Sample records for technology reducing uncertainty

  1. Trapped between two tails: trading off scientific uncertainties via climate targets

    NASA Astrophysics Data System (ADS)

    Lemoine, Derek; McJeon, Haewon C.

    2013-09-01

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology-rich GCAM integrated assessment model to assess the robustness of 450 and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.

  2. Trapped Between Two Tails: Trading Off Scientific Uncertainties via Climate Targets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemoine, Derek M.; McJeon, Haewon C.

    2013-08-20

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology- rich GCAM integrated assessment model to assess the robustness of 450 ppm and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides netmore » benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.« less

  3. The option to abandon: stimulating innovative groundwater remediation technologies characterized by technological uncertainty.

    PubMed

    Compernolle, T; Van Passel, S; Huisman, K; Kort, P

    2014-10-15

    Many studies on technology adoption demonstrate that uncertainty leads to a postponement of investments by integrating a wait option in the economic analysis. The aim of this study however is to demonstrate how the investment in new technologies can be stimulated by integrating an option to abandon. Furthermore, this real option analysis not only considers the ex ante decision analysis of the investment in a new technology under uncertainty, but also allows for an ex post evaluation of the investment. Based on a case study regarding the adoption of an innovative groundwater remediation strategy, it is demonstrated that when the option to abandon the innovative technology is taken into account, the decision maker decides to invest in this technology, while at the same time it determines an optimal timing to abandon the technology if its operation proves to be inefficient. To reduce uncertainty about the effectiveness of groundwater remediation technologies, samples are taken. Our analysis shows that when the initial belief in an effective innovative technology is low, it is important that these samples provide correct information in order to justify the adoption of the innovative technology. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. The HTA Risk Analysis Chart: Visualising the Need for and Potential Value of Managed Entry Agreements in Health Technology Assessment.

    PubMed

    Grimm, Sabine Elisabeth; Strong, Mark; Brennan, Alan; Wailoo, Allan J

    2017-12-01

    Recent changes to the regulatory landscape of pharmaceuticals may sometimes require reimbursement authorities to issue guidance on technologies that have a less mature evidence base. Decision makers need to be aware of risks associated with such health technology assessment (HTA) decisions and the potential to manage this risk through managed entry agreements (MEAs). This work develops methods for quantifying risk associated with specific MEAs and for clearly communicating this to decision makers. We develop the 'HTA risk analysis chart', in which we present the payer strategy and uncertainty burden (P-SUB) as a measure of overall risk. The P-SUB consists of the payer uncertainty burden (PUB), the risk stemming from decision uncertainty as to which is the truly optimal technology from the relevant set of technologies, and the payer strategy burden (PSB), the additional risk of approving a technology that is not expected to be optimal. We demonstrate the approach using three recent technology appraisals from the UK National Institute for Health and Clinical Excellence (NICE), each of which considered a price-based MEA. The HTA risk analysis chart was calculated using results from standard probabilistic sensitivity analyses. In all three HTAs, the new interventions were associated with substantial risk as measured by the P-SUB. For one of these technologies, the P-SUB was reduced to zero with the proposed price reduction, making this intervention cost effective with near complete certainty. For the other two, the risk reduced substantially with a much reduced PSB and a slightly increased PUB. The HTA risk analysis chart shows the risk that the healthcare payer incurs under unresolved decision uncertainty and when considering recommending a technology that is not expected to be optimal given current evidence. This allows the simultaneous consideration of financial and data-collection MEA schemes in an easily understood format. The use of HTA risk analysis charts will help to ensure that MEAs are considered within a standard utility-maximising health economic decision-making framework.

  5. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    NASA Technical Reports Server (NTRS)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  6. Technology Assessment Requirements for Programs and Projects

    NASA Technical Reports Server (NTRS)

    Bilbro, James W.

    2006-01-01

    Program/project uncertainty can most simply be defined as the unpredictability of its outcome. As might be expected, the degree of uncertainty depends substantially on program/project type. For hi-tech programs/projects, uncertainty all too frequently translates into schedule slips, cost overruns and occasionally even to cancellations or failures - consummations root cause of such events is often attributed to inadequate definition of requirements. If such were indeed the root cause, then correcting the situation would simply be a matter of requiring better requirements definition, but since history seems frequently to repeat itself, this must not be the case - at least not in total. There are in fact many contributors to schedule slips, cost overruns, project cancellations and failures, among them lack of adequate requirements definition. The case can be made, however, that many of these contributors are related to the degree of uncertainty at the outset of the project. And further, that a dominant factor in the degree of uncertainty is the maturity of the technology required to bring the project to fruition. This presentation discusses the concept of relating degrees of uncertainty to Technology Readiness Levels (TRL) and their associated Advancement Degree of Difficulty (AD2) levels. It also briefly describes a quantifiable process to establish the appropriate TRL for a given technology and quantifies through the AD2 what is required to move it from its current TRL to the desired TRL in order to reduce risk and maximize likelihood of successfully infusing the technology.

  7. A methodology for uncertainty quantification in quantitative technology valuation based on expert elicitation

    NASA Astrophysics Data System (ADS)

    Akram, Muhammad Farooq Bin

    The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.

  8. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical, future energy demand, and mitigation technology uncertainties. This information provides central information for policy making, since it helps to understand the relationship between mitigation costs and their potential to reduce the risk of exceeding 2°C, or other temperature limits like 3°C or 1.5°C, under a wide range of scenarios.

  9. Progress of Aircraft System Noise Assessment with Uncertainty Quantification for the Environmentally Responsible Aviation Project

    NASA Technical Reports Server (NTRS)

    Thomas, Russell H.; Burley, Casey L.; Guo, Yueping

    2016-01-01

    Aircraft system noise predictions have been performed for NASA modeled hybrid wing body aircraft advanced concepts with 2025 entry-into-service technology assumptions. The system noise predictions developed over a period from 2009 to 2016 as a result of improved modeling of the aircraft concepts, design changes, technology development, flight path modeling, and the use of extensive integrated system level experimental data. In addition, the system noise prediction models and process have been improved in many ways. An additional process is developed here for quantifying the uncertainty with a 95% confidence level. This uncertainty applies only to the aircraft system noise prediction process. For three points in time during this period, the vehicle designs, technologies, and noise prediction process are documented. For each of the three predictions, and with the information available at each of those points in time, the uncertainty is quantified using the direct Monte Carlo method with 10,000 simulations. For the prediction of cumulative noise of an advanced aircraft at the conceptual level of design, the total uncertainty band has been reduced from 12.2 to 9.6 EPNL dB. A value of 3.6 EPNL dB is proposed as the lower limit of uncertainty possible for the cumulative system noise prediction of an advanced aircraft concept.

  10. River Protection Project Technology and Innovation Roadmap.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reid, D. S.; Wooley, T. A.; Kelly, S. E.

    The Technology and Innovation Roadmap is a planning tool for WRPS management, DOE ORP, DOE EM, and others to understand the risks and technology gaps associated with the RPP mission. The roadmap identifies and prioritizes technical areas that require technology solutions and underscores where timely and appropriate technology development can have the greatest impact to reduce those risks and uncertainties. The roadmap also serves as a tool for determining allocation of resources.

  11. Assessing the Expected Value of Research Studies in Reducing Uncertainty and Improving Implementation Dynamics.

    PubMed

    Grimm, Sabine E; Dixon, Simon; Stevens, John W

    2017-07-01

    With low implementation of cost-effective health technologies being a problem in many health systems, it is worth considering the potential effects of research on implementation at the time of health technology assessment. Meaningful and realistic implementation estimates must be of dynamic nature. To extend existing methods for assessing the value of research studies in terms of both reduction of uncertainty and improvement in implementation by considering diffusion based on expert beliefs with and without further research conditional on the strength of evidence. We use expected value of sample information and expected value of specific implementation measure concepts accounting for the effects of specific research studies on implementation and the reduction of uncertainty. Diffusion theory and elicitation of expert beliefs about the shape of diffusion curves inform implementation dynamics. We illustrate use of the resulting dynamic expected value of research in a preterm birth screening technology and results are compared with those from a static analysis. Allowing for diffusion based on expert beliefs had a significant impact on the expected value of research in the case study, suggesting that mistakes are made where static implementation levels are assumed. Incorporating the effects of research on implementation resulted in an increase in the expected value of research compared to the expected value of sample information alone. Assessing the expected value of research in reducing uncertainty and improving implementation dynamics has the potential to complement currently used analyses in health technology assessments, especially in recommendations for further research. The combination of expected value of research, diffusion theory, and elicitation described in this article is an important addition to the existing methods of health technology assessment.

  12. Stress Corrosion of Ceramic Materials

    DTIC Science & Technology

    1981-10-01

    stresses are liable to fail after an indeterminate period of time, leading to a considerable uncertainty in the safe design stress. One of the objectives...of modern ceramics technology is to reduce the uncertainty associated with structural design , and hence, to improve our capabilities of designing ...processes that occur during stress corrosion cracking. Recent advances in th~earea of structural design with ceramic materials have lead to several

  13. Ionospheric Response to Extremes in the Space Environment: Establishing Benchmarks for the Space Weather Action Plan.

    NASA Astrophysics Data System (ADS)

    Viereck, R. A.; Azeem, S. I.

    2017-12-01

    One of the goals of the National Space Weather Action Plan is to establish extreme event benchmarks. These benchmarks are estimates of environmental parameters that impact technologies and systems during extreme space weather events. Quantitative assessment of anticipated conditions during these extreme space weather event will enable operators and users of affected technologies to develop plans for mitigating space weather risks and improve preparedness. The ionosphere is one of the most important regions of space because so many applications either depend on ionospheric space weather for their operation (HF communication, over-the-horizon radars), or can be deleteriously affected by ionospheric conditions (e.g. GNSS navigation and timing, UHF satellite communications, synthetic aperture radar, HF communications). Since the processes that influence the ionosphere vary over time scales from seconds to years, it continues to be a challenge to adequately predict its behavior in many circumstances. Estimates with large uncertainties, in excess of 100%, may result in operators of impacted technologies over or under preparing for such events. The goal of the next phase of the benchmarking activity is to reduce these uncertainties. In this presentation, we will focus on the sources of uncertainty in the ionospheric response to extreme geomagnetic storms. We will then discuss various research efforts required to better understand the underlying processes of ionospheric variability and how the uncertainties in ionospheric response to extreme space weather could be reduced and the estimates improved.

  14. State perspectives on clean coal technology deployment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, T.

    1997-12-31

    State governments have been funding partners in the Clean Coal Technology program since its beginnings. Today, regulatory and market uncertainties and tight budgets have reduced state investment in energy R and D, but states have developed program initiatives in support of deployment. State officials think that the federal government must continue to support these technologies in the deployment phase. Discussions of national energy policy must include attention to the Clean Coal Technology program and its accomplishments.

  15. Planck Constant Determination from Power Equivalence

    NASA Astrophysics Data System (ADS)

    Newell, David B.

    2000-04-01

    Equating mechanical to electrical power links the kilogram, the meter, and the second to the practical realizations of the ohm and the volt derived from the quantum Hall and the Josephson effects, yielding an SI determination of the Planck constant. The NIST watt balance uses this power equivalence principle, and in 1998 measured the Planck constant with a combined relative standard uncertainty of 8.7 x 10-8, the most accurate determination to date. The next generation of the NIST watt balance is now being assembled. Modification to the experimental facilities have been made to reduce the uncertainty components from vibrations and electromagnetic interference. A vacuum chamber has been installed to reduce the uncertainty components associated with performing the experiment in air. Most of the apparatus is in place and diagnostic testing of the balance should begin this year. Once a combined relative standard uncertainty of one part in 10-8 has been reached, the power equivalence principle can be used to monitor the possible drift in the artifact mass standard, the kilogram, and provide an accurate alternative definition of mass in terms of fundamental constants. *Electricity Division, Electronics and Electrical Engineering Laboratory, Technology Administration, U.S. Department of Commerce. Contribution of the National Institute of Standards and Technology, not subject to copyright in the U.S.

  16. Regulatory uncertainty and the associated business risk for emerging technologies

    NASA Astrophysics Data System (ADS)

    Hoerr, Robert A.

    2011-04-01

    An oversight system specifically concerned with nanomaterials should be flexible enough to take into account the unique aspects of individual novel materials and the settings in which they might be used, while recognizing that heretofore unrecognized safety issues may require future modifications. This article considers a question not explicitly considered by the project team: what is the risk that uncertainty over how regulatory oversight will be applied to nanomaterials will delay or block the development of this emerging technology, thereby depriving human health of potential and substantial benefits? An ambiguous regulatory environment could delay the availability of valuable new technology and therapeutics for human health by reducing access to investment capital. Venture capitalists list regulatory uncertainty as a major reason not to invest at all in certain areas. Uncertainty is far more difficult to evaluate than risk, which lends itself to quantitative models and can be factored into projections of return on possible investments. Loss of time has a large impact on investment return. An examination of regulatory case histories suggests that an increase in regulatory resting requirement, where the path is well-defined, is far less costly than a delay of a year or more in achieving product approval and market launch.

  17. A systematic uncertainty analysis for liner impedance eduction technology

    NASA Astrophysics Data System (ADS)

    Zhou, Lin; Bodén, Hans

    2015-11-01

    The so-called impedance eduction technology is widely used for obtaining acoustic properties of liners used in aircraft engines. The measurement uncertainties for this technology are still not well understood though it is essential for data quality assessment and model validation. A systematic framework based on multivariate analysis is presented in this paper to provide 95 percent confidence interval uncertainty estimates in the process of impedance eduction. The analysis is made using a single mode straightforward method based on transmission coefficients involving the classic Ingard-Myers boundary condition. The multivariate technique makes it possible to obtain an uncertainty analysis for the possibly correlated real and imaginary parts of the complex quantities. The results show that the errors in impedance results at low frequency mainly depend on the variability of transmission coefficients, while the mean Mach number accuracy is the most important source of error at high frequencies. The effect of Mach numbers used in the wave dispersion equation and in the Ingard-Myers boundary condition has been separated for comparison of the outcome of impedance eduction. A local Mach number based on friction velocity is suggested as a way to reduce the inconsistencies found when estimating impedance using upstream and downstream acoustic excitation.

  18. Surface reconstruction and deformation monitoring of stratospheric airship based on laser scanning technology

    NASA Astrophysics Data System (ADS)

    Guo, Kai; Xie, Yongjie; Ye, Hu; Zhang, Song; Li, Yunfei

    2018-04-01

    Due to the uncertainty of stratospheric airship's shape and the security problem caused by the uncertainty, surface reconstruction and surface deformation monitoring of airship was conducted based on laser scanning technology and a √3-subdivision scheme based on Shepard interpolation was developed. Then, comparison was conducted between our subdivision scheme and the original √3-subdivision scheme. The result shows our subdivision scheme could reduce the shrinkage of surface and the number of narrow triangles. In addition, our subdivision scheme could keep the sharp features. So, surface reconstruction and surface deformation monitoring of airship could be conducted precisely by our subdivision scheme.

  19. SunShot solar power reduces costs and uncertainty in future low-carbon electricity systems.

    PubMed

    Mileva, Ana; Nelson, James H; Johnston, Josiah; Kammen, Daniel M

    2013-08-20

    The United States Department of Energy's SunShot Initiative has set cost-reduction targets of $1/watt for central-station solar technologies. We use SWITCH, a high-resolution electricity system planning model, to study the implications of achieving these targets for technology deployment and electricity costs in western North America, focusing on scenarios limiting carbon emissions to 80% below 1990 levels by 2050. We find that achieving the SunShot target for solar photovoltaics would allow this technology to provide more than a third of electric power in the region, displacing natural gas in the medium term and reducing the need for nuclear and carbon capture and sequestration (CCS) technologies, which face technological and cost uncertainties, by 2050. We demonstrate that a diverse portfolio of technological options can help integrate high levels of solar generation successfully and cost-effectively. The deployment of GW-scale storage plays a central role in facilitating solar deployment and the availability of flexible loads could increase the solar penetration level further. In the scenarios investigated, achieving the SunShot target can substantially mitigate the cost of implementing a carbon cap, decreasing power costs by up to 14% and saving up to $20 billion ($2010) annually by 2050 relative to scenarios with Reference solar costs.

  20. The timing and probability of treatment switch under cost uncertainty: an application to patients with gastrointestinal stromal tumor.

    PubMed

    de Mello-Sampayo, Felipa

    2014-03-01

    Cost fluctuations render the outcome of any treatment switch uncertain, so that decision makers might have to wait for more information before optimally switching treatments, especially when the incremental cost per quality-adjusted life year (QALY) gained cannot be fully recovered later on. To analyze the timing of treatment switch under cost uncertainty. A dynamic stochastic model for the optimal timing of a treatment switch is developed and applied to a problem in medical decision taking, i.e. to patients with unresectable gastrointestinal stromal tumour (GIST). The theoretical model suggests that cost uncertainty reduces expected net benefit. In addition, cost volatility discourages switching treatments. The stochastic model also illustrates that as technologies become less cost competitive, the cost uncertainty becomes more dominant. With limited substitutability, higher quality of technologies will increase the demand for those technologies disregarding the cost uncertainty. The results of the empirical application suggest that the first-line treatment may be the better choice when considering lifetime welfare. Under uncertainty and irreversibility, low-risk patients must begin the second-line treatment as soon as possible, which is precisely when the second-line treatment is least valuable. As the costs of reversing current treatment impacts fall, it becomes more feasible to provide the option-preserving treatment to these low-risk individuals later on. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  1. A quadranomial real options model for evaluation of emissions trading and technology

    NASA Astrophysics Data System (ADS)

    Sarkis, Joseph; Tamarkin, Maurry

    2005-11-01

    Green house gas (GHG) emissions have been tied to global climate change. One popular policy instrument that seems to have gained credibility with explicit mention of its application in the Kyoto Protocol is the use of permit trading and cap-and-trade mechanisms. Organizations functioning within this environment will need to manage their resources appropriately to remain competitive. Organizations will either have the opportunity to purchase emissions credits (offsets) from a market trading scheme or seek to reduce their emissions through different measures. Some measures may include investment in new technologies that will reduce their reliance on GHG emitting practices. In many countries, large organizations and institutions generate their own power to operate their facilities. Much of this power is generated (or bought) from GHG producing technology. Specific renewable energy sources such as wind and solar photovoltaic technology may become more feasible alternatives available to a large percentage of these organizations if they are able to take advantage and incorporate the market for GHG emissions trading in their analyses. To help organizations evaluate investment in these renewable energy technologies we introduce a real options based model that will take into consideration uncertainties associated with the technology and those associated with the GHG trading market. The real options analysis will consider both the stochastic (uncertainty) nature of the exercise price of the technology and the stochastic nature of the market trading price of the GHG emissions.

  2. Breaking through the uncertainty ceiling in LA-ICP-MS U-Pb geochronology

    NASA Astrophysics Data System (ADS)

    Horstwood, M.

    2016-12-01

    Sources of systematic uncertainty associated with session-to-session bias are the dominant contributor to the 2% (2s) uncertainty ceiling that currently limits the accuracy of LA-ICP-MS U-Pb geochronology. Sources include differential downhole fractionation (LIEF), `matrix effects' and ablation volume differences, which result in irreproducibility of the same reference material across sessions. Current mitigation methods include correcting for LIEF mathematically, using matrix-matched reference materials, annealing material to reduce or eliminate radiation damage effects and tuning for robust plasma conditions. Reducing the depth and volume of ablation can also mitigate these problems and should contribute to the reduction of the uncertainty ceiling. Reducing analysed volume leads to increased detection efficiency, reduced matrix-effects, eliminates LIEF, obviates ablation rate differences and reduces the likelihood of intercepting complex growth zones with depth, thereby apparently improving material homogeneity. High detection efficiencies (% level) and low sampling volumes (20um box, 1-2um deep) can now be achieved using MC-ICP-MS such that low volume ablations should be considered part of the toolbox of methods targeted at improving the reproducibility of LA-ICP-MS U-Pb geochronology. In combination with other strategies these improvements should be feasible on any ICP platform. However, reducing the volume of analysis reduces detected counts and requires a change of analytical approach in order to mitigate this. Appropriate strategies may include the use of high efficiency cell and torch technologies and the optimisation of acquisition protocols and data handling techniques such as condensing signal peaks, using log ratios and total signal integration. The tools required to break the 2% (2s) uncertainty ceiling in LA-ICP-MS U-Pb geochronology are likely now known but require a coherent strategy and change of approach to combine their implementation and realise this goal. This study will highlight these changes and efforts towards reducing the uncertainty contribution for LA-ICP-MS U-Pb geochronology.

  3. PRaVDA: High Energy Physics towards proton Computed Tomography

    NASA Astrophysics Data System (ADS)

    Price, T.; PRaVDA Consortium

    2016-07-01

    Proton radiotherapy is an increasingly popular modality for treating cancers of the head and neck, and in paediatrics. To maximise the potential of proton radiotherapy it is essential to know the distribution, and more importantly the proton stopping powers, of the body tissues between the proton beam and the tumour. A stopping power map could be measured directly, and uncertainties in the treatment vastly reduce, if the patient was imaged with protons instead of conventional x-rays. Here we outline the application of technologies developed for High Energy Physics to provide clinical-quality proton Computed Tomography, in so reducing range uncertainties and enhancing the treatment of cancer.

  4. ISA implementation and uncertainty: a literature review and expert elicitation study.

    PubMed

    van der Pas, J W G M; Marchau, V A W J; Walker, W E; van Wee, G P; Vlassenroot, S H

    2012-09-01

    Each day, an average of over 116 people die from traffic accidents in the European Union. One out of three fatalities is estimated to be the result of speeding. The current state of technology makes it possible to make speeding more difficult, or even impossible, by placing intelligent speed limiters (so called ISA devices) in vehicles. Although the ISA technology has been available for some years now, and reducing the number of road traffic fatalities and injuries has been high on the European political agenda, implementation still seems to be far away. Experts indicate that there are still too many uncertainties surrounding ISA implementation, and dealing with these uncertainties is essential for implementing ISA. In this paper, a systematic and representative inventory of the uncertainties is made based upon the literature. Furthermore, experts in the field of ISA were surveyed and asked which uncertainties are barriers for ISA implementation, and how uncertain these uncertainties are. We found that the long-term effects and the effects of large-scale implementation of ISA are still uncertain and are the most important barriers for the implementation of the most effective types of ISA. One way to deal with these uncertainties would be to start implementation on a small scale and gradually expand the penetration, in order to learn how ISA influences the transport system over time. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Probabilistic cost estimates for climate change mitigation.

    PubMed

    Rogelj, Joeri; McCollum, David L; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-01-03

    For more than a decade, the target of keeping global warming below 2 °C has been a key focus of the international climate debate. In response, the scientific community has published a number of scenario studies that estimate the costs of achieving such a target. Producing these estimates remains a challenge, particularly because of relatively well known, but poorly quantified, uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on the one hand, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other hand, has spent years improving its understanding of the geophysical response of the Earth system to emissions of greenhouse gases. This geophysical response remains a key uncertainty in the cost of mitigation scenarios but has been integrated with assessments of other uncertainties in only a rudimentary manner, that is, for equilibrium conditions. Here we bridge this gap between the two research communities by generating distributions of the costs associated with limiting transient global temperature increase to below specific values, taking into account uncertainties in four factors: geophysical, technological, social and political. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical uncertainties, social factors influencing future energy demand and, lastly, technological uncertainties surrounding the availability of greenhouse gas mitigation options. Our information on temperature risk and mitigation costs provides crucial information for policy-making, because it clarifies the relative importance of mitigation costs, energy demand and the timing of global action in reducing the risk of exceeding a global temperature increase of 2 °C, or other limits such as 3 °C or 1.5 °C, across a wide range of scenarios.

  6. Digital Sensor Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Ken D.; Quinn, Edward L.; Mauck, Jerry L.

    The nuclear industry has been slow to incorporate digital sensor technology into nuclear plant designs due to concerns with digital qualification issues. However, the benefits of digital sensor technology for nuclear plant instrumentation are substantial in terms of accuracy and reliability. This paper, which refers to a final report issued in 2013, demonstrates these benefits in direct comparisons of digital and analog sensor applications. Improved accuracy results from the superior operating characteristics of digital sensors. These include improvements in sensor accuracy and drift and other related parameters which reduce total loop uncertainty and thereby increase safety and operating margins. Anmore » example instrument loop uncertainty calculation for a pressure sensor application is presented to illustrate these improvements. This is a side-by-side comparison of the instrument loop uncertainty for both an analog and a digital sensor in the same pressure measurement application. Similarly, improved sensor reliability is illustrated with a sample calculation for determining the probability of failure on demand, an industry standard reliability measure. This looks at equivalent analog and digital temperature sensors to draw the comparison. The results confirm substantial reliability improvement with the digital sensor, due in large part to ability to continuously monitor the health of a digital sensor such that problems can be immediately identified and corrected. This greatly reduces the likelihood of a latent failure condition of the sensor at the time of a design basis event. Notwithstanding the benefits of digital sensors, there are certain qualification issues that are inherent with digital technology and these are described in the report. One major qualification impediment for digital sensor implementation is software common cause failure (SCCF).« less

  7. The choice of strategic alternatives under increasing regulation in high technology companies.

    PubMed

    Birnbaum, P H

    1984-09-01

    The strategic response of U.S. high technology companies in the medical X-ray manufacturing industry to increased governmental regulations from 1962 to 1977 is examined. Results suggest that regulations increase consumer and competitor uncertainty, with the consequence that firms select less risky strategies and decrease the riskier new product invention strategy. Larger firms reduce inventions less than smaller firms.

  8. Assessment of Data and Knowledge Fusion Strategies for Diagnostics and Prognostics

    DTIC Science & Technology

    2001-04-05

    prognostic technologies has proven effective in reducing false alarm rates, increasing confidence levels in early fault detection , and predicting time...or better than the sum of the parts. Specific to health management, this means reduced uncertainty in current condition assessment reduced (improving...achieve time synchronous averaged vibration features. Semmm Amy -U....1A MreN T.g 4 Id F~As- Anomaly DEtection Figure 1 - Fusion Application Areas At a

  9. Optimization under variability and uncertainty: a case study for NOx emissions control for a gasification system.

    PubMed

    Chen, Jianjun; Frey, H Christopher

    2004-12-15

    Methods for optimization of process technologies considering the distinction between variability and uncertainty are developed and applied to case studies of NOx control for Integrated Gasification Combined Cycle systems. Existing methods of stochastic optimization (SO) and stochastic programming (SP) are demonstrated. A comparison of SO and SP results provides the value of collecting additional information to reduce uncertainty. For example, an expected annual benefit of 240,000 dollars is estimated if uncertainty can be reduced before a final design is chosen. SO and SP are typically applied to uncertainty. However, when applied to variability, the benefit of dynamic process control is obtained. For example, an annual savings of 1 million dollars could be achieved if the system is adjusted to changes in process conditions. When variability and uncertainty are treated distinctively, a coupled stochastic optimization and programming method and a two-dimensional stochastic programming method are demonstrated via a case study. For the case study, the mean annual benefit of dynamic process control is estimated to be 700,000 dollars, with a 95% confidence range of 500,000 dollars to 940,000 dollars. These methods are expected to be of greatest utility for problems involving a large commitment of resources, for which small differences in designs can produce large cost savings.

  10. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghanem, Roger

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced modelsmore » to be used in estimation and inference.« less

  11. Reduced-order model based active disturbance rejection control of hydraulic servo system with singular value perturbation theory.

    PubMed

    Wang, Chengwen; Quan, Long; Zhang, Shijie; Meng, Hongjun; Lan, Yuan

    2017-03-01

    Hydraulic servomechanism is the typical mechanical/hydraulic double-dynamics coupling system with the high stiffness control and mismatched uncertainties input problems, which hinder direct applications of many advanced control approaches in the hydraulic servo fields. In this paper, by introducing the singular value perturbation theory, the original double-dynamics coupling model of the hydraulic servomechanism was reduced to a integral chain system. So that, the popular ADRC (active disturbance rejection control) technology could be directly applied to the reduced system. In addition, the high stiffness control and mismatched uncertainties input problems are avoided. The validity of the simplified model is analyzed and proven theoretically. The standard linear ADRC algorithm is then developed based on the obtained reduced-order model. Extensive comparative co-simulations and experiments are carried out to illustrate the effectiveness of the proposed method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Climate regulation enhances the value of second generation biofuel technology

    NASA Astrophysics Data System (ADS)

    Hertel, T. W.; Steinbuks, J.; Tyner, W.

    2014-12-01

    Commercial scale implementation of second generation (2G) biofuels has long been 'just over the horizon - perhaps a decade away'. However, with recent innovations, and higher oil prices, we appear to be on the verge of finally seeing commercial scale implementations of cellulosic to liquid fuel conversion technologies. Interest in this technology derives from many quarters. Environmentalists see this as a way of reducing our carbon footprint, however, absent a global market for carbon emissions, private firms will not factor this into their investment decisions. Those interested in poverty and nutrition see this as a channel for lessening the biofuels' impact on food prices. But what is 2G technology worth to society? How valuable are prospective improvements in this technology? And how are these valuations affected by future uncertainties, including climate regulation, climate change impacts, and energy prices? This paper addresses all of these questions. We employ FABLE, a dynamic optimization model for the world's land resources which characterizes the optimal long run path for protected natural lands, managed forests, crop and livestock land use, energy extraction and biofuels over the period 2005-2105. By running this model twice for each future state of the world - once with 2G biofuels technology available and once without - we measure the contribution of the technology to global welfare. Given the uncertainty in how these technologies are likely to evolve, we consider a range cost estimates - from optimistic to pessimistic. In addition to technological uncertainty, there is great uncertainty in the conditions characterizing our baseline for the 21st century. For each of the 2G technology scenarios, we therefore also consider a range of outcomes for key drivers of global land use, including: population, income, oil prices, climate change impacts and climate regulation. We find that the social valuation of 2G technologies depends critically on climate change regulations and future oil prices. In the base case with no climate policy and higher oil prices, the value of second generation biofuels is roughly $8 billion. With stringent climate change regulations in place, 2G biofuels are worth about fifty percent more.

  13. Project management lessons learned on SDIO's Delta Star and Single Stage Rocket Technology programs

    NASA Technical Reports Server (NTRS)

    Klevatt, Paul L.

    1992-01-01

    The topics are presented in viewgraph form and include the following: a Delta Star (Delta 183) Program Overview, lessons learned, and rapid prototyping and the Single Stage Rocket Technology (SSRT) Program. The basic objective of the Strategic Defense Initiative Programs are to quickly reduce key uncertainties to a manageable range of parameters and solutions, and to yield results applicable to focusing subsequent research dollars on high payoff areas.

  14. Aerothermodynamics and Turbulence

    DTIC Science & Technology

    2013-03-08

    Surface Heat Transfer and Detailed Flow Structure Fuel Injection in a Scramjet Combustor Reduced Uncertainty in Complex Flows Addressing... hypersonic flight data to capture shock interaction unsteadiness National Hypersonic Foundational Research Plan Joint Technology Office... Hypersonics Basic Science Roadmap Assessment of SOA and Future Research Directions Ongoing Basic Research for Understanding and Controlling Noise

  15. Coal supply and cost under technological and environmental uncertainty

    NASA Astrophysics Data System (ADS)

    Chan, Melissa

    This thesis estimates available coal resources, recoverability, mining costs, environmental impacts, and environmental control costs for the United States under technological and environmental uncertainty. It argues for a comprehensive, well-planned research program that will resolve resource uncertainty, and innovate new technologies to improve recovery and environmental performance. A stochastic process and cost (constant 2005) model for longwall, continuous, and surface mines based on current technology and mining practice data was constructed. It estimates production and cost ranges within 5-11 percent of 2006 prices and production rates. The model was applied to the National Coal Resource Assessment. Assuming the cheapest mining method is chosen to extract coal, 250-320 billion tons are recoverable. Two-thirds to all coal resource can be mined at a cost less than 4/mmBTU. If U.S. coal demand substantially increases, as projected by alternate Energy Information Administration (EIA), resources might not last more than 100 years. By scheduling cost to meet EIA projected demand, estimated cost uncertainty increases over time. It costs less than 15/ton to mine in the first 10 years of a 100 year time period, 10-30/ton in the following 50 years, and 15-$90/ton thereafter. Environmental impacts assessed are subsidence from underground mines, surface mine pit area, erosion, acid mine drainage, air pollutant and methane emissions. The analysis reveals that environmental impacts are significant and increasing as coal demand increases. Control technologies recommended to reduce these impacts are backfilling underground mines, surface pit reclamation, substitution of robotic underground mining systems for surface pit mining, soil replacement for erosion, placing barriers between exposed coal and the elements to avoid acid formation, and coalbed methane development to avoid methane emissions during mining. The costs to apply these technologies to meet more stringent environmental regulation scenarios are estimated. The results show that the cost of meeting these regulatory scenarios could increase mining costs two to six times the business as usual cost, which could significantly affect the cost of coal-powered electricity generation. This thesis provides a first estimate of resource availability, mining cost, and environmental impact assessment and cost analysis. Available resource is not completely reported, so the available estimate is lower than actual resource. Mining costs are optimized, so provide a low estimate of potential costs. Environmental impact estimates are on the high end of potential impact that may be incurred because it is assumed that impact is unavoidable. Control costs vary. Estimated cost to control subsidence and surface mine pit impacts are suitable estimates of the cost to reduce land impacts. Erosion control and robotic mining system costs are lower, and methane and acid mine drainage control costs are higher, than they may be in the case that these impacts must be reduced.

  16. Active and Passive Hydrologic Tomographic Surveys:A Revolution in Hydrology (Invited)

    NASA Astrophysics Data System (ADS)

    Yeh, T. J.

    2013-12-01

    Mathematical forward or inverse problems of flow through geological media always have unique solutions if necessary conditions are givens. Unique mathematical solutions to forward or inverse modeling of field problems are however always uncertain (an infinite number of possibilities) due to many reasons. They include non-representativeness of the governing equations, inaccurate necessary conditions, multi-scale heterogeneity, scale discrepancies between observation and model, noise and others. Conditional stochastic approaches, which derives the unbiased solution and quantifies the solution uncertainty, are therefore most appropriate for forward and inverse modeling of hydrological processes. Conditioning using non-redundant data sets reduces uncertainty. In this presentation, we explain non-redundant data sets in cross-hole aquifer tests, and demonstrate that active hydraulic tomographic survey (using man-made excitations) is a cost-effective approach to collect the same type but non-redundant data sets for reducing uncertainty in the inverse modeling. We subsequently show that including flux measurements (a piece of non-redundant data set) collected in the same well setup as in hydraulic tomography improves the estimated hydraulic conductivity field. We finally conclude with examples and propositions regarding how to collect and analyze data intelligently by exploiting natural recurrent events (river stage fluctuations, earthquakes, lightning, etc.) as energy sources for basin-scale passive tomographic surveys. The development of information fusion technologies that integrate traditional point measurements and active/passive hydrogeophysical tomographic surveys, as well as advances in sensor, computing, and information technologies may ultimately advance our capability of characterizing groundwater basins to achieve resolution far beyond the feat of current science and technology.

  17. STochastic Analysis of Technical Systems (STATS): A model for evaluating combined effects of multiple uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kranz, L.; VanKuiken, J.C.; Gillette, J.L.

    1989-12-01

    The STATS model, now modified to run on microcomputers, uses user- defined component uncertainties to calculate composite uncertainty distributions for systems or technologies. The program can be used to investigate uncertainties for a single technology on to compare two technologies. Although the term technology'' is used throughout the program screens, the program can accommodate very broad problem definitions. For example, electrical demand uncertainties, health risks associated with toxic material exposures, or traffic queuing delay times can be estimated. The terminology adopted in this version of STATS reflects the purpose of the earlier version, which was to aid in comparing advancedmore » electrical generating technologies. A comparison of two clean coal technologies in two power plants is given as a case study illustration. 7 refs., 35 figs., 7 tabs.« less

  18. Evaluation and Uncertainty of a New Method to Detect Suspected Nuclear and WMD Activity: Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurzeja, R.; Werth, D.; Buckley, R.

    The Atmospheric Technology Group at SRNL developed a new method to detect signals from Weapons of Mass Destruction (WMD) activities in a time series of chemical measurements at a downwind location. This method was tested with radioxenon measured in Russia and Japan after the 2013 underground test in North Korea. This LDRD calculated the uncertainty in the method with the measured data and also for a case with the signal reduced to 1/10 its measured value. The research showed that the uncertainty in the calculated probability of origin from the NK test site was small enough to confirm the test.more » The method was also wellbehaved for small signal strengths.« less

  19. Failure Bounding And Sensitivity Analysis Applied To Monte Carlo Entry, Descent, And Landing Simulations

    NASA Technical Reports Server (NTRS)

    Gaebler, John A.; Tolson, Robert H.

    2010-01-01

    In the study of entry, descent, and landing, Monte Carlo sampling methods are often employed to study the uncertainty in the designed trajectory. The large number of uncertain inputs and outputs, coupled with complicated non-linear models, can make interpretation of the results difficult. Three methods that provide statistical insights are applied to an entry, descent, and landing simulation. The advantages and disadvantages of each method are discussed in terms of the insights gained versus the computational cost. The first method investigated was failure domain bounding which aims to reduce the computational cost of assessing the failure probability. Next a variance-based sensitivity analysis was studied for the ability to identify which input variable uncertainty has the greatest impact on the uncertainty of an output. Finally, probabilistic sensitivity analysis is used to calculate certain sensitivities at a reduced computational cost. These methods produce valuable information that identifies critical mission parameters and needs for new technology, but generally at a significant computational cost.

  20. Foregone benefits of important food crop improvements in Sub-Saharan Africa

    PubMed Central

    2017-01-01

    A number of new crops have been developed that address important traits of particular relevance for smallholder farmers in Africa. Scientists, policy makers, and other stakeholders have raised concerns that the approval process for these new crops causes delays that are often scientifically unjustified. This article develops a real option model for the optimal regulation of a risky technology that enhances economic welfare and reduces malnutrition. We consider gradual adoption of the technology and show that delaying approval reduces uncertainty about perceived risks of the technology. Optimal conditions for approval incorporate parameters of the stochastic processes governing the dynamics of risk. The model is applied to three cases of improved crops, which either are, or are expected to be, delayed by the regulatory process. The benefits and costs of the crops are presented in a partial equilibrium that considers changes in adoption over time and the foregone benefits caused by a delay in approval under irreversibility and uncertainty. We derive the equilibrium conditions where the net-benefits of the technology equal the costs that would justify a delay. The sooner information about the safety of the technology arrive, the lower the costs for justifying a delay need to be i.e. it pays more to delay. The costs of a delay can be substantial: e.g. a one year delay in approval of the pod-borer resistant cowpea in Nigeria will cost the country about 33 million USD to 46 million USD and between 100 and 3,000 lives. PMID:28749984

  1. Monopolistic competition and the health care sector.

    PubMed

    Hilsenrath, P

    1991-07-01

    The model of monopolistic competition is appropriate for describing the behavior of the health care sector in the United States. Uncertainty about quality of medical and related services promotes product differentiation especially when consumers do not bear the full costs of care. New technologies can be used to signal quality even when their clinical usefulness is unproven. Recent cost containment measures may reduce employment of ineffective technologies but may also inhibit the adaptation of genuinely useful developments.

  2. Fission Power System Technology for NASA Exploration Missions

    NASA Technical Reports Server (NTRS)

    Mason, Lee; Houts, Michael

    2011-01-01

    Under the NASA Exploration Technology Development Program, and in partnership with the Department of Energy (DOE), NASA is conducting a project to mature Fission Power System (FPS) technology. A primary project goal is to develop viable system options to support future NASA mission needs for nuclear power. The main FPS project objectives are as follows: 1) Develop FPS concepts that meet expected NASA mission power requirements at reasonable cost with added benefits over other options. 2) Establish a hardware-based technical foundation for FPS design concepts and reduce overall development risk. 3) Reduce the cost uncertainties for FPS and establish greater credibility for flight system cost estimates. 4) Generate the key products to allow NASA decisionmakers to consider FPS as a preferred option for flight development. In order to achieve these goals, the FPS project has two main thrusts: concept definition and risk reduction. Under concept definition, NASA and DOE are performing trade studies, defining requirements, developing analytical tools, and formulating system concepts. A typical FPS consists of the reactor, shield, power conversion, heat rejection, and power management and distribution (PMAD). Studies are performed to identify the desired design parameters for each subsystem that allow the system to meet the requirements with reasonable cost and development risk. Risk reduction provides the means to evaluate technologies in a laboratory test environment. Non-nuclear hardware prototypes are built and tested to verify performance expectations, gain operating experience, and resolve design uncertainties.

  3. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  4. Modeling sustainability in renewable energy supply chain systems

    NASA Astrophysics Data System (ADS)

    Xie, Fei

    This dissertation aims at modeling sustainability of renewable fuel supply chain systems against emerging challenges. In particular, the dissertation focuses on the biofuel supply chain system design, and manages to develop advanced modeling framework and corresponding solution methods in tackling challenges in sustaining biofuel supply chain systems. These challenges include: (1) to integrate "environmental thinking" into the long-term biofuel supply chain planning; (2) to adopt multimodal transportation to mitigate seasonality in biofuel supply chain operations; (3) to provide strategies in hedging against uncertainty from conversion technology; and (4) to develop methodologies in long-term sequential planning of the biofuel supply chain under uncertainties. All models are mixed integer programs, which also involves multi-objective programming method and two-stage/multistage stochastic programming methods. In particular for the long-term sequential planning under uncertainties, to reduce the computational challenges due to the exponential expansion of the scenario tree, I also developed efficient ND-Max method which is more efficient than CPLEX and Nested Decomposition method. Through result analysis of four independent studies, it is found that the proposed modeling frameworks can effectively improve the economic performance, enhance environmental benefits and reduce risks due to systems uncertainties for the biofuel supply chain systems.

  5. Reducing the uncertainty in robotic machining by modal analysis

    NASA Astrophysics Data System (ADS)

    Alberdi, Iñigo; Pelegay, Jose Angel; Arrazola, Pedro Jose; Ørskov, Klaus Bonde

    2017-10-01

    The use of industrial robots for machining could lead to high cost and energy savings for the manufacturing industry. Machining robots offer several advantages respect to CNC machines such as flexibility, wide working space, adaptability and relatively low cost. However, there are some drawbacks that are preventing a widespread adoption of robotic solutions namely lower stiffness, vibration/chatter problems and lower accuracy and repeatability. Normally due to these issues conservative cutting parameters are chosen, resulting in a low material removal rate (MRR). In this article, an example of a modal analysis of a robot is presented. For that purpose the Tap-testing technology is introduced, which aims at maximizing productivity, reducing the uncertainty in the selection of cutting parameters and offering a stable process free from chatter vibrations.

  6. Technology Development Activities for the Space Environment and its Effects on Spacecraft

    NASA Technical Reports Server (NTRS)

    Kauffman, Billy; Hardage, Donna; Minor, Jody; Barth, Janet; LaBel, Ken

    2003-01-01

    Reducing size and weight of spacecraft, along with demanding increased performance capabilities, introduces many uncertainties in the engineering design community on how emerging microelectronics will perform in space. The engineering design community is forever behind on obtaining and developing new tools and guidelines to mitigate the harmful effects of the space environment. Adding to this complexity is the push to use Commercial-off-the-shelf (COTS) and shrinking microelectronics behind less shielding and the potential usage of unproven technologies such as large solar sail structures and nuclear electric propulsion. In order to drive down these uncertainties, various programs are working together to avoid duplication, save what resources are available in this technical area and possess a focused agenda to insert these new developments into future mission designs. This paper will describe the relationship between the Living With a Star (LWS): Space Environment Testbeds (SET) Project and NASA's Space Environments and Effects (SEE) Program and their technology development activities funded as a result from the recent SEE Program's NASA Research Announcement.

  7. Qualification Testing Versus Quantitative Reliability Testing of PV - Gaining Confidence in a Rapidly Changing Technology: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L

    Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less

  8. A critical review of the use of technology to provide psychosocial support for children and young people with long-term conditions.

    PubMed

    Aldiss, Susie; Baggott, Christina; Gibson, Faith; Mobbs, Sarah; Taylor, Rachel M

    2015-01-01

    Advances in technology have offered health professionals alternative mediums of providing support to patients with long-term conditions. This critical review evaluated and assessed the benefit of electronic media technologies in supporting children and young people with long-term conditions. Of 664 references identified, 40 met the inclusion criteria. Supportive technology tended to increase disease-related knowledge and improve aspects of psychosocial function. Supportive technology did not improve quality of life, reduce health service use or decrease school absences. The poor methodological quality of current evidence and lack of involvement of users in product development contribute to the uncertainty that supportive technology is beneficial. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Directed International Technological Change and Climate Policy: New Methods for Identifying Robust Policies Under Conditions of Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Molina-Perez, Edmundo

    It is widely recognized that international environmental technological change is key to reduce the rapidly rising greenhouse gas emissions of emerging nations. In 2010, the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) agreed to the creation of the Green Climate Fund (GCF). This new multilateral organization has been created with the collective contributions of COP members, and has been tasked with directing over USD 100 billion per year towards investments that can enhance the development and diffusion of clean energy technologies in both advanced and emerging nations (Helm and Pichler, 2015). The landmark agreement arrived at the COP 21 has reaffirmed the key role that the GCF plays in enabling climate mitigation as it is now necessary to align large scale climate financing efforts with the long-term goals agreed at Paris 2015. This study argues that because of the incomplete understanding of the mechanics of international technological change, the multiplicity of policy options and ultimately the presence of climate and technological change deep uncertainty, climate financing institutions such as the GCF, require new analytical methods for designing long-term robust investment plans. Motivated by these challenges, this dissertation shows that the application of new analytical methods, such as Robust Decision Making (RDM) and Exploratory Modeling (Lempert, Popper and Bankes, 2003) to the study of international technological change and climate policy provides useful insights that can be used for designing a robust architecture of international technological cooperation for climate change mitigation. For this study I developed an exploratory dynamic integrated assessment model (EDIAM) which is used as the scenario generator in a large computational experiment. The scope of the experimental design considers an ample set of climate and technological scenarios. These scenarios combine five sources of uncertainty: climate change, elasticity of substitution between renewable and fossil energy and three different sources of technological uncertainty (i.e. R&D returns, innovation propensity and technological transferability). The performance of eight different GCF and non-GCF based policy regimes is evaluated in light of various end-of-century climate policy targets. Then I combine traditional scenario discovery data mining methods (Bryant and Lempert, 2010) with high dimensional stacking methods (Suzuki, Stem and Manzocchi, 2015; Taylor et al., 2006; LeBlanc, Ward and Wittels, 1990) to quantitatively characterize the conditions under which it is possible to stabilize greenhouse gas emissions and keep temperature rise below 2°C before the end of the century. Finally, I describe a method by which it is possible to combine the results of scenario discovery with high-dimensional stacking to construct a dynamic architecture of low cost technological cooperation. This dynamic architecture consists of adaptive pathways (Kwakkel, Haasnoot and Walker, 2014; Haasnoot et al., 2013) which begin with carbon taxation across both regions as a critical near term action. Then in subsequent phases different forms of cooperation are triggered depending on the unfolding climate and technological conditions. I show that there is no single policy regime that dominates over the entire uncertainty space. Instead I find that it is possible to combine these different architectures into a dynamic framework for technological cooperation across regions that can be adapted to unfolding climate and technological conditions which can lead to a greater rate of success and to lower costs in meeting the end-of-century climate change objectives agreed at the 2015 Paris Conference of the Parties. Keywords: international technological change, emerging nations, climate change, technological uncertainties, Green Climate Fund.

  10. A methodology for spacecraft technology insertion analysis balancing benefit, cost, and risk

    NASA Astrophysics Data System (ADS)

    Bearden, David Allen

    Emerging technologies are changing the way space missions are developed and implemented. Technology development programs are proceeding with the goal of enhancing spacecraft performance and reducing mass and cost. However, it is often the case that technology insertion assessment activities, in the interest of maximizing performance and/or mass reduction, do not consider synergistic system-level effects. Furthermore, even though technical risks are often identified as a large cost and schedule driver, many design processes ignore effects of cost and schedule uncertainty. This research is based on the hypothesis that technology selection is a problem of balancing interrelated (and potentially competing) objectives. Current spacecraft technology selection approaches are summarized, and a Methodology for Evaluating and Ranking Insertion of Technology (MERIT) that expands on these practices to attack otherwise unsolved problems is demonstrated. MERIT combines the modern techniques of technology maturity measures, parametric models, genetic algorithms, and risk assessment (cost and schedule) in a unique manner to resolve very difficult issues including: user-generated uncertainty, relationships between cost/schedule and complexity, and technology "portfolio" management. While the methodology is sufficiently generic that it may in theory be applied to a number of technology insertion problems, this research focuses on application to the specific case of small (<500 kg) satellite design. Small satellite missions are of particular interest because they are often developed under rigid programmatic (cost and schedule) constraints and are motivated to introduce advanced technologies into the design. MERIT is demonstrated for programs procured under varying conditions and constraints such as stringent performance goals, not-to-exceed costs, or hard schedule requirements. MERIT'S contributions to the engineering community are its: unique coupling of the aspects of performance, cost, and schedule; assessment of system level impacts of technology insertion; procedures for estimating uncertainties (risks) associated with advanced technology; and application of heuristics to facilitate informed system-level technology utilization decisions earlier in the conceptual design phase. MERIT extends the state of the art in technology insertion assessment selection practice and, if adopted, may aid designers in determining the configuration of complex systems that meet essential requirements in a timely, cost-effective manner.

  11. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.

    PubMed

    Kobayashi, Kenji; Hsu, Ming

    2017-07-19

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.

  12. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty

    PubMed Central

    2017-01-01

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. PMID:28626019

  13. FIELD ANALYTICAL METHODS: ADVANCED FIELD MONITORING METHODS DEVELOPMENT AND EVALUATION OF NEW AND INNOVATIVE TECHNOLOGIES THAT SUPPORT THE SITE CHARACTERIZATION AND MONITORING REQUIREMENTS OF THE SUPERFUND PROGRAM.

    EPA Science Inventory

    The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...

  14. Valuing flexibilities in the design of urban water management systems.

    PubMed

    Deng, Yinghan; Cardin, Michel-Alexandre; Babovic, Vladan; Santhanakrishnan, Deepak; Schmitter, Petra; Meshgi, Ali

    2013-12-15

    Climate change and rapid urbanization requires decision-makers to develop a long-term forward assessment on sustainable urban water management projects. This is further complicated by the difficulties of assessing sustainable designs and various design scenarios from an economic standpoint. A conventional valuation approach for urban water management projects, like Discounted Cash Flow (DCF) analysis, fails to incorporate uncertainties, such as amount of rainfall, unit cost of water, and other uncertainties associated with future changes in technological domains. Such approach also fails to include the value of flexibility, which enables managers to adapt and reconfigure systems over time as uncertainty unfolds. This work describes an integrated framework to value investments in urban water management systems under uncertainty. It also extends the conventional DCF analysis through explicit considerations of flexibility in systems design and management. The approach incorporates flexibility as intelligent decision-making mechanisms that enable systems to avoid future downside risks and increase opportunities for upside gains over a range of possible futures. A water catchment area in Singapore was chosen to assess the value of a flexible extension of standard drainage canals and a flexible deployment of a novel water catchment technology based on green roofs and porous pavements. Results show that integrating uncertainty and flexibility explicitly into the decision-making process can reduce initial capital expenditure, improve value for investment, and enable decision-makers to learn more about system requirements during the lifetime of the project. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Exploring Best Practice Skills to Predict Uncertainties in Venture Capital Investment Decision-Making

    NASA Astrophysics Data System (ADS)

    Blum, David Arthur

    Algae biodiesel is the sole sustainable and abundant transportation fuel source that can replace petrol diesel use; however, high competition and economic uncertainties exist, influencing independent venture capital decision making. Technology, market, management, and government action uncertainties influence competition and economic uncertainties in the venture capital industry. The purpose of this qualitative case study was to identify the best practice skills at IVC firms to predict uncertainty between early and late funding stages. The basis of the study was real options theory, a framework used to evaluate and understand the economic and competition uncertainties inherent in natural resource investment and energy derived from plant-based oils. Data were collected from interviews of 24 venture capital partners based in the United States who invest in algae and other renewable energy solutions. Data were analyzed by coding and theme development interwoven with the conceptual framework. Eight themes emerged: (a) expected returns model, (b) due diligence, (c) invest in specific sectors, (d) reduced uncertainty-late stage, (e) coopetition, (f) portfolio firm relationships, (g) differentiation strategy, and (h) modeling uncertainty and best practice. The most noteworthy finding was that predicting uncertainty at the early stage was impractical; at the expansion and late funding stages, however, predicting uncertainty was possible. The implications of these findings will affect social change by providing independent venture capitalists with best practice skills to increase successful exits, lessen uncertainty, and encourage increased funding of renewable energy firms, contributing to cleaner and healthier communities throughout the United States..

  16. Space Environments and Effects (SEE) Program: Spacecraft Charging Technology Development Activities

    NASA Technical Reports Server (NTRS)

    Kauffman, Billy; Hardage, Donna; Minor, Jody

    2003-01-01

    Reducing size and weight of spacecraft, along with demanding increased performance capabilities, introduces many uncertainties in the engineering design community on how materials and spacecraft systems will perform in space. The engineering design community is forever behind on obtaining and developing new tools and guidelines to mitigate the harmful effects of the space environment. Adding to this complexity is the continued push to use Commercial-off-the-shelf (COTS) microelectronics, potential usage of unproven technologies such as large solar sail structures and nuclear electric propulsion. In order to drive down these uncertainties, various programs are working together to avoid duplication, save what resources are available in this technical area and possess a focused agenda to insert these new developments into future mission designs. This paper will introduce the SEE Program, briefly discuss past and currently sponsored spacecraft charging activities and possible future endeavors.

  17. Space Environments and Effects (SEE) Program: Spacecraft Charging Technology Development Activities

    NASA Technical Reports Server (NTRS)

    Kauffman, B.; Hardage, D.; Minor, J.

    2004-01-01

    Reducing size and weight of spacecraft, along with demanding increased performance capabilities, introduces many uncertainties in the engineering design community on how materials and spacecraft systems will perform in space. The engineering design community is forever behind on obtaining and developing new tools and guidelines to mitigate the harmful effects of the space environment. Adding to this complexity is the continued push to use Commercial-off-the-Shelf (COTS) microelectronics, potential usage of unproven technologies such as large solar sail structures and nuclear electric propulsion. In order to drive down these uncertainties, various programs are working together to avoid duplication, save what resources are available in this technical area and possess a focused agenda to insert these new developments into future mission designs. This paper will introduce the SEE Program, briefly discuss past and currently sponsored spacecraft charging activities and possible future endeavors.

  18. Assessment the impact of samplers change on the uncertainty related to geothermalwater sampling

    NASA Astrophysics Data System (ADS)

    Wątor, Katarzyna; Mika, Anna; Sekuła, Klaudia; Kmiecik, Ewa

    2018-02-01

    The aim of this study is to assess the impact of samplers change on the uncertainty associated with the process of the geothermal water sampling. The study was carried out on geothermal water exploited in Podhale region, southern Poland (Małopolska province). To estimate the uncertainty associated with sampling the results of determinations of metasilicic acid (H2SiO3) in normal and duplicate samples collected in two series were used (in each series the samples were collected by qualified sampler). Chemical analyses were performed using ICP-OES method in the certified Hydrogeochemical Laboratory of the Hydrogeology and Engineering Geology Department at the AGH University of Science and Technology in Krakow (Certificate of Polish Centre for Accreditation No. AB 1050). To evaluate the uncertainty arising from sampling the empirical approach was implemented, based on double analysis of normal and duplicate samples taken from the same well in the series of testing. The analyses of the results were done using ROBAN software based on technique of robust statistics analysis of variance (rANOVA). Conducted research proved that in the case of qualified and experienced samplers uncertainty connected with the sampling can be reduced what results in small measurement uncertainty.

  19. Accuracy requirements and uncertainties in radiotherapy: a report of the International Atomic Energy Agency.

    PubMed

    van der Merwe, Debbie; Van Dyk, Jacob; Healy, Brendan; Zubizarreta, Eduardo; Izewska, Joanna; Mijnheer, Ben; Meghzifene, Ahmed

    2017-01-01

    Radiotherapy technology continues to advance and the expectation of improved outcomes requires greater accuracy in various radiotherapy steps. Different factors affect the overall accuracy of dose delivery. Institutional comprehensive quality assurance (QA) programs should ensure that uncertainties are maintained at acceptable levels. The International Atomic Energy Agency has recently developed a report summarizing the accuracy achievable and the suggested action levels, for each step in the radiotherapy process. Overview of the report: The report seeks to promote awareness and encourage quantification of uncertainties in order to promote safer and more effective patient treatments. The radiotherapy process and the radiobiological and clinical frameworks that define the need for accuracy are depicted. Factors that influence uncertainty are described for a range of techniques, technologies and systems. Methodologies for determining and combining uncertainties are presented, and strategies for reducing uncertainties through QA programs are suggested. The role of quality audits in providing international benchmarking of achievable accuracy and realistic action levels is also discussed. The report concludes with nine general recommendations: (1) Radiotherapy should be applied as accurately as reasonably achievable, technical and biological factors being taken into account. (2) For consistency in prescribing, reporting and recording, recommendations of the International Commission on Radiation Units and Measurements should be implemented. (3) Each institution should determine uncertainties for their treatment procedures. Sample data are tabulated for typical clinical scenarios with estimates of the levels of accuracy that are practically achievable and suggested action levels. (4) Independent dosimetry audits should be performed regularly. (5) Comprehensive quality assurance programs should be in place. (6) Professional staff should be appropriately educated and adequate staffing levels should be maintained. (7) For reporting purposes, uncertainties should be presented. (8) Manufacturers should provide training on all equipment. (9) Research should aid in improving the accuracy of radiotherapy. Some example research projects are suggested.

  20. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  1. Satellite Power Systems (SPS) concept definition study exhibit C. Volume 3: Experimental verification definition

    NASA Technical Reports Server (NTRS)

    1979-01-01

    An environmentally oriented microwave technology exploratory research program aimed at reducing the uncertainty associated with microwave power system critical technical issues is described. Topics discussed include: (1) Solar Power Satellite System (SPS) development plan elements; (2) critical technology issues related to the SPS preliminary reference configuration; (3) pilot plant to demonstrate commercial viability of the SPS system; and (4) research areas required to demonstrate feasibility of the SPS system. Progress in the development of advanced GaAs solar cells is reported along with a power distribution subsystem.

  2. [Traditional Chinese Medicine data management policy in big data environment].

    PubMed

    Liang, Yang; Ding, Chang-Song; Huang, Xin-di; Deng, Le

    2018-02-01

    As traditional data management model cannot effectively manage the massive data in traditional Chinese medicine(TCM) due to the uncertainty of data object attributes as well as the diversity and abstraction of data representation, a management strategy for TCM data based on big data technology is proposed. Based on true characteristics of TCM data, this strategy could solve the problems of the uncertainty of data object attributes in TCM information and the non-uniformity of the data representation by using modeless properties of stored objects in big data technology. Hybrid indexing mode was also used to solve the conflicts brought by different storage modes in indexing process, with powerful capabilities in query processing of massive data through efficient parallel MapReduce process. The theoretical analysis provided the management framework and its key technology, while its performance was tested on Hadoop by using several common traditional Chinese medicines and prescriptions from practical TCM data source. Result showed that this strategy can effectively solve the storage problem of TCM information, with good performance in query efficiency, completeness and robustness. Copyright© by the Chinese Pharmaceutical Association.

  3. Robust Rate Maximization for Heterogeneous Wireless Networks under Channel Uncertainties

    PubMed Central

    Xu, Yongjun; Hu, Yuan; Li, Guoquan

    2018-01-01

    Heterogeneous wireless networks are a promising technology in next generation wireless communication networks, which has been shown to efficiently reduce the blind area of mobile communication and improve network coverage compared with the traditional wireless communication networks. In this paper, a robust power allocation problem for a two-tier heterogeneous wireless networks is formulated based on orthogonal frequency-division multiplexing technology. Under the consideration of imperfect channel state information (CSI), the robust sum-rate maximization problem is built while avoiding sever cross-tier interference to macrocell user and maintaining the minimum rate requirement of each femtocell user. To be practical, both of channel estimation errors from the femtocells to the macrocell and link uncertainties of each femtocell user are simultaneously considered in terms of outage probabilities of users. The optimization problem is analyzed under no CSI feedback with some cumulative distribution function and partial CSI with Gaussian distribution of channel estimation error. The robust optimization problem is converted into the convex optimization problem which is solved by using Lagrange dual theory and subgradient algorithm. Simulation results demonstrate the effectiveness of the proposed algorithm by the impact of channel uncertainties on the system performance. PMID:29466315

  4. Setting priorities for research on pollution reduction functions of agricultural buffers.

    PubMed

    Dosskey, Michael G

    2002-11-01

    The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.

  5. Validation of exposure time for discharge measurements made with two bottom-tracking acoustic doppler current profilers

    USGS Publications Warehouse

    Czuba, J.A.; Oberg, K.

    2008-01-01

    Previous work by Oberg and Mueller of the U.S. Geological Survey in 2007 concluded that exposure time (total time spent sampling the flow) is a critical factor in reducing measurement uncertainty. In a subsequent paper, Oberg and Mueller validated these conclusions using one set of data to show that the effect of exposure time on the uncertainty of the measured discharge is independent of stream width, depth, and range of boat speeds. Analysis of eight StreamPro acoustic Doppler current profiler (ADCP) measurements indicate that they fall within and show a similar trend to the Rio Grande ADCP data previously reported. Four special validation measurements were made for the purpose of verifying the conclusions of Oberg and Mueller regarding exposure time for Rio Grande and StreamPro ADCPs. Analysis of these measurements confirms that exposure time is a critical factor in reducing measurement uncertainty and is independent of stream width, depth, and range of boat speeds. Furthermore, it appears that the relation between measured discharge uncertainty and exposure time is similar for both Rio Grande and StreamPro ADCPs. These results are applicable to ADCPs that make use of broadband technology using bottom-tracking to obtain the boat velocity. Based on this work, a minimum of two transects should be collected with an exposure time for all transects greater than or equal to 720 seconds in order to achieve an uncertainty of ??5 percent when using bottom-tracking ADCPs. ?? 2008 IEEE.

  6. Vadose zone transport field study: Detailed test plan for simulated leak tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AL Ward; GW Gee

    2000-06-23

    The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from thesemore » uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.« less

  7. Simulating and validating coastal gradients in wind energy resources

    NASA Astrophysics Data System (ADS)

    Hahmann, Andrea; Floors, Rogier; Karagali, Ioanna; Vasiljevic, Nikola; Lea, Guillaume; Simon, Elliot; Courtney, Michael; Badger, Merete; Peña, Alfredo; Hasager, Charlotte

    2016-04-01

    The experimental campaign of the RUNE (Reducing Uncertainty of Near-shore wind resource Estimates) project took place on the western coast of Denmark during the winter 2015-2016. The campaign used onshore scanning lidar technology combined with ocean and satellite information and produced a unique dataset to study the transition in boundary layer dynamics across the coastal zone. The RUNE project aims at reducing the uncertainty of near-shore wind resource estimates produced by mesoscale modeling. With this in mind, simulations using the Weather Research and Forecasting (WRF) model were performed to identify the sensitivity in the coastal gradients of wind energy resources to various model parameters and model inputs. Among these: model horizontal grid spacing and the planetary boundary layer and surface-layer scheme. We report on the differences amongst these simulations and preliminary results on the comparison of the model simulations with the RUNE observations of lidar and satellite measurements and near coastal tall mast.

  8. Unconventional nozzle tradeoff study. [space tug propulsion

    NASA Technical Reports Server (NTRS)

    Obrien, C. J.

    1979-01-01

    Plug cluster engine design, performance, weight, envelope, operational characteristics, development cost, and payload capability, were evaluated and comparisons were made with other space tug engine candidates using oxygen/hydrogen propellants. Parametric performance data were generated for existing developed or high technology thrust chambers clustered around a plug nozzle of very large diameter. The uncertainties in the performance prediction of plug cluster engines with large gaps between the modules (thrust chambers) were evaluated. The major uncertainty involves, the aerodynamics of the flow from discrete nozzles, and the lack of this flow to achieve the pressure ratio corresponding to the defined area ratio for a plug cluster. This uncertainty was reduced through a cluster design that consists of a plug contour that is formed from the cluster of high area ratio bell nozzles that have been scarfed. Light-weight, high area ratio, bell nozzles were achieved through the use of AGCarb (carbon-carbon cloth) nozzle extensions.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muller, L; Soldner, A; Kirk, M

    Purpose: The beam range uncertainty presents a special challenge for proton therapy. Novel technologies currently under development offer strategies to reduce the range uncertainty [1,2]. This work quantifies the potential advantages that could be realized by such a reduction for dosimetrically challenging chordomas at the base of skull. Therapeutic improvement was assessed by evaluating tumor control probabilities (TCP) and normal tissue complication probabilities (NTCP). Methods: Treatment plans were made for a modulated-scanned proton delivery technique using the Eclipse treatment planning system. The prescription dose was 7920 cGy to the CTV. Three different range uncertainty scenarios were considered: 5 mm (3.5%more » of the beam range + 1 mm, representing current clinical practice, “Curr”), 2 mm (1.3%), and 1 mm (0.7%). For each of 4 patients, 3 different PTVs were defined via uniform expansion of the CTV by the value of the range uncertainty. Tumor control probability (TCP) and normal tissue complication probabilities (NTCPs) for organs-at-risk (OARs) were calculated using the Lyman-Kutcher-Burman[3] formalism and published model parameters [ref Terahara[4], quantec S10, Burman Red Journal v21 pp 123]. Our plan optimization strategy was to achieve PTV close to prescription while maintaining OAR NTCP values at or better than the Curr plan. Results: The average TCP values for the 5, 2, and 1 mm range uncertainty scenarios are 51%, 55% and 65%. The improvement in TCP for patients was between 4 and 30%, depending primarily on the proximity of the GTV to OAR. The average NTCPs for the brainstem and cord were about 4% and 1%, respectively, for all target margins. Conclusion: For base of skull chordomas, reduced target margins can substantially increase the TCP without increasing the NTCP. This work demonstrates the potential significance of a reduction in the range uncertainty for proton beams.« less

  10. Climate: Policy, Modeling, and Federal Priorities (Invited)

    NASA Astrophysics Data System (ADS)

    Koonin, S.; Department Of Energy Office Of The Under SecretaryScience

    2010-12-01

    The Administration has set ambitious national goals to reduce our dependence on fossil fuels and reduce anthropogenic greenhouse gas (GHG) emissions. The US and other countries involved in the U.N. Framework Convention on Climate Change continue to work toward a goal of establishing a viable treaty that would encompass limits on emissions and codify actions that nations would take to reduce emissions. These negotiations are informed by the science of climate change and by our understanding of how changes in technology and the economy might affect the overall climate in the future. I will describe the present efforts within the U.S. Department of Energy, and the federal government more generally, to address issues related to climate change. These include state-of-the-art climate modeling and uncertainty assessment, economic and climate scenario planning based on best estimates of different technology trajectories, adaption strategies for climate change, and monitoring and reporting for treaty verification.

  11. A coupled stochastic inverse-management framework for dealing with nonpoint agriculture pollution under groundwater parameter uncertainty

    NASA Astrophysics Data System (ADS)

    Llopis-Albert, Carlos; Palacios-Marqués, Daniel; Merigó, José M.

    2014-04-01

    In this paper a methodology for the stochastic management of groundwater quality problems is presented, which can be used to provide agricultural advisory services. A stochastic algorithm to solve the coupled flow and mass transport inverse problem is combined with a stochastic management approach to develop methods for integrating uncertainty; thus obtaining more reliable policies on groundwater nitrate pollution control from agriculture. The stochastic inverse model allows identifying non-Gaussian parameters and reducing uncertainty in heterogeneous aquifers by constraining stochastic simulations to data. The management model determines the spatial and temporal distribution of fertilizer application rates that maximizes net benefits in agriculture constrained by quality requirements in groundwater at various control sites. The quality constraints can be taken, for instance, by those given by water laws such as the EU Water Framework Directive (WFD). Furthermore, the methodology allows providing the trade-off between higher economic returns and reliability in meeting the environmental standards. Therefore, this new technology can help stakeholders in the decision-making process under an uncertainty environment. The methodology has been successfully applied to a 2D synthetic aquifer, where an uncertainty assessment has been carried out by means of Monte Carlo simulation techniques.

  12. Asteroid approach covariance analysis for the Clementine mission

    NASA Technical Reports Server (NTRS)

    Ionasescu, Rodica; Sonnabend, David

    1993-01-01

    The Clementine mission is designed to test Strategic Defense Initiative Organization (SDIO) technology, the Brilliant Pebbles and Brilliant Eyes sensors, by mapping the moon surface and flying by the asteroid Geographos. The capability of two of the instruments available on board the spacecraft, the lidar (laser radar) and the UV/Visible camera is used in the covariance analysis to obtain the spacecraft delivery uncertainties at the asteroid. These uncertainties are due primarily to asteroid ephemeris uncertainties. On board optical navigation reduces the uncertainty in the knowledge of the spacecraft position in the direction perpendicular to the incoming asymptote to a one-sigma value of under 1 km, at the closest approach distance of 100 km. The uncertainty in the knowledge of the encounter time is about 0.1 seconds for a flyby velocity of 10.85 km/s. The magnitude of these uncertainties is due largely to Center Finding Errors (CFE). These systematic errors represent the accuracy expected in locating the center of the asteroid in the optical navigation images, in the absence of a topographic model for the asteroid. The direction of the incoming asymptote cannot be estimated accurately until minutes before the asteroid flyby, and correcting for it would require autonomous navigation. Orbit determination errors dominate over maneuver execution errors, and the final delivery accuracy attained is basically the orbit determination uncertainty before the final maneuver.

  13. Characterizing model uncertainties in the life cycle of lignocellulose-based ethanol fuels.

    PubMed

    Spatari, Sabrina; MacLean, Heather L

    2010-11-15

    Renewable and low carbon fuel standards being developed at federal and state levels require an estimation of the life cycle carbon intensity (LCCI) of candidate fuels that can substitute for gasoline, such as second generation bioethanol. Estimating the LCCI of such fuels with a high degree of confidence requires the use of probabilistic methods to account for known sources of uncertainty. We construct life cycle models for the bioconversion of agricultural residue (corn stover) and energy crops (switchgrass) and explicitly examine uncertainty using Monte Carlo simulation. Using statistical methods to identify significant model variables from public data sets and Aspen Plus chemical process models,we estimate stochastic life cycle greenhouse gas (GHG) emissions for the two feedstocks combined with two promising fuel conversion technologies. The approach can be generalized to other biofuel systems. Our results show potentially high and uncertain GHG emissions for switchgrass-ethanol due to uncertain CO₂ flux from land use change and N₂O flux from N fertilizer. However, corn stover-ethanol,with its low-in-magnitude, tight-in-spread LCCI distribution, shows considerable promise for reducing life cycle GHG emissions relative to gasoline and corn-ethanol. Coproducts are important for reducing the LCCI of all ethanol fuels we examine.

  14. Investments in energy technological change under uncertainty

    NASA Astrophysics Data System (ADS)

    Shittu, Ekundayo

    2009-12-01

    This dissertation addresses the crucial problem of how environmental policy uncertainty influences investments in energy technological change. The rising level of carbon emissions due to increasing global energy consumption calls for policy shift. In order to stem the negative consequences on the climate, policymakers are concerned with carving an optimal regulation that will encourage technology investments. However, decision makers are facing uncertainties surrounding future environmental policy. The first part considers the treatment of technological change in theoretical models. This part has two purposes: (1) to show--through illustrative examples--that technological change can lead to quite different, and surprising, impacts on the marginal costs of pollution abatement. We demonstrate an intriguing and uncommon result that technological change can increase the marginal costs of pollution abatement over some range of abatement; (2) to show the impact, on policy, of this uncommon observation. We find that under the assumption of technical change that can increase the marginal cost of pollution abatement over some range, the ranking of policy instruments is affected. The second part builds on the first by considering the impact of uncertainty in the carbon tax on investments in a portfolio of technologies. We determine the response of energy R&D investments as the carbon tax increases both in terms of overall and technology-specific investments. We determine the impact of risk in the carbon tax on the portfolio. We find that the response of the optimal investment in a portfolio of technologies to an increasing carbon tax depends on the relative costs of the programs and the elasticity of substitution between fossil and non-fossil energy inputs. In the third part, we zoom-in on the portfolio model above to consider how uncertainty in the magnitude and timing of a carbon tax influences investments. Under a two-stage continuous-time optimal control model, we consider the impact of these uncertainties on R&D spending that aims to lower the cost of non-fossil energy technology. We find that our results tally with the classical results because it discourages near-term investment. However, timing uncertainty increases near-term investment.

  15. Data quality control in eco-environmental monitoring

    NASA Astrophysics Data System (ADS)

    Lu, Chunyan; Wang, Jing

    2007-11-01

    With the development of science and technology, a number of environmental issues, such as sustainable development, climate change, environmental pollution, and land degradation become serious. Greater attention has been attached to environmental protection. The government gradually launched some eco--environmental construction projects. In 1999, China begin to carry out the project of Grain-for-Green in the west, to improve the eco-environment, and it make some good effect, but there are some questions that still can not be answered. How about the new grass or forest? Where are they? How can we do in the future? To answer these questions, the government began to monitor the eco-environment, based on remote sensing technology. Geography information can be attained timely, but the issue of uncertainty has become increasingly recognized, and this uncertainty affects the reliability of applications using the data. This article analyzed the process of eco-environment monitoring, the uncertainty of geography information, and discussed the methods of data quality control. The Spot5 span data and multi-spectral data in 2003(2002) were used, combined with land use survey data at the scale of 1:10,000, topography data at the scale of 1:10,000, and the local Grain-for-Green project map. Also the social and economic data were collected. Eco-environmental monitoring is a process which consists of several steps, such as image geometric correction, image matching, information extraction, and so on. Based on visual and automated method, land information turned to grass and forest from cultivated land was obtained by comparing the information form remote sensing data with the land survey data, and local Grain-for-Green project data, combined with field survey. According to the process, the uncertainty in the process was analyzed. Positional uncertainty, attribute uncertainty, and thematic uncertainty was obvious. Positional uncertainty mainly derived from image geometric correction, such as data resource, the number and spatial distribution of the control points are important resource of uncertainty. Attribution uncertainty mainly derived from the process of information extraction. Land classification system, artificial error was the main factor induced uncertainty. Concept defined was not clear, and it reduced thematic uncertainty. According to the resource of uncertainty, data quality control methods were put forward to improve the data quality. At first, it is more important to choose appropriate remote sensing data and other basic data. Secondly, the accuracy of digital orthophoto map should be controlled. Thirdly, it is necessary to check the result data according to relative data quality criterion to guarantee GIS data quality.

  16. National Center for Nuclear Security: The Nuclear Forensics Project (F2012)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klingensmith, A. L.

    These presentation visuals introduce the National Center for Nuclear Security. Its chartered mission is to enhance the Nation’s verification and detection capabilities in support of nuclear arms control and nonproliferation through R&D activities at the NNSS. It has three focus areas: Treaty Verification Technologies, Nonproliferation Technologies, and Technical Nuclear Forensics. The objectives of nuclear forensics are to reduce uncertainty in the nuclear forensics process & improve the scientific defensibility of nuclear forensics conclusions when applied to nearsurface nuclear detonations. Research is in four key areas: Nuclear Physics, Debris collection and analysis, Prompt diagnostics, and Radiochemistry.

  17. Primary data collection in health technology assessment.

    PubMed

    McIsaac, Michelle L; Goeree, Ron; Brophy, James M

    2007-01-01

    This study discusses the value of primary data collection as part of health technology assessment (HTA). Primary data collection can help reduce uncertainty in HTA and better inform evidence-based decision making. However, methodological issues such as choosing appropriate study design and practical concerns such as the value of collecting additional information need to be addressed. The authors emphasize the conditions required for successful primary data collection in HTA: experienced researchers, sufficient funding, and coordination among stakeholders, government, and researchers. The authors conclude that, under specific conditions, primary data collection is a worthwhile endeavor in the HTA process.

  18. Incorporating climate-system and carbon-cycle uncertainties in integrated assessments of climate change. (Invited)

    NASA Astrophysics Data System (ADS)

    Rogelj, J.; McCollum, D. L.; Reisinger, A.; Knutti, R.; Riahi, K.; Meinshausen, M.

    2013-12-01

    The field of integrated assessment draws from a large body of knowledge across a range of disciplines to gain robust insights about possible interactions, trade-offs, and synergies. Integrated assessment of climate change, for example, uses knowledge from the fields of energy system science, economics, geophysics, demography, climate change impacts, and many others. Each of these fields comes with its associated caveats and uncertainties, which should be taken into account when assessing any results. The geophysical system and its associated uncertainties are often represented by models of reduced complexity in integrated assessment modelling frameworks. Such models include simple representations of the carbon-cycle and climate system, and are often based on the global energy balance equation. A prominent example of such model is the 'Model for the Assessment of Greenhouse Gas Induced Climate Change', MAGICC. Here we show how a model like MAGICC can be used for the representation of geophysical uncertainties. Its strengths, weaknesses, and limitations are discussed and illustrated by means of an analysis which attempts to integrate socio-economic and geophysical uncertainties. These uncertainties in the geophysical response of the Earth system to greenhouse gases remains key for estimating the cost of greenhouse gas emission mitigation scenarios. We look at uncertainties in four dimensions: geophysical, technological, social and political. Our results indicate that while geophysical uncertainties are an important factor influencing projections of mitigation costs, political choices that delay mitigation by one or two decades a much more pronounced effect.

  19. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    PubMed

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. The impact of uncertainty on optimal emission policies

    NASA Astrophysics Data System (ADS)

    Botta, Nicola; Jansson, Patrik; Ionescu, Cezar

    2018-05-01

    We apply a computational framework for specifying and solving sequential decision problems to study the impact of three kinds of uncertainties on optimal emission policies in a stylized sequential emission problem.We find that uncertainties about the implementability of decisions on emission reductions (or increases) have a greater impact on optimal policies than uncertainties about the availability of effective emission reduction technologies and uncertainties about the implications of trespassing critical cumulated emission thresholds. The results show that uncertainties about the implementability of decisions on emission reductions (or increases) call for more precautionary policies. In other words, delaying emission reductions to the point in time when effective technologies will become available is suboptimal when these uncertainties are accounted for rigorously. By contrast, uncertainties about the implications of exceeding critical cumulated emission thresholds tend to make early emission reductions less rewarding.

  1. Life support technology investment strategies for flight programs: An application of decision analysis

    NASA Technical Reports Server (NTRS)

    Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.

    1993-01-01

    Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA"s proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for the develpoment of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.

  2. Life support technology investment strategies for flight programs: An application of decision analysis

    NASA Technical Reports Server (NTRS)

    Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.

    1993-01-01

    Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA's proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for development of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.

  3. Precision Departure Release Capability (PDRC) Technology Description

    NASA Technical Reports Server (NTRS)

    Engelland, Shawn A.; Capps, Richard; Day, Kevin; Robinson, Corissia; Null, Jody R.

    2013-01-01

    After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Technology Description. Companion papers include the Final Report and a Concept of Operations.

  4. Effective information channels for reducing costs of environmentally- friendly technologies: evidence from residential PV markets

    NASA Astrophysics Data System (ADS)

    Rai, Varun; Robinson, Scott A.

    2013-03-01

    Realizing the environmental benefits of solar photovoltaics (PV) will require reducing costs associated with perception, informational gaps and technological uncertainties. To identify opportunities to decrease costs associated with residential PV adoption, in this letter we use multivariate regression models to analyze a unique, household-level dataset of PV adopters in Texas (USA) to systematically quantify the effect of different information channels on aspiring PV adopters’ decision-making. We find that the length of the decision period depends on the business model, such as whether the system was bought or leased, and on special opportunities to learn, such as the influence of other PV owners in the neighborhood. This influence accrues passively through merely witnessing PV systems in the neighborhood, increasing confidence and motivation, as well as actively through peer-to-peer communications. Using these insights we propose a new framework to provide public information on PV that could drastically reduce barriers to PV adoption, thereby accelerating its market penetration and environmental benefits. This framework could also serve as a model for other distributed generation technologies.

  5. Essays on Adoption and Diffusion of New Technology in Supply Chains

    ERIC Educational Resources Information Center

    Choi, Daeheon

    2012-01-01

    Over the past decades, network technologies across supply chains have been introduced and promoted with the premised benefits for all participants. However industry experience with an adoption process of some technology suggests that some firms have a great amount of uncertainty in estimating the benefits of its adoption. This uncertainty will…

  6. Evaluating the effects of China's pollution control on inter-annual trends and uncertainties of atmospheric mercury emissions

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Zhong, H.; Zhang, J.; Nielsen, C. P.

    2014-10-01

    China's atmospheric mercury (Hg) emissions of anthropogenic origin have been effectively restrained through the national policy of air pollution control. Improved methods based on available field measurements are developed to quantify the benefits of Hg abatement through various emission control measures. Those measures include increased use of flue gas desulfurization (FGD) and selective catalyst reduction (SCR) systems for power sector, precalciners with fabric filter (FF) for cement production, machinery coking with electrostatic precipitator (ESP) for iron and steel production, and advanced manufacturing technologies for nonferrous metal smelting. Declining trends in emissions factors for those sources are revealed, leading to a much slower growth of national total Hg emissions than that of energy and economy, from 679 in 2005 to 750 metric tons (t) in 2012. In particular, nearly half of emissions from the above-mentioned four types of sources are expected to be reduced in 2012, attributed to expansion of technologies with high energy efficiencies and air pollutant removal rates after 2005. The speciation of Hg emissions keeps stable for recent years, with the mass fractions of around 55, 39 and 6% for Hg0, Hg2+ and Hgp, respectively. The lower estimate of Hg emissions than previous inventories is supported by limited chemistry simulation work, but middle-to-long term observation on ambient Hg levels is further needed to justify the inter-annual trends of estimated Hg emissions. With improved implementation of emission controls and energy saving, 23% reduction in annual Hg emissions for the most optimistic case in 2030 is expected compared to 2012, with total emissions below 600 t. While Hg emissions are evaluated to be gradually constrained, increased uncertainties are quantified with Monte-Carlo simulation for recent years, particularly for power and certain industrial sources. The uncertainty of Hg emissions from coal-fired power plants, as an example, increased from -48~ +73% in 2005 to -50~ +89% in 2012 (expressed as 95% confidence interval). This is attributed mainly to swiftly increased penetration of advanced manufacturing and pollutant control technologies. The unclear operation status or relatively small sample size of field measurements on those technologies results in lower but highly varied emission factors. To further confirm the benefits of pollution control polices with reduced uncertainty, therefore, systematic investigations are recommended specific for Hg pollution sources, and the variability of temporal trends and spatial distributions of Hg emissions need to be better tracked for the country under dramatic changes in economy, energy and air pollution status.

  7. Linear Parameter Varying Control Synthesis for Actuator Failure, Based on Estimated Parameter

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Wu, N. Eva; Belcastro, Christine

    2002-01-01

    The design of a linear parameter varying (LPV) controller for an aircraft at actuator failure cases is presented. The controller synthesis for actuator failure cases is formulated into linear matrix inequality (LMI) optimizations based on an estimated failure parameter with pre-defined estimation error bounds. The inherent conservatism of an LPV control synthesis methodology is reduced using a scaling factor on the uncertainty block which represents estimated parameter uncertainties. The fault parameter is estimated using the two-stage Kalman filter. The simulation results of the designed LPV controller for a HiMXT (Highly Maneuverable Aircraft Technology) vehicle with the on-line estimator show that the desired performance and robustness objectives are achieved for actuator failure cases.

  8. Evaluation of Uncertainty in Constituent Input Parameters for Modeling the Fate of IMX 101 Components

    DTIC Science & Technology

    2017-05-01

    ER D C/ EL T R- 17 -7 Environmental Security Technology Certification Program (ESTCP) Evaluation of Uncertainty in Constituent Input...Environmental Security Technology Certification Program (ESTCP) ERDC/EL TR-17-7 May 2017 Evaluation of Uncertainty in Constituent Input Parameters...Environmental Evaluation and Characterization Sys- tem (TREECS™) was applied to a groundwater site and a surface water site to evaluate the sensitivity

  9. Life cycle evaluation of emerging lignocellulosic ethanol conversion technologies.

    PubMed

    Spatari, Sabrina; Bagley, David M; MacLean, Heather L

    2010-01-01

    Lignocellulosic ethanol holds promise for addressing climate change and energy security issues associated with personal transportation through lowering the fuel mixes' carbon intensity and petroleum demand. We compare the technological features and life cycle environmental impacts of near- and mid-term ethanol bioconversion technologies in the United States. Key uncertainties in the major processes: pre-treatment, hydrolysis, and fermentation are evaluated. The potential to reduce fossil energy use and greenhouse gas (GHG) emissions varies among bioconversion processes, although all options studied are considerably more attractive than gasoline. Anticipated future performance is found to be considerably more attractive than that published in the literature as being achieved to date. Electricity co-product credits are important in characterizing the GHG impacts of different ethanol production pathways; however, in the absence of near-term liquid transportation fuel alternatives to gasoline, optimizing ethanol facilities to produce ethanol (as opposed to co-products) is important for reducing the carbon intensity of the road transportation sector and for energy security.

  10. National research and education network

    NASA Technical Reports Server (NTRS)

    Villasenor, Tony

    1991-01-01

    Some goals of this network are as follows: Extend U.S. technological leadership in high performance computing and computer communications; Provide wide dissemination and application of the technologies both to the speed and the pace of innovation and to serve the national economy, national security, education, and the global environment; and Spur gains in the U.S. productivity and industrial competitiveness by making high performance computing and networking technologies an integral part of the design and production process. Strategies for achieving these goals are as follows: Support solutions to important scientific and technical challenges through a vigorous R and D effort; Reduce the uncertainties to industry for R and D and use of this technology through increased cooperation between government, industry, and universities and by the continued use of government and government funded facilities as a prototype user for early commercial HPCC products; and Support underlying research, network, and computational infrastructures on which U.S. high performance computing technology is based.

  11. Economic and technological aspects of the market introduction of renewable power technologies

    NASA Astrophysics Data System (ADS)

    Worlen, Christine M.

    Renewable energy, if developed and delivered with appropriate technologies, is cleaner, more evenly distributed, and safer than conventional energy systems. Many countries and several states in the United States promote the development and introduction of technologies for "green" electricity production. This dissertation investigates economic and technological aspects of this process for wind energy. In liberalized electricity markets, policy makers use economic incentives to encourage the adoption of renewables. Choosing from a large range of possible policies and instruments is a multi-criteria decision process. This dissertation evaluates the criteria used and the trade-offs among the criteria, and develops a hierarchical flow scheme that policy makers can use to choose the most appropriate policy for a given situation. Economic incentives and market transformation programs seek to reduce costs through mass deployment in order to make renewable technologies competitive. Cost reduction is measured in "experience curves" that posit negative exponential relationships between cumulative deployment and production cost. This analysis reveals the weaknesses in conventional experience curve analyses for wind turbines, and concludes that the concept is limited by data availability, a weak conceptual foundation, and inappropriate statistical estimation. A revised model specifies a more complete set of economic and technological forces that determine the cost of wind power. Econometric results indicate that experience and upscaling of turbine sizes accounted for the observed cost reduction in wind turbines in the United States, Denmark and Germany between 1983 and 2001. These trends are likely to continue. In addition, future cost reductions will result from economies of scale in production. Observed differences in the performance of theoretically equivalent policy instruments could arise from economic uncertainty. To test this hypothesis, a methodology for the quantitative comparison of economic incentive schemes and their effect on uncertainty and investor behavior in renewable power markets is developed using option value theory of investment. Critical investment thresholds compared with actual benefit-cost ratios for several case studies in Germany indicate that uncertainty in prices for wind power and green certificates would delay investment. In Germany, the fixed-tariff system effectively removes this barrier.

  12. Uncertainties in estimates of the risks of late effects from space radiation

    NASA Astrophysics Data System (ADS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits.

  13. Health technology assessment and primary data collection for reducing uncertainty in decision making.

    PubMed

    Goeree, Ron; Levin, Les; Chandra, Kiran; Bowen, James M; Blackhouse, Gord; Tarride, Jean-Eric; Burke, Natasha; Bischof, Matthias; Xie, Feng; O'Reilly, Daria

    2009-05-01

    Health care expenditures continue to escalate, and pressures for increased spending will continue. Health care decision makers from publicly financed systems, private insurance companies, or even from individual health care institutions, will continue to be faced with making difficult purchasing, access, and reimbursement decisions. As a result, decision makers are increasingly turning to evidence-based platforms to help control costs and make the most efficient use of existing resources. Most tools used to assist with evidence-based decision making focus on clinical outcomes. Health technology assessment (HTA) is increasing in popularity because it also considers other factors important for decision making, such as cost, social and ethical values, legal issues, and factors such as the feasibility of implementation. In some jurisdictions, HTAs have also been supplemented with primary data collection to help address uncertainty that may still exist after conducting a traditional HTA. The HTA process adopted in Ontario, Canada, is unique in that assessments are also made to determine what primary data research should be conducted and what should be collected in these studies. In this article, concerns with the traditional HTA process are discussed, followed by a description of the HTA process that has been established in Ontario, with a particular focus on the data collection program followed by the Programs for Assessment of Technology in Health Research Institute. An illustrative example is used to show how the Ontario HTA process works and the role value of information analyses plays in addressing decision uncertainty, determining research feasibility, and determining study data collection needs.

  14. Uncertainty in BRCA1 cancer susceptibility testing.

    PubMed

    Baty, Bonnie J; Dudley, William N; Musters, Adrian; Kinney, Anita Y

    2006-11-15

    This study investigated uncertainty in individuals undergoing genetic counseling/testing for breast/ovarian cancer susceptibility. Sixty-three individuals from a single kindred with a known BRCA1 mutation rated uncertainty about 12 items on a five-point Likert scale before and 1 month after genetic counseling/testing. Factor analysis identified a five-item total uncertainty scale that was sensitive to changes before and after testing. The items in the scale were related to uncertainty about obtaining health care, positive changes after testing, and coping well with results. The majority of participants (76%) rated reducing uncertainty as an important reason for genetic testing. The importance of reducing uncertainty was stable across time and unrelated to anxiety or demographics. Yet, at baseline, total uncertainty was low and decreased after genetic counseling/testing (P = 0.004). Analysis of individual items showed that after genetic counseling/testing, there was less uncertainty about the participant detecting cancer early (P = 0.005) and coping well with their result (P < 0.001). Our findings support the importance to clients of genetic counseling/testing as a means of reducing uncertainty. Testing may help clients to reduce the uncertainty about items they can control, and it may be important to differentiate the sources of uncertainty that are more or less controllable. Genetic counselors can help clients by providing anticipatory guidance about the role of uncertainty in genetic testing. (c) 2006 Wiley-Liss, Inc.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, R.A.

    Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a framework for quantifying the degree to which risk is reduced as mass is removed from shallow, saturated, low-permeability, dual-porosity, DNAPL source zones. Risk is defined in terms of meeting an alternate concentration level (ACL) at a compliance well in an aquifer underlying the source zone. Themore » ACL is back-calculated from a carcinogenic health-risk characterization at a downstream water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phases (dissolved, sorbed, free product). Due to the uncertainties in currently-available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making risk-reduction calculations for specific technologies. Despite the qualitative nature of the exercise, results imply that very high mass-removal efficiencies are required to achieve significant long-term risk reduction with technology, applications of finite duration. 17 refs., 7 figs., 6 tabs.« less

  16. Optimization under Uncertainty of a Biomass-integrated Renewable Energy Microgrid with Energy Storage

    NASA Astrophysics Data System (ADS)

    Zheng, Yingying

    The growing energy demands and needs for reducing carbon emissions call more and more attention to the development of renewable energy technologies and management strategies. Microgrids have been developed around the world as a means to address the high penetration level of renewable generation and reduce greenhouse gas emissions while attempting to address supply-demand balancing at a more local level. This dissertation presents a model developed to optimize the design of a biomass-integrated renewable energy microgrid employing combined heat and power with energy storage. A receding horizon optimization with Monte Carlo simulation were used to evaluate optimal microgrid design and dispatch under uncertainties in the renewable energy and utility grid energy supplies, the energy demands, and the economic assumptions so as to generate a probability density function for the cost of energy. Case studies were examined for a conceptual utility grid-connected microgrid application in Davis, California. The results provide the most cost effective design based on the assumed energy load profile, local climate data, utility tariff structure, and technical and financial performance of the various components of the microgrid. Sensitivity and uncertainty analyses are carried out to illuminate the key parameters that influence the energy costs. The model application provides a means to determine major risk factors associated with alternative design integration and operating strategies.

  17. Mining nonterrestrial resources: Information needs and research topics

    NASA Technical Reports Server (NTRS)

    Daemen, Jaak J. K.

    1992-01-01

    An outline of topics we need to understand better in order to apply mining technology to a nonterrestrial environment is presented. The proposed list is not intended to be complete. It aims to identify representative topics that suggest productive research. Such research will reduce the uncertainties associated with extrapolating from conventional earthbound practice to nonterrestrial applications. One objective is to propose projects that should put future discussions of nonterrestrial mining on a firmer, less speculative basis.

  18. Science and policy in regulatory decision making: getting the facts right about hazardous air pollutants.

    PubMed Central

    Sexton, K

    1995-01-01

    Hazardous air pollutants are regulated under Title III of the 1990 Clean Air Act Amendments. The Amendments replace the risk-based approach mandated in the 1977 Amendments with a prescriptive, technology-based approach requiring that maximum achievable control technology (MACT) be applied to all major industrial sources of 189 hazardous air pollutants. The change reflects political, rather than scientific consensus that the public health benefits justify the costs. The choice is put into perspective by looking at the interface between science and policy that occurs as part of regular decisionmaking. Particular emphasis is given to examining the interrelationships among facts (science), judgments (science policy), and policy (values) in the context of the risk assessment paradigm. Science and policy are discussed in relation to Title III, contrasting the political consensus for action with the scientific uncertainty about risks and benefits. It is argued that a balanced research program is needed to get the facts right about hazardous air pollutants, including research to meet statutory requirements, to reduce uncertainties in risk assessment, and to address strategic issues. PMID:8549476

  19. OAST Space Theme Workshop. Volume 3: Working group summary. 9: Aerothermodynamics (M-3). A: Statement. B: Technology needs (form 1). C. Priority assessment (form 2). D. Additional assessments

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Twelve aerothermodynamic space technology needs were identified to reduce the design uncertainties in aerodynamic heating and forces experienced by heavy lift launch vehicles, orbit transfer vehicles, and advanced single stage to orbit vehicles for the space transportation system, and for probes, planetary surface landers, and sample return vehicles for solar system exploration vehicles. Research and technology needs identified include: (1) increasing the fluid dynamics capability by at least two orders of magnitude by developing an advanced computer processor for the solution of fluid dynamic problems with improved software; (2) predicting multi-engine base flow fields for launch vehicles; and (3) developing methods to conserve energy in aerothermodynamic ground test facilities.

  20. Assessment of technologies to meet a low carbon fuel standard.

    PubMed

    Yeh, Sonia; Lutsey, Nicholas P; Parker, Nathan C

    2009-09-15

    California's low carbon fuel standard (LCFS) was designed to incentivize a diverse array of available strategies for reducing transportation greenhouse gas (GHG) emissions. It provides strong incentives for fuels with lower GHG emissions, while explicitly requiring a 10% reduction in California's transportation fuel GHG intensity by 2020. This paper investigates the potential for cost-effective GHG reductions from electrification and expanded use of biofuels. The analysis indicates that fuel providers could meetthe standard using a portfolio approach that employs both biofuels and electricity, which would reduce the risks and uncertainties associated with the progress of cellulosic and battery technologies, feedstock prices, land availability, and the sustainability of the various compliance approaches. Our analysis is based on the details of California's development of an LCFS; however, this research approach could be generalizable to a national U.S. standard and to similar programs in Europe and Canada.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, D.W.

    In previous reports, we have identified two potentially important issues, solutions to which would increase the attractiveness of DOE-developed technologies in commercial buildings energy systems. One issue concerns the fact that in addition to saving energy, many new technologies offer non-energy benefits that contribute to building productivity (firm profitability). The second issue is that new technologies are typically unproven in the eyes of decision makers and must bear risk premiums that offset cost advantages resulting from laboratory calculations. Even though a compelling case can be made for the importance of these issues, for building decision makers to incorporate them inmore » business decisions and for DOE to use them in R&D program planning there must be robust empirical evidence of their existence and size. This paper investigates how such measurements could be made and offers recommendations as to preferred options. There is currently little systematic information on either of these concepts in the literature. Of the two there is somewhat more information on non-energy benefits, but little as regards office buildings. Office building productivity impacts can be observed casually, but must be estimated statistically, because buildings have many interacting attributes and observations based on direct behavior can easily confuse the process of attribution. For example, absenteeism can be easily observed. However, absenteeism may be down because a more healthy space conditioning system was put into place, because the weather was milder, or because firm policy regarding sick days had changed. There is also a general dearth of appropriate information for purposes of estimation. To overcome these difficulties, we propose developing a new data base and applying the technique of hedonic price analysis. This technique has been used extensively in the analysis of residential dwellings. There is also a literature on its application to commercial and industrial buildings. Commercially available data bases exist that, if supplemented with engineering survey for equipment and materials use, could be analyzed statistically with a hedonic price model for the valuation of both the energy-saving and productivity effects of building technologies. Uncertainties about technology performance can cause investors to delay deploying new technologies. This behavior is explained by the ''investment under uncertainty'' literature. This literature suggests that under conditions of irrecoverable (''sunk'') costs, uncertain outcomes, and the ability to defer deployment, decision makers focus on potential losses and demand risk premiums and a few support the notion of focusing on losses, the so-called ''bad news principle.'' We describe a series of approaches to isolating buyer perceptions of uncertainty and means for reducing uncertainty.« less

  2. Primary Atomic Frequency Standards at NIST

    PubMed Central

    Sullivan, D. B.; Bergquist, J. C.; Bollinger, J. J.; Drullinger, R. E.; Itano, W. M.; Jefferts, S. R.; Lee, W. D.; Meekhof, D.; Parker, T. E.; Walls, F. L.; Wineland, D. J.

    2001-01-01

    The development of atomic frequency standards at NIST is discussed and three of the key frequency-standard technologies of the current era are described. For each of these technologies, the most recent NIST implementation of the particular type of standard is described in greater detail. The best relative standard uncertainty achieved to date for a NIST frequency standard is 1.5×10−15. The uncertainties of the most recent NIST standards are displayed relative to the uncertainties of atomic frequency standards of several other countries. PMID:27500017

  3. Uncertainties in estimates of the risks of late effects from space radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.

    2004-01-01

    Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. Published by Elsevier Ltd on behalf of COSPAR.

  4. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    NASA Astrophysics Data System (ADS)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  5. Uncertainties in Estimates of the Risks of Late Effects from Space Radiation

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.

    2002-01-01

    The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.

  6. Uncertainties in Projecting Risks of Late Effects from Space Radiation

    NASA Astrophysics Data System (ADS)

    Cucinotta, F.; Schimmerling, W.; Peterson, L.; Wilson, J.; Saganti, P.; Dicello, J.

    The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, CNS risks, and non - cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low -Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a maximum likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of the primary factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objectives, i.e., number of days in space without exceeding a given risk level within well defined confidence limits

  7. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, directmore » and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.« less

  8. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Albin, E-mail: albin.fredriksson@raysearchlabs.com; Hårdemark, Björn; Forsgren, Anders

    2015-07-15

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goalsmore » to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality.« less

  9. Methodology to Calculate the ACE and HPQ Metrics Used in the Wave Energy Prize

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Driscoll, Frederick R; Weber, Jochem W; Jenne, Dale S

    The U.S. Department of Energy's Wave Energy Prize Competition encouraged the development of innovative deep-water wave energy conversion technologies that at least doubled device performance above the 2014 state of the art. Because levelized cost of energy (LCOE) metrics are challenging to apply equitably to new technologies where significant uncertainty exists in design and operation, the prize technical team developed a reduced metric as proxy for LCOE, which provides an equitable comparison of low technology readiness level wave energy converter (WEC) concepts. The metric is called 'ACE' which is short for the ratio of the average climate capture width tomore » the characteristic capital expenditure. The methodology and application of the ACE metric used to evaluate the performance of the technologies that competed in the Wave Energy Prize are explained in this report.« less

  10. Technology Overview and Assessment for Small-Scale EDL Systems

    NASA Technical Reports Server (NTRS)

    Heidrich, Casey R.; Smith, Brandon P.; Braun, Robert D.

    2016-01-01

    Motivated by missions to land large rovers and humans at Mars and other bodies, high-mass EDL technologies are a prevalent trend in the research community. In contrast, EDL systems for low-mass payloads have attracted less attention. Significant potential in science and discovery exists in small-scale EDL systems. Payloads acting secondary to a flagship mission are a currently under-utilzed resource. Before taking advantage of these opportunities, further developed of scaled EDL technologies is required. The key limitations identified in this study are compact decelerators and deformable impact systems. Current technologies may enable rough landing of small payloads, with moderate restrictions in packaging volume. Utilization of passive descent and landing stages will greatly increase the applicability of small systems, allowing for vehicles robust to entry environment uncertainties. These architectures will provide an efficient means of achieving science and support objectives while reducing cost and risk margins of a parent mission.

  11. Reliability Impacts in Life Support Architecture and Technology Selection

    NASA Technical Reports Server (NTRS)

    Lange Kevin E.; Anderson, Molly S.

    2012-01-01

    Quantitative assessments of system reliability and equivalent system mass (ESM) were made for different life support architectures based primarily on International Space Station technologies. The analysis was applied to a one-year deep-space mission. System reliability was increased by adding redundancy and spares, which added to the ESM. Results were thus obtained allowing a comparison of the ESM for each architecture at equivalent levels of reliability. Although the analysis contains numerous simplifications and uncertainties, the results suggest that achieving necessary reliabilities for deep-space missions will add substantially to the life support ESM and could influence the optimal degree of life support closure. Approaches for reducing reliability impacts were investigated and are discussed.

  12. A comparison of advanced overlay technologies

    NASA Astrophysics Data System (ADS)

    Dasari, Prasad; Smith, Nigel; Goelzer, Gary; Liu, Zhuan; Li, Jie; Tan, Asher; Koh, Chin Hwee

    2010-03-01

    The extension of optical lithography to 22nm and beyond by Double Patterning Technology is often challenged by CDU and overlay control. With reduced overlay measurement error budgets in the sub-nm range, relying on traditional Total Measurement Uncertainty (TMU) estimates alone is no longer sufficient. In this paper we will report scatterometry overlay measurements data from a set of twelve test wafers, using four different target designs. The TMU of these measurements is under 0.4nm, within the process control requirements for the 22nm node. Comparing the measurement differences between DBO targets (using empirical and model based analysis) and with image-based overlay data indicates the presence of systematic and random measurement errors that exceeds the TMU estimate.

  13. Application of identified sensitive physical parameters in reducing the uncertainty of numerical simulation

    NASA Astrophysics Data System (ADS)

    Sun, Guodong; Mu, Mu

    2016-04-01

    An important source of uncertainty, which then causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. There are many physical parameters in numerical models in the atmospheric and oceanic sciences, and it would cost a great deal to reduce uncertainties in all physical parameters. Therefore, finding a subset of these parameters, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach. The results imply that nonlinear interactions among parameters play a key role in the uncertainty of numerical simulations in arid and semi-arid regions of China compared to those in northern, northeastern and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.

  14. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  15. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  16. Uncertainty analysis of integrated gasification combined cycle systems based on Frame 7H versus 7F gas turbines.

    PubMed

    Zhu, Yunhua; Frey, H Christopher

    2006-12-01

    Integrated gasification combined cycle (IGCC) technology is a promising alternative for clean generation of power and coproduction of chemicals from coal and other feedstocks. Advanced concepts for IGCC systems that incorporate state-of-the-art gas turbine systems, however, are not commercially demonstrated. Therefore, there is uncertainty regarding the future commercial-scale performance, emissions, and cost of such technologies. The Frame 7F gas turbine represents current state-of-practice, whereas the Frame 7H is the most recently introduced advanced commercial gas turbine. The objective of this study was to evaluate the risks and potential payoffs of IGCC technology based on different gas turbine combined cycle designs. Models of entrained-flow gasifier-based IGCC systems with Frame 7F (IGCC-7F) and 7H gas turbine combined cycles (IGCC-7H) were developed in ASPEN Plus. An uncertainty analysis was conducted. Gasifier carbon conversion and project cost uncertainty are identified as the most important uncertain inputs with respect to system performance and cost. The uncertainties in the difference of the efficiencies and costs for the two systems are characterized. Despite uncertainty, the IGCC-7H system is robustly preferred to the IGCC-7F system. Advances in gas turbine design will improve the performance, emissions, and cost of IGCC systems. The implications of this study for decision-making regarding technology selection, research planning, and plant operation are discussed.

  17. Tower-Based Greenhouse Gas Measurement Network Design---The National Institute of Standards and Technology North East Corridor Testbed.

    PubMed

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k -means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  18. Tower-based greenhouse gas measurement network design—The National Institute of Standards and Technology North East Corridor Testbed

    NASA Astrophysics Data System (ADS)

    Lopez-Coto, Israel; Ghosh, Subhomoy; Prasad, Kuldeep; Whetstone, James

    2017-09-01

    The North-East Corridor (NEC) Testbed project is the 3rd of three NIST (National Institute of Standards and Technology) greenhouse gas emissions testbeds designed to advance greenhouse gas measurements capabilities. A design approach for a dense observing network combined with atmospheric inversion methodologies is described. The Advanced Research Weather Research and Forecasting Model with the Stochastic Time-Inverted Lagrangian Transport model were used to derive the sensitivity of hypothetical observations to surface greenhouse gas emissions (footprints). Unlike other network design algorithms, an iterative selection algorithm, based on a k-means clustering method, was applied to minimize the similarities between the temporal response of each site and maximize sensitivity to the urban emissions contribution. Once a network was selected, a synthetic inversion Bayesian Kalman filter was used to evaluate observing system performance. We present the performances of various measurement network configurations consisting of differing numbers of towers and tower locations. Results show that an overly spatially compact network has decreased spatial coverage, as the spatial information added per site is then suboptimal as to cover the largest possible area, whilst networks dispersed too broadly lose capabilities of constraining flux uncertainties. In addition, we explore the possibility of using a very high density network of lower cost and performance sensors characterized by larger uncertainties and temporal drift. Analysis convergence is faster with a large number of observing locations, reducing the response time of the filter. Larger uncertainties in the observations implies lower values of uncertainty reduction. On the other hand, the drift is a bias in nature, which is added to the observations and, therefore, biasing the retrieved fluxes.

  19. Digital Actuator Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ken Thomas; Ted Quinn; Jerry Mauck

    There are significant developments underway in new types of actuators for power plant active components. Many of these make use of digital technology to provide a wide array of benefits in performance of the actuators and in reduced burden to maintain them. These new product offerings have gained considerable acceptance in use in process plants. In addition, they have been used in conventional power generation very successfully. This technology has been proven to deliver the benefits promised and substantiate the claims of improved performance. The nuclear industry has been reluctant to incorporate digital actuator technology into nuclear plant designs duemore » to concerns due to a number of concerns. These could be summarized as cost, regulatory uncertainty, and a certain comfort factor with legacy analog technology. The replacement opportunity for these types of components represents a decision point for whether to invest in more modern technology that would provide superior operational and maintenance benefits. Yet, the application of digital technology has been problematic for the nuclear industry, due to qualification and regulatory issues. With some notable exceptions, the result has been a continuing reluctance to undertake the risks and uncertainties of implementing digital actuator technology when replacement opportunities present themselves. Rather, utilities would typically prefer to accept the performance limitations of the legacy analog actuator technologies to avoid impacts to project costs and schedules. The purpose of this report is to demonstrate that the benefits of digital actuator technology can be significant in terms of plant performance and that it is worthwhile to address the barriers currently holding back the widespread development and use of this technology. It addresses two important objectives in pursuit of the beneficial use of digital actuator technology for nuclear power plants: 1. To demonstrate the benefits of digital actuator technology over legacy analog sensor technology in both quantitative and qualitative ways. 2. To recognize and address the added difficulty of digital technology qualification, especially in regard to software common cause failure (SCCF), that is introduced by the use of digital actuator technology.« less

  20. Analysis of Connected and Automated Vehicle Technologies Highlights

    Science.gov Websites

    Uncertainty in Potential Effects on Fuel Use, Miles Traveled | News | NREL Analysis of Connected and Automated Vehicle Technologies Highlights Uncertainty in Potential Effects on Fuel Use, Miles Potential Effects on Fuel Use, Miles Traveled December 13, 2016 A joint study from the U.S. Department of

  1. Accepting uncertainty, assessing risk: decision quality in managing wildfire, forest resource values, and new technology

    Treesearch

    Jeffrey G. Borchers

    2005-01-01

    The risks, uncertainties, and social conflicts surrounding uncharacteristic wildfire and forest resource values have defied conventional approaches to planning and decision-making. Paradoxically, the adoption of technological innovations such as risk assessment, decision analysis, and landscape simulation models by land management organizations has been limited. The...

  2. US Efforts in Support of Examinations at Fukushima Daiichi – 2016 Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amway, P.; Andrews, N.; Bixby, Willis

    Although it is clear that the accident signatures from each unit at the Fukushima Daiichi Nuclear Power Station (NPS) [Daiichi] differ, much is not known about the end-state of core materials within these units. Some of this uncertainty can be attributed to a lack of information related to cooling system operation and cooling water injection. There is also uncertainty in our understanding of phenomena affecting: a) in-vessel core damage progression during severe accidents in boiling water reactors (BWRs), and b) accident progression after vessel failure (ex-vessel progression) for BWRs and Pressurized Water Reactors (PWRs). These uncertainties arise due to limitedmore » full scale prototypic data. Similar to what occurred after the accident at Three Mile Island Unit 2, these Daiichi units offer the international community a means to reduce such uncertainties by obtaining prototypic data from multiple full-scale BWR severe accidents. Information obtained from Daiichi is required to inform Decontamination and Decommissioning activities, improving the ability of the Tokyo Electric Power Company Holdings (TEPCO) to characterize potential hazards and to ensure the safety of workers involved with cleanup activities. This document reports recent results from the US Forensics Effort to use information obtained by TEPCO to enhance the safety of existing and future nuclear power plant designs. This Forensics Effort, which is sponsored by the Reactor Safety Technologies Pathway of the Department of Energy Office of Nuclear Energy Light Water Reactor (LWR) Sustainability Program, consists of a group of US experts in LWR safety and plant operations that have identified examination needs and are evaluating TEPCO information from Daiichi that address these needs. Examples presented in this report demonstrate that significant safety insights are being obtained in the areas of component performance, fission product release and transport, debris end-state location, and combustible gas generation and transport. In addition to reducing uncertainties related to severe accident modeling progression, these insights are being used to update guidance for severe accident prevention, mitigation, and emergency planning. Furthermore, reduced uncertainties in modeling the events at Daiichi will improve the realism of reactor safety evaluations and inform future D&D activities by improving the capability for characterizing potential hazards to workers involved with cleanup activities.« less

  3. Global reverse supply chain design for solid waste recycling under uncertainties and carbon emission constraint.

    PubMed

    Xu, Zhitao; Elomri, Adel; Pokharel, Shaligram; Zhang, Qin; Ming, X G; Liu, Wenjie

    2017-06-01

    The emergence of concerns over environmental protection, resource conservation as well as the development of logistics operations and manufacturing technology has led several countries to implement formal collection and recycling systems of solid waste. Such recycling system has the benefits of reducing environmental pollution, boosting the economy by creating new jobs, and generating income from trading the recyclable materials. This leads to the formation of a global reverse supply chain (GRSC) of solid waste. In this paper, we investigate the design of such a GRSC with a special emphasis on three aspects; (1) uncertainty of waste collection levels, (2) associated carbon emissions, and (3) challenges posed by the supply chain's global aspect, particularly the maritime transportation costs and currency exchange rates. To the best of our knowledge, this paper is the first attempt to integrate the three above-mentioned important aspects in the design of a GRSC. We have used mixed integer-linear programming method along with robust optimization to develop the model which is validated using a sample case study of e-waste management. Our results show that using a robust model by taking the complex interactions characterizing global reverse supply chain networks into account, we can create a better GRSC. The effect of uncertainties and carbon constraints on decisions to reduce costs and emissions are also shown. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    NASA Technical Reports Server (NTRS)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  5. Photomask applications of traceable atomic force microscope dimensional metrology at NIST

    NASA Astrophysics Data System (ADS)

    Dixson, Ronald; Orji, Ndubuisi G.; Potzick, James; Fu, Joseph; Allen, Richard A.; Cresswell, Michael; Smith, Stewart; Walton, Anthony J.; Tsiamis, Andreas

    2007-10-01

    The National Institute of Standards and Technology (NIST) has a multifaceted program in atomic force microscope (AFM) dimensional metrology. Three major instruments are being used for traceable measurements. The first is a custom in-house metrology AFM, called the calibrated AFM (C-AFM), the second is the first generation of commercially available critical dimension AFM (CD-AFM), and the third is a current generation CD-AFM at SEMATECH - for which NIST has established the calibration and uncertainties. All of these instruments have useful applications in photomask metrology. Linewidth reference metrology is an important application of CD-AFM. We have performed a preliminary comparison of linewidths measured by CD-AFM and by electrical resistance metrology on a binary mask. For the ten selected test structures with on-mask linewidths between 350 nm and 600 nm, most of the observed differences were less than 5 nm, and all of them were less than 10 nm. The offsets were often within the estimated uncertainties of the AFM measurements, without accounting for the effect of linewidth roughness or the uncertainties of electrical measurements. The most recent release of the NIST photomask standard - which is Standard Reference Material (SRM) 2059 - was also supported by CD-AFM reference measurements. We review the recent advances in AFM linewidth metrology that will reduce the uncertainty of AFM measurements on this and future generations of the NIST photomask standard. The NIST C-AFM has displacement metrology for all three axes traceable to the 633 nm wavelength of the iodine-stabilized He-Ne laser. One of the important applications of the C-AFM is step height metrology, which has some relevance to phase shift calibration. In the current generation of the system, the approximate level of relative standard uncertainty for step height measurements at the 100 nm scale is 0.1 %. We discuss the monitor history of a 290 nm step height, originally measured on the C-AFM with a 1.9 nm (k = 2) expanded uncertainty, and describe advances that bring the step height uncertainty of recent measurements to an estimated 0.6 nm (k = 2). Based on this work, we expect to be able to reduce the topographic component of phase uncertainty in alternating aperture phase shift masks (AAPSM) by a factor of three compared to current calibrations based on earlier generation step height references.

  6. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  7. New Assignment of Mass Values and Uncertainties to NIST Working Standards

    PubMed Central

    Davis, Richard S.

    1990-01-01

    For some time it had been suspected that values assigned to NIST working standards of mass were some 0.17 mg/kg larger than mass values based on artifacts representing mass in the International System of Units (SI). This relatively small offset, now confirmed, has had minimal scientific or technological significance. The discrepancy was removed on January 1, 1990. We document the history of the discrepancy, the studies which allow its removal, and the methods in place to limit its effect and prevent its recurrence. For routine calibrations, we believe that our working standards now have a long-term stability of 0.033 mg/kg (3σ) with respect to the national prototype kilograms of the United States. We provisionally admit an additional uncertainty of 0.09 mg/kg (3σ), systematic to all NIST mass measurements, which represents the possible offset of our primary standards from standards maintained by the Bureau International des Poids et Mesures (BIPM). This systematic uncertainty may be significantly reduced after analysis of results from the 3rd verification of national prototype kilograms, which is now underway. PMID:28179759

  8. The valuation of water quality: effects of mixing different drinking water qualities.

    PubMed

    Rygaard, Martin; Arvin, Erik; Binning, Philip J

    2009-03-01

    As water supplies increasingly turn to use desalination technologies it becomes relevant to consider the options for remineralization and blending with mineral rich water resources. We present a method for analyzing economic consequences due to changes in drinking water mineral content. Included impacts are cardiovascular diseases, dental caries, atopic eczema, lifetime of dish and clothes washing machines, heat exchangers, distribution systems, bottled water consumption and soap usage. The method includes an uncertainty assessment that ranks the impacts having the highest influence on the result and associated uncertainty. Effects are calculated for a scenario where 50% of Copenhagen's water supply is substituted by desalinated water. Without remineralization the total impact is expected to be negative (euro -0.44+/-0.2/m(3)) and individual impacts expected in the range of euro 0.01-0.51/m(3) delivered water. Health impacts have the highest contribution to impact size and uncertainty. With remineralization it is possible to reduce several negative impacts and the total impact is expected to be positive (euro 0.14+/-0.08/m(3)).

  9. Markov Task Network: A Framework for Service Composition under Uncertainty in Cyber-Physical Systems.

    PubMed

    Mohammed, Abdul-Wahid; Xu, Yang; Hu, Haixiao; Agyemang, Brighter

    2016-09-21

    In novel collaborative systems, cooperative entities collaborate services to achieve local and global objectives. With the growing pervasiveness of cyber-physical systems, however, such collaboration is hampered by differences in the operations of the cyber and physical objects, and the need for the dynamic formation of collaborative functionality given high-level system goals has become practical. In this paper, we propose a cross-layer automation and management model for cyber-physical systems. This models the dynamic formation of collaborative services pursuing laid-down system goals as an ontology-oriented hierarchical task network. Ontological intelligence provides the semantic technology of this model, and through semantic reasoning, primitive tasks can be dynamically composed from high-level system goals. In dealing with uncertainty, we further propose a novel bridge between hierarchical task networks and Markov logic networks, called the Markov task network. This leverages the efficient inference algorithms of Markov logic networks to reduce both computational and inferential loads in task decomposition. From the results of our experiments, high-precision service composition under uncertainty can be achieved using this approach.

  10. Benefits of on-wafer calibration standards fabricated in membrane technology

    NASA Astrophysics Data System (ADS)

    Rohland, M.; Arz, U.; Büttgenbach, S.

    2011-07-01

    In this work we compare on-wafer calibration standards fabricated in membrane technology with standards built in conventional thin-film technology. We perform this comparison by investigating the propagation of uncertainties in the geometry and material properties to the broadband electrical properties of the standards. For coplanar waveguides used as line standards the analysis based on Monte Carlo simulations demonstrates an up to tenfold reduction in uncertainty depending on the electromagnetic waveguide property we look at.

  11. Reducing Risk in CO2 Sequestration: A Framework for Integrated Monitoring of Basin Scale Injection

    NASA Astrophysics Data System (ADS)

    Seto, C. J.; Haidari, A. S.; McRae, G. J.

    2009-12-01

    Geological sequestration of CO2 is an option for stabilization of atmospheric CO2 concentrations. Technical ability to safely store CO2 in the subsurface has been demonstrated through pilot projects and a long history of enhanced oil recovery and acid gas disposal operations. To address climate change, current injection operations must be scaled up by a factor of 100, raising issues of safety and security. Monitoring and verification is an essential component in ensuring safe operations and managing risk. Monitoring provides assurance that CO2 is securely stored in the subsurface, and the mechanisms governing transport and storage are well understood. It also provides an early warning mechanism for identification of anomalies in performance, and a means for intervention and remediation through the ability to locate the CO2. Through theoretical studies, bench scale experiments and pilot tests, a number of technologies have demonstrated their ability to monitor CO2 in the surface and subsurface. Because the focus of these studies has been to demonstrate feasibility, individual techniques have not been integrated to provide a more robust method for monitoring. Considering the large volumes required for injection, size of the potential footprint, length of time a project must be monitored and uncertainty, operational considerations of cost and risk must balance safety and security. Integration of multiple monitoring techniques will reduce uncertainty in monitoring injected CO2, thereby reducing risk. We present a framework for risk management of large scale injection through model based monitoring network design. This framework is applied to monitoring CO2 in a synthetic reservoir where there is uncertainty in the underlying permeability field controlling fluid migration. Deformation and seismic data are used to track plume migration. A modified Ensemble Kalman filter approach is used to estimate flow properties by jointly assimilating flow and geomechanical observations. Issues of risk, cost and uncertainty are considered.

  12. A Summary of Lightpipe Radiation Thermometry Research at NIST

    PubMed Central

    Tsai, Benjamin K.

    2006-01-01

    During the last 10 years, research in light-pipe radiation thermometry has significantly reduced the uncertainties for temperature measurements in semiconductor processing. The National Institute of Standards and Technology (NIST) has improved the calibration of lightpipe radiation thermometers (LPRTs), the characterization procedures for LPRTs, the in situ calibration of LPRTs using thin-film thermocouple (TFTC) test wafers, and the application of model-based corrections to improve LPRT spectral radiance temperatures. Collaboration with industry on implementing techniques and ideas established at NIST has led to improvements in temperature measurements in semiconductor processing. LPRTs have been successfully calibrated at NIST for rapid thermal processing (RTP) applications using a sodium heat-pipe blackbody between 700 °C and 900 °C with an uncertainty of about 0.3 °C (k = 1) traceable to the International Temperature Scale of 1990. Employing appropriate effective emissivity models, LPRTs have been used to determine the wafer temperature in the NIST RTP Test Bed with an uncertainty of 3.5 °C. Using a TFTC wafer for calibration, the LPRT can measure the wafer temperature in the NIST RTP Test Bed with an uncertainty of 2.3 °C. Collaborations with industry in characterizing and calibrating LPRTs will be summarized, and future directions for LPRT research will be discussed. PMID:27274914

  13. A Summary of Lightpipe Radiation Thermometry Research at NIST.

    PubMed

    Tsai, Benjamin K

    2006-01-01

    During the last 10 years, research in light-pipe radiation thermometry has significantly reduced the uncertainties for temperature measurements in semiconductor processing. The National Institute of Standards and Technology (NIST) has improved the calibration of lightpipe radiation thermometers (LPRTs), the characterization procedures for LPRTs, the in situ calibration of LPRTs using thin-film thermocouple (TFTC) test wafers, and the application of model-based corrections to improve LPRT spectral radiance temperatures. Collaboration with industry on implementing techniques and ideas established at NIST has led to improvements in temperature measurements in semiconductor processing. LPRTs have been successfully calibrated at NIST for rapid thermal processing (RTP) applications using a sodium heat-pipe blackbody between 700 °C and 900 °C with an uncertainty of about 0.3 °C (k = 1) traceable to the International Temperature Scale of 1990. Employing appropriate effective emissivity models, LPRTs have been used to determine the wafer temperature in the NIST RTP Test Bed with an uncertainty of 3.5 °C. Using a TFTC wafer for calibration, the LPRT can measure the wafer temperature in the NIST RTP Test Bed with an uncertainty of 2.3 °C. Collaborations with industry in characterizing and calibrating LPRTs will be summarized, and future directions for LPRT research will be discussed.

  14. Modelling adaptation to climate change of Ecuadorian agriculture and associated water resources: uncertainties in coastal and highland cropping systems

    NASA Astrophysics Data System (ADS)

    Ruiz-Ramos, Margarita; Bastidas, Wellington; Cóndor, Amparo; Villacís, Marcos; Calderón, Marco; Herrera, Mario; Zambrano, José Luis; Lizaso, Jon; Hernández, Carlos; Rodríguez, Alfredo; Capa-Morocho, Mirian

    2016-04-01

    Climate change threatens sustainability of farms and associated water resources in Ecuador. Although the last IPCC report (AR5) provides a general framework for adaptation, , impact assessment and especially adaptation analysis should be site-specific, taking into account both biophysical and social aspects. The objective of this study is to analyse the climate change impacts and to sustainable adaptations to optimize the crop yield. Furthermore is also aimed to weave agronomical and hydrometeorological aspects, to improve the modelling of the coastal ("costa") and highland ("sierra") cropping systems in Ecuador, from the agricultural production and water resources points of view. The final aim is to support decision makers, at national and local institutions, for technological implementation of structural adaptation strategies, and to support farmers for their autonomous adaptation actions to cope with the climate change impacts and that allow equal access to resources and appropriate technologies. . A diagnosis of the current situation in terms of data availability and reliability was previously done, and the main sources of uncertainty for agricultural projections have been identified: weather data, especially precipitation projections, soil data below the upper 30 cm, and equivalent experimental protocol for ecophysiological crop field measurements. For reducing these uncertainties, several methodologies are being discussed. This study was funded by PROMETEO program from Ecuador through SENESCYT (M. Ruiz-Ramos contract), and by the project COOP-XV-25 funded by Universidad Politécnica de Madrid.

  15. Financial options methodology for analyzing investments in new technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wenning, B.D.

    1994-12-31

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options valuation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisionsmore » are being contemplated.« less

  16. Financial options methodology for analyzing investments in new technology

    NASA Technical Reports Server (NTRS)

    Wenning, B. D.

    1995-01-01

    The evaluation of investments in longer term research and development in emerging technologies, because of the nature of such subjects, must address inherent uncertainties. Most notably, future cash flow forecasts include substantial uncertainties. Conventional present value methodology, when applied to emerging technologies severely penalizes cash flow forecasts, and strategic investment opportunities are at risk of being neglected. Use of options evaluation methodology adapted from the financial arena has been introduced as having applicability in such technology evaluations. Indeed, characteristics of superconducting magnetic energy storage technology suggest that it is a candidate for the use of options methodology when investment decisions are being contemplated.

  17. Review of health and productivity gains from better IEQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, William J.

    2000-08-01

    The available scientific data suggest that existing technologies and procedures can improve indoor environmental quality (IEQ) in a manner that significantly increases productivity and health. While there is considerable uncertainty in the estimates of the magnitudes of productivity gains that may be obtained, the projected gains are very large. For the U.S., the estimated potential annual savings and productivity gains are $6 to $14 billion from reduced respiratory disease, $2 to $4 billion from reduced allergies and asthma, $10 to $30 billion from reduced sick building syndrome symptoms, and $20 to $160 billion from direct improvements in worker performance thatmore » are unrelated to health. Productivity gains that are quantified and demonstrated could serve as a strong stimulus for energy efficiency measures that simultaneously improve the indoor environment.« less

  18. Accounting for uncertainty in DNA sequencing data.

    PubMed

    O'Rawe, Jason A; Ferson, Scott; Lyon, Gholson J

    2015-02-01

    Science is defined in part by an honest exposition of the uncertainties that arise in measurements and propagate through calculations and inferences, so that the reliabilities of its conclusions are made apparent. The recent rapid development of high-throughput DNA sequencing technologies has dramatically increased the number of measurements made at the biochemical and molecular level. These data come from many different DNA-sequencing technologies, each with their own platform-specific errors and biases, which vary widely. Several statistical studies have tried to measure error rates for basic determinations, but there are no general schemes to project these uncertainties so as to assess the surety of the conclusions drawn about genetic, epigenetic, and more general biological questions. We review here the state of uncertainty quantification in DNA sequencing applications, describe sources of error, and propose methods that can be used for accounting and propagating these errors and their uncertainties through subsequent calculations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Designing for Uncertainty: Three Approaches

    ERIC Educational Resources Information Center

    Bennett, Scott

    2007-01-01

    Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

  20. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    PubMed

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  1. Number-phase minimum-uncertainty state with reduced number uncertainty in a Kerr nonlinear interferometer

    NASA Astrophysics Data System (ADS)

    Kitagawa, M.; Yamamoto, Y.

    1987-11-01

    An alternative scheme for generating amplitude-squeezed states of photons based on unitary evolution which can properly be described by quantum mechanics is presented. This scheme is a nonlinear Mach-Zehnder interferometer containing an optical Kerr medium. The quasi-probability density (QPD) and photon-number distribution of the output field are calculated, and it is demonstrated that the reduced photon-number uncertainty and enhanced phase uncertainty maintain the minimum-uncertainty product. A self-phase-modulation of the single-mode quantized field in the Kerr medium is described based on localized operators. The spatial evolution of the state is demonstrated by QPD in the Schroedinger picture. It is shown that photon-number variance can be reduced to a level far below the limit for an ordinary squeezed state, and that the state prepared using this scheme remains a number-phase minimum-uncertainty state until the maximum reduction of number fluctuations is surpassed.

  2. Precision Departure Release Capability (PDRC) Concept of Operations

    NASA Technical Reports Server (NTRS)

    Engelland, Shawn; Capps, Richard A.; Day, Kevin Brian

    2013-01-01

    After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demandcapacity imbalances. When demand exceeds capacity Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in DallasFort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Concept of Operations. Companion papers include the Final Report and a Technology Description. ? SUBJECT:

  3. Reducing uncertainty about objective functions in adaptive management

    USGS Publications Warehouse

    Williams, B.K.

    2012-01-01

    This paper extends the uncertainty framework of adaptive management to include uncertainty about the objectives to be used in guiding decisions. Adaptive decision making typically assumes explicit and agreed-upon objectives for management, but allows for uncertainty as to the structure of the decision process that generates change through time. Yet it is not unusual for there to be uncertainty (or disagreement) about objectives, with different stakeholders expressing different views not only about resource responses to management but also about the appropriate management objectives. In this paper I extend the treatment of uncertainty in adaptive management, and describe a stochastic structure for the joint occurrence of uncertainty about objectives as well as models, and show how adaptive decision making and the assessment of post-decision monitoring data can be used to reduce uncertainties of both kinds. Different degrees of association between model and objective uncertainty lead to different patterns of learning about objectives. ?? 2011.

  4. Parameter and prediction uncertainty in an optimized terrestrial carbon cycle model: Effects of constraining variables and data record length

    NASA Astrophysics Data System (ADS)

    Ricciuto, Daniel M.; King, Anthony W.; Dragoni, D.; Post, Wilfred M.

    2011-03-01

    Many parameters in terrestrial biogeochemical models are inherently uncertain, leading to uncertainty in predictions of key carbon cycle variables. At observation sites, this uncertainty can be quantified by applying model-data fusion techniques to estimate model parameters using eddy covariance observations and associated biometric data sets as constraints. Uncertainty is reduced as data records become longer and different types of observations are added. We estimate parametric and associated predictive uncertainty at the Morgan Monroe State Forest in Indiana, USA. Parameters in the Local Terrestrial Ecosystem Carbon (LoTEC) are estimated using both synthetic and actual constraints. These model parameters and uncertainties are then used to make predictions of carbon flux for up to 20 years. We find a strong dependence of both parametric and prediction uncertainty on the length of the data record used in the model-data fusion. In this model framework, this dependence is strongly reduced as the data record length increases beyond 5 years. If synthetic initial biomass pool constraints with realistic uncertainties are included in the model-data fusion, prediction uncertainty is reduced by more than 25% when constraining flux records are less than 3 years. If synthetic annual aboveground woody biomass increment constraints are also included, uncertainty is similarly reduced by an additional 25%. When actual observed eddy covariance data are used as constraints, there is still a strong dependence of parameter and prediction uncertainty on data record length, but the results are harder to interpret because of the inability of LoTEC to reproduce observed interannual variations and the confounding effects of model structural error.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Post, Wilfred M; King, Anthony Wayne; Dragoni, Danilo

    Many parameters in terrestrial biogeochemical models are inherently uncertain, leading to uncertainty in predictions of key carbon cycle variables. At observation sites, this uncertainty can be quantified by applying model-data fusion techniques to estimate model parameters using eddy covariance observations and associated biometric data sets as constraints. Uncertainty is reduced as data records become longer and different types of observations are added. We estimate parametric and associated predictive uncertainty at the Morgan Monroe State Forest in Indiana, USA. Parameters in the Local Terrestrial Ecosystem Carbon (LoTEC) are estimated using both synthetic and actual constraints. These model parameters and uncertainties aremore » then used to make predictions of carbon flux for up to 20 years. We find a strong dependence of both parametric and prediction uncertainty on the length of the data record used in the model-data fusion. In this model framework, this dependence is strongly reduced as the data record length increases beyond 5 years. If synthetic initial biomass pool constraints with realistic uncertainties are included in the model-data fusion, prediction uncertainty is reduced by more than 25% when constraining flux records are less than 3 years. If synthetic annual aboveground woody biomass increment constraints are also included, uncertainty is similarly reduced by an additional 25%. When actual observed eddy covariance data are used as constraints, there is still a strong dependence of parameter and prediction uncertainty on data record length, but the results are harder to interpret because of the inability of LoTEC to reproduce observed interannual variations and the confounding effects of model structural error.« less

  6. Delaying investments in sensor technology: The rationality of dairy farmers' investment decisions illustrated within the framework of real options theory.

    PubMed

    Rutten, C J; Steeneveld, W; Oude Lansink, A G J M; Hogeveen, H

    2018-05-02

    The adoption rate of sensors on dairy farms varies widely. Whereas some sensors are hardly adopted, others are adopted by many farmers. A potential rational explanation for the difference in adoption may be the expected future technological progress in the sensor technology and expected future improved decision support possibilities. For some sensors not much progress can be expected because the technology has already made enormous progress in recent years, whereas for sensors that have only recently been introduced on the market, much progress can be expected. The adoption of sensors may thus be partly explained by uncertainty about the investment decision, in which uncertainty lays in the future performance of the sensors and uncertainty about whether improved informed decision support will become available. The overall aim was to offer a plausible example of why a sensor may not be adopted now. To explain this, the role of uncertainty about technological progress in the investment decision was illustrated for highly adopted sensors (automated estrus detection) and hardly adopted sensors (automated body condition score). This theoretical illustration uses the real options theory, which accounts for the role of uncertainty in the timing of investment decisions. A discrete event model, simulating a farm of 100 dairy cows, was developed to estimate the net present value (NPV) of investing now and investing in 5 yr in both sensor systems. The results show that investing now in automated estrus detection resulted in a higher NPV than investing 5 yr from now, whereas for the automated body condition score postponing the investment resulted in a higher NPV compared with investing now. These results are in line with the observation that farmers postpone investments in sensors. Also, the current high adoption of automated estrus detection sensors can be explained because the NPV of investing now is higher than the NPV of investing in 5 yr. The results confirm that uncertainty about future sensor performance and uncertainty about whether improved decision support will become available play a role in investment decisions. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  7. Evaluating a multispecies adaptive management framework: Must uncertainty impede effective decision-making?

    USGS Publications Warehouse

    Smith, David R.; McGowan, Conor P.; Daily, Jonathan P.; Nichols, James D.; Sweka, John A.; Lyons, James E.

    2013-01-01

    Application of adaptive management to complex natural resource systems requires careful evaluation to ensure that the process leads to improved decision-making. As part of that evaluation, adaptive policies can be compared with alternative nonadaptive management scenarios. Also, the value of reducing structural (ecological) uncertainty to achieving management objectives can be quantified.A multispecies adaptive management framework was recently adopted by the Atlantic States Marine Fisheries Commission for sustainable harvest of Delaware Bay horseshoe crabs Limulus polyphemus, while maintaining adequate stopover habitat for migrating red knots Calidris canutus rufa, the focal shorebird species. The predictive model set encompassed the structural uncertainty in the relationships between horseshoe crab spawning, red knot weight gain and red knot vital rates. Stochastic dynamic programming was used to generate a state-dependent strategy for harvest decisions given that uncertainty. In this paper, we employed a management strategy evaluation approach to evaluate the performance of this adaptive management framework. Active adaptive management was used by including model weights as state variables in the optimization and reducing structural uncertainty by model weight updating.We found that the value of information for reducing structural uncertainty is expected to be low, because the uncertainty does not appear to impede effective management. Harvest policy responded to abundance levels of both species regardless of uncertainty in the specific relationship that generated those abundances. Thus, the expected horseshoe crab harvest and red knot abundance were similar when the population generating model was uncertain or known, and harvest policy was robust to structural uncertainty as specified.Synthesis and applications. The combination of management strategy evaluation with state-dependent strategies from stochastic dynamic programming was an informative approach to evaluate adaptive management performance and value of learning. Although natural resource decisions are characterized by uncertainty, not all uncertainty will cause decisions to be altered substantially, as we found in this case. It is important to incorporate uncertainty into the decision framing and evaluate the effect of reducing that uncertainty on achieving the desired outcomes

  8. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    NASA Astrophysics Data System (ADS)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model. This complex model then serves as the basis to compare simpler model structures. Through this approach, predictive uncertainty can be quantified relative to a known reference solution.

  9. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Blanford; E. Keldrauk; M. Laufer

    2010-09-20

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement,more » and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using factory prefabricated structural modules, for application to external event shell and base isolated structures.« less

  10. Innovation under Regulatory Uncertainty: Evidence from Medical Technology

    PubMed Central

    Stern, Ariel Dora

    2016-01-01

    This paper explores how the regulatory approval process affects innovation incentives in medical technologies. Prior studies have found early mover regulatory advantages for drugs. I find the opposite for medical devices, where pioneer entrants spend 34 percent (7.2 months) longer than follow-on entrants in regulatory approval. Back-of-the- envelope calculations suggest that the cost of a delay of this length is upwards of 7 percent of the total cost of bringing a new high-risk device to market. Considering potential explanations, I find that approval times are largely unrelated to technological novelty, but are meaningfully reduced by the publication of objective regulatory guidelines. Finally, I consider how the regulatory process affects small firms’ market entry patterns and find that small firms are less likely to be pioneers in new device markets, a fact consistent with relatively higher costs of doing so for more financially constrained firms. PMID:28652646

  11. Impact of government regulation on health care technology

    NASA Astrophysics Data System (ADS)

    Berkowitz, Robert D.

    1994-12-01

    Increased government regulation of the medical device industry produces higher expenses, a longer time to return investment capital, and greater uncertainty. As a result there are fewer new ventures and reduced efforts to develop new technology in established companies. The current federal regulatory framework has shifted from monitoring the product to monitoring the process. The inability to reach perfect performance in such a regulated environment subject to continuous and fluid interpretation guarantees non-compliance and growing ethical tension. Without new medical technology, we may be unable to maintain quality medical coverage in the face of rising demand. The author proposes risk assessment to set regulatory priorities; the conversion of a national weapons lab to a national device testing lab; the establishment of device standards and the monitoring of in-use performance against these standards; and the education of patients and users as to the results of these examinations.

  12. Conventional engine technology. Volume 2: Status of diesel engine technology

    NASA Technical Reports Server (NTRS)

    Schneider, H. W.

    1981-01-01

    The engines of diesel cars marketed in the United States were examined. Prominent design features, performance characteristics, fuel economy and emissions data were compared. Specific problems, in particular those of NO and smoke emissions, the effects of increasing dieselization on diesel fuel price and availability, current R&D work and advanced diesel concepts are discussed. Diesel cars currently have a fuel economy advantage over gasoline engine powered cars. Diesel drawbacks (noise and odor) were reduced to a less objectionable level. An equivalent gasoline engine driveability was obtained with turbocharging. Diesel manufacturers see a growth in the diesel market for the next ten years. Uncertainties regarding future emission regulation may inhibit future diesel production investments. With spark ignition engine technology advancing in the direction of high compression ratios, the fuel economy advantages of the diesel car is expected to diminish. To return its fuel economy lead, the diesel's potential for future improvement must be used.

  13. Public and stakeholder participation for managing and reducing the risks of shale gas development.

    PubMed

    North, D Warner; Stern, Paul C; Webler, Thomas; Field, Patrick

    2014-01-01

    Emerging technologies pose particularly strong challenges for risk governance when they have multidimensional and inequitable impacts, when there is scientific uncertainty about the technology and its risks, when there are strong value conflicts over the perceived benefits and risks, when decisions must be made urgently, and when the decision making environment is rife with mistrust. Shale gas development is one such emerging technology. Drawing on previous U.S. National Research Council committee reports that examined risk decision making for complex issues like these, we point to the benefits and challenges of applying the analytic-deliberative process recommended in those reports for stakeholder and public engagement in risk decision making about shale gas development in the United States. We discuss the different phases of such a process and conclude by noting the dangers of allowing controversy to ossify and the benefits of sound dialogue and learning among publics, stakeholders, industry, and regulatory decision makers.

  14. The large area crop inventory experiment: A major demonstration of space remote sensing

    NASA Technical Reports Server (NTRS)

    Macdonald, R. B.; Hall, F. G.

    1977-01-01

    Strategies are presented in agricultural technology to increase the resistance of crops to a wider range of meteorological conditions in order to reduce year-to-year variations in crop production. Uncertainties in agricultral production, together with the consumer demands of an increasing world population, have greatly intensified the need for early and accurate annual global crop production forecasts. These forecasts must predict fluctuation with an accuracy, timeliness and known reliability sufficient to permit necessary social and economic adjustments, with as much advance warning as possible.

  15. Uncertainties in real-world decisions on medical technologies.

    PubMed

    Lu, C Y

    2014-08-01

    Patients, clinicians, payers and policy makers face substantial uncertainties in their respective healthcare decisions as they attempt to achieve maximum value, or the greatest level of benefit possible at a given cost. Uncertainties largely come from incomplete information at the time that decisions must be made. This is true in all areas of medicine because evidence from clinical trials is often incongruent with real-world patient care. This article highlights key uncertainties around the (comparative) benefits and harms of medical technologies. Initiatives and strategies such as comparative effectiveness research and coverage with evidence development may help to generate reliable and relevant evidence for decisions on coverage and treatment. These efforts could result in better decisions that improve patient outcomes and better use of scarce medical resources. © 2014 John Wiley & Sons Ltd.

  16. Status of Duct Liner Technology for Application to Aircraft Engine Nacelles

    NASA Technical Reports Server (NTRS)

    Parrott, Tony L.; Jones, Michael G.; Watson, Willie R.

    2005-01-01

    Grazing flows and high acoustic intensities impose unusual design requirements on acoustic liner treatments used in aircraft engine nacelles. Increased sound absorption efficiency (requiring increased accuracy of liner impedance specification) is particularly critical in the face of ever decreasing nacelle wall area available for liner treatments in modern, high-bypass ratio engines. This paper reviews the strategy developed at Langley Research Center for achieving a robust measurement technology that is crucial for validating impedance models for aircraft liners. Specifically, the paper describes the current status of computational and data acquisition technologies for reducing impedance in a flow duct. Comparisons of reduced impedances for a "validation liner" using 1980's and 2000's measurement technology are consistent, but show significant deviations (up to 0.5 c exclusive of liner anti-resonance region) from a first principles impedance prediction model as grazing flow centerline Mach numbers increase up to 0.5. The deviations, in part, are believed related to uncertainty in the choice of grazing flow parameters (e.g. cross-section averaged, core-flow averaged, or centerline Mach number?). Also, there may be an issue with incorporating the impedance discontinuities corresponding to the hard wall to liner interface (i.e. leading and trailing edge of test liner) within the discretized finite element model.

  17. Management of groundwater in-situ bioremediation system using reactive transport modelling under parametric uncertainty: field scale application

    NASA Astrophysics Data System (ADS)

    Verardo, E.; Atteia, O.; Rouvreau, L.

    2015-12-01

    In-situ bioremediation is a commonly used remediation technology to clean up the subsurface of petroleum-contaminated sites. Forecasting remedial performance (in terms of flux and mass reduction) is a challenge due to uncertainties associated with source properties and the uncertainties associated with contribution and efficiency of concentration reducing mechanisms. In this study, predictive uncertainty analysis of bio-remediation system efficiency is carried out with the null-space Monte Carlo (NSMC) method which combines the calibration solution-space parameters with the ensemble of null-space parameters, creating sets of calibration-constrained parameters for input to follow-on remedial efficiency. The first step in the NSMC methodology for uncertainty analysis is model calibration. The model calibration was conducted by matching simulated BTEX concentration to a total of 48 observations from historical data before implementation of treatment. Two different bio-remediation designs were then implemented in the calibrated model. The first consists in pumping/injection wells and the second in permeable barrier coupled with infiltration across slotted piping. The NSMC method was used to calculate 1000 calibration-constrained parameter sets for the two different models. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. The first variant implementation of the NSMC is based on a single calibrated model. In the second variant, models were calibrated from different initial parameter sets. NSMC calibration-constrained parameter sets were sampled from these different calibrated models. We demonstrate that in context of nonlinear model, second variant avoids to underestimate parameter uncertainty which may lead to a poor quantification of predictive uncertainty. Application of the proposed approach to manage bioremediation of groundwater in a real site shows that it is effective to provide support in management of the in-situ bioremediation systems. Moreover, this study demonstrates that the NSMC method provides a computationally efficient and practical methodology of utilizing model predictive uncertainty methods in environmental management.

  18. Planning low-carbon electricity systems under uncertainty considering operational flexibility and smart grid technologies.

    PubMed

    Moreno, Rodrigo; Street, Alexandre; Arroyo, José M; Mancarella, Pierluigi

    2017-08-13

    Electricity grid operators and planners need to deal with both the rapidly increasing integration of renewables and an unprecedented level of uncertainty that originates from unknown generation outputs, changing commercial and regulatory frameworks aimed to foster low-carbon technologies, the evolving availability of market information on feasibility and costs of various technologies, etc. In this context, there is a significant risk of locking-in to inefficient investment planning solutions determined by current deterministic engineering practices that neither capture uncertainty nor represent the actual operation of the planned infrastructure under high penetration of renewables. We therefore present an alternative optimization framework to plan electricity grids that deals with uncertain scenarios and represents increased operational details. The presented framework is able to model the effects of an array of flexible, smart grid technologies that can efficiently displace the need for conventional solutions. We then argue, and demonstrate via the proposed framework and an illustrative example, that proper modelling of uncertainty and operational constraints in planning is key to valuing operationally flexible solutions leading to optimal investment in a smart grid context. Finally, we review the most used practices in power system planning under uncertainty, highlight the challenges of incorporating operational aspects and advocate the need for new and computationally effective optimization tools to properly value the benefits of flexible, smart grid solutions in planning. Such tools are essential to accelerate the development of a low-carbon energy system and investment in the most appropriate portfolio of renewable energy sources and complementary enabling smart technologies.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  19. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  20. Summary Findings from the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.

  1. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  2. Assessment of Solid Sorbent Systems for Post-Combustion Carbon Dioxide Capture at Coal-Fired Power Plants

    NASA Astrophysics Data System (ADS)

    Glier, Justin C.

    In an effort to lower future CO2 emissions, a wide range of technologies are being developed to scrub CO2 from the flue gases of fossil fuel-based electric power and industrial plants. This thesis models one of several early-stage post-combustion CO2 capture technologies, solid sorbent-based CO2 capture process, and presents performance and cost estimates of this system on pulverized coal power plants. The spreadsheet-based software package Microsoft Excel was used in conjunction with AspenPlus modelling results and the Integrated Environmental Control Model to develop performance and cost estimates for the solid sorbent-based CO2 capture technology. A reduced order model also was created to facilitate comparisons among multiple design scenarios. Assumptions about plant financing and utilization, as well as uncertainties in heat transfer and material design that affect heat exchanger and reactor design were found to produce a wide range of cost estimates for solid sorbent-based systems. With uncertainties included, costs for a supercritical power plant with solid sorbent-based CO2 capture ranged from 167 to 533 per megawatt hour for a first-of-a-kind installation (with all costs in constant 2011 US dollars) based on a 90% confidence interval. The median cost was 209/MWh. Post-combustion solid sorbent-based CO2 capture technology is then evaluated in terms of the potential cost for a mature system based on historic experience as technologies are improved with sequential iterations of the currently available system. The range costs for a supercritical power plant with solid sorbent-based CO2 capture was found to be 118 to 189 per megawatt hour with a nominal value of 163 per megawatt hour given the expected range of technological improvement in the capital and operating costs and efficiency of the power plant after 100 GW of cumulative worldwide experience. These results suggest that the solid sorbent-based system will not be competitive with currently available liquid amine-systems in the absence of significant new improvements in solid sorbent properties and process system design to reduce the heat exchange surface area in the regenerator and cross-flow heat exchanger. Finally, the importance of these estimates for policy makers is discussed.

  3. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  4. Developing Hydrogeological Site Characterization Strategies based on Human Health Risk

    NASA Astrophysics Data System (ADS)

    de Barros, F.; Rubin, Y.; Maxwell, R. M.

    2013-12-01

    In order to provide better sustainable groundwater quality management and minimize the impact of contamination in humans, improved understanding and quantification of the interaction between hydrogeological models, geological site information and human health are needed. Considering the joint influence of these components in the overall human health risk assessment and the corresponding sources of uncertainty aid decision makers to better allocate resources in data acquisition campaigns. This is important to (1) achieve remediation goals in a cost-effective manner, (2) protect human health and (3) keep water supplies clean in order to keep with quality standards. Such task is challenging since a full characterization of the subsurface is unfeasible due to financial and technological constraints. In addition, human exposure and physiological response to contamination are subject to uncertainty and variability. Normally, sampling strategies are developed with the goal of reducing uncertainty, but less often they are developed in the context of their impacts on the overall system uncertainty. Therefore, quantifying the impact from each of these components (hydrogeological, behavioral and physiological) in final human health risk prediction can provide guidance for decision makers to best allocate resources towards minimal prediction uncertainty. In this presentation, a multi-component human health risk-based framework is presented which allows decision makers to set priorities through an information entropy-based visualization tool. Results highlight the role of characteristic length-scales characterizing flow and transport in determining data needs within an integrated hydrogeological-health framework. Conditions where uncertainty reduction in human health risk predictions may benefit from better understanding of the health component, as opposed to a more detailed hydrogeological characterization, are also discussed. Finally, results illustrate how different dose-response models can impact the probability of human health risk exceeding a regulatory threshold.

  5. Integrating uncertainty into public energy research and development decisions

    NASA Astrophysics Data System (ADS)

    Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina

    2017-05-01

    Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.

  6. A Practical Approach to Starting Fission Surface Power Development

    NASA Technical Reports Server (NTRS)

    Mason, Lee S.

    2006-01-01

    The Prometheus Power and Propulsion Program has been reformulated to address NASA needs relative to lunar and Mars exploration. Emphasis has switched from the Jupiter Icy Moons Orbiter (JIMO) flight system development to more generalized technology development addressing Fission Surface Power (FSP) and Nuclear Thermal Propulsion (NTP). Current NASA budget priorities and the deferred mission need date for nuclear systems prohibit a fully funded reactor Flight Development Program. However, a modestly funded Advanced Technology Program can and should be conducted to reduce the risk and cost of future flight systems. A potential roadmap for FSP technology development leading to possible flight applications could include three elements: 1) Conceptual Design Studies, 2) Advanced Component Technology, and 3) Non-Nuclear System Testing. The Conceptual Design Studies would expand on recent NASA and DOE analyses while increasing the depth of study in areas of greatest uncertainty such as reactor integration and human-rated shielding. The Advanced Component Technology element would address the major technology risks through development and testing of reactor fuels, structural materials, primary loop components, shielding, power conversion, heat rejection, and power management and distribution (PMAD). The Non-Nuclear System Testing would provide a modular, technology testbed to investigate and resolve system integration issues.

  7. Farm-level economics of innovative tillage technologies: the case of no-till in the Altai Krai in Russian Siberia.

    PubMed

    Bavorova, Miroslava; Imamverdiyev, Nizami; Ponkina, Elena

    2018-01-01

    In the agricultural Altai Krai in Russian Siberia, soil degradation problems are prevalent. Agronomists recommend "reduced tillage systems," especially no-till, as a sustainable way to cultivate land that is threatened by soil degradation. In the Altai Krai, less is known about the technologies in practice. In this paper, we provide information on plant cultivation technologies used in the Altai Krai and on selected factors preventing farm managers in this region from adopting no-till technology based on our own quantitative survey conducted across 107 farms in 2015 and 2016. The results of the quantitative survey show that farm managers have high uncertainty regarding the use of no-till technology including its economics. To close this gap, we provide systematic analysis of factors influencing the economy of the plant production systems by using a farm optimization model (linear programming) for a real farm, together with expert estimations. The farm-specific results of the optimization model show that under optimal management and climatic conditions, the expert Modern Canadian no-till technology outperforms the farm min-till technology, but this is not the case for suboptimal conditions with lower yields.

  8. Accounting for Epistemic Uncertainty in Mission Supportability Assessment: A Necessary Step in Understanding Risk and Logistics Requirements

    NASA Technical Reports Server (NTRS)

    Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William

    2017-01-01

    Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.

  9. Investment under Uncertainty with Manager-Shareholder Conflict

    NASA Astrophysics Data System (ADS)

    Shibata, Takashi; Nishihara, Michi

    2009-09-01

    This paper examines investment timing by the manager in a decentralized firm in the presence of asymmetric information. In particular, we extend the agency problem in a real options model to incorporate an audit technology which allows the owner, at a cost, to verify private information. The implied investment triggers include those in three related papers: standard full information model (e.g., McDonald and Siegel, 1986); Grenadier and Wang (2005); Shibata (2009). An increase in the penalty for the manager's false report always reduces inefficiency in the investment triggers, while it does not necessarily reduce inefficiency in the total social welfare. Most importantly, however, the full information investment triggers and total social welfare can be approximated arbitrarily closely by making the penalty sufficiently large.

  10. Entry Vehicle Control System Design for the Mars Smart Lander

    NASA Technical Reports Server (NTRS)

    Calhoun, Philip C.; Queen, Eric M.

    2002-01-01

    The NASA Langley Research Center, in cooperation with the Jet Propulsion Laboratory, participated in a preliminary design study of the Entry, Descent and Landing phase for the Mars Smart Lander Project. This concept utilizes advances in Guidance, Navigation and Control technology to significantly reduce uncertainty in the vehicle landed location on the Mars surface. A candidate entry vehicle controller based on the Reaction Control System controller for the Apollo Lunar Excursion Module digital autopilot is proposed for use in the entry vehicle attitude control. A slight modification to the phase plane controller is used to reduce jet-firing chattering while maintaining good control response for the Martian entry probe application. The controller performance is demonstrated in a six-degree-of-freedom simulation with representative aerodynamics.

  11. A Global Emission Inventory of Black Carbon and Primary Organic Carbon from Fossil-Fuel and Biofuel Combustion

    NASA Astrophysics Data System (ADS)

    Bond, T. C.; Streets, D. G.; Nelson, S. M.

    2001-12-01

    Regional and global climate models rely on emission inventories of black carbon and organic carbon to determine the climatic effects of primary particulate matter (PM) from combustion. The emission of primary carbonaceous particles is highly dependent on fuel type and combustion practice. Therefore, simple categories such as "domestic" or "industrial" combustion are not sufficient to quantify emissions, and the black-carbon and organic-carbon fractions of PM vary with combustion type. We present a global inventory of primary carbonaceous particles that improves on previous "bottom-up" tabulations (e.g. \\textit{Cooke et al.,} 1999) by considering approximately 100 technologies, each representing one combination of fuel, combustion type, and emission controls. For fossil-fuel combustion, we include several categories not found in previous inventories, including "superemitting" and two-stroke vehicles, steel-making. We also include emissions from waste burning and biofuels used for heating and cooking. Open biomass burning is not included. Fuel use, drawn from International Energy Agency (IEA) and United Nations (UN) data, is divided into technologies on a regional basis. We suggest that emissions in developing countries are better characterized by including high-emitting technologies than by invoking emission multipliers. Due to lack of information on emission factors and technologies in use, uncertainties are high. We estimate central values and uncertainties by combining the range of emission factors found in the literature with reasonable estimates of technology divisions. We provide regional totals of central, low and high estimates, identify the sources of greatest uncertainty to be targeted for future work, and compare our results with previous emission inventories. Both central estimates and uncertainties are given on a 1\\deg x1\\deg grid. As we have reported previously for the case of China (\\textit{Streets et al.,} 2001), low-technology combustion contributes greatly to the emissions and to the uncertainties.

  12. Iron Disilicide as High-Temperature Reference Material for Traceable Measurements of Seebeck Coefficient Between 300 K and 800 K

    NASA Astrophysics Data System (ADS)

    Ziolkowski, Pawel; Stiewe, Christian; de Boor, Johannes; Druschke, Ines; Zabrocki, Knud; Edler, Frank; Haupt, Sebastian; König, Jan; Mueller, Eckhard

    2017-01-01

    Thermoelectric generators (TEGs) convert heat to electrical energy by means of the Seebeck effect. The Seebeck coefficient is a central thermoelectric material property, measuring the magnitude of the thermovoltage generated in response to a temperature difference across a thermoelectric material. Precise determination of the Seebeck coefficient provides the basis for reliable performance assessment in materials development in the field of thermoelectrics. For several reasons, measurement uncertainties of up to 14% can often be observed in interlaboratory comparisons of temperature-dependent Seebeck coefficient or in error analyses on currently employed instruments. This is still too high for an industrial benchmark and insufficient for many scientific investigations and technological developments. The TESt (thermoelectric standardization) project was launched in 2011, funded by the German Federal Ministry of Education and Research (BMBF), to reduce measurement uncertainties, engineer traceable and precise thermoelectric measurement techniques for materials and TEGs, and develop reference materials (RMs) for temperature-dependent determination of the Seebeck coefficient. We report herein the successful development and qualification of cobalt-doped β-iron disilicide ( β-Fe0.95Co0.05Si2) as a RM for high-temperature thermoelectric metrology. A brief survey on technological processes for manufacturing and machining of samples is presented. Focus is placed on metrological qualification of the iron disilicide, results of an international round-robin test, and final certification as a reference material in accordance with ISO-Guide 35 and the "Guide to the expression of uncertainty in measurement" by the Physikalisch-Technische Bundesanstalt, the national metrology institute of Germany.

  13. Exploring entropic uncertainty relation in the Heisenberg XX model with inhomogeneous magnetic field

    NASA Astrophysics Data System (ADS)

    Huang, Ai-Jun; Wang, Dong; Wang, Jia-Ming; Shi, Jia-Dong; Sun, Wen-Yang; Ye, Liu

    2017-08-01

    In this work, we investigate the quantum-memory-assisted entropic uncertainty relation in a two-qubit Heisenberg XX model with inhomogeneous magnetic field. It has been found that larger coupling strength J between the two spin-chain qubits can effectively reduce the entropic uncertainty. Besides, we observe the mechanics of how the inhomogeneous field influences the uncertainty, and find out that when the inhomogeneous field parameter b<1, the uncertainty will decrease with the decrease of the inhomogeneous field parameter b, conversely, the uncertainty will increase with decreasing b under the condition that b>1. Intriguingly, the entropic uncertainty can shrink to zero when the coupling coefficients are relatively large, while the entropic uncertainty only reduces to 1 with the increase of the homogeneous magnetic field. Additionally, we observe the purity of the state and Bell non-locality and obtain that the entropic uncertainty is anticorrelated with both the purity and Bell non-locality of the evolution state.

  14. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    NASA Technical Reports Server (NTRS)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  15. Forward and backward uncertainty propagation: an oxidation ditch modelling example.

    PubMed

    Abusam, A; Keesman, K J; van Straten, G

    2003-01-01

    In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.

  16. Chasing Perfection: Should We Reduce Model Uncertainty in Carbon Cycle-Climate Feedbacks

    NASA Astrophysics Data System (ADS)

    Bonan, G. B.; Lombardozzi, D.; Wieder, W. R.; Lindsay, K. T.; Thomas, R. Q.

    2015-12-01

    Earth system model simulations of the terrestrial carbon (C) cycle show large multi-model spread in the carbon-concentration and carbon-climate feedback parameters. Large differences among models are also seen in their simulation of global vegetation and soil C stocks and other aspects of the C cycle, prompting concern about model uncertainty and our ability to faithfully represent fundamental aspects of the terrestrial C cycle in Earth system models. Benchmarking analyses that compare model simulations with common datasets have been proposed as a means to assess model fidelity with observations, and various model-data fusion techniques have been used to reduce model biases. While such efforts will reduce multi-model spread, they may not help reduce uncertainty (and increase confidence) in projections of the C cycle over the twenty-first century. Many ecological and biogeochemical processes represented in Earth system models are poorly understood at both the site scale and across large regions, where biotic and edaphic heterogeneity are important. Our experience with the Community Land Model (CLM) suggests that large uncertainty in the terrestrial C cycle and its feedback with climate change is an inherent property of biological systems. The challenge of representing life in Earth system models, with the rich diversity of lifeforms and complexity of biological systems, may necessitate a multitude of modeling approaches to capture the range of possible outcomes. Such models should encompass a range of plausible model structures. We distinguish between model parameter uncertainty and model structural uncertainty. Focusing on improved parameter estimates may, in fact, limit progress in assessing model structural uncertainty associated with realistically representing biological processes. Moreover, higher confidence may be achieved through better process representation, but this does not necessarily reduce uncertainty.

  17. Development of a 300 L Calibration Bath for Oceanographic Thermometers

    NASA Astrophysics Data System (ADS)

    Baba, S.; Yamazawa, K.; Nakano, T.; Saito, I.; Tamba, J.; Wakimoto, T.; Katoh, K.

    2017-11-01

    The Japan Agency for Marine-Earth Science and Technology (JAMSTEC) has been developing a 300 L calibration bath to calibrate 24 oceanographic thermometers (OT) simultaneously and thereby reduce the calibration work load necessary to service more than 180 OT every year. This study investigated characteristics of the developed 300 L calibration bath using a SBE 3plus thermometer produced by an OT manufacturer. We also used 11 thermistor thermometers that were calibrated to be traceable to the international temperature scale of 1990 (ITS-90) within 1 mK of standard uncertainty through collaboration of JAMSTEC and NMIJ/AIST. Results show that the time stability of temperature of the developed bath was within ± 1 mK. Furthermore, the temperature uniformity was ± 1.3 mK. The expanded uncertainty (k=2) components for the characteristics of the developed 300 L calibration bath were estimated as 2.9 mK, which is much less than the value of 10 mK: the required specification for uncertainty of calibration for the OT. These results demonstrated the utility of this 300 L calibration bath as a device for use with a new calibration system.

  18. Estimating Preferences for Complex Health Technologies: Lessons Learned and Implications for Personalized Medicine.

    PubMed

    Marshall, Deborah A; Gonzalez, Juan Marcos; MacDonald, Karen V; Johnson, F Reed

    2017-01-01

    We examine key study design challenges of using stated-preference methods to estimate the value of whole-genome sequencing (WGS) as a specific example of genomic testing. Assessing the value of WGS is complex because WGS provides multiple findings, some of which can be incidental in nature and unrelated to the specific health concerns that motivated the test. In addition, WGS results can include actionable findings (variants considered to be clinically useful and can be acted on), findings for which evidence for best clinical action is not available (variants considered clinically valid but do not meet as high of a standard for clinical usefulness), and findings of unknown significance. We consider three key challenges encountered in designing our national study on the value of WGS-layers of uncertainty, potential downstream consequences with endogenous aspects, and both positive and negative utility associated with testing information-and potential solutions as strategies to address these challenges. We conceptualized the decision to acquire WGS information as a series of sequential choices that are resolved separately. To determine the value of WGS information at the initial decision to undergo WGS, we used contingent valuation questions, and to elicit respondent preferences for reducing risks of health problems and the consequences of taking the steps to reduce these risks, we used a discrete-choice experiment. We conclude by considering the implications for evaluating the value of other complex health technologies that involve multiple forms of uncertainty. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less

  20. Using Large-Scale Cooperative Control to Manage Operational Uncertainties for Aquifer Thermal Energy Storage

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, M.; Rostampour, V.; Kwakkel, J. H.; Bloemendal, M.

    2017-12-01

    Seasonal Aquifer Thermal Energy Storage (ATES) technology can help reduce the demand of energy for heating and cooling in buildings, and has become a popular option for larger buildings in northern Europe. However, the larger-scale deployment of this technology has evidenced some issues of concern for policymakers; in particular, recent research shows that operational uncertainties contribute to inefficient outcomes under current planning methods for ATES. For instance, systems in the Netherlands typically use less than half of their permitted pumping volume on an annual basis. This overcapacity gives users more flexibility to operate their systems in response to the uncertainties which drive building energy demand; these include short-term operational factors such as weather and occupancy, and longer-term, deeply uncertain factors such as changes in climate and aquifer conditions over the lifespan of the buildings. However, as allocated subsurface volume remains unused, this situation limits the adoption of the technology in dense areas. Previous work using coupled agent-based/geohydrological simulation has shown that the cooperative operation of neighbouring ATES systems can support more efficient spatial planning, by dynamically managing thermal interactions in response to uncertain operating conditions. An idealized case study with centralized ATES control thus showed significant improvements in the energy savings which could obtained per unit of allocated subsurface volume, without degrading the recovery performance of systems. This work will extend this cooperative approach for a realistic case study of ATES planning in the city of Utrecht, in the Netherlands. This case was previously simulated under different scenarios for individual ATES operation. The poster will compare these results with a cooperative case under which neighbouring systems can coordinate their operation to manage interactions. Furthermore, a cooperative game-theoretical framework will be used to analyze the theoretical conditions under which cooperation between ATES operators could be assumed to be stable and beneficial, under a range of scenarios for climate trends and ATES adoption pathways.

  1. The option value of delay in health technology assessment.

    PubMed

    Eckermann, Simon; Willan, Andrew R

    2008-01-01

    Processes of health technology assessment (HTA) inform decisions under uncertainty about whether to invest in new technologies based on evidence of incremental effects, incremental cost, and incremental net benefit monetary (INMB). An option value to delaying such decisions to wait for further evidence is suggested in the usual case of interest, in which the prior distribution of INMB is positive but uncertain. of estimating the option value of delaying decisions to invest have previously been developed when investments are irreversible with an uncertain payoff over time and information is assumed fixed. However, in HTA decision uncertainty relates to information (evidence) on the distribution of INMB. This article demonstrates that the option value of delaying decisions to allow collection of further evidence can be estimated as the expected value of sample of information (EVSI). For irreversible decisions, delay and trial (DT) is demonstrated to be preferred to adopt and no trial (AN) when the EVSI exceeds expected costs of information, including expected opportunity costs of not treating patients with the new therapy. For reversible decisions, adopt and trial (AT) becomes a potentially optimal strategy, but costs of reversal are shown to reduce the EVSI of this strategy due to both a lower probability of reversal being optimal and lower payoffs when reversal is optimal. Hence, decision makers are generally shown to face joint research and reimbursement decisions (AN, DT and AT), with the optimal choice dependent on costs of reversal as well as opportunity costs of delay and the distribution of prior INMB.

  2. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate decision making under uncertainty methods from the state of the art. We will compare the efficiency of alternative approaches to the two case studies. Finally, we will present a hybrid decision analytic tool to address the synthesis of uncertainties.

  3. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  4. Classifying the Sizes of Explosive Eruptions using Tephra Deposits: The Advantages of a Numerical Inversion Approach

    NASA Astrophysics Data System (ADS)

    Connor, C.; Connor, L.; White, J.

    2015-12-01

    Explosive volcanic eruptions are often classified by deposit mass and eruption column height. How well are these eruption parameters determined in older deposits, and how well can we reduce uncertainty using robust numerical and statistical methods? We describe an efficient and effective inversion and uncertainty quantification approach for estimating eruption parameters given a dataset of tephra deposit thickness and granulometry. The inversion and uncertainty quantification is implemented using the open-source PEST++ code. Inversion with PEST++ can be used with a variety of forward models and here is applied using Tephra2, a code that simulates advective and dispersive tephra transport and deposition. The Levenburg-Marquardt algorithm is combined with formal Tikhonov and subspace regularization to invert eruption parameters; a linear equation for conditional uncertainty propagation is used to estimate posterior parameter uncertainty. Both the inversion and uncertainty analysis support simultaneous analysis of the full eruption and wind-field parameterization. The combined inversion/uncertainty-quantification approach is applied to the 1992 eruption of Cerro Negro (Nicaragua), the 2011 Kirishima-Shinmoedake (Japan), and the 1913 Colima (Mexico) eruptions. These examples show that although eruption mass uncertainty is reduced by inversion against tephra isomass data, considerable uncertainty remains for many eruption and wind-field parameters, such as eruption column height. Supplementing the inversion dataset with tephra granulometry data is shown to further reduce the uncertainty of most eruption and wind-field parameters. We think the use of such robust models provides a better understanding of uncertainty in eruption parameters, and hence eruption classification, than is possible with more qualitative methods that are widely used.

  5. Long-Term Planning for Nuclear Energy Systems Under Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Kim, Lance Kyungwoo

    Long-term planning for nuclear energy systems has been an area of interest for policy planners and systems designers to assess and manage the complexity of the system and the long-term, wide-ranging societal impacts of decisions. However, traditional planning tools are often poorly equipped to cope with the deep parametric, structural, and value uncertainties in long-term planning. A more robust, multiobjective decision-making method is applied to a model of the nuclear fuel cycle to address the many sources of complexity, uncertainty, and ambiguity inherent to long-term planning. Unlike prior studies that rely on assessing the outcomes of a limited set of deployment strategies, solutions in this study arise from optimizing behavior against multiple incommensurable objectives, utilizing goal-seeking multiobjective evolutionary algorithms to identify minimax regret solutions across various demand scenarios. By excluding inferior and infeasible solutions, the choice between the Pareto optimal solutions depends on a decision-maker's preferences for the defined outcomes---limiting analyst bias and increasing transparency. Though simplified by the necessity of reducing computational burdens, the nuclear fuel cycle model captures important phenomena governing the behavior of the nuclear energy system relevant to the decision to close the fuel cycle---incorporating reactor population dynamics, material stocks and flows, constraints on material flows, and outcomes of interest to decision-makers. Technology neutral performance criteria are defined consistent with the Generation IV International Forum goals of improved security and proliferation resistance based on structural features of the nuclear fuel cycle, natural resource sustainability, and waste production. A review of safety risks and the economic history of the development of nuclear technology suggests that safety and economic criteria may not be decisive criteria as the safety risks posed by alternative fuel cycles may be comparable in aggregate and economic performance is uncertain and path dependent. Technology strategies impacting reactor lifetimes and advanced reactor introduction dates are evaluated against a high, medium, and phaseout scenarios of nuclear energy demand. Non-dominated, minimax regret solutions are found with the NSGA-II multiobjective evolutionary algorithm. Results suggest that more aggressive technology strategies featuring the early introduction of breeder and burner reactors, possibly combined with lifetime extension of once-through systems, tend to dominate less aggressive strategies under more demanding growth scenarios over the next century. Less aggressive technology strategies that delay burning and breeding tend to be clustered in the minimax regret space, suggesting greater sensitivity to shifts in preferences. Lifetime extension strategies can unexpectedly result in fewer deployments of once-through systems, permitting the growth of advanced systems to meet demand. Both breeders and burners are important for controlling plutonium inventories with breeders achieving lower inventories in storage by locking material in reactor cores while burners can reduce the total inventory in the system. Other observations include the indirect impacts of some performance measures, the relatively small impact of technology strategies on the waste properties of all material in the system, and the difficulty of phasing out nuclear energy while meeting all objectives with the specified technology options.

  6. One Strategy for Reducing Uncertainty in Climate Change Communications

    NASA Astrophysics Data System (ADS)

    Romm, J.

    2011-12-01

    Future impacts of climate change are invariably presented with a very wide range of impacts reflecting two different sets of uncertainties. The first concerns our uncertainty about precisely how much greenhouse gas emissions humanity will emit into the atmosphere. The second concerns our uncertainty about precisely what impact those emissions will have on the climate. By failing to distinguish between these two types of uncertainties, climate scientists have not clearly explained to the public and policymakers what the scientific literature suggests is likely to happen if we don't substantially alter our current emissions path. Indeed, much of climate communications has been built around describing the range of impacts from emissions paths that are increasingly implausible given political and technological constraints, such as a stabilization at 450 or 550 parts per million atmospheric of carbon dioxide. For the past decade, human emissions of greenhouse gases have trended near the worst-case scenarios of the Intergovernmental Panel on Climate Change, emissions paths that reach 800 ppm or even 1000 ppm. The current policies of the two biggest emitters, the United States and China, coupled with the ongoing failure of international negotiations to come to an agreement on restricting emissions, suggests that recent trends will continue for the foreseeable future. This in turn suggests that greater clarity in climate change communications could be achieved by more clearly explaining to the public what the scientific literature suggests the range of impacts are for our current high emissions path. This also suggests that more focus should be given in the scientific literature to better constraining the range of impacts from the high emissions scenarios.

  7. On the Application of Science Systems Engineering and Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    NASA Astrophysics Data System (ADS)

    Schlegel, Nicole-Jeanne; Boening, Carmen; Larour, Eric; Limonadi, Daniel; Schodlok, Michael; Seroussi, Helene; Watkins, Michael

    2017-04-01

    Research and development activities at the Jet Propulsion Laboratory (JPL) currently support the creation of a framework to formally evaluate the observational needs within earth system science. One of the pilot projects of this effort aims to quantify uncertainties in global mean sea level rise projections, due to contributions from the continental ice sheets. Here, we take advantage of established uncertainty quantification tools embedded within the JPL-University of California at Irvine Ice Sheet System Model (ISSM). We conduct sensitivity and Monte-Carlo style sampling experiments on forward simulations of the Greenland and Antarctic ice sheets. By varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges, we assess the impact of the different parameter ranges on century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  8. Potential uncertainty reduction in model-averaged benchmark dose estimates informed by an additional dose study.

    PubMed

    Shao, Kan; Small, Mitchell J

    2011-10-01

    A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.

  9. Periodic reference tracking control approach for smart material actuators with complex hysteretic characteristics

    NASA Astrophysics Data System (ADS)

    Sun, Zhiyong; Hao, Lina; Song, Bo; Yang, Ruiguo; Cao, Ruimin; Cheng, Yu

    2016-10-01

    Micro/nano positioning technologies have been attractive for decades for their various applications in both industrial and scientific fields. The actuators employed in these technologies are typically smart material actuators, which possess inherent hysteresis that may cause systems behave unexpectedly. Periodic reference tracking capability is fundamental for apparatuses such as scanning probe microscope, which employs smart material actuators to generate periodic scanning motion. However, traditional controller such as PID method cannot guarantee accurate fast periodic scanning motion. To tackle this problem and to conduct practical implementation in digital devices, this paper proposes a novel control method named discrete extended unparallel Prandtl-Ishlinskii model based internal model (d-EUPI-IM) control approach. To tackle modeling uncertainties, the robust d-EUPI-IM control approach is investigated, and the associated sufficient stabilizing conditions are derived. The advantages of the proposed controller are: it is designed and represented in discrete form, thus practical for digital devices implementation; the extended unparallel Prandtl-Ishlinskii model can precisely represent forward/inverse complex hysteretic characteristics, thus can reduce modeling uncertainties and benefits controllers design; in addition, the internal model principle based control module can be utilized as a natural oscillator for tackling periodic references tracking problem. The proposed controller was verified through comparative experiments on a piezoelectric actuator platform, and convincing results have been achieved.

  10. Still Elegantly Muddling Through? NICE and Uncertainty in Decision Making About the Rationing of Expensive Medicines in England.

    PubMed

    Calnan, Michael; Hashem, Ferhana; Brown, Patrick

    2017-07-01

    This article examines the "technological appraisals" carried out by the National Institute for Health and Care Excellence as it regulates the provision of expensive new drugs within the English National Health Service on cost-effectiveness grounds. Ostensibly this is a highly rational process by which the regulatory mechanisms absorb uncertainty, but in practice, decision making remains highly complex and uncertain. This article draws on ethnographic data-interviews with a range of stakeholders and decision makers (n = 41), observations of public and closed appraisal meetings, and documentary analysis-regarding the decision-making processes involving three pharmaceutical products. The study explores the various ways in which different forms of uncertainty are perceived and tackled within these Single Technology Appraisals. Difficulties of dealing with the various levels of uncertainty were manifest and often rendered straightforward decision making problematic. Uncertainties associated with epistemology, procedures, interpersonal relations, and technicality were particularly evident. The need to exercise discretion within a more formal institutional framework shaped a pragmatic combining of strategies tactics-explicit and informal, collective and individual-to navigate through the layers of complexity and uncertainty in making decisions.

  11. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    NASA Astrophysics Data System (ADS)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  12. Stochastic Technology Choice Model for Consequential Life Cycle Assessment.

    PubMed

    Kätelhön, Arne; Bardow, André; Suh, Sangwon

    2016-12-06

    Discussions on Consequential Life Cycle Assessment (CLCA) have relied largely on partial or general equilibrium models. Such models are useful for integrating market effects into CLCA, but also have well-recognized limitations such as the poor granularity of the sectoral definition and the assumption of perfect oversight by all economic agents. Building on the Rectangular-Choice-of-Technology (RCOT) model, this study proposes a new modeling approach for CLCA, the Technology Choice Model (TCM). In this approach, the RCOT model is adapted for its use in CLCA and extended to incorporate parameter uncertainties and suboptimal decisions due to market imperfections and information asymmetry in a stochastic setting. In a case study on rice production, we demonstrate that the proposed approach allows modeling of complex production technology mixes and their expected environmental outcomes under uncertainty, at a high level of detail. Incorporating the effect of production constraints, uncertainty, and suboptimal decisions by economic agents significantly affects technology mixes and associated greenhouse gas (GHG) emissions of the system under study. The case study also shows the model's ability to determine both the average and marginal environmental impacts of a product in response to changes in the quantity of final demand.

  13. RACE pulls for shared control

    NASA Astrophysics Data System (ADS)

    Leahy, M. B., Jr.; Cassiday, B. K.

    1993-02-01

    Maintaining and supporting an aircraft fleet, in a climate of reduced manpower and financial resources, dictates effective utilization of robotics and automation technologies. To help develop a winning robotics and automation program the Air Force Logistics Command created the Robotics and Automation Center of Excellence (RACE). RACE is a command wide focal point. Race is an organic source of expertise to assist the Air Logistic Center (ALC) product directorates in improving process productivity through the judicious insertion of robotics and automation technologies. RACE is a champion for pulling emerging technologies into the aircraft logistic centers. One of those technology pulls is shared control. Small batch sizes, feature uncertainty, and varying work load conspire to make classic industrial robotic solutions impractical. One can view ALC process problems in the context of space robotics without the time delay. The ALC's will benefit greatly from the implementation of a common architecture that supports a range of control actions from fully autonomous to teleoperated. Working with national laboratories and private industry, we hope to transition shared control technology to the depot floor. This paper provides an overview of the RACE internal initiatives and customer support, with particular emphasis on production processes that will benefit from shared control technology.

  14. RACE pulls for shared control

    NASA Astrophysics Data System (ADS)

    Leahy, Michael B., Jr.; Cassiday, Brian K.

    1992-11-01

    Maintaining and supporting an aircraft fleet, in a climate of reduced manpower and financial resources, dictates effective utilization of robotics and automation technologies. To help develop a winning robotics and automation program the Air Force Logistics Command created the Robotics and Automation Center of Excellence (RACE). RACE is a command wide focal point. An organic source of expertise to assist the Air Logistic Center (ALC) product directorates in improving process productivity through the judicious insertion of robotics and automation technologies. RACE is a champion for pulling emerging technologies into the aircraft logistic centers. One of those technology pulls is shared control. The small batch sizes, feature uncertainty, and varying work load conspire to make classic industrial robotic solutions impractical. One can view ALC process problems in the context of space robotics without the time delay. The ALCs will benefit greatly from the implementation of a common architecture that supports a range of control actions from fully autonomous to teleoperated. Working with national laboratories and private industry we hope to transition shared control technology to the depot floor. This paper provides an overview of the RACE internal initiatives and customer support, with particular emphasis on production processes that will benefit from shared control technology.

  15. RACE pulls for shared control

    NASA Technical Reports Server (NTRS)

    Leahy, M. B., Jr.; Cassiday, B. K.

    1993-01-01

    Maintaining and supporting an aircraft fleet, in a climate of reduced manpower and financial resources, dictates effective utilization of robotics and automation technologies. To help develop a winning robotics and automation program the Air Force Logistics Command created the Robotics and Automation Center of Excellence (RACE). RACE is a command wide focal point. Race is an organic source of expertise to assist the Air Logistic Center (ALC) product directorates in improving process productivity through the judicious insertion of robotics and automation technologies. RACE is a champion for pulling emerging technologies into the aircraft logistic centers. One of those technology pulls is shared control. Small batch sizes, feature uncertainty, and varying work load conspire to make classic industrial robotic solutions impractical. One can view ALC process problems in the context of space robotics without the time delay. The ALC's will benefit greatly from the implementation of a common architecture that supports a range of control actions from fully autonomous to teleoperated. Working with national laboratories and private industry, we hope to transition shared control technology to the depot floor. This paper provides an overview of the RACE internal initiatives and customer support, with particular emphasis on production processes that will benefit from shared control technology.

  16. Data worth and prediction uncertainty for pesticide transport and fate models in Nebraska and Maryland, United States

    USGS Publications Warehouse

    Nolan, Bernard T.; Malone, Robert W.; Doherty, John E.; Barbash, Jack E.; Ma, Liwang; Shaner, Dale L.

    2015-01-01

    CONCLUSIONS: Although the observed data were sparse, they substantially reduced prediction uncertainty in unsampled regions of pesticide breakthrough curves. Nitrate evidently functioned as a surrogate for soil hydraulic data in well-drained loam soils conducive to conservative transport of nitrogen. Pesticide properties and macropore parameters could most benefit from improved characterization further to reduce model misfit and prediction uncertainty.

  17. We have the technology, but can we use it? Building flood risk capacity amongst property owners in England.

    NASA Astrophysics Data System (ADS)

    White, Iain; Connelly, Angela; O'Hare, Paul; Lawson, Nigel

    2013-04-01

    The UK's Meteorological Office has provisionally confirmed 2012 to be the second wettest recorded in the country (The Met Office, 2013). Volatile weather patterns resulted in much social and economic disruption and damage from floods. The UK's Flood and Water Management Act (2010) has placed responsibility for flood risk management primarily at local level. In reality, various agencies are responsible for managing flood risk resulting in a fragmented system that communities struggle to make sense of. Strengthening emergency response during a flood event is one strategy to build capacity. However, resilience has emerged as an operative policy, and points to a need for anticipatory approaches. These should extend beyond large-scale flood defenses or measures that reduce the vulnerability of infrastructures and buildings in order to incorporate social vulnerability through the establishment of warning systems and capacity building (White 2010). To this, small-scale, innovative technologies - from automatic door guards and 'smart' air bricks - hold the potential to manage the uncertainty around flood risk before an event occurs. However, innovative technologies are often resisted by institutions, technical systems, cultural preferences, and legislation, which require a multifaceted approach that addresses the social, cultural, economic and technical domains (De Graaf 2009). We present a case study that explores the barriers that inhibit the uptake of property level technologies in England by various actors: from property owners and manufacturers, to municipal authorities and built environment professionals. Through the case study, we demonstrate how these various stakeholders were involved in identifying the procedural principles to overcome these barriers and to integrate property level technologies more fully into an overall flood risk management system. Following this, best practice guidance was designed and we show the means by which such guidance can improve social capacity even where there is much uncertainty. The paper ends by describing the transferable lessons learned through the development of this tool and concludes on the potential of property level protection to manage flood risk across Europe. References de Graaf, R. E. (2009). Urban water innovations to reduce the vulnerability of cities. Feasibility and mainstreaming of technologies in society, Ph. D thesis, Delft University of Technology. Available at: www.deltasync.nl/reports/De_Graaf_thesis.pdf [Accessed 29 December 2012]. The Met Office. (2013) Statistics for December and 2012 - is the UK getting wetter? [Online resource]. Available at: http://www.metoffice.gov.uk/news/releases/archive/2013/2012-weather-statistics [Accessed 6 January 2012]. White, I. (2010). Water and the city: Risk resilience and planning for a sustainable future. London: Routledge.

  18. Space tug aerobraking study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Corso, C. J.; Eyer, C. L.

    1972-01-01

    The feasibility and practicality of employing an aerobraking trajectory for return of the reusable space tug from geosynchronous orbit was investigated. The aerobraking return trajectory modes employ transfer ellipses from high orbits which have low perigee altitudes wherein the earth's sensible atmosphere provides drag to reduce the tug return delta velocity requirements and thus decrease the required return trip propulsive energy. Aerodynamics, aerothermodynamics, trajectories, guidance and control, configuration concepts, materials, weights and performance were considered. Sensitivities to trajectory uncertainties, atmospheric anomalies and reentry environments were determined. New technology requirements and future studies required to further enhance the aerobraking potential were identified.

  19. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  20. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data pointsmore » can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.« less

  1. Process Design and Techno-economic Analysis for Materials to Treat Produced Waters.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heimer, Brandon Walter; Paap, Scott M; Sasan, Koroush

    Significant quantities of water are produced during enhanced oil recovery making these “produced water” streams attractive candidates for treatment and reuse. However, high concentrations of dissolved silica raise the propensity for fouling. In this paper, we report the design and economic analysis for a new ion exchange process using calcined hydrotalcite (HTC) to remove silica from water. This process improves upon known technologies by minimizing sludge product, reducing process fouling, and lowering energy use. Process modeling outputs included raw material requirements, energy use, and the minimum water treatment price (MWTP). Monte Carlo simulations quantified the impact of uncertainty and variabilitymore » in process inputs on MWTP. These analyses showed that cost can be significantly reduced if the HTC materials are optimized. Specifically, R&D improving HTC reusability, silica binding capacity, and raw material price can reduce MWTP by 40%, 13%, and 20%, respectively. Optimizing geographic deployment further improves cost competitiveness.« less

  2. Study of aerodynamic technology for single-cruise-engine V/STOL fighter/attack aircraft

    NASA Technical Reports Server (NTRS)

    Hess, J. R.; Bear, R. L.

    1982-01-01

    A viable, single engine, supersonic V/STOL fighter/attack aircraft concept was defined. This vectored thrust, canard wing configuration utilizes an advanced technology separated flow engine with fan stream burning. The aerodynamic characteristics of this configuration were estimated and performance evaluated. Significant aerodynamic and aerodynamic propulsion interaction uncertainties requiring additional investigation were identified. A wind tunnel model concept and test program to resolve these uncertainties and validate the aerodynamic prediction methods were defined.

  3. Uncertainty and stress: Why it causes diseases and how it is mastered by the brain.

    PubMed

    Peters, Achim; McEwen, Bruce S; Friston, Karl

    2017-09-01

    The term 'stress' - coined in 1936 - has many definitions, but until now has lacked a theoretical foundation. Here we present an information-theoretic approach - based on the 'free energy principle' - defining the essence of stress; namely, uncertainty. We address three questions: What is uncertainty? What does it do to us? What are our resources to master it? Mathematically speaking, uncertainty is entropy or 'expected surprise'. The 'free energy principle' rests upon the fact that self-organizing biological agents resist a tendency to disorder and must therefore minimize the entropy of their sensory states. Applied to our everyday life, this means that we feel uncertain, when we anticipate that outcomes will turn out to be something other than expected - and that we are unable to avoid surprise. As all cognitive systems strive to reduce their uncertainty about future outcomes, they face a critical constraint: Reducing uncertainty requires cerebral energy. The characteristic of the vertebrate brain to prioritize its own high energy is captured by the notion of the 'selfish brain'. Accordingly, in times of uncertainty, the selfish brain demands extra energy from the body. If, despite all this, the brain cannot reduce uncertainty, a persistent cerebral energy crisis may develop, burdening the individual by 'allostatic load' that contributes to systemic and brain malfunction (impaired memory, atherogenesis, diabetes and subsequent cardio- and cerebrovascular events). Based on the basic tenet that stress originates from uncertainty, we discuss the strategies our brain uses to avoid surprise and thereby resolve uncertainty. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  4. Reducing Spatial Uncertainty Through Attentional Cueing Improves Contrast Sensitivity in Regions of the Visual Field With Glaucomatous Defects

    PubMed Central

    Phu, Jack; Kalloniatis, Michael; Khuu, Sieu K.

    2018-01-01

    Purpose Current clinical perimetric test paradigms present stimuli randomly to various locations across the visual field (VF), inherently introducing spatial uncertainty, which reduces contrast sensitivity. In the present study, we determined the extent to which spatial uncertainty affects contrast sensitivity in glaucoma patients by minimizing spatial uncertainty through attentional cueing. Methods Six patients with open-angle glaucoma and six healthy subjects underwent laboratory-based psychophysical testing to measure contrast sensitivity at preselected locations at two eccentricities (9.5° and 17.5°) with two stimulus sizes (Goldmann sizes III and V) under different cueing conditions: 1, 2, 4, or 8 points verbally cued. Method of Constant Stimuli and a single-interval forced-choice procedure were used to generate frequency of seeing (FOS) curves at locations with and without VF defects. Results At locations with VF defects, cueing minimizes spatial uncertainty and improves sensitivity under all conditions. The effect of cueing was maximal when one point was cued, and rapidly diminished when more points were cued (no change to baseline with 8 points cued). The slope of the FOS curve steepened with reduced spatial uncertainty. Locations with normal sensitivity in glaucomatous eyes had similar performance to that of healthy subjects. There was a systematic increase in uncertainty with the depth of VF loss. Conclusions Sensitivity measurements across the VF are negatively affected by spatial uncertainty, which increases with greater VF loss. Minimizing uncertainty can improve sensitivity at locations of deficit. Translational Relevance Current perimetric techniques introduce spatial uncertainty and may therefore underestimate sensitivity in regions of VF loss. PMID:29600116

  5. Uncertainty in solid precipitation and snow depth prediction for Siberia using the Noah and Noah-MP land surface models

    NASA Astrophysics Data System (ADS)

    Suzuki, Kazuyoshi; Zupanski, Milija

    2018-01-01

    In this study, we investigate the uncertainties associated with land surface processes in an ensemble predication context. Specifically, we compare the uncertainties produced by a coupled atmosphere-land modeling system with two different land surface models, the Noah- MP land surface model (LSM) and the Noah LSM, by using the Maximum Likelihood Ensemble Filter (MLEF) data assimilation system as a platform for ensemble prediction. We carried out 24-hour prediction simulations in Siberia with 32 ensemble members beginning at 00:00 UTC on 5 March 2013. We then compared the model prediction uncertainty of snow depth and solid precipitation with observation-based research products and evaluated the standard deviation of the ensemble spread. The prediction skill and ensemble spread exhibited high positive correlation for both LSMs, indicating a realistic uncertainty estimation. The inclusion of a multiple snowlayer model in the Noah-MP LSM was beneficial for reducing the uncertainties of snow depth and snow depth change compared to the Noah LSM, but the uncertainty in daily solid precipitation showed minimal difference between the two LSMs. The impact of LSM choice in reducing temperature uncertainty was limited to surface layers of the atmosphere. In summary, we found that the more sophisticated Noah-MP LSM reduces uncertainties associated with land surface processes compared to the Noah LSM. Thus, using prediction models with improved skill implies improved predictability and greater certainty of prediction.

  6. Cloud fraction at the ARM SGP site: reducing uncertainty with self-organizing maps

    NASA Astrophysics Data System (ADS)

    Kennedy, Aaron D.; Dong, Xiquan; Xi, Baike

    2016-04-01

    Instrument downtime leads to uncertainty in the monthly and annual record of cloud fraction (CF), making it difficult to perform time series analyses of cloud properties and perform detailed evaluations of model simulations. As cloud occurrence is partially controlled by the large-scale atmospheric environment, this knowledge is used to reduce uncertainties in the instrument record. Synoptic patterns diagnosed from the North American Regional Reanalysis (NARR) during the period 1997-2010 are classified using a competitive neural network known as the self-organizing map (SOM). The classified synoptic states are then compared to the Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) instrument record to determine the expected CF. A number of SOMs are tested to understand how the number of classes and the period of classifications impact the relationship between classified states and CFs. Bootstrapping is utilized to quantify the uncertainty of the instrument record when statistical information from the SOM is included. Although all SOMs significantly reduce the uncertainty of the CF record calculated in Kennedy et al. (Theor Appl Climatol 115:91-105, 2014), SOMs with a large number of classes and separated by month are required to produce the lowest uncertainty and best agreement with the annual cycle of CF. This result may be due to a manifestation of seasonally dependent biases in NARR. With use of the SOMs, the average uncertainty in monthly CF is reduced in half from the values calculated in Kennedy et al. (Theor Appl Climatol 115:91-105, 2014).

  7. Essential information: Uncertainty and optimal control of Ebola outbreaks

    USGS Publications Warehouse

    Li, Shou-Li; Bjornstad, Ottar; Ferrari, Matthew J.; Mummah, Riley; Runge, Michael C.; Fonnesbeck, Christopher J.; Tildesley, Michael J.; Probert, William J. M.; Shea, Katriona

    2017-01-01

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  8. Essential information: Uncertainty and optimal control of Ebola outbreaks.

    PubMed

    Li, Shou-Li; Bjørnstad, Ottar N; Ferrari, Matthew J; Mummah, Riley; Runge, Michael C; Fonnesbeck, Christopher J; Tildesley, Michael J; Probert, William J M; Shea, Katriona

    2017-05-30

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeze, R.A.; McWhorter, D.B.

    Many emerging remediation technologies are designed to remove contaminant mass from source zones at DNAPL sites in response to regulatory requirements. There is often concern in the regulated community as to whether mass removal actually reduces risk, or whether the small risk reductions achieved warrant the large costs incurred. This paper sets out a proposed framework for quantifying the degree to which risk is reduced as mass is removed from DNAPL source areas in shallow, saturated, low-permeability media. Risk is defined in terms of meeting an alternate concentration limit (ACL) at a compliance well in an aquifer underlying the sourcemore » zone. The ACL is back-calculated from a carcinogenic health-risk characterization at a downgradient water-supply well. Source-zone mass-removal efficiencies are heavily dependent on the distribution of mass between media (fractures, matrix) and phase (aqueous, sorbed, NAPL). Due to the uncertainties in currently available technology performance data, the scope of the paper is limited to developing a framework for generic technologies rather than making specific risk-reduction calculations for individual technologies. Despite the qualitative nature of the exercise, results imply that very high total mass-removal efficiencies are required to achieve significant long-term risk reduction with technology applications of finite duration. This paper is not an argument for no action at contaminated sites. Rather, it provides support for the conclusions of Cherry et al. (1992) that the primary goal of current remediation should be short-term risk reduction through containment, with the aim to pass on to future generations site conditions that are well-suited to the future applications of emerging technologies with improved mass-removal capabilities.« less

  10. Method to Calculate Uncertainty Estimate of Measuring Shortwave Solar Irradiance using Thermopile and Semiconductor Solar Radiometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reda, I.

    2011-07-01

    The uncertainty of measuring solar irradiance is fundamentally important for solar energy and atmospheric science applications. Without an uncertainty statement, the quality of a result, model, or testing method cannot be quantified, the chain of traceability is broken, and confidence cannot be maintained in the measurement. Measurement results are incomplete and meaningless without a statement of the estimated uncertainty with traceability to the International System of Units (SI) or to another internationally recognized standard. This report explains how to use International Guidelines of Uncertainty in Measurement (GUM) to calculate such uncertainty. The report also shows that without appropriate corrections tomore » solar measuring instruments (solar radiometers), the uncertainty of measuring shortwave solar irradiance can exceed 4% using present state-of-the-art pyranometers and 2.7% using present state-of-the-art pyrheliometers. Finally, the report demonstrates that by applying the appropriate corrections, uncertainties may be reduced by at least 50%. The uncertainties, with or without the appropriate corrections might not be compatible with the needs of solar energy and atmospheric science applications; yet, this report may shed some light on the sources of uncertainties and the means to reduce overall uncertainty in measuring solar irradiance.« less

  11. Wind effects on long-span bridges: Probabilistic wind data format for buffeting and VIV load assessments

    NASA Astrophysics Data System (ADS)

    Hoffmann, K.; Srouji, R. G.; Hansen, S. O.

    2017-12-01

    The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.

  12. Reusable Launch Vehicle Control in Multiple Time Scale Sliding Modes

    NASA Technical Reports Server (NTRS)

    Shtessel, Yuri

    1999-01-01

    A reusable launch vehicle control problem during ascent is addressed via multiple-time scaled continuous sliding mode control. The proposed sliding mode controller utilizes a two-loop structure and provides robust, de-coupled tracking of both orientation angle command profiles and angular rate command profiles in the presence of bounded external disturbances and plant uncertainties. Sliding mode control causes the angular rate and orientation angle tracking error dynamics to be constrained to linear, de-coupled, homogeneous, and vector valued differential equations with desired eigenvalues placement. The dual-time scale sliding mode controller was designed for the X-33 technology demonstration sub-orbital launch vehicle in the launch mode. 6DOF simulation results show that the designed controller provides robust, accurate, de-coupled tracking of the orientation angle command profiles in presence of external disturbances and vehicle inertia uncertainties. It creates possibility to operate the X-33 vehicle in an aircraft-like mode with reduced pre-launch adjustment of the control system.

  13. Research on effect of rough surface on FMCW laser radar range accuracy

    NASA Astrophysics Data System (ADS)

    Tao, Huirong

    2018-03-01

    The non-cooperative targets large scale measurement system based on frequency-modulated continuous-wave (FMCW) laser detection and ranging technology has broad application prospects. It is easy to automate measurement without cooperative targets. However, the complexity and diversity of the surface characteristics of the measured surface directly affects the measurement accuracy. First, the theoretical analysis of range accuracy for a FMCW laser radar was studied, the relationship between surface reflectivity and accuracy was obtained. Then, to verify the effect of surface reflectance for ranging accuracy, a standard tool ball and three standard roughness samples were measured within 7 m to 24 m. The uncertainty of each target was obtained. The results show that the measurement accuracy is found to increase as the surface reflectivity gets larger. Good agreements were obtained between theoretical analysis and measurements from rough surfaces. Otherwise, when the laser spot diameter is smaller than the surface correlation length, a multi-point averaged measurement can reduce the measurement uncertainty. The experimental results show that this method is feasible.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kress, Joel David

    The development and scale up of cost effective carbon capture processes is of paramount importance to enable the widespread deployment of these technologies to significantly reduce greenhouse gas emissions. The U.S. Department of Energy initiated the Carbon Capture Simulation Initiative (CCSI) in 2011 with the goal of developing a computational toolset that would enable industry to more effectively identify, design, scale up, operate, and optimize promising concepts. The first half of the presentation will introduce the CCSI Toolset consisting of basic data submodels, steady-state and dynamic process models, process optimization and uncertainty quantification tools, an advanced dynamic process control framework,more » and high-resolution filtered computationalfluid- dynamics (CFD) submodels. The second half of the presentation will describe a high-fidelity model of a mesoporous silica supported, polyethylenimine (PEI)-impregnated solid sorbent for CO 2 capture. The sorbent model includes a detailed treatment of transport and amine-CO 2- H 2O interactions based on quantum chemistry calculations. Using a Bayesian approach for uncertainty quantification, we calibrate the sorbent model to Thermogravimetric (TGA) data.« less

  15. Intelligence by design in an entropic power grid

    NASA Astrophysics Data System (ADS)

    Negrete-Pincetic, Matias Alejandro

    In this work, the term Entropic Grid is coined to describe a power grid with increased levels of uncertainty and dynamics. These new features will require the reconsideration of well-established paradigms in the way of planning and operating the grid and its associated markets. New tools and models able to handle uncertainty and dynamics will form the required scaffolding to properly capture the behavior of the physical system, along with the value of new technologies and policies. The leverage of this knowledge will facilitate the design of new architectures to organize power and energy systems and their associated markets. This work presents several results, tools and models with the goal of contributing to that design objective. A central idea of this thesis is that the definition of products is critical in electricity markets. When markets are constructed with appropriate product definitions in mind, the interference between the physical and the market/financial systems seen in today's markets can be reduced. A key element of evaluating market designs is understanding the impact that salient features of an entropic grid---uncertainty, dynamics, constraints---can have on the electricity markets. Dynamic electricity market models tailored to capture such features are developed in this work. Using a multi-settlement dynamic electricity market, the impact of volatility is investigated. The results show the need to implement policies and technologies able to cope with the volatility of renewable sources. Similarly, using a dynamic electricity market model in which ramping costs are considered, the impacts of those costs on electricity markets are investigated. The key conclusion is that those additional ramping costs, in average terms, are not reflected in electricity prices. These results reveal several difficulties with today's real-time markets. Elements of an alternative architecture to organize these markets are also discussed.

  16. The Global Aerosol Synthesis and Science Project (GASSP): Measurements and Modeling to Reduce Uncertainty

    DOE PAGES

    Reddington, C. L.; Carslaw, K. S.; Stier, P.; ...

    2017-09-01

    The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less

  17. The Global Aerosol Synthesis and Science Project (GASSP): Measurements and Modeling to Reduce Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddington, C. L.; Carslaw, K. S.; Stier, P.

    The largest uncertainty in the historical radiative forcing of climate is caused by changes in aerosol particles due to anthropogenic activity. Sophisticated aerosol microphysics processes have been included in many climate models in an effort to reduce the uncertainty. However, the models are very challenging to evaluate and constrain because they require extensive in situ measurements of the particle size distribution, number concentration, and chemical composition that are not available from global satellite observations. The Global Aerosol Synthesis and Science Project (GASSP) aims to improve the robustness of global aerosol models by combining new methodologies for quantifying model uncertainty, tomore » create an extensive global dataset of aerosol in situ microphysical and chemical measurements, and to develop new ways to assess the uncertainty associated with comparing sparse point measurements with low-resolution models. GASSP has assembled over 45,000 hours of measurements from ships and aircraft as well as data from over 350 ground stations. The measurements have been harmonized into a standardized format that is easily used by modelers and nonspecialist users. Available measurements are extensive, but they are biased to polluted regions of the Northern Hemisphere, leaving large pristine regions and many continental areas poorly sampled. The aerosol radiative forcing uncertainty can be reduced using a rigorous model–data synthesis approach. Nevertheless, our research highlights significant remaining challenges because of the difficulty of constraining many interwoven model uncertainties simultaneously. Although the physical realism of global aerosol models still needs to be improved, the uncertainty in aerosol radiative forcing will be reduced most effectively by systematically and rigorously constraining the models using extensive syntheses of measurements.« less

  18. A strategy for investment in space resource utilization

    NASA Technical Reports Server (NTRS)

    Mendell, Wendell W.

    1992-01-01

    Considerations governing a strategy for investment in the utilization of space resources are discussed. It is suggested on the basis of an examination of current trends in terms of historical processes which operate on new frontiers that the limited markets and unfamiliar technologies associated with space commercialization today may change dramatically in 20 years when lunar resources are accessible. It is argued that the uncertainty of such projections discourages investment at a useful scale unless a strategy for technology development can be implemented which provides tangible and marketable benefits in the intermediate term. At present, technologies can be identified which will be required (and therefore valuable) at the time of lunar settlement, and whose development can be planned to yield marketable intermediate products on earth. It is concluded that the formation of precompetitive collaborative research consortia in the industrial sector could reduce technical and economic risk in the early stages and could promote a favorable political environment for the future growth of space activities.

  19. Funding the unfundable: mechanisms for managing uncertainty in decisions on the introduction of new and innovative technologies into healthcare systems.

    PubMed

    Stafinski, Tania; McCabe, Christopher J; Menon, Devidas

    2010-01-01

    As tensions between payers, responsible for ensuring prudent and principled use of scarce resources, and both providers and patients, who legitimately want access to technologies from which they could benefit, continue to mount, interest in approaches to managing the uncertainty surrounding the introduction of new health technologies has heightened. The purpose of this project was to compile an inventory of various types of 'access with evidence development' (AED) schemes, examining characteristics of the technologies to which they have been applied, the uncertainty they sought to address, the terms of arrangements of each scheme, and the policy outcomes. It also aimed to identify issues related to such schemes, including advantages and disadvantages from the perspectives of various stakeholder groups. A comprehensive search, review and appraisal of peer-reviewed and 'grey' literature were performed, followed by a facilitated workshop of academics and decision makers with expertise in AED schemes. Information was extracted and compiled in tabular form to identify patterns or trends. To enhance the validity of interpretations made, member checking was performed. Although the concept of AED is not new, evaluative data are sparse. Despite varying opinions on the 'right' answers to some of the questions raised, there appears to be consensus on a 'way forward'--development of methodological guidelines. All stakeholders seemed to share the view that AEDs offer the potential to facilitate patient access to promising new technologies and encourage innovation while ensuring effective use of scarce healthcare resources. There is no agreement on what constitutes 'sufficient evidence', and it depends on the specific uncertainty in question. There is agreement on the need for 'best practice' guidelines around the implementation and evaluation of AED schemes. This is the first attempt at a comprehensive analysis of methods that have been used to address uncertainty concerning a new drug or other technology. The analysis reveals that, although various approaches have been experimented with, many of them have not achieved the ostensible goal of the approach. This article outlines challenges related to AED schemes and issues that remain unresolved.

  20. The Need for Governance by Experimentation: The Case of Biofuels.

    PubMed

    Asveld, Lotte

    2016-06-01

    The policies of the European Union concerning the development of biofuels can be termed a lock-in. Biofuels were initially hailed as a green, sustainability technology. However evidence to the contrary quickly emerged. The European Commission proposed to alter its policies to accommodate for these effects but met with fierce resistance from a considerable number of member states who have an economic interest in these first generation biofuels. In this paper I argue that such a lock-in might have been avoided if an experimental approach to governance had been adopted. Existing approaches such as anticipation and niche management either do not reduce uncertainty sufficiently or fail to explicitly address conflicts between values motivating political and economic support for new technologies. In this paper, I suggest to apply an experimental framework to the development of sustainable biobased technologies. Such an approach builds on insights from adaptive management and transition management in that it has the stimulation of learning effects at its core. I argue that these learning effects should occur on the actual impacts of new technologies, on the institutionalisation of new technologies and most specifically on the norms and values that underly policies supporting new technologies. This approach can be relevant for other emerging technologies.

  1. Crossing the Technology Adoption Chasm: Implications for DoD

    DTIC Science & Technology

    2008-06-30

    technologies (where, given uncertainty of evaluation, adoption is driven by mimicry processes) and those technologies that exhibit network...Total Capital per Farm Dependent Variable = Long-run Equilibrium Percentage of Acreage Planted to Hybrid Seed Average Corn Acre per

  2. Quantifying model uncertainty in seasonal Arctic sea-ice forecasts

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin

    2017-04-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  3. Nuclear Data Uncertainty Quantification: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  4. Development of Large-Scale Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Ruff, Gary A.; Urban, David; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Cowlard, Adam J.; hide

    2013-01-01

    The status is presented of a spacecraft fire safety research project that is under development to reduce the uncertainty and risk in the design of spacecraft fire safety systems by testing at nearly full scale in low-gravity. Future crewed missions are expected to be more complex and longer in duration than previous exploration missions outside of low-earth orbit. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low-gravity, the need for realistic scale testing at reduced gravity has been demonstrated. To address this gap in knowledge, a project has been established under the NASA Advanced Exploration Systems Program under the Human Exploration and Operations Mission directorate with the goal of substantially advancing our understanding of the spacecraft fire safety risk. Associated with the project is an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The experiments are under development to be conducted in an Orbital Science Corporation Cygnus vehicle after it has undocked from the ISS. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the atmosphere. A computer modeling effort will complement the experimental effort. The international topical team is collaborating with the NASA team in the definition of the experiment requirements and performing supporting analysis, experimentation and technology development. The status of the overall experiment and the associated international technology development efforts are summarized.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Liu, Xiaobing

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects have been competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This paper highlights the findings of a case study of one of the ARRA-funded GSHP demonstration projects, a ground-source variable refrigerant flow (GS-VRF) system installed at the Human Health Building at Oakland University in Rochester, Michigan.more » This case study is based on the analysis of measured performance data, maintenance records, construction costs, and simulations of the energy consumption of conventional central heating, ventilation, and air-conditioning (HVAC) systems providing the same level of space conditioning as the demonstrated GS-VRF system. The evaluated performance metrics include the energy efficiency of the heat pump equipment and the overall GS-VRF system, pumping performance, energy savings, carbon emission reductions, and cost-effectiveness of the GS-VRF system compared with conventional HVAC systems. This case study also identified opportunities for reducing uncertainties in the performance evaluation, improving the operational efficiency, and reducing the installed cost of similar GSHP systems in the future.« less

  6. Rumsey and Walker_AMT_2016_Figure 2.xlsx

    EPA Pesticide Factsheets

    Figure summarizes uncertainty (error) in hourly gradient flux measurements by individual analyte. Flux uncertainty is derived from estimates of uncertainty in chemical gradients and turbulent transfer velocity.This dataset is associated with the following publication:Rumsey, I. Application of an online ion chromatography-based instrument for gradient flux measurements of speciated nitrogen and sulfur. ENVIRONMENTAL SCIENCE & TECHNOLOGY. American Chemical Society, Washington, DC, USA, 9(6): 2581-2592, (2016).

  7. Parameter Uncertainties for a 10-Meter Ground-Based Optical Reception Station

    NASA Technical Reports Server (NTRS)

    Shaik, K.

    1990-01-01

    Performance uncertainties for a 10-m optical reception station may arise from the nature of the communications channel or from a specific technology choice. Both types of uncertainties are described in this article to develop an understanding of the limitations imposed by them and to provide a rational basis for making technical decisions. The performance at night will be considerably higher than for daytime reception.

  8. Methodology for conceptual remote sensing spacecraft technology: insertion analysis balancing performance, cost, and risk

    NASA Astrophysics Data System (ADS)

    Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.

    1997-12-01

    Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.

  9. Global emission projections of particulate matter (PM): II. Uncertainty analyses of on-road vehicle exhaust emissions

    NASA Astrophysics Data System (ADS)

    Yan, Fang; Winijkul, Ekbordin; Bond, Tami C.; Streets, David G.

    2014-04-01

    Estimates of future emissions are necessary for understanding the future health of the atmosphere, designing national and international strategies for air quality control, and evaluating mitigation policies. Emission inventories are uncertain and future projections even more so, thus it is important to quantify the uncertainty inherent in emission projections. This paper is the second in a series that seeks to establish a more mechanistic understanding of future air pollutant emissions based on changes in technology. The first paper in this series (Yan et al., 2011) described a model that projects emissions based on dynamic changes of vehicle fleet, Speciated Pollutant Emission Wizard-Trend, or SPEW-Trend. In this paper, we explore the underlying uncertainties of global and regional exhaust PM emission projections from on-road vehicles in the coming decades using sensitivity analysis and Monte Carlo simulation. This work examines the emission sensitivities due to uncertainties in retirement rate, timing of emission standards, transition rate of high-emitting vehicles called “superemitters”, and emission factor degradation rate. It is concluded that global emissions are most sensitive to parameters in the retirement rate function. Monte Carlo simulations show that emission uncertainty caused by lack of knowledge about technology composition is comparable to the uncertainty demonstrated by alternative economic scenarios, especially during the period 2010-2030.

  10. Technical report: The design and evaluation of a basin-scale wireless sensor network for mountain hydrology

    NASA Astrophysics Data System (ADS)

    Zhang, Ziran; Glaser, Steven D.; Bales, Roger C.; Conklin, Martha; Rice, Robert; Marks, Danny G.

    2017-05-01

    A network of sensors for spatially representative water-balance measurements was developed and deployed across the 2000 km2 snow-dominated portion of the upper American River basin, primarily to measure changes in snowpack and soil-water storage, air temperature, and humidity. This wireless sensor network (WSN) consists of 14 sensor clusters, each with 10 measurement nodes that were strategically placed within a 1 km2 area, across different elevations, aspects, slopes, and canopy covers. Compared to existing operational sensor installations, the WSN reduces hydrologic uncertainty in at least three ways. First, redundant measurements improved estimation of lapse rates for air and dew-point temperature. Second, distributed measurements captured local variability and constrained uncertainty in air and dew-point temperature, snow accumulation, and derived hydrologic attributes important for modeling and prediction. Third, the distributed relative-humidity measurements offer a unique capability to monitor upper-basin patterns in dew-point temperature and characterize elevation gradient of water vapor-pressure deficit across steep, variable topography. Network statistics during the first year of operation demonstrated that the WSN was robust for cold, wet, and windy conditions in the basin. The electronic technology used in the WSN-reduced adverse effects, such as high current consumption, multipath signal fading, and clock drift, seen in previous remote WSNs.

  11. An application of multiattribute decision analysis to the Space Station Freedom program. Case study: Automation and robotics technology evaluation

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Levin, Richard R.; Carpenter, Elisabeth J.

    1990-01-01

    The results are described of an application of multiattribute analysis to the evaluation of high leverage prototyping technologies in the automation and robotics (A and R) areas that might contribute to the Space Station (SS) Freedom baseline design. An implication is that high leverage prototyping is beneficial to the SS Freedom Program as a means for transferring technology from the advanced development program to the baseline program. The process also highlights the tradeoffs to be made between subsidizing high value, low risk technology development versus high value, high risk technology developments. Twenty one A and R Technology tasks spanning a diverse array of technical concepts were evaluated using multiattribute decision analysis. Because of large uncertainties associated with characterizing the technologies, the methodology was modified to incorporate uncertainty. Eight attributes affected the rankings: initial cost, operation cost, crew productivity, safety, resource requirements, growth potential, and spinoff potential. The four attributes of initial cost, operations cost, crew productivity, and safety affected the rankings the most.

  12. Comprehensive analysis of proton range uncertainties related to stopping-power-ratio estimation using dual-energy CT imaging

    NASA Astrophysics Data System (ADS)

    Li, B.; Lee, H. C.; Duan, X.; Shen, C.; Zhou, L.; Jia, X.; Yang, M.

    2017-09-01

    The dual-energy CT-based (DECT) approach holds promise in reducing the overall uncertainty in proton stopping-power-ratio (SPR) estimation as compared to the conventional stoichiometric calibration approach. The objective of this study was to analyze the factors contributing to uncertainty in SPR estimation using the DECT-based approach and to derive a comprehensive estimate of the range uncertainty associated with SPR estimation in treatment planning. Two state-of-the-art DECT-based methods were selected and implemented on a Siemens SOMATOM Force DECT scanner. The uncertainties were first divided into five independent categories. The uncertainty associated with each category was estimated for lung, soft and bone tissues separately. A single composite uncertainty estimate was eventually determined for three tumor sites (lung, prostate and head-and-neck) by weighting the relative proportion of each tissue group for that specific site. The uncertainties associated with the two selected DECT methods were found to be similar, therefore the following results applied to both methods. The overall uncertainty (1σ) in SPR estimation with the DECT-based approach was estimated to be 3.8%, 1.2% and 2.0% for lung, soft and bone tissues, respectively. The dominant factor contributing to uncertainty in the DECT approach was the imaging uncertainties, followed by the DECT modeling uncertainties. Our study showed that the DECT approach can reduce the overall range uncertainty to approximately 2.2% (2σ) in clinical scenarios, in contrast to the previously reported 1%.

  13. Observation of quantum-memory-assisted entropic uncertainty relation under open systems, and its steering

    NASA Astrophysics Data System (ADS)

    Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu

    2018-01-01

    Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.

  14. Blade tip timing (BTT) uncertainties

    NASA Astrophysics Data System (ADS)

    Russhard, Pete

    2016-06-01

    Blade Tip Timing (BTT) is an alternative technique for characterising blade vibration in which non-contact timing probes (e.g. capacitance or optical probes), typically mounted on the engine casing (figure 1), and are used to measure the time at which a blade passes each probe. This time is compared with the time at which the blade would have passed the probe if it had been undergoing no vibration. For a number of years the aerospace industry has been sponsoring research into Blade Tip Timing technologies that have been developed as tools to obtain rotor blade tip deflections. These have been successful in demonstrating the potential of the technology, but rarely produced quantitative data, along with a demonstration of a traceable value for measurement uncertainty. BTT technologies have been developed under a cloak of secrecy by the gas turbine OEM's due to the competitive advantages it offered if it could be shown to work. BTT measurements are sensitive to many variables and there is a need to quantify the measurement uncertainty of the complete technology and to define a set of guidelines as to how BTT should be applied to different vehicles. The data shown in figure 2 was developed from US government sponsored program that bought together four different tip timing system and a gas turbine engine test. Comparisons showed that they were just capable of obtaining measurement within a +/-25% uncertainty band when compared to strain gauges even when using the same input data sets.

  15. Biogeosystem technique as the way to certainty of soil, hydrosphere, environment and climate

    NASA Astrophysics Data System (ADS)

    Kalinitchenko, Valery; Batukaev, Abdulmalik; Zarmaev, Ali; Startsev, Viktor; Chernenko, Vladimir; Dikaev, Zaurbek; Sushkova, Svetlana

    2016-04-01

    The modern technological platform awkwardly imitates the Nature. Teaching the Geosciences, development of technology, overcoming the problem of uncertainty of geospheres is impossible on the base of outdated knowledge. An emphasis is to be done not on the natural analogues, but on our new technologies - Biogeosystem Technique (BGT*). BGT* is a transcendental (not imitating the natural processes) approach to soil processing, regulation of fluxes of energy, gas, water, matter and biological productivity of biosphere: Intrasoil milling processing in 20-50 cm soil layer provides new soil disperse system, best conditions for stable evolution of techno-soil and plant growth in period up to 40 years after the single processing. Pulse intrasoil discrete irrigation provides an injection of small discrete dose of water which distributes in vertical soil cylinder. Lateral distance between successive injections is 10-15 cm. The water within 5-10 min after injection spreads in cylinder of diameter 2-4 cm at depth from 5 to 50 cm. The soil carcass around the cylinder is dry and mechanically stable. Mean thermodynamic soil water potential after watering is of -0.2 MPa. Stomatal apparatus is in a regulation mode, transpiration rate is reduced, soil solution concentration increased, plant nutrition rate and biological productivity are high. No excessive plant transpiration, evaporation and seepage of water from soil. Intrasoil environmentally safe waste return during intrasoil milling processing and (or) intrasoil pulse discrete plants watering with nutrition. Is provided the medically, veterinary and environmentally safe recycle of municipal, industrial, biological and agricultural wastes into the soil continuum. All applied substances transform to plant nutrients, not degrade to the greenhouse gas, or become the deposit of waste. Capabilities of intrasoil technologies of BGT* to correct and sustain the Nature: Correct soil evolution, long-term biological productivity of intrasoil processed soil of 150% higher compared to initial. Save of fresh water by intrasoil irrigation up to 20 times. Biological return of matter and high biological productivity of soil by environmentally safe intrasoil waste recycling. On the base of BGT* are opened the opportunities for: controlled, stable, safe, biologically effective soil, environment and landscape; improved equilibriums in soil, environment and landscape; reduced water consumption; improved waste management; reduced flux of nutrients to water systems; carbon transformation into the soil to the state of elements of plant nutrition; reducing degradation of biological matter to the state of greenhouse gases; increasing biologi al consumption of carbon dioxide by photosynthesis in terrestrial system; prolongation of the phase of carbon in terrestrial biological system for greenhouse gases sequestration; extension of the active area of biosphere on terrestrial part of the Earth; high rate oxidation of methane and hydrogen sulfide by oxygen, which is ionized in photosynthesis, and thus is biologically active; high biological product output of biosphere. The higher biomass on the Earth, the more ecologically safe food, raw material and biofuel can be produced, better conditions for technologies of Noosphere. Uncertainty of soil, hydrosphere, environment and climate will be reduced by the BGT* methods. Are available BGT* robotic systems of low cost and minimal consumption of energy and material.

  16. Estimation of Vickers hardness uncertainty for a heterogeneous welded joint (S235JR+AR and X2CrNiMo17-12-2)

    NASA Astrophysics Data System (ADS)

    Dijmărescu, M. C.; Dijmărescu, M. R.

    2017-08-01

    When talking about tests that include measurements, the uncertainty of measurement is an essential element because it is important to know the limits within the obtained results may be assumed to lie and the influence the measurement technological system elements have on these results. The research presented in this paper focuses on the estimation of the Vickers hardness uncertainty of measurement for the heterogeneous welded joint between S235JR+AR and X2CrNiMo17-12-2 materials in order to establish the results relevance and the quality assessment of this joint. The paper contents are structured in three main parts. In the first part, the initial data necessary for the experiment is presented in terms of the welded joint and technological means characterisation. The second part presents the physical experiment development and its results and in the third part the uncertainty of the measurements is calculated and a results discussion is undertaken.

  17. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, Aron M; Sengupta, Manajit

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  18. Rejoinder: Certainty, Doubt, and the Reduction of Uncertainty

    ERIC Educational Resources Information Center

    Kauffman, James M.; Sasso, Gary M.

    2006-01-01

    Postmodern arguments about doubt, certainty, and objectivity are both old and unsound. All philosophical relativity, or postmodernism by whatever name it is known, denies the possibility of objective truth. Postmodernists' arguments for reducing uncertainty or approximating truth are apparently nonexistent, and their method of reducing uncertainty…

  19. Practical uncertainty reduction and quantification in shock physics measurements

    DOE PAGES

    Akin, M. C.; Nguyen, J. H.

    2015-04-20

    We report the development of a simple error analysis sampling method for identifying intersections and inflection points to reduce total uncertainty in experimental data. This technique was used to reduce uncertainties in sound speed measurements by 80% over conventional methods. Here, we focused on its impact on a previously published set of Mo sound speed data and possible implications for phase transition and geophysical studies. However, this technique's application can be extended to a wide range of experimental data.

  20. Strategic Technology Investment Analysis: An Integrated System Approach

    NASA Technical Reports Server (NTRS)

    Adumitroaie, V.; Weisbin, C. R.

    2010-01-01

    Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.

  1. Investment, regulation, and uncertainty: managing new plant breeding techniques.

    PubMed

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline.

  2. Investment, regulation, and uncertainty

    PubMed Central

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  3. Quantifying the Impact of Technological Trends and Spatiotemporal Variability in Hydraulic Fracturing Water Intensity

    NASA Astrophysics Data System (ADS)

    Montgomery, J.; O'sullivan, F.

    2016-12-01

    An important metric for comparing the environmental impact of hydraulically fractured oil and gas wells to other energy technologies is the water intensity, or water usage normalized to energy production. Due to varying hydraulic fracturing practices, immense variability in short-term well performance, and uncertainty about lifetime production from wells, the water intensity of wells is difficult to predict and should be modeled statistically using field data. We analyzed public production and hydraulic fracturing data for 3497 wells drilled in the North Dakota Williston Basin between 2012 and 2015 to identify technology and sweet-spotting trends and identify their impact on well productivity and water intensity. We found that the water used per well increased by an average of 43% per year over this period while the water intensity of wells increased by 32% per year. The difference in these rates was due to a trend of increasing production rates, which we found to be associated equally with changes in technology and sweet-spotting. The prevalent role of sweet spotting means that as future drilling activity shifts into less productive areas than are presently being exploited, this will predictably increase the water intensity of new wells. Although some of the variability in well productivity and water intensity is resolvable to the influence of spatial heterogeneity and technology practices, a substantial amount of uncertainty is irreducible due to unobservable factors. This uncertainty can best be represented and updated with new information, such as initial rates of production, using a Bayesian decline curve model. We demonstrate how this approach can be used to forecast uncertainty of water intensity at different locations and points in time, making it a useful tool for a range of stakeholders, including regulatory agencies assessing the environmental impact of drilling activity within particular watersheds.

  4. When is enough evidence enough? - Using systematic decision analysis and value-of-information analysis to determine the need for further evidence.

    PubMed

    Siebert, Uwe; Rochau, Ursula; Claxton, Karl

    2013-01-01

    Decision analysis (DA) and value-of-information (VOI) analysis provide a systematic, quantitative methodological framework that explicitly considers the uncertainty surrounding the currently available evidence to guide healthcare decisions. In medical decision making under uncertainty, there are two fundamental questions: 1) What decision should be made now given the best available evidence (and its uncertainty)?; 2) Subsequent to the current decision and given the magnitude of the remaining uncertainty, should we gather further evidence (i.e., perform additional studies), and if yes, which studies should be undertaken (e.g., efficacy, side effects, quality of life, costs), and what sample sizes are needed? Using the currently best available evidence, VoI analysis focuses on the likelihood of making a wrong decision if the new intervention is adopted. The value of performing further studies and gathering additional evidence is based on the extent to which the additional information will reduce this uncertainty. A quantitative framework allows for the valuation of the additional information that is generated by further research, and considers the decision maker's objectives and resource constraints. Claxton et al. summarise: "Value of information analysis can be used to inform a range of policy questions including whether a new technology should be approved based on existing evidence, whether it should be approved but additional research conducted or whether approval should be withheld until the additional evidence becomes available." [Claxton K. Value of information entry in Encyclopaedia of Health Economics, Elsevier, forthcoming 2014.] The purpose of this tutorial is to introduce the framework of systematic VoI analysis to guide further research. In our tutorial article, we explain the theoretical foundations and practical methods of decision analysis and value-of-information analysis. To illustrate, we use a simple case example of a foot ulcer (e.g., with diabetes) as well as key references from the literature, including examples for the use of the decision-analytic VoI framework by health technology assessment agencies to guide further research. These concepts may guide stakeholders involved or interested in how to determine whether or not and, if so, which additional evidence is needed to make decisions. Copyright © 2013. Published by Elsevier GmbH.

  5. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.

  6. Fatigue damage prognosis of internal delamination in composite plates under cyclic compression loadings using affine arithmetic as uncertainty propagation tool

    NASA Astrophysics Data System (ADS)

    Gbaguidi, Audrey J.-M.

    Structural health monitoring (SHM) has become indispensable for reducing maintenance costs and increasing the in-service capacity of a structure. The increased use of lightweight composite materials in aircraft structures drastically increased the effects of fatigue induced damage on their critical structural components and thus the necessity to predict the remaining life of those components. Damage prognosis, one of the least investigated fields in SHM, uses the current damage state of the system to forecast its future performance by estimating the expected loading environments. A successful damage prediction model requires the integration of technologies in areas like measurements, materials science, mechanics of materials, and probability theories, but most importantly the quantification of uncertainty in all these areas. In this study, Affine Arithmetic is used as a method for incorporating the uncertainties due to the material properties into the fatigue life prognosis of composite plates subjected to cyclic compressive loadings. When loadings are compressive in nature, the composite plates undergo repeated buckling-unloading of the delaminated layer which induces mixed modes I and II states of stress at the tip of the delamination in the plates. The Kardomateas model-based prediction law is used to predict the growth of the delamination, while the integration of the effects of the uncertainties for modes I and II coefficients in the fatigue life prediction model is handled using Affine arithmetic. The Mode I and Mode II interlaminar fracture toughness and fatigue characterization of the composite plates are first experimentally studied to obtain the material coefficients and fracture toughness, respectively. Next, these obtained coefficients are used in the Kardomateas law to predict the delamination lengths in the composite plates while using Affine Arithmetic to handle their uncertainties. At last, the fatigue characterization of the composite plates during compressive-buckling loadings is experimentally studied, and the delamination lengths obtained are compared with the predicted values to check the performance of Affine Arithmetic as an uncertainty propagation tool.

  7. Uncertainty of Polarized Parton Distributions

    NASA Astrophysics Data System (ADS)

    Hirai, M.; Goto, Y.; Horaguchi, T.; Kobayashi, H.; Kumano, S.; Miyama, M.; Saito, N.; Shibata, T.-A.

    Polarized parton distribution functions are determined by a χ2 analysis of polarized deep inelastic experimental data. In this paper, uncertainty of obtained distribution functions is investigated by a Hessian method. We find that the uncertainty of the polarized gluon distribution is fairly large. Then, we estimate the gluon uncertainty by including the fake data which are generated from prompt photon process at RHIC. We observed that the uncertainty could be reduced with these data.

  8. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2017-08-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  9. Analysis of uncertainties in the estimates of nitrous oxide and methane emissions in the UK's greenhouse gas inventory for agriculture

    NASA Astrophysics Data System (ADS)

    Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.

    2014-01-01

    The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.

  10. Technology Systems Analysis | Energy Analysis | NREL

    Science.gov Websites

    RD&D areas in terms of potential costs, benefits, risks, uncertainties, and timeframes. For examples of our technology systems analysis work, see these research areas: Bioenergy Buildings Grid

  11. Information transduction capacity reduces the uncertainties in annotation-free isoform discovery and quantification

    PubMed Central

    Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong

    2017-01-01

    Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101

  12. Transportation Energy Futures Series: Non-Cost Barriers to Consumer Adoption of New Light-Duty Vehicle Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephens, T.

    2013-03-01

    Consumer preferences are key to the adoption of new vehicle technologies. Barriers to consumer adoption include price and other obstacles, such as limited driving range and charging infrastructure; unfamiliarity with the technology and uncertainty about direct benefits; limited makes and models with the technology; reputation or perception of the technology; standardization issues; and regulations. For each of these non-cost barriers, this report estimates an effective cost and summarizes underlying influences on consumer preferences, approximate magnitude and relative severity, and assesses potential actions, based on a comprehensive literature review. While the report concludes that non-cost barriers are significant, effective cost andmore » potential market share are very uncertain. Policies and programs including opportunities for drivers to test drive advanced vehicles, general public outreach and information programs, incentives for providing charging and fueling infrastructure, and development of technology standards were examined for their ability to address barriers, but little quantitative data exists on the effectiveness of these measures. This is one in a series of reports produced as a result of the Transportation Energy Futures project, a Department of Energy-sponsored multi-agency effort to pinpoint underexplored strategies for reducing GHGs and petroleum dependence related to transportation.« less

  13. Can hydraulic-modelled rating curves reduce uncertainty in high flow data?

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida; Lam, Norris; Lyon, Steve W.

    2017-04-01

    Flood risk assessments rely on accurate discharge data records. Establishing a reliable rating curve for calculating discharge from stage at a gauging station normally takes years of data collection efforts. Estimation of high flows is particularly difficult as high flows occur rarely and are often practically difficult to gauge. Hydraulically-modelled rating curves can be derived based on as few as two concurrent stage-discharge and water-surface slope measurements at different flow conditions. This means that a reliable rating curve can, potentially, be derived much faster than a traditional rating curve based on numerous stage-discharge gaugings. In this study we compared the uncertainty in discharge data that resulted from these two rating curve modelling approaches. We applied both methods to a Swedish catchment, accounting for uncertainties in the stage-discharge gauging and water-surface slope data for the hydraulic model and in the stage-discharge gauging data and rating-curve parameters for the traditional method. We focused our analyses on high-flow uncertainty and the factors that could reduce this uncertainty. In particular, we investigated which data uncertainties were most important, and at what flow conditions the gaugings should preferably be taken. First results show that the hydraulically-modelled rating curves were more sensitive to uncertainties in the calibration measurements of discharge than water surface slope. The uncertainty of the hydraulically-modelled rating curves were lowest within the range of the three calibration stage-discharge gaugings (i.e. between median and two-times median flow) whereas uncertainties were higher outside of this range. For instance, at the highest observed stage of the 24-year stage record, the 90% uncertainty band was -15% to +40% of the official rating curve. Additional gaugings at high flows (i.e. four to five times median flow) would likely substantially reduce those uncertainties. These first results show the potential of the hydraulically-modelled curves, particularly where the calibration gaugings are of high quality and cover a wide range of flow conditions.

  14. Air traffic management as principled negotiation between intelligent agents

    NASA Technical Reports Server (NTRS)

    Wangermann, J. P.

    1994-01-01

    The major challenge facing the world's aircraft/airspace system (AAS) today is the need to provide increased capacity, while reducing delays, increasing the efficiency of flight operations, and improving safety. Technologies are emerging that should improve the performance of the system, but which could also introduce uncertainty, disputes, and inefficiency if not properly implemented. The aim of our research is to apply techniques from intelligent control theory and decision-making theory to define an Intelligent Aircraft/Airspace System (IAAS) for the year 2025. The IAAS would make effective use of the technical capabilities of all parts of the system to meet the demand for increased capacity with improved performance.

  15. Feasibility of an orbital simulator of stratospheric photochemistry

    NASA Technical Reports Server (NTRS)

    Matloff, G. L.; Hoffert, M. I.

    1978-01-01

    It is proposed that a stratospheric photochemistry simulator could be created in sun-synchronous orbit, so that diffusion and photochemistry could be decoupled and uncertainties in photochemical reaction rates could be substantially reduced. The proposed test chamber is described, and it is suggested that the technology of superpressure balloons seems to be the best short-term solution to the construction of the proposed facility. Both unreinforced polyester films and gelatin films are considered as candidate chamber coatings. It is noted that the experiments can be performed early in the space-manufacturing era and that at least three dedicated Shuttle launches will be required to establish the proposed facility.

  16. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  17. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  18. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D.L., E-mail: Donald.L.Smith@anl.gov

    2015-01-15

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  19. Challenges and opportunities of cloud computing for atmospheric sciences

    NASA Astrophysics Data System (ADS)

    Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.

    2016-04-01

    Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.

  20. Designing better methane mitigation policies: the challenge of distributed small sources in the natural gas sector

    NASA Astrophysics Data System (ADS)

    Ravikumar, Arvind P.; Brandt, Adam R.

    2017-04-01

    Methane—a short-lived and potent greenhouse gas—presents a unique challenge: it is emitted from a large number of highly distributed and diffuse sources. In this regard, the United States’ Environmental Protection Agency (EPA) has recommended periodic leak detection and repair surveys at oil and gas facilities using optical gas imaging technology. This regulation requires an operator to fix all detected leaks within a set time period. Whether such ‘find-all-fix-all’ policies are effective depends on significant uncertainties in the character of emissions. In this work, we systematically analyze the effect of facility-related and mitigation-related uncertainties on regulation effectiveness. Drawing from multiple publicly-available datasets, we find that: (1) highly-skewed leak-size distributions strongly influence emissions reduction potential; (2) variations in emissions estimates across facilities leads to large variability in mitigation effectiveness; (3) emissions reductions from optical gas imaging-based leak detection programs can range from 15% to over 70%; and (4) while implementation costs are uniformly lower than EPA estimates, benefits from saved gas are highly variable. Combining empirical evidence with model results, we propose four policy options for effective methane mitigation: performance-oriented targets for accelerated emission reductions, flexible policy mechanisms to account for regional variation, technology-agnostic regulations to encourage adoption of the most cost-effective measures, and coordination with other greenhouse gas mitigation policies to reduce unintended spillover effects.

  1. Characterization of Tactical Departure Scheduling in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Capps, Alan; Engelland, Shawn A.

    2011-01-01

    This paper discusses and analyzes current day utilization and performance of the tactical departure scheduling process in the National Airspace System (NAS) to understand the benefits in improving this process. The analysis used operational air traffic data from over 1,082,000 flights during the month of January, 2011. Specific metrics included the frequency of tactical departure scheduling, site specific variances in the technology's utilization, departure time prediction compliance used in the tactical scheduling process and the performance with which the current system can predict the airborne slot that aircraft are being scheduled into from the airport surface. Operational data analysis described in this paper indicates significant room for improvement exists in the current system primarily in the area of reduced departure time prediction uncertainty. Results indicate that a significant number of tactically scheduled aircraft did not meet their scheduled departure slot due to departure time uncertainty. In addition to missed slots, the operational data analysis identified increased controller workload associated with tactical departures which were subject to traffic management manual re-scheduling or controller swaps. An analysis of achievable levels of departure time prediction accuracy as obtained by a new integrated surface and tactical scheduling tool is provided to assess the benefit it may provide as a solution to the identified shortfalls. A list of NAS facilities which are likely to receive the greatest benefit from the integrated surface and tactical scheduling technology are provided.

  2. Water supply infrastructure planning under multiple uncertainties: A differentiated approach

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and desalination. Intense withdrawals for urban and agricultural use will lead to lowering of the water table in the aquifer at rapid but uncertain rates due to poor groundwater characterization. We assess the potential for additional groundwater data collection and a flexible infrastructure approach similar to that in Melbourne to mitigate risk.

  3. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  4. Impact of rocket propulsion technology on the radiation risk in missions to Mars

    NASA Astrophysics Data System (ADS)

    Durante, M.; Bruno, C.

    2010-10-01

    Exposure to cosmic radiation is today acknowledged as a major obstacle to human missions to Mars. In fact, in addition to the poor knowledge on the late effects of heavy ions in the cosmic rays, simple countermeasures are apparently not available. Shielding is indeed very problematic in space, because of mass problems and the high-energy of the cosmic rays, and radio-protective drugs or dietary supplements are not effective. However, the simplest countermeasure for reducing radiation risk is to shorten the duration time, particularly the transit time to Mars, where the dose rate is higher than on the planet surface. Here we show that using nuclear electric propulsion (NEP) rockets, the transit time could be substantially reduced to a point where radiation risk could be considered acceptable even with the current uncertainty on late effects.

  5. Towards Bridging the Gaps in Holistic Transition Prediction via Numerical Simulations

    NASA Technical Reports Server (NTRS)

    Choudhari, Meelan M.; Li, Fei; Duan, Lian; Chang, Chau-Lyan; Carpenter, Mark H.; Streett, Craig L.; Malik, Mujeeb R.

    2013-01-01

    The economic and environmental benefits of laminar flow technology via reduced fuel burn of subsonic and supersonic aircraft cannot be realized without minimizing the uncertainty in drag prediction in general and transition prediction in particular. Transition research under NASA's Aeronautical Sciences Project seeks to develop a validated set of variable fidelity prediction tools with known strengths and limitations, so as to enable "sufficiently" accurate transition prediction and practical transition control for future vehicle concepts. This paper provides a summary of selected research activities targeting the current gaps in high-fidelity transition prediction, specifically those related to the receptivity and laminar breakdown phases of crossflow induced transition in a subsonic swept-wing boundary layer. The results of direct numerical simulations are used to obtain an enhanced understanding of the laminar breakdown region as well as to validate reduced order prediction methods.

  6. Optimal Objective-Based Experimental Design for Uncertain Dynamical Gene Networks with Experimental Error.

    PubMed

    Mohsenizadeh, Daniel N; Dehghannasiri, Roozbeh; Dougherty, Edward R

    2018-01-01

    In systems biology, network models are often used to study interactions among cellular components, a salient aim being to develop drugs and therapeutic mechanisms to change the dynamical behavior of the network to avoid undesirable phenotypes. Owing to limited knowledge, model uncertainty is commonplace and network dynamics can be updated in different ways, thereby giving multiple dynamic trajectories, that is, dynamics uncertainty. In this manuscript, we propose an experimental design method that can effectively reduce the dynamics uncertainty and improve performance in an interaction-based network. Both dynamics uncertainty and experimental error are quantified with respect to the modeling objective, herein, therapeutic intervention. The aim of experimental design is to select among a set of candidate experiments the experiment whose outcome, when applied to the network model, maximally reduces the dynamics uncertainty pertinent to the intervention objective.

  7. Enterprise Information Technology Organizational Flexibility: Managing Uncertainty and Change

    ERIC Educational Resources Information Center

    Patten, Karen Prast

    2009-01-01

    Chief Information Officers (CIOs) lead enterprise information technology organizations (EITOs) in today's dynamic competitive business environment. CIOs deal with external and internal environmental changes, changing internal customer needs, and rapidly changing technology. New models for the organization include flexibility and suggest that CIOs…

  8. How Do Science and Technology Affect International Affairs?

    ERIC Educational Resources Information Center

    Weiss, Charles

    2015-01-01

    Science and technology influence international affairs by many different mechanisms. Both create new issues, risks and uncertainties. Advances in science alert the international community to new issues and risks. New technological capabilities transform war, diplomacy, commerce, intelligence, and investment. This paper identifies six basic…

  9. TESTING, PERFORMANCE VALIDATION AND QUALITY ASSURANCE/QUALITY CONTROL OF FIELD-PORTABLE INSTRUMENTATION

    EPA Science Inventory

    New technologies for field-portable monitoring instruments often have a long lead time in development and authorization. Some obstacles to the acceptance of these pilot technologies include concern about liabilities, reluctance to take risks on new technologies, and uncertainty a...

  10. Lidar Measurements of Atmospheric CO2 From Regional to Global Scales

    NASA Technical Reports Server (NTRS)

    Lin, Bing; Harrison, F. Wallace; Nehrir, Amin; Browell, Edward; Dobler, Jeremy; Campbell, Joel; Meadows, Byron; Obland, Michael; Ismail, Syed; Kooi, Susan; hide

    2015-01-01

    Atmospheric CO2 is a critical forcing for the Earth's climate and the knowledge on its distributions and variations influences predictions of the Earth's future climate. Large uncertainties in the predictions persist due to limited observations. This study uses the airborne Intensity-Modulated Continuous-Wave (IMCW) lidar developed at NASA Langley Research Center to measure regional atmospheric CO2 spatio-temporal variations. Further lidar development and demonstration will provide the capability of global atmospheric CO2 estimations from space, which will significantly advances our knowledge on atmospheric CO2 and reduce the uncertainties in the predictions of future climate. In this presentation, atmospheric CO2 column measurements from airborne flight campaigns and lidar system simulations for space missions will be discussed. A measurement precision of approx.0.3 ppmv for a 10-s average over desert and vegetated surfaces has been achieved. Data analysis also shows that airborne lidar CO2 column measurements over these surfaces agree well with in-situ measurements. Even when thin cirrus clouds present, consistent CO2 column measurements between clear and thin cirrus cloudy skies are obtained. Airborne flight campaigns have demonstrated that precise atmospheric column CO2 values can be measured from current IM-CW lidar systems, which will lead to use this airborne technique in monitoring CO2 sinks and sources in regional and continental scales as proposed by the NASA Atmospheric Carbon and Transport â€" America project. Furthermore, analyses of space CO2 measurements shows that applying the current IM-CW lidar technology and approach to space, the CO2 science goals of space missions will be achieved, and uncertainties in CO2 distributions and variations will be reduced.

  11. CO2 loss by permafrost thawing implies additional emissions reductions to limit warming to 1.5 or 2 °C

    NASA Astrophysics Data System (ADS)

    Burke, Eleanor J.; Chadburn, Sarah E.; Huntingford, Chris; Jones, Chris D.

    2018-02-01

    Large amounts of carbon are stored in the permafrost of the northern high latitude land. As permafrost degrades under a warming climate, some of this carbon will decompose and be released to the atmosphere. This positive climate-carbon feedback will reduce the natural carbon sinks and thus lower anthropogenic CO2 emissions compatible with the goals of the Paris Agreement. Simulations using an ensemble of the JULES-IMOGEN intermediate complexity climate model (including climate response and process uncertainty) and a stabilization target of 2 °C, show that including the permafrost carbon pool in the model increases the land carbon emissions at stabilization by between 0.09 and 0.19 Gt C year-1 (10th to 90th percentile). These emissions are only slightly reduced to between 0.08 and 0.16 Gt C year-1 (10th to 90th percentile) when considering 1.5 °C stabilization targets. This suggests that uncertainties caused by the differences in stabilization target are small compared with those associated with model parameterisation uncertainty. Inertia means that permafrost carbon loss may continue for many years after anthropogenic emissions have stabilized. Simulations suggest that between 225 and 345 Gt C (10th to 90th percentile) are in thawed permafrost and may eventually be released to the atmosphere for stabilization target of 2 °C. This value is 60-100 Gt C less for a 1.5 °C target. The inclusion of permafrost carbon will add to the demands on negative emission technologies which are already present in most low emissions scenarios.

  12. Prediction uncertainty and optimal experimental design for learning dynamical systems.

    PubMed

    Letham, Benjamin; Letham, Portia A; Rudin, Cynthia; Browne, Edward P

    2016-06-01

    Dynamical systems are frequently used to model biological systems. When these models are fit to data, it is necessary to ascertain the uncertainty in the model fit. Here, we present prediction deviation, a metric of uncertainty that determines the extent to which observed data have constrained the model's predictions. This is accomplished by solving an optimization problem that searches for a pair of models that each provides a good fit for the observed data, yet has maximally different predictions. We develop a method for estimating a priori the impact that additional experiments would have on the prediction deviation, allowing the experimenter to design a set of experiments that would most reduce uncertainty. We use prediction deviation to assess uncertainty in a model of interferon-alpha inhibition of viral infection, and to select a sequence of experiments that reduces this uncertainty. Finally, we prove a theoretical result which shows that prediction deviation provides bounds on the trajectories of the underlying true model. These results show that prediction deviation is a meaningful metric of uncertainty that can be used for optimal experimental design.

  13. Evaluation of a novel ultra small target technology supporting on-product overlay measurements

    NASA Astrophysics Data System (ADS)

    Smilde, Henk-Jan H.; den Boef, Arie; Kubis, Michael; Jak, Martin; van Schijndel, Mark; Fuchs, Andreas; van der Schaar, Maurits; Meyer, Steffen; Morgan, Stephen; Wu, Jon; Tsai, Vincent; Wang, Cathy; Bhattacharyya, Kaustuve; Chen, Kai-Hsiung; Huang, Guo-Tsai; Ke, Chih-Ming; Huang, Jacky

    2012-03-01

    Reducing the size of metrology targets is essential for in-die overlay metrology in advanced semiconductor manufacturing. In this paper, μ-diffraction-based overlay (μDBO) measurements with a YieldStar metrology tool are presented for target-sizes down to 10 × 10 μm2. The μDBO technology enables selection of only the diffraction efficiency information from the grating by efficiently separating it from product structure reflections. Therefore, μDBO targets -even when located adjacent to product environment- give excellent correlation with 40 × 160 μm2 reference targets. Although significantly smaller than standard scribe-line targets, they can achieve total-measurement-uncertainty values of below 0.5 nm on a wide range of product layers. This shows that the new μDBO technique allows for accurate metrology on ultra small in-die targets, while retaining the excellent TMU performance of diffraction-based overlay metrology.

  14. The Role of Nuclear Power in Reducing Risk of the Fossil Fuel Prices and Diversity of Electricity Generation in Tunisia: A Portfolio Approach

    NASA Astrophysics Data System (ADS)

    Abdelhamid, Mohamed Ben; Aloui, Chaker; Chaton, Corinne; Souissi, Jomâa

    2010-04-01

    This paper applies real options and mean-variance portfolio theories to analyze the electricity generation planning into presence of nuclear power plant for the Tunisian case. First, we analyze the choice between fossil fuel and nuclear production. A dynamic model is presented to illustrate the impact of fossil fuel cost uncertainty on the optimal timing to switch from gas to nuclear. Next, we use the portfolio theory to manage risk of the electricity generation portfolio and to determine the optimal fuel mix with the nuclear alternative. Based on portfolio theory, the results show that there is other optimal mix than the mix fixed for the Tunisian mix for the horizon 2010-2020, with lower cost for the same risk degree. In the presence of nuclear technology, we found that the optimal generating portfolio must include 13% of nuclear power technology share.

  15. Managing uncertainty in collaborative robotics engineering projects: The influence of task structure and peer interaction

    NASA Astrophysics Data System (ADS)

    Jordan, Michelle

    Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and different types of uncertainty management strategies in the less structured task setting than in the more structured task setting. Peer interaction was influential because students relied on supportive social response to enact most of their uncertainty management strategies. When students could not garner socially supportive response from their peers, their options for managing uncertainty were greatly reduced.

  16. A technique for reducing patient setup uncertainties by aligning and verifying daily positioning of a moving tumor using implanted fiducials

    PubMed Central

    Balter, Peter; Morice, Rodolfo C.; Choi, Bum; Kudchadker, Rajat J.; Bucci, Kara; Chang, Joe Y.; Dong, Lei; Tucker, Susan; Vedam, Sastry; Briere, Tina; Starkschall, George

    2008-01-01

    This study aimed to validate and implement a methodology in which fiducials implanted in the periphery of lung tumors can be used to reduce uncertainties in tumor location. Alignment software that matches marker positions on two‐dimensional (2D) kilovoltage portal images to positions on three‐dimensional (3D) computed tomography data sets was validated using static and moving phantoms. This software also was used to reduce uncertainties in tumor location in a patient with fiducials implanted in the periphery of a lung tumor. Alignment of fiducial locations in orthogonal projection images with corresponding fiducial locations in 3D data sets can position both static and moving phantoms with an accuracy of 1 mm. In a patient, alignment based on fiducial locations reduced systematic errors in the left–right direction by 3 mm and random errors by 2 mm, and random errors in the superior–inferior direction by 3 mm as measured by anterior–posterior cine images. Software that matches fiducial markers on 2D and 3D images is effective for aligning both static and moving fiducials before treatment and can be implemented to reduce patient setup uncertainties. PACS number: 81.40.Wx

  17. Parallel Computing and Model Evaluation for Environmental Systems: An Overview of the Supermuse and Frames Software Technologies

    EPA Science Inventory

    ERD’s Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) is a key to enhancing quality assurance in environmental models and applications. Uncertainty analysis and sensitivity analysis remain critical, though often overlooked steps in the development and e...

  18. Conflict or Caveats? Effects of Media Portrayals of Scientific Uncertainty on Audience Perceptions of New Technologies.

    PubMed

    Binder, Andrew R; Hillback, Elliott D; Brossard, Dominique

    2016-04-01

    Research indicates that uncertainty in science news stories affects public assessment of risk and uncertainty. However, the form in which uncertainty is presented may also affect people's risk and uncertainty assessments. For example, a news story that features an expert discussing both what is known and what is unknown about a topic may convey a different form of scientific uncertainty than a story that features two experts who hold conflicting opinions about the status of scientific knowledge of the topic, even when both stories contain the same information about knowledge and its boundaries. This study focuses on audience uncertainty and risk perceptions regarding the emerging science of nanotechnology by manipulating whether uncertainty in a news story about potential risks is attributed to expert sources in the form of caveats (individual uncertainty) or conflicting viewpoints (collective uncertainty). Results suggest that the type of uncertainty portrayed does not impact audience feelings of uncertainty or risk perceptions directly. Rather, the presentation of the story influences risk perceptions only among those who are highly deferent to scientific authority. Implications for risk communication theory and practice are discussed. © 2015 Society for Risk Analysis.

  19. CFCI3 (CFC-11): UV Absorption Spectrum Temperature Dependence Measurements and the Impact on Atmospheric Lifetime and Uncertainty

    NASA Technical Reports Server (NTRS)

    Mcgillen, Max R.; Fleming, Eric L.; Jackman, Charles H.; Burkholder, James B.

    2014-01-01

    CFCl3 (CFC-11) is both an atmospheric ozone-depleting and potent greenhouse gas that is removed primarily via stratospheric UV photolysis. Uncertainty in the temperature dependence of its UV absorption spectrum is a significant contributing factor to the overall uncertainty in its global lifetime and, thus, model calculations of stratospheric ozone recovery and climate change. In this work, the CFC-11 UV absorption spectrum was measured over a range of wavelength (184.95 - 230 nm) and temperature (216 - 296 K). We report a spectrum temperature dependence that is less than currently recommended for use in atmospheric models. The impact on its atmospheric lifetime was quantified using a 2-D model and the spectrum parameterization developed in this work. The obtained global annually averaged lifetime was 58.1 +- 0.7 years (2 sigma uncertainty due solely to the spectrum uncertainty). The lifetime is slightly reduced and the uncertainty significantly reduced from that obtained using current spectrum recommendations

  20. Reducing uncertainties in decadal variability of the global carbon budget with multiple datasets

    PubMed Central

    Li, Wei; Ciais, Philippe; Wang, Yilong; Peng, Shushi; Broquet, Grégoire; Ballantyne, Ashley P.; Canadell, Josep G.; Cooper, Leila; Friedlingstein, Pierre; Le Quéré, Corinne; Myneni, Ranga B.; Peters, Glen P.; Piao, Shilong; Pongratz, Julia

    2016-01-01

    Conventional calculations of the global carbon budget infer the land sink as a residual between emissions, atmospheric accumulation, and the ocean sink. Thus, the land sink accumulates the errors from the other flux terms and bears the largest uncertainty. Here, we present a Bayesian fusion approach that combines multiple observations in different carbon reservoirs to optimize the land (B) and ocean (O) carbon sinks, land use change emissions (L), and indirectly fossil fuel emissions (F) from 1980 to 2014. Compared with the conventional approach, Bayesian optimization decreases the uncertainties in B by 41% and in O by 46%. The L uncertainty decreases by 47%, whereas F uncertainty is marginally improved through the knowledge of natural fluxes. Both ocean and net land uptake (B + L) rates have positive trends of 29 ± 8 and 37 ± 17 Tg C⋅y−2 since 1980, respectively. Our Bayesian fusion of multiple observations reduces uncertainties, thereby allowing us to isolate important variability in global carbon cycle processes. PMID:27799533

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Chanyoung; Kim, Nam H.

    Structural elements, such as stiffened panels and lap joints, are basic components of aircraft structures. For aircraft structural design, designers select predesigned elements satisfying the design load requirement based on their load-carrying capabilities. Therefore, estimation of safety envelope of structural elements for load tolerances would be a good investment for design purpose. In this article, a method of estimating safety envelope is presented using probabilistic classification, which can estimate a specific level of failure probability under both aleatory and epistemic uncertainties. An important contribution of this article is that the calculation uncertainty is reflected in building a safety envelope usingmore » Gaussian process, and the effect of element test data on reducing the calculation uncertainty is incorporated by updating the Gaussian process model with the element test data. It is shown that even one element test can significantly reduce the calculation uncertainty due to lacking knowledge of actual physics, so that conservativeness in a safety envelope is significantly reduced. The proposed approach was demonstrated with a cantilever beam example, which represents a structural element. The example shows that calculation uncertainty provides about 93% conservativeness against the uncertainty due to a few element tests. As a result, it is shown that even a single element test can increase the load tolerance modeled with the safety envelope by 20%.« less

  2. Reducing patients' anxiety and uncertainty, and improving recall in bad news consultations.

    PubMed

    van Osch, Mara; Sep, Milou; van Vliet, Liesbeth M; van Dulmen, Sandra; Bensing, Jozien M

    2014-11-01

    Patients' recall of provided information during bad news consultations is poor. According to the attentional narrowing hypothesis, the emotional arousal caused by the bad news might be responsible for this hampered information processing. Because affective communication has proven to be effective in tempering patients' emotional reactions, the current study used an experimental design to explore whether physician's affective communication in bad news consultations decreases patients' anxiety and uncertainty and improves information recall. Two scripted video-vignettes of a bad news consultation were used in which the physician's verbal communication was manipulated (standard vs. affective condition). Fifty healthy women (i.e., analogue patients) randomly watched 1 of the 2 videos. The effect of communication on participants' anxiety, uncertainty, and recall was assessed by self-report questionnaires. Additionally, a moderator analysis was performed. Affective communication reduced anxiety (p = .01) and uncertainty (p = .04), and improved recall (p = .05), especially for information about prognosis (p = .04) and, to some extent, for treatment options (p = .07). The moderating effect of (reduced) anxiety and uncertainty on recall could not be confirmed and showed a trend for uncertainty. Physicians' affective communication can temper patients' anxiety and uncertainty during bad news consultations, and enhance their ability to recall medical information. The reduction of anxiety and uncertainty could not explain patients' enhanced recall, which leaves the underlying mechanism unspecified. Our findings underline the importance of addressing patients' emotions and provide empirical support to incorporate this in clinical guidelines and recommendations. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  3. Value of Information Analysis Applied to the Economic Evaluation of Interventions Aimed at Reducing Juvenile Delinquency: An Illustration.

    PubMed

    Eeren, Hester V; Schawo, Saskia J; Scholte, Ron H J; Busschbach, Jan J V; Hakkaart, Leona

    2015-01-01

    To investigate whether a value of information analysis, commonly applied in health care evaluations, is feasible and meaningful in the field of crime prevention. Interventions aimed at reducing juvenile delinquency are increasingly being evaluated according to their cost-effectiveness. Results of cost-effectiveness models are subject to uncertainty in their cost and effect estimates. Further research can reduce that parameter uncertainty. The value of such further research can be estimated using a value of information analysis, as illustrated in the current study. We built upon an earlier published cost-effectiveness model that demonstrated the comparison of two interventions aimed at reducing juvenile delinquency. Outcomes were presented as costs per criminal activity free year. At a societal willingness-to-pay of €71,700 per criminal activity free year, further research to eliminate parameter uncertainty was valued at €176 million. Therefore, in this illustrative analysis, the value of information analysis determined that society should be willing to spend a maximum of €176 million in reducing decision uncertainty in the cost-effectiveness of the two interventions. Moreover, the results suggest that reducing uncertainty in some specific model parameters might be more valuable than in others. Using a value of information framework to assess the value of conducting further research in the field of crime prevention proved to be feasible. The results were meaningful and can be interpreted according to health care evaluation studies. This analysis can be helpful in justifying additional research funds to further inform the reimbursement decision in regard to interventions for juvenile delinquents.

  4. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    USGS Publications Warehouse

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in these highly parameterized modeling contexts. Availability of these utilities is particularly important because, in many cases, a significant proportion of the uncertainty associated with model parameters-and the predictions that depend on them-arises from differences between the complex properties of the real world and the simplified representation of those properties that is expressed by the calibrated model. This report is intended to guide intermediate to advanced modelers in the use of capabilities available with the PEST suite of programs for evaluating model predictive error and uncertainty. A brief theoretical background is presented on sources of parameter and predictive uncertainty and on the means for evaluating this uncertainty. Applications of PEST tools are then discussed for overdetermined and underdetermined problems, both linear and nonlinear. PEST tools for calculating contributions to model predictive uncertainty, as well as optimization of data acquisition for reducing parameter and predictive uncertainty, are presented. The appendixes list the relevant PEST variables, files, and utilities required for the analyses described in the document.

  5. Bayesian methodology incorporating expert judgment for ranking countermeasure effectiveness under uncertainty: example applied to at grade railroad crossings in Korea.

    PubMed

    Washington, Simon; Oh, Jutaek

    2006-03-01

    Transportation professionals are sometimes required to make difficult transportation safety investment decisions in the face of uncertainty. In particular, an engineer may be expected to choose among an array of technologies and/or countermeasures to remediate perceived safety problems when: (1) little information is known about the countermeasure effects on safety; (2) information is known but from different regions, states, or countries where a direct generalization may not be appropriate; (3) where the technologies and/or countermeasures are relatively untested, or (4) where costs prohibit the full and careful testing of each of the candidate countermeasures via before-after studies. The importance of an informed and well-considered decision based on the best possible engineering knowledge and information is imperative due to the potential impact on the numbers of human injuries and deaths that may result from these investments. This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to "stated preference" methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain 'best' estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.

  6. Visualising uncertainty: Examining women's views on the role of Magnetic Resonance Imaging (MRI) in late pregnancy.

    PubMed

    Reed, Kate; Kochetkova, Inna; Whitby, Elspeth

    2016-09-01

    Prenatal screening occupies a prominent role within sociological debates on medical uncertainty. A particular issue concerns the limitations of routine screening which tends to be based on risk prediction. Computer assisted visual technologies such as Magnetic Resonance Imaging (MRI) are now starting to be applied to the prenatal realm to assist in the diagnosis of a range of fetal and maternal disorders (from problems with the fetal brain to the placenta). MRI is often perceived in popular and medical discourse as a technology of certainty and truth. However, little is known about the use of MRI as a tool to confirm or refute the diagnosis of a range of disorders in pregnancy. Drawing on qualitative research with pregnant women attending a fetal medicine clinic in the North of England this paper examines the potential role that MRI can play in mediating pregnancy uncertainty. The paper will argue that MRI can create and manage women's feelings of uncertainty during pregnancy. However, while MRI may not always provide women with unequivocal answers, the detailed information provided by MR images combined with the interpretation and communication skills of the radiologist in many ways enables women to navigate the issue. Our analysis of empirical data therefore highlights the value of this novel technological application for women and their partners. It also seeks to stress the merit of taking a productive approach to the study of diagnostic uncertainty, an approach which recognises the concepts dual nature. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  7. An audit of the global carbon budget: identifying and reducing sources of uncertainty

    NASA Astrophysics Data System (ADS)

    Ballantyne, A. P.; Tans, P. P.; Marland, G.; Stocker, B. D.

    2012-12-01

    Uncertainties in our carbon accounting practices may limit our ability to objectively verify emission reductions on regional scales. Furthermore uncertainties in the global C budget must be reduced to benchmark Earth System Models that incorporate carbon-climate interactions. Here we present an audit of the global C budget where we try to identify sources of uncertainty for major terms in the global C budget. The atmospheric growth rate of CO2 has increased significantly over the last 50 years, while the uncertainty in calculating the global atmospheric growth rate has been reduced from 0.4 ppm/yr to 0.2 ppm/yr (95% confidence). Although we have greatly reduced global CO2 growth rate uncertainties, there remain regions, such as the Southern Hemisphere, Tropics and Arctic, where changes in regional sources/sinks will remain difficult to detect without additional observations. Increases in fossil fuel (FF) emissions are the primary factor driving the increase in global CO2 growth rate; however, our confidence in FF emission estimates has actually gone down. Based on a comparison of multiple estimates, FF emissions have increased from 2.45 ± 0.12 PgC/yr in 1959 to 9.40 ± 0.66 PgC/yr in 2010. Major sources of increasing FF emission uncertainty are increased emissions from emerging economies, such as China and India, as well as subtle differences in accounting practices. Lastly, we evaluate emission estimates from Land Use Change (LUC). Although relative errors in emission estimates from LUC are quite high (2 sigma ~ 50%), LUC emissions have remained fairly constant in recent decades. We evaluate the three commonly used approaches to estimating LUC emissions- Bookkeeping, Satellite Imagery, and Model Simulations- to identify their main sources of error and their ability to detect net emissions from LUC.; Uncertainties in Fossil Fuel Emissions over the last 50 years.

  8. Using cooperative control to manage uncertainties for Aquifer Thermal Energy Storage (ATES)

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, Marc; Rostampour, Vahab; Kwakkel, Jan; Bloemendal, Martin

    2017-04-01

    Aquifer Thermal Energy Storage (ATES) technology can lead to major reductions in energy demand for heating and cooling in buildings. ATES systems rely on shallow aquifers to seasonally store thermal energy and have become popular in the Netherlands, where a combination of easily accessible aquifers and strict energy regulations makes the technology especially relevant. However, this rapid adoption has made their management in dense urban areas more challenging. For instance, thermal interferences between neighboring systems can degrade storage efficiency. Policies for the permitting and spatial layout of ATES thus tend to be conservative to ensure the performance of individual systems, but this limits the space available for new systems - leading to a trade-off between individual system performance, and the overall energy savings obtained from ATES in a given area. Furthermore, recent studies show that operational uncertainties contribute to poor outcomes under current planning practices; systems in the Netherlands typically use less than half of their permitted water volume. This further reduces energy savings compared to expectations and also leads to an over-allocation of subsurface space. In this context, this work investigates the potential of a more flexible approach for ATES planning and operation, under which neighboring systems coordinate their operation. This is illustrated with a three-building idealized case, using a model predictive control approach for two control schemes: a decoupled formulation, and a centralized scheme that aims to avoid interferences between neighboring systems (assuming perfect information exchange). These control schemes are compared across a range of scenarios for spatial layout, building energy demand, and climate, using a coupled agent-based/geohydrological simulation. The simulation indicates that centralized operation could significantly improve the spatial layout efficiency of ATES systems, by allowing systems to be placed more densely without penalizing their individual performance. This effectively relaxes the trade-off between individual system performance and collective energy savings as observed in the decoupled case. The continued adoption of ATES technology provides a window of opportunity to revisit existing practices for the layout and operation of urban ATES systems, as information exchange - supported by appropriate spatial planning - could offer significant potential towards improved performance under operational uncertainties.

  9. Essays on competition in electricity markets

    NASA Astrophysics Data System (ADS)

    Bustos Salvagno, Ricardo Javier

    The first chapter shows how technology decisions affect entry in commodity markets with oligopolistic competition, like the electricity market. I demonstrate an entry deterrence effect that works through cost uncertainty. Technology's cost uncertainty affects spot market expected profits through forward market trades. Therefore, incentives to engage in forward trading shape firms' decisions on production technologies. I show that high-cost but low-risk technologies are adopted by risk-averse incumbents to deter entry. Strategic technology adoption can end in a equilibrium where high-cost technologies prevail over low-cost but riskier ones. In the case of incumbents who are less risk-averse than entrants, entry deterrence is achieved by choosing riskier technologies. The main results do not depend on who chooses their technology first. Chapter two examines the Chilean experience on auctions for long-term supply contracts in electricity markets from 2006 to 2011. Using a divisible-good auction model, I provide a theoretical framework that explains bidding behavior in terms of expected spot prices and contracting positions. The model is extended to include potential strategic behavior on contracting decisions. Empirical estimations confirm the main determinants of bidding behavior and show heterogeneity in the marginal cost of over-contracting depending on size and incumbency. Chapter three analyzes the lag in capacity expansion in the Chilean electricity market from 2000 to 2004. Regarded as a result of regulatory uncertainty, the role of delays in the construction of a large hydro-power plant has been overlooked by the literature. We argue that those delays postponed projected investment and gave small windows of opportunity that only incumbents could take advantage of. We are able to retrace the history of investments through real-time information from the regulator's reports and a simple model enables us to explain the effect of those delays on suggested and under-construction investments.

  10. Engineering with uncertainty: monitoring air bag performance.

    PubMed

    Wetmore, Jameson M

    2008-06-01

    Modern engineering is complicated by an enormous number of uncertainties. Engineers know a great deal about the material world and how it works. But due to the inherent limits of testing and the complexities of the world outside the lab, engineers will never be able to fully predict how their creations will behave. One way the uncertainties of engineering can be dealt with is by actively monitoring technologies once they have left the development and production stage. This article uses an episode in the history of automobile air bags as an example of engineers who had the foresight and initiative to carefully track the technology on the road to discover problems as early as possible. Not only can monitoring help engineers identify problems that surface in the field, it can also assist them in their efforts to mobilize resources to resolve problem.

  11. Studies of aerodynamic technology for VSTOL fighter/attack aircraft

    NASA Technical Reports Server (NTRS)

    Nelms, W. P.

    1978-01-01

    The paper summarizes several studies to develop aerodynamic technology for high performance VSTOL aircraft anticipated after 1990. A contracted study jointly sponsored by NASA-Ames and David Taylor Naval Ship Research and Development Center is emphasized. Four contractors analyzed two vertical-attitude and three horizontal-attitude takeoff and landing concepts with gross weights ranging from about 10433 kg (23,000 lb) to 17236 kg (38,000 lb). The aircraft have supersonic capability, high maneuver performance (sustained load factor 6.2 at Mach 0.6, 3048 m (10,000 ft)) and a 4536 kg (10,000-lb) STO overload capability. The contractors have estimated the aerodynamics and identified aerodynamic uncertainties associated with their concept. Example uncertainties relate to propulsion-induced flows, canard-wing interactions, and top inlets. Wind-tunnel research programs were proposed to investigate these uncertainties.

  12. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  13. An inferentialist perspective on the coordination of actions and reasons involved in making a statistical inference

    NASA Astrophysics Data System (ADS)

    Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie

    2017-12-01

    To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical inference. In his project, the intern, Sam, investigated whether patients' blood could be sent through pneumatic post without influencing the measurement of particular blood components. We asked, in the process of making a statistical inference, how are reasons and actions coordinated to reduce uncertainty? For the analysis, we used the semantic theory of inferentialism, specifically, the concept of webs of reasons and actions—complexes of interconnected reasons for facts and actions; these reasons include premises and conclusions, inferential relations, implications, motives for action, and utility of tools for specific purposes in a particular context. Analysis of interviews with Sam, his supervisor and teacher as well as video data of Sam in the classroom showed that many of Sam's actions aimed to reduce variability, rule out errors, and thus reduce uncertainties so as to arrive at a valid inference. Interestingly, the decisive factor was not the outcome of a t test but of the reference change value, a clinical chemical measure of analytic and biological variability. With insights from this case study, we expect that students can be better supported in connecting statistics with context and in dealing with uncertainty.

  14. A new approach to identify the sensitivity and importance of physical parameters combination within numerical models using the Lund-Potsdam-Jena (LPJ) model as an example

    NASA Astrophysics Data System (ADS)

    Sun, Guodong; Mu, Mu

    2017-05-01

    An important source of uncertainty, which causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. Therefore, finding a subset among numerous physical parameters in numerical models in the atmospheric and oceanic sciences, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach in China. The results imply that nonlinear interactions among parameters play a key role in the identification of sensitive parameters in arid and semi-arid regions of China compared to those in northern, northeastern, and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.

  15. Designing Technology: An Exploration of the Relationship between Technological Literacy and Design Capability

    ERIC Educational Resources Information Center

    Hope, Gill

    2013-01-01

    The aim of this article is to contribute to the debate on the nature of technology education. This is especially pertinent at times of curriculum change and uncertainty, such as currently exist in relation to the Primary school curriculum in England and Wales. Two phrases ("technological literacy" and "design capability") have…

  16. Improved Wavelengths and Oscillator Strengths of Cr III, Co III, and Fe III

    NASA Astrophysics Data System (ADS)

    Smith, Peter L.; Smillie, D. G.; Pickering, J. C.; Blackwell-Whitehead, R. J.

    2008-05-01

    Improvements in the resolution, accuracy, and range of spectra obtained by state-of-the-art space- and ground-based astronomical spectrographs have demonstrated a need for corresponding improvements in atomic data. Transition wavelengths with uncertainties of 1 part in 10^7 and oscillator strengths (f-values) with uncertainties of 10 to 15% are needed to accurately interpret modern astrophysical spectra. Our focus has been on spectra of doubly ionized iron group elements that dominate the UV spectra of hot B stars. We report here completion of measurements on Cr III, Co III, Fe III made with a UV high resolution Fourier transform spectrometer (FTS) [J. C. Pickering, Vibrational Spectrosc. 29, 27 (2002)] with a typical wavelength/wavenumber uncertainty of a few parts in 10^8, supplemented by measurements were carried out at the US National Institute of Standards & Technology using their FTS and the Normal Incidence Vacuum (grating) Spectrograph (NIVS). The spectra were analyzed and line lists were produced to give calibrated line wavelengths and relative intensities. Measured wavelengths are, in many cases, an order of magnitude more accurate than previous measurements, and the energy level uncertainties are typically reduced by a factor or 3 more. Summaries of submitted papers on Cr III and Co III will be presented, as will work on improved wavelengths, energy levels, and oscillator strengths for Fe III. Limitations to the method and possible solutions will be discussed. This work is, or has been, supported in part by NASA Grant NAG5-12668; NASA inter-agency agreement W-10255; PPARC; the Royal Society of the UK; and by the Leverhulme Trust.

  17. Regulating interface science healthcare products: myths and uncertainties.

    PubMed

    Bravery, Christopher A

    2010-12-06

    Whenever new technology emerges it brings with it concerns and uncertainties about whether or how it will need to be regulated, particularly when it is applied to human healthcare. Drawing on the recent history in the European Union (EU) of the regulation of cell-based medicinal products, and in particular tissue-engineered products, this paper explores the myths that persist around their regulation and speculates on whether the existing regulatory landscape in the EU is flexible enough to incorporate nanotechnology and other new technologies into healthcare products. By untangling these myths a number of clear conclusions are revealed that, when considered in the context of risk-benefit, make it clear that what hinders the uptake of new technology is not regulatory process but basic science.

  18. Risk analysis and technology assessment in support of technology development: Putting responsible innovation in practice in a case study for nanotechnology.

    PubMed

    van Wezel, Annemarie P; van Lente, Harro; van de Sandt, Johannes Jm; Bouwmeester, Hans; Vandeberg, Rens Lj; Sips, Adrienne Jam

    2018-01-01

    Governments invest in "key enabling technologies," such as nanotechnology, to solve societal challenges and boost the economy. At the same time, governmental agencies demand risk reduction to prohibit any often unknown adverse effects, and industrial parties demand smart approaches to reduce uncertainties. Responsible research and innovation (RRI) is therefore a central theme in policy making. Risk analysis and technology assessment, together referred to as "RATA," can provide a basis to assess human, environmental, and societal risks of new technological developments during the various stages of technological development. This assessment can help both governmental authorities and innovative industry to move forward in a sustainable manner. Here we describe the developed procedures and products and our experiences to bring RATA in practice within a large Dutch nanotechnology consortium. This is an example of how to put responsible innovation in practice as an integrated part of a research program, how to increase awareness of RATA, and how to help technology developers perform and use RATA. Integr Environ Assess Manag 2018;14:9-16. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

  19. A methodology for computing uncertainty bounds of multivariable systems based on sector stability theory concepts

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.

    1992-01-01

    The application of a sector-based stability theory approach to the formulation of useful uncertainty descriptions for linear, time-invariant, multivariable systems is explored. A review of basic sector properties and sector-based approach are presented first. The sector-based approach is then applied to several general forms of parameter uncertainty to investigate its advantages and limitations. The results indicate that the sector uncertainty bound can be used effectively to evaluate the impact of parameter uncertainties on the frequency response of the design model. Inherent conservatism is a potential limitation of the sector-based approach, especially for highly dependent uncertain parameters. In addition, the representation of the system dynamics can affect the amount of conservatism reflected in the sector bound. Careful application of the model can help to reduce this conservatism, however, and the solution approach has some degrees of freedom that may be further exploited to reduce the conservatism.

  20. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    NASA Astrophysics Data System (ADS)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was reduced. The FOSM forecast uncertainty estimates were then recalculated and compared to the base forecast uncertainty estimates. The resulting reduction in forecast uncertainty is a measure of the effect on the model from the AEM survey. Iterations through this process, results in optimization of flight line location.

  1. Commentary: ambiguity and uncertainty: neglected elements of medical education curricula?

    PubMed

    Luther, Vera P; Crandall, Sonia J

    2011-07-01

    Despite significant advances in scientific knowledge and technology, ambiguity and uncertainty are still intrinsic aspects of contemporary medicine. To practice confidently and competently, a physician must learn rational approaches to complex and ambiguous clinical scenarios and must possess a certain degree of tolerance of ambiguity. In this commentary, the authors discuss the role that ambiguity and uncertainty play in medicine and emphasize why openly addressing these topics in the formal medical education curriculum is critical. They discuss key points from original research by Wayne and colleagues and their implications for medical education. Finally, the authors offer recommendations for increasing medical student tolerance of ambiguity and uncertainty, including dedicating time to attend candidly to ambiguity and uncertainty as a formal part of every medical school curriculum.

  2. Health and productivity gains from better indoor environments and their relationship with building energy efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, William J.

    2000-04-01

    Theoretical considerations and empirical data suggest that existing technologies and procedures can improve indoor environments in a manner that significantly increases productivity and health. Existing literature contains moderate to strong evidence that characteristics of buildings and indoor environments significantly influence rates of communicable respiratory illness, allergy and asthma symptoms, sick building symptoms, and worker performance. While there is considerable uncertainty in the estimates of the magnitudes of productivity gains that may be obtained by providing better indoor environments, the projected gains are very large. For the U.S., the estimated potential annual savings and productivity gains are $6 to $14 billionmore » from reduced respiratory disease, $2 to $4 billion from reduced allergies and asthma, $10 to $30 billion from reduced sick building syndrome symptoms, and $20 to $160 billion from direct improvements in worker performance that are unrelated to health. Productivity gains that are quantified and demonstrated could serve as a strong stimulus for energy efficiency measures that simultaneously improve the indoor environment.« less

  3. Simulating evolution of technology: An aid to energy policy analysis. A case study of strategies to control greenhouse gases in Canada

    NASA Astrophysics Data System (ADS)

    Nyboer, John

    Issues related to the reduction of greenhouse gases are encumbered with uncertainties for decision makers. Unfortunately, conventional analytical tools generate widely divergent forecasts of the effects of actions designed to mitigate these emissions. "Bottom-up" models show the costs of reducing emissions attained through the penetration of efficient technologies to be low or negative. In contrast, more aggregate "top-down" models show costs of reduction to be high. The methodological approaches of the different models used to simulate energy consumption generate, in part, the divergence found in model outputs. To address this uncertainty and bring convergence, I use a technology-explicit model that simulates turnover of equipment stock as a function of detailed data on equipment costs and stock characteristics and of verified behavioural data related to equipment acquisition and retrofitting. Such detail can inform the decision maker of the effects of actions to reduce greenhouse gases due to changes in (1) technology stocks, (2) products or services, or (3) the mix of fuels used. This thesis involves two main components: (1) the development of a quantitative model to analyse energy demand and (2) the application of this tool to a policy issue, abatement of COsb2 emissions. The analysis covers all of Canada by sector (8 industrial subsectors, residential commercial) and region. An electricity supply model to provide local electricity prices supplemented the quantitative model. Forecasts of growth and structural change were provided by national macroeconomic models. Seven different simulations were applied to each sector in each region including a base case run and three runs simulating emissions charges of 75/tonne, 150/tonne and 225/tonne CO sb2. The analysis reveals that there is significant variation in the costs and quantity of emissions reduction by sector and region. Aggregated results show that Canada can meet both stabilisation targets (1990 levels of emissions by 2000) and reduction targets (20% less than 1990 by 2010), but the cost of meeting reduction targets exceeds 225/tonne. After a review of the results, I provide several reasons for concluding that the costs are overestimated and the emissions reduction underestimated. I also provide several future research options.

  4. 78 FR 41731 - Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-11

    ... Federal Implementation Plan for Implementing Best Available Retrofit Technology for Four Corners Power... Implementation Plan (FIP) to implement the Best Available Retrofit Technology (BART) requirement of the Regional... given the uncertainties in the electrical market in Arizona, EPA is proposing to extend the date by...

  5. Engagement and Uncertainty: Emerging Technologies Challenge the Work of Engagement

    ERIC Educational Resources Information Center

    Eaton, Weston; Wright, Wynne; Whyte, Kyle; Gasteyer, Stephen P.; Gehrke, Pat J.

    2014-01-01

    Universities' increasing applications of science and technology to address a wide array of societal problems may serve to thwart democratic engagement strategies. For emerging technologies, such challenges are particularly salient, as knowledge is incomplete and application and impact are uncertain or contested. Insights from science and…

  6. Our Brains Extended

    ERIC Educational Resources Information Center

    Prensky, Marc

    2013-01-01

    Technology is an extension of the brain; it is a new way of thinking. It is the solution humans have created to deal with the difficult new context of variability, uncertainty, complexity, and ambiguity. Wise integration of evolving and powerful technology demands a rethinking of the curriculum. This article discusses technology as the new way of…

  7. The Impact of Iranian Teachers Cultural Values on Computer Technology Acceptance

    ERIC Educational Resources Information Center

    Sadeghi, Karim; Saribagloo, Javad Amani; Aghdam, Samad Hanifepour; Mahmoudi, Hojjat

    2014-01-01

    This study was conducted with the aim of testing the technology acceptance model and the impact of Hofstede cultural values (masculinity/femininity, uncertainty avoidance, individualism/collectivism, and power distance) on computer technology acceptance among teachers at Urmia city (Iran) using the structural equation modeling approach. From among…

  8. Risk-Aversion: Understanding Teachers' Resistance to Technology Integration

    ERIC Educational Resources Information Center

    Howard, Sarah K.

    2013-01-01

    Teachers who do not integrate technology are often labelled as "resistant" to change. Yet, considerable uncertainties remain about appropriate uses and actual value of technology in teaching and learning, which can make integration and change seem risky. The purpose of this article is to explore the nature of teachers' analytical and…

  9. Research strategies for addressing uncertainties

    USGS Publications Warehouse

    Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh; Garfin, Gregg; Jardine, Angela; Merideth, Robert; Black, Mary; LeRoy, Sarah

    2013-01-01

    Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.

  10. Section summary: Uncertainty and design considerations

    Treesearch

    Stephen Hagen

    2013-01-01

    Well planned sampling designs and robust approaches to estimating uncertainty are critical components of forest monitoring. The importance of uncertainty estimation increases as deforestation and degradation issues become more closely tied to financing incentives for reducing greenhouse gas emissions in the forest sector. Investors like to know risk and risk is tightly...

  11. Global Crop Yields, Climatic Trends and Technology Enhancement

    NASA Astrophysics Data System (ADS)

    Najafi, E.; Devineni, N.; Khanbilvardi, R.; Kogan, F.

    2016-12-01

    During the last decades the global agricultural production has soared up and technology enhancement is still making positive contribution to yield growth. However, continuing population, water crisis, deforestation and climate change threaten the global food security. Attempts to predict food availability in the future around the world can be partly understood from the impact of changes to date. A new multilevel model for yield prediction at the country scale using climate covariates and technology trend is presented in this paper. The structural relationships between average yield and climate attributes as well as trends are estimated simultaneously. All countries are modeled in a single multilevel model with partial pooling and/or clustering to automatically group and reduce estimation uncertainties. El Niño Southern Oscillation (ENSO), Palmer Drought Severity Index (PDSI), Geopotential height (GPH), historical CO2 level and time-trend as a relatively reliable approximation of technology measurement are used as predictors to estimate annual agricultural crop yields for each country from 1961 to 2007. Results show that these indicators can explain the variability in historical crop yields for most of the countries and the model performs well under out-of-sample verifications.

  12. Integrated Vehicle Health Management (IVHM) for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Baroth, Edmund C.; Pallix, Joan

    2006-01-01

    To achieve NASA's ambitious Integrated Space Transportation Program objectives, aerospace systems will implement a variety of new concept in health management. System level integration of IVHM technologies for real-time control and system maintenance will have significant impact on system safety and lifecycle costs. IVHM technologies will enhance the safety and success of complex missions despite component failures, degraded performance, operator errors, and environment uncertainty. IVHM also has the potential to reduce, or even eliminate many of the costly inspections and operations activities required by current and future aerospace systems. This presentation will describe the array of NASA programs participating in the development of IVHM technologies for NASA missions. Future vehicle systems will use models of the system, its environment, and other intelligent agents with which they may interact. IVHM will be incorporated into future mission planners, reasoning engines, and adaptive control systems that can recommend or execute commands enabling the system to respond intelligently in real time. In the past, software errors and/or faulty sensors have been identified as significant contributors to mission failures. This presentation will also address the development and utilization of highly dependable sohare and sensor technologies, which are key components to ensure the reliability of IVHM systems.

  13. Uncertainties in global aerosols and climate effects due to biofuel emissions

    NASA Astrophysics Data System (ADS)

    Kodros, J. K.; Scott, C. E.; Farina, S. C.; Lee, Y. H.; L'Orange, C.; Volckens, J.; Pierce, J. R.

    2015-04-01

    Aerosol emissions from biofuel combustion impact both health and climate; however, while reducing emissions through improvements to combustion technologies will improve health, the net effect on climate is largely unconstrained. In this study, we examine sensitivities in global aerosol concentration, direct radiative climate effect, and cloud-albedo aerosol indirect climate effect to uncertainties in biofuel emission factors, optical mixing-state, and model nucleation and background SOA. We use the Goddard Earth Observing System global chemical-transport model (GEOS-Chem) with TwO Moment Aerosol Sectional (TOMAS) microphysics. The emission factors include: amount, composition, size and hygroscopicity, as well as optical mixing-state properties. We also evaluate emissions from domestic coal use, which is not biofuel but is also frequently emitted from homes. We estimate the direct radiative effect assuming different mixing states (internal, core-shell, and external) with and without absorptive organic aerosol (brown carbon). We find the global-mean direct radiative effect of biofuel emissions ranges from -0.02 to +0.06 W m-2 across all simulation/mixing state combinations with regional effects in source regions ranging from -0.2 to +1.2 W m-2. The global-mean cloud-albedo aerosol indirect effect ranges from +0.01 to -0.02 W m-2 with regional effects in source regions ranging from -1.0 to -0.05 W m-2. The direct radiative effect is strongly dependent on uncertainties in emissions mass, composition, emissions aerosol size distributions and assumed optical mixing state, while the indirect effect is dependent on the emissions mass, emissions aerosol size distribution and the choice of model nucleation and secondary organic aerosol schemes. The sign and magnitude of these effects have a strong regional dependence. We conclude that the climate effects of biofuel aerosols are largely unconstrained, and the overall sign of the aerosol effects is unclear due to uncertainties in model inputs. This uncertainty limits our ability to introduce mitigation strategies aimed at reducing biofuel black carbon emissions in order to counter warming effects from greenhouse-gases. To better understand the climate impact of particle emissions from biofuel combustion, we recommend field/laboratory measurements to narrow constraints on: (1) emissions mass, (2) emission size distribution, (3) mixing state, and (4) ratio of black carbon to organic aerosol.

  14. Uncertainties in global aerosols and climate effects due to biofuel emissions

    NASA Astrophysics Data System (ADS)

    Kodros, J. K.; Scott, C. E.; Farina, S. C.; Lee, Y. H.; L'Orange, C.; Volckens, J.; Pierce, J. R.

    2015-08-01

    Aerosol emissions from biofuel combustion impact both health and climate; however, while reducing emissions through improvements to combustion technologies will improve health, the net effect on climate is largely unconstrained. In this study, we examine sensitivities in global aerosol concentration, direct radiative climate effect, and cloud-albedo aerosol indirect climate effect to uncertainties in biofuel emission factors, optical mixing state, and model nucleation and background secondary organic aerosol (SOA). We use the Goddard Earth Observing System global chemical-transport model (GEOS-Chem) with TwO Moment Aerosol Sectional (TOMAS) microphysics. The emission factors include amount, composition, size, and hygroscopicity, as well as optical mixing-state properties. We also evaluate emissions from domestic coal use, which is not biofuel but is also frequently emitted from homes. We estimate the direct radiative effect assuming different mixing states (homogeneous, core-shell, and external) with and without absorptive organic aerosol (brown carbon). We find the global-mean direct radiative effect of biofuel emissions ranges from -0.02 to +0.06 W m-2 across all simulation/mixing-state combinations with regional effects in source regions ranging from -0.2 to +0.8 W m-2. The global-mean cloud-albedo aerosol indirect effect (AIE) ranges from +0.01 to -0.02 W m-2 with regional effects in source regions ranging from -1.0 to -0.05 W m-2. The direct radiative effect is strongly dependent on uncertainties in emissions mass, composition, emissions aerosol size distributions, and assumed optical mixing state, while the indirect effect is dependent on the emissions mass, emissions aerosol size distribution, and the choice of model nucleation and secondary organic aerosol schemes. The sign and magnitude of these effects have a strong regional dependence. We conclude that the climate effects of biofuel aerosols are largely unconstrained, and the overall sign of the aerosol effects is unclear due to uncertainties in model inputs. This uncertainty limits our ability to introduce mitigation strategies aimed at reducing biofuel black carbon emissions in order to counter warming effects from greenhouse gases. To better understand the climate impact of particle emissions from biofuel combustion, we recommend field/laboratory measurements to narrow constraints on (1) emissions mass, (2) emission size distribution, (3) mixing state, and (4) ratio of black carbon to organic aerosol.

  15. Perceived Uncertainty and Organizational Health in Public Schools: The Mediating Effect of School Principals' Transformational Leadership Style

    ERIC Educational Resources Information Center

    Hameiri, Lior; Nir, Adam

    2016-01-01

    Purpose: Public schools operate in a changing and dynamic environment evident in technological innovations, increased social heterogeneity and competition, all contributing to school leaders' uncertainty. Such changes inevitably influence schools' inner dynamic and may therefore undermine schools' organizational health. School leaders have a…

  16. Uncertainty Exposed: A Field Lab Exercise Where GIS Meets the Real World

    ERIC Educational Resources Information Center

    Prisley, Stephen P.; Luebbering, Candice

    2011-01-01

    Students in natural resources programs commonly take courses in geospatial technologies. An awareness of the uncertainty of spatial data and algorithms can be an important outcome of such courses. This article describes a laboratory exercise in a graduate geographic information system (GIS) class that involves collection of data for the assessment…

  17. Intellectual Skills and Competitive Strength: Is a Radical Change Necessary?

    ERIC Educational Resources Information Center

    Koike, Kazuo

    2002-01-01

    Data from a study of Toyota production workshops show the most important worker intellectual skills are problem-solving know-how and ability to handle change. Introduction of information technology elevates the need for intellectual skills because of uncertainty. Development of skills for dealing with uncertainty and change in both blue- and…

  18. NICE technology appraisals: working with multiple levels of uncertainty and the potential for bias.

    PubMed

    Brown, Patrick; Calnan, Michael

    2013-05-01

    One of the key roles of the English National Institute for Health and Clinical Excellence (NICE) is technology appraisal. This essentially involves evaluating the cost effectiveness of pharmaceutical products and other technologies for use within the National Health Service. Based on a content analysis of key documents which shed light on the nature of appraisals, this paper draws attention to the multiple layers of uncertainty and complexity which are latent within the appraisal process, and the often socially constructed mechanisms for tackling these. Epistemic assumptions, bounded rationality and more explicitly relational forms of managing knowledge are applied to this end. These findings are discussed in the context of the literature highlighting the inherently social process of regulation. A framework is developed which posits the various forms of uncertainty, and responses to these, as potential conduits of regulatory bias-in need of further research. That NICE's authority is itself regulated by other actors within the regulatory regime, particularly the pharmaceutical industry, exposes it to the threat of regulatory capture. Following Lehoux, it is concluded that a more transparent and reflexive format for technological appraisals is necessary. This would enable a more robust, defensible form of decision-making and moreover enable NICE to preserve its legitimacy in the midst of pressures which threaten this.

  19. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  20. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    PubMed

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  1. An Inferentialist Perspective on the Coordination of Actions and Reasons Involved in Making a Statistical Inference

    ERIC Educational Resources Information Center

    Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie

    2017-01-01

    To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical…

  2. Impact of Online User-Generated Content on Retailers and Manufacturers

    ERIC Educational Resources Information Center

    Kwark, Young

    2013-01-01

    Online user-generated content has been ubiquitous. Consumers are eager to not only listen to the other consumers but provide opinion to the public. The former reduces the consumers' uncertainty about products and the latter reduces the firms' uncertainty about consumers. We examine the effect of online product reviews, a most common form of…

  3. A New Combined Stepwise-Based High-Order Decoupled Direct and Reduced-Form Method To Improve Uncertainty Analysis in PM2.5 Simulations.

    PubMed

    Huang, Zhijiong; Hu, Yongtao; Zheng, Junyu; Yuan, Zibing; Russell, Armistead G; Ou, Jiamin; Zhong, Zhuangmin

    2017-04-04

    The traditional reduced-form model (RFM) based on the high-order decoupled direct method (HDDM), is an efficient uncertainty analysis approach for air quality models, but it has large biases in uncertainty propagation due to the limitation of the HDDM in predicting nonlinear responses to large perturbations of model inputs. To overcome the limitation, a new stepwise-based RFM method that combines several sets of local sensitive coefficients under different conditions is proposed. Evaluations reveal that the new RFM improves the prediction of nonlinear responses. The new method is applied to quantify uncertainties in simulated PM 2.5 concentrations in the Pearl River Delta (PRD) region of China as a case study. Results show that the average uncertainty range of hourly PM 2.5 concentrations is -28% to 57%, which can cover approximately 70% of the observed PM 2.5 concentrations, while the traditional RFM underestimates the upper bound of the uncertainty range by 1-6%. Using a variance-based method, the PM 2.5 boundary conditions and primary PM 2.5 emissions are found to be the two major uncertainty sources in PM 2.5 simulations. The new RFM better quantifies the uncertainty range in model simulations and can be applied to improve applications that rely on uncertainty information.

  4. Analysis of recent projections of electric power demand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hudson, Jr, D V

    1993-08-01

    This report reviews the changes and potential changes in the outlook for electric power demand since the publication of Review and Analysis of Electricity Supply Market Projections (B. Swezey, SERI/MR-360-3322, National Renewable Energy Laboratory). Forecasts of the following organizations were reviewed: DOE/Energy Information Administration, DOE/Policy Office, DRI/McGraw-Hill, North American Electric Reliability Council, and Gas Research Institute. Supply uncertainty was briefly reviewed to place the uncertainties of the demand outlook in perspective. Also discussed were opportunities for modular technologies, such as renewable energy technologies, to fill a potential gap in energy demand and supply.

  5. Cryogenic Fluid Technologies for Long Duration In-Space Operations

    NASA Technical Reports Server (NTRS)

    Motil, Susan M.; Tramel, Terri L.

    2008-01-01

    Reliable knowledge of low-gravity cryogenic fluid management behavior is lacking and yet is critical in the areas of storage, distribution, and low-gravity propellant management. The Vision for Space Exploration mission objectives will require the use of high performance cryogenic propellants (hydrogen, oxygen, and methane). Additionally, lunar missions will require success in storing and transferring liquid and gas commodities on the surface. The fundamental challenges associated with the in-space use of cryogens are their susceptibility to environmental heat, their complex thermodynamic and fluid dynamic behavior in low gravity and the uncertainty of the position of the liquid-vapor interface if the propellants are not settled. The Cryogenic Fluid Management (CFM) project is addressing these issues through ground testing and analytical model development, and has crosscutting applications and benefits to virtually all missions requiring in-space operations with cryogens. Such knowledge can significantly reduce or even eliminate tank fluid boil-off losses for long term missions, reduce propellant launch mass and on-orbit margins, and simplify vehicle operations. The Cryogenic Fluid Management (CFM) Project is conducting testing and performing analytical evaluation of several areas to enable NASA s Exploration Vision. This paper discusses the content and progress of the technology focus areas within CFM.

  6. The Role of Trust and Interaction in Global Positioning System Related Accidents

    NASA Technical Reports Server (NTRS)

    Johnson, Chris W.; Shea, Christine; Holloway, C. Michael

    2008-01-01

    The Global Positioning System (GPS) uses a network of satellites to calculate the position of a receiver over time. This technology has revolutionized a wide range of safety-critical industries and leisure applications. These systems provide diverse benefits; supplementing the users existing navigation skills and reducing the uncertainty that often characterizes many route planning tasks. GPS applications can also help to reduce workload by automating tasks that would otherwise require finite cognitive and perceptual resources. However, the operation of these systems has been identified as a contributory factor in a range of recent accidents. Users often come to rely on GPS applications and, therefore, fail to notice when they develop faults or when errors occur in the other systems that use the data from these systems. Further accidents can stem from the over confidence that arises when users assume automated warnings will be issued when they stray from an intended route. Unless greater attention is paid to the role of trust and interaction in GPS applications then there is a danger that we will see an increasing number of these failures as positioning technologies become integral in the functioning of increasing numbers of applications.

  7. Real-Time Capabilities of a Digital Analyzer for Mixed-Field Assay Using Scintillation Detectors

    NASA Astrophysics Data System (ADS)

    Aspinall, M. D.; Joyce, M. J.; Lavietes, A.; Plenteda, R.; Cave, F. D.; Parker, H.; Jones, A.; Astromskas, V.

    2017-03-01

    Scintillation detectors offer a single-step detection method for fast neutrons and necessitate real-time acquisition, whereas this is redundant in two-stage thermal detection systems using helium-3 and lithium-6, where the fast neutrons need to be thermalized prior to detection. The relative affordability of scintillation detectors and the associated fast digital acquisition systems have enabled entirely new measurement setups that can consist of sizeable detector arrays. These detectors in most cases rely on photomultiplier tubes, which have significant tolerances and result in variations in detector response functions. The detector tolerances and other environmental instabilities must be accounted for in measurements that depend on matched detector performance. This paper presents recent advances made to a high-speed FPGA-based digitizer. The technology described offers a complete solution for fast-neutron scintillation detectors by integrating multichannel high-speed data acquisition technology with dedicated detector high-voltage supplies. This configuration has significant advantages for large detector arrays that require uniform detector responses. We report on bespoke control software and firmware techniques that exploit real-time functionality to reduce setup and acquisition time, increase repeatability, and reduce statistical uncertainties.

  8. The contents of visual working memory reduce uncertainty during visual search.

    PubMed

    Cosman, Joshua D; Vecera, Shaun P

    2011-05-01

    Information held in visual working memory (VWM) influences the allocation of attention during visual search, with targets matching the contents of VWM receiving processing benefits over those that do not. Such an effect could arise from multiple mechanisms: First, it is possible that the contents of working memory enhance the perceptual representation of the target. Alternatively, it is possible that when a target is presented among distractor items, the contents of working memory operate postperceptually to reduce uncertainty about the location of the target. In both cases, a match between the contents of VWM and the target should lead to facilitated processing. However, each effect makes distinct predictions regarding set-size manipulations; whereas perceptual enhancement accounts predict processing benefits regardless of set size, uncertainty reduction accounts predict benefits only with set sizes larger than 1, when there is uncertainty regarding the target location. In the present study, in which briefly presented, masked targets were presented in isolation, there was a negligible effect of the information held in VWM on target discrimination. However, in displays containing multiple masked items, information held in VWM strongly affected target discrimination. These results argue that working memory representations act at a postperceptual level to reduce uncertainty during visual search.

  9. Fuels for urban transit buses: a cost-effectiveness analysis.

    PubMed

    Cohen, Joshua T; Hammitt, James K; Levy, Jonathan I

    2003-04-15

    Public transit agencies have begun to adopt alternative propulsion technologies to reduce urban transit bus emissions associated with conventional diesel (CD) engines. Among the most popular alternatives are emission controlled diesel buses (ECD), defined here to be buses with continuously regenerating diesel particle filters burning low-sulfur diesel fuel, and buses burning compressed natural gas (CNG). This study uses a series of simplifying assumptions to arrive at first-order estimates for the incremental cost-effectiveness (CE) of ECD and CNG relative to CD. The CE ratio numerator reflects acquisition and operating costs. The denominator reflects health losses (mortality and morbidity) due to primary particulate matter (PM), secondary PM, and ozone exposure, measured as quality adjusted life years (QALYs). We find that CNG provides larger health benefits than does ECD (nine vs six QALYs annually per 1000 buses) but that ECD is more cost-effective than CNG (dollar 270 000 per QALY for ECD vs dollar 1.7 million to dollar 2.4 million for CNG). These estimates are subject to much uncertainty. We identify assumptions that contribute most to this uncertainty and propose potential research directions to refine our estimates.

  10. Life-cycle analysis of shale gas and natural gas.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, C.E.; Han, J.; Burnham, A.

    2012-01-27

    The technologies and practices that have enabled the recent boom in shale gas production have also brought attention to the environmental impacts of its use. Using the current state of knowledge of the recovery, processing, and distribution of shale gas and conventional natural gas, we have estimated up-to-date, life-cycle greenhouse gas emissions. In addition, we have developed distribution functions for key parameters in each pathway to examine uncertainty and identify data gaps - such as methane emissions from shale gas well completions and conventional natural gas liquid unloadings - that need to be addressed further. Our base case results showmore » that shale gas life-cycle emissions are 6% lower than those of conventional natural gas. However, the range in values for shale and conventional gas overlap, so there is a statistical uncertainty regarding whether shale gas emissions are indeed lower than conventional gas emissions. This life-cycle analysis provides insight into the critical stages in the natural gas industry where emissions occur and where opportunities exist to reduce the greenhouse gas footprint of natural gas.« less

  11. Reliability Issues in Stirling Radioisotope Power Systems

    NASA Technical Reports Server (NTRS)

    Schreiber, Jeffrey; Shah, Ashwin

    2005-01-01

    Stirling power conversion is a potential candidate for use in a Radioisotope Power System (RPS) for space science missions because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced requirement of radioactive material. Reliability of an RPS that utilizes Stirling power conversion technology is important in order to ascertain long term successful performance. Owing to long life time requirement (14 years), it is difficult to perform long-term tests that encompass all the uncertainties involved in the design variables of components and subsystems comprising the RPS. The requirement for uninterrupted performance reliability and related issues are discussed, and some of the critical areas of concern are identified. An overview of the current on-going efforts to understand component life, design variables at the component and system levels, and related sources and nature of uncertainties are also discussed. Current status of the 110 watt Stirling Radioisotope Generator (SRG110) reliability efforts is described. Additionally, an approach showing the use of past experience on other successfully used power systems to develop a reliability plan for the SRG110 design is outlined.

  12. A controllable sensor management algorithm capable of learning

    NASA Astrophysics Data System (ADS)

    Osadciw, Lisa A.; Veeramacheneni, Kalyan K.

    2005-03-01

    Sensor management technology progress is challenged by the geographic space it spans, the heterogeneity of the sensors, and the real-time timeframes within which plans controlling the assets are executed. This paper presents a new sensor management paradigm and demonstrates its application in a sensor management algorithm designed for a biometric access control system. This approach consists of an artificial intelligence (AI) algorithm focused on uncertainty measures, which makes the high level decisions to reduce uncertainties and interfaces with the user, integrated cohesively with a bottom up evolutionary algorithm, which optimizes the sensor network"s operation as determined by the AI algorithm. The sensor management algorithm presented is composed of a Bayesian network, the AI algorithm component, and a swarm optimization algorithm, the evolutionary algorithm. Thus, the algorithm can change its own performance goals in real-time and will modify its own decisions based on observed measures within the sensor network. The definition of the measures as well as the Bayesian network determine the robustness of the algorithm and its utility in reacting dynamically to changes in the global system.

  13. Reliability Issues in Stirling Radioisotope Power Systems

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Schreiber, Jeffrey G.

    2004-01-01

    Stirling power conversion is a potential candidate for use in a Radioisotope Power System (RPS) for space science missions because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced requirement of radioactive material. Reliability of an RPS that utilizes Stirling power conversion technology is important in order to ascertain long term successful performance. Owing to long life time requirement (14 years), it is difficult to perform long-term tests that encompass all the uncertainties involved in the design variables of components and subsystems comprising the RPS. The requirement for uninterrupted performance reliability and related issues are discussed, and some of the critical areas of concern are identified. An overview of the current on-going efforts to understand component life, design variables at the component and system levels, and related sources and nature of uncertainties are also discussed. Current status of the 110 watt Stirling Radioisotope Generator (SRG110) reliability efforts is described. Additionally, an approach showing the use of past experience on other successfully used power systems to develop a reliability plan for the SRG110 design is outlined.

  14. Modeling cascading diffusion of new energy technologies: case study of residential solid oxide fuel cells in the US and internationally.

    PubMed

    Herron, Seth; Williams, Eric

    2013-08-06

    Subsidy programs for new energy technologies are motivated by the experience curve: increased adoption of a technology leads to learning and economies of scale that lower costs. Geographic differences in fuel prices and climate lead to large variability in the economic performance of energy technologies. The notion of cascading diffusion is that regions with favorable economic conditions serve as the basis to build scale and reduce costs so that the technology becomes attractive in new regions. We develop a model of cascading diffusion and implement via a case study of residential solid oxide fuel cells (SOFCs) for combined heating and power. We consider diffusion paths within the U.S. and internationally. We construct market willingness-to-pay curves and estimate future manufacturing costs via an experience curve. Combining market and cost results, we find that for rapid cost reductions (learning rate = 25%), a modest public subsidy can make SOFC investment profitable for 20-160 million households. If cost reductions are slow however (learning rate = 15%), residential SOFCs may not become economically competitive. Due to higher energy prices in some countries, international diffusion is more favorable than domestic, mitigating much of the uncertainty in the learning rate.

  15. Correlated Uncertainties in Radiation Shielding Effectiveness

    NASA Technical Reports Server (NTRS)

    Werneth, Charles M.; Maung, Khin Maung; Blattnig, Steve R.; Clowdsley, Martha S.; Townsend, Lawrence W.

    2013-01-01

    The space radiation environment is composed of energetic particles which can deliver harmful doses of radiation that may lead to acute radiation sickness, cancer, and even death for insufficiently shielded crew members. Spacecraft shielding must provide structural integrity and minimize the risk associated with radiation exposure. The risk of radiation exposure induced death (REID) is a measure of the risk of dying from cancer induced by radiation exposure. Uncertainties in the risk projection model, quality factor, and spectral fluence are folded into the calculation of the REID by sampling from probability distribution functions. Consequently, determining optimal shielding materials that reduce the REID in a statistically significant manner has been found to be difficult. In this work, the difference of the REID distributions for different materials is used to study the effect of composition on shielding effectiveness. It is shown that the use of correlated uncertainties allows for the determination of statistically significant differences between materials despite the large uncertainties in the quality factor. This is in contrast to previous methods where uncertainties have been generally treated as uncorrelated. It is concluded that the use of correlated quality factor uncertainties greatly reduces the uncertainty in the assessment of shielding effectiveness for the mitigation of radiation exposure.

  16. How uncertain is the future of electric vehicle market: Results from Monte Carlo simulations using a nested logit model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Changzheng; Oak Ridge National Lab.; Lin, Zhenhong

    Plug-in electric vehicles (PEVs) are widely regarded as an important component of the technology portfolio designed to accomplish policy goals in sustainability and energy security. However, the market acceptance of PEVs in the future remains largely uncertain from today's perspective. By integrating a consumer choice model based on nested multinomial logit and Monte Carlo simulation, this study analyzes the uncertainty of PEV market penetration using Monte Carlo simulation. Results suggest that the future market for PEVs is highly uncertain and there is a substantial risk of low penetration in the early and midterm market. Top factors contributing to market sharemore » variability are price sensitivities, energy cost, range limitation, and charging availability. The results also illustrate the potential effect of public policies in promoting PEVs through investment in battery technology and infrastructure deployment. Here, continued improvement of battery technologies and deployment of charging infrastructure alone do not necessarily reduce the spread of market share distributions, but may shift distributions toward right, i.e., increase the probability of having great market success.« less

  17. How uncertain is the future of electric vehicle market: Results from Monte Carlo simulations using a nested logit model

    DOE PAGES

    Liu, Changzheng; Oak Ridge National Lab.; Lin, Zhenhong; ...

    2016-12-08

    Plug-in electric vehicles (PEVs) are widely regarded as an important component of the technology portfolio designed to accomplish policy goals in sustainability and energy security. However, the market acceptance of PEVs in the future remains largely uncertain from today's perspective. By integrating a consumer choice model based on nested multinomial logit and Monte Carlo simulation, this study analyzes the uncertainty of PEV market penetration using Monte Carlo simulation. Results suggest that the future market for PEVs is highly uncertain and there is a substantial risk of low penetration in the early and midterm market. Top factors contributing to market sharemore » variability are price sensitivities, energy cost, range limitation, and charging availability. The results also illustrate the potential effect of public policies in promoting PEVs through investment in battery technology and infrastructure deployment. Here, continued improvement of battery technologies and deployment of charging infrastructure alone do not necessarily reduce the spread of market share distributions, but may shift distributions toward right, i.e., increase the probability of having great market success.« less

  18. Portfolio Optimization of Nanomaterial Use in Clean Energy Technologies.

    PubMed

    Moore, Elizabeth A; Babbitt, Callie W; Gaustad, Gabrielle; Moore, Sean T

    2018-04-03

    While engineered nanomaterials (ENMs) are increasingly incorporated in diverse applications, risks of ENM adoption remain difficult to predict and mitigate proactively. Current decision-making tools do not adequately account for ENM uncertainties including varying functional forms, unique environmental behavior, economic costs, unknown supply and demand, and upstream emissions. The complexity of the ENM system necessitates a novel approach: in this study, the adaptation of an investment portfolio optimization model is demonstrated for optimization of ENM use in renewable energy technologies. Where a traditional investment portfolio optimization model maximizes return on investment through optimal selection of stock, ENM portfolio optimization maximizes the performance of energy technology systems by optimizing selective use of ENMs. Cumulative impacts of multiple ENM material portfolios are evaluated in two case studies: organic photovoltaic cells (OPVs) for renewable energy and lithium-ion batteries (LIBs) for electric vehicles. Results indicate ENM adoption is dependent on overall performance and variance of the material, resource use, environmental impact, and economic trade-offs. From a sustainability perspective, improved clean energy applications can help extend product lifespans, reduce fossil energy consumption, and substitute ENMs for scarce incumbent materials.

  19. Design of forging process variables under uncertainties

    NASA Astrophysics Data System (ADS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2005-02-01

    Forging is a complex nonlinear process that is vulnerable to various manufacturing anomalies, such as variations in billet geometry, billet/die temperatures, material properties, and workpiece and forging equipment positional errors. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion, and reduced productivity. Identifying, quantifying, and controlling the uncertainties will reduce variability risk in a manufacturing environment, which will minimize the overall production cost. In this article, various uncertainties that affect the forging process are identified, and their cumulative effect on the forging tool life is evaluated. Because the forging process simulation is time-consuming, a response surface model is used to reduce computation time by establishing a relationship between the process performance and the critical process variables. A robust design methodology is developed by incorporating reliability-based optimization techniques to obtain sound forging components. A case study of an automotive-component forging-process design is presented to demonstrate the applicability of the method.

  20. Electro-optical equivalent calibration technology for high-energy laser energy meters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Ji Feng, E-mail: wjfcom2000@163.com; Institute of Applied Electronics, China Academy of Engineering Physics, Mianyang 621900; Graduate School of China Academy of Engineering Physics, Beijing 100088

    Electro-optical equivalent calibration with high calibration power and high equivalence is particularly well-suited to the calibration of high-energy laser energy meters. A large amount of energy is reserved during this process, however, which continues to radiate after power-off. This study measured the radiation efficiency of a halogen tungsten lamp during power-on and after power-off in order to calculate the total energy irradiated by a lamp until the high-energy laser energy meter reaches thermal equilibrium. A calibration system was designed based on the measurement results, and the calibration equivalence of the system was analyzed in detail. Results show that measurement precisionmore » is significantly affected by the absorption factor of the absorption chamber and by heat loss in the energy meter. Calibration precision is successfully improved by enhancing the equivalent power and reducing power-on time. The electro-optical equivalent calibration system, measurement uncertainty of which was evaluated as 2.4% (k = 2), was used to calibrate a graphite-cone-absorption-cavity absolute energy meter, yielding a calibration coefficient of 1.009 and measurement uncertainty of 3.5% (k = 2). A water-absorption-type high-energy laser energy meter with measurement uncertainty of 4.8% (k = 2) was considered the reference standard, and compared to the energy meter calibrated in this study, yielded a correction factor of 0.995 (standard deviation of 1.4%).« less

  1. The impact of shale gas on the cost and feasibility of meeting climate targets—A global energy system model analysis and an exploration of uncertainties

    DOE PAGES

    Few, Sheridan; Gambhir, Ajay; Napp, Tamaryn; ...

    2017-01-27

    There exists considerable uncertainty over both shale and conventional gas resource availability and extraction costs, as well as the fugitive methane emissions associated with shale gas extraction and its possible role in mitigating climate change. This study uses a multi-region energy system model, TIAM (TIMES integrated assessment model), to consider the impact of a range of conventional and shale gas cost and availability assessments on mitigation scenarios aimed at achieving a limit to global warming of below 2 °C in 2100, with a 50% likelihood. When adding shale gas to the global energy mix, the reduction to the global energymore » system cost is relatively small (up to 0.4%), and the mitigation cost increases by 1%–3% under all cost assumptions. The impact of a “dash for shale gas”, of unavailability of carbon capture and storage, of increased barriers to investment in low carbon technologies, and of higher than expected leakage rates, are also considered; and are each found to have the potential to increase the cost and reduce feasibility of meeting global temperature goals. Finally, we conclude that the extraction of shale gas is not likely to significantly reduce the effort required to mitigate climate change under globally coordinated action, but could increase required mitigation effort if not handled sufficiently carefully.« less

  2. The sequestration switch: removing industrial CO2 by direct ocean absorption.

    PubMed

    Ametistova, Lioudmila; Twidell, John; Briden, James

    2002-04-22

    This review paper considers direct injection of industrial CO2 emissions into the mid-water oceanic column below 500 m depth. Such a process is a potential candidate for switching atmospheric carbon emissions directly to long term sequestration, thereby relieving the intermediate atmospheric burden. Given sufficient research justification, the argument is that harmful impact in both the Atmosphere and the biologically rich upper marine layer could be reduced. The paper aims to estimate the role that active intervention, through direct ocean CO2 storage, could play and to outline further research and assessment for the strategy to be a viable option for climate change mitigation. The attractiveness of direct ocean injection lies in its bypassing of the Atmosphere and upper marine region, its relative permanence, its practicability using existing technologies and its quantification. The difficulties relate to the uncertainty of some fundamental scientific issues, such as plume dynamics, lowered pH of the exposed waters and associated ecological impact, the significant energy penalty associated with the necessary engineering plant and the uncertain costs. Moreover, there are considerable uncertainties regarding related international marine law. Development of the process would require acceptance of the evidence for climate change, strict requirements for large industrial consumers of fossil fuel to reduce CO2 emissions into the Atmosphere and scientific evidence for the overall beneficial impact of ocean sequestration.

  3. The impact of shale gas on the cost and feasibility of meeting climate targets—A global energy system model analysis and an exploration of uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Few, Sheridan; Gambhir, Ajay; Napp, Tamaryn

    There exists considerable uncertainty over both shale and conventional gas resource availability and extraction costs, as well as the fugitive methane emissions associated with shale gas extraction and its possible role in mitigating climate change. This study uses a multi-region energy system model, TIAM (TIMES integrated assessment model), to consider the impact of a range of conventional and shale gas cost and availability assessments on mitigation scenarios aimed at achieving a limit to global warming of below 2 °C in 2100, with a 50% likelihood. When adding shale gas to the global energy mix, the reduction to the global energymore » system cost is relatively small (up to 0.4%), and the mitigation cost increases by 1%–3% under all cost assumptions. The impact of a “dash for shale gas”, of unavailability of carbon capture and storage, of increased barriers to investment in low carbon technologies, and of higher than expected leakage rates, are also considered; and are each found to have the potential to increase the cost and reduce feasibility of meeting global temperature goals. Finally, we conclude that the extraction of shale gas is not likely to significantly reduce the effort required to mitigate climate change under globally coordinated action, but could increase required mitigation effort if not handled sufficiently carefully.« less

  4. A probabilistic approach to emissions from transportation sector in the coming decades

    NASA Astrophysics Data System (ADS)

    Yan, F.; Winijkul, E.; Bond, T. C.; Streets, D. G.

    2010-12-01

    Future emission estimates are necessary for understanding climate change, designing national and international strategies for air quality control and evaluating mitigation policies. Emission inventories are uncertain and future projections even more so. Most current emission projection models are deterministic; in other words, there is only single answer for each scenario. As a result, uncertainties have not been included in the estimation of climate forcing or other environmental effects, but it is important to quantify the uncertainty inherent in emission projections. We explore uncertainties of emission projections from transportation sector in the coming decades by sensitivity analysis and Monte Carlo simulations. These projections are based on a technology driven model: the Speciated Pollutants Emission Wizard (SPEW)-Trend, which responds to socioeconomic conditions in different economic and mitigation scenarios. The model contains detail about technology stock, including consumption growth rates, retirement rates, timing of emission standards, deterioration rates and transition rates from normal vehicles to vehicles with extremely high emission factors (termed “superemitters”). However, understanding of these parameters, as well as relationships with socioeconomic conditions, is uncertain. We project emissions from transportation sectors under four different IPCC scenarios (A1B, A2, B1, and B2). Due to the later implementation of advanced emission standards, Africa has the highest annual growth rate (1.2-3.1%) from 2010 to 2050. Superemitters begin producing more than 50% of global emissions around year 2020. We estimate uncertainties from the relationships between technological change and socioeconomic conditions and examine their impact on future emissions. Sensitivities to parameters governing retirement rates are highest, causing changes in global emissions from-26% to +55% on average from 2010 to 2050. We perform Monte Carlo simulations to examine how these uncertainties will affect total emissions if any input parameter that has inherent the uncertainties is substituted by a range of values-probability distribution and varies at the same time; the 95% confidence interval of global emission annual growth rate is -1.9% to +0.2% per year.

  5. Accounting for downscaling and model uncertainty in fine-resolution seasonal climate projections over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2018-01-01

    Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.

  6. Emissions Benefits From Renewable Fuels and Other Alternatives for Heavy-Duty Vehicles

    NASA Astrophysics Data System (ADS)

    Hajbabaei, Maryam

    There is a global effort to expand the use of alternative fuels due to their several benefits such as improving air quality with reducing some criteria emissions, reducing dependency on fossil fuels, and reducing greenhouse gases such as carbon dioxide. This dissertation is focused on investigating the impact of two popular alternative fuels, biodiesel and natural gas (NG), on emissions from heavy-duty engines. Biodiesel is one of the most popular renewable fuels with diesel applications. Although biodiesel blends are reported to reduce particulate matter, carbon monoxide, and total hydrocarbon emissions; there is uncertainty on their impact on nitrogen oxides (NOx) emissions. This dissertation evaluated the effect of biodiesel feedstock, biodiesel blend level, engine technology, and driving conditions on NOx emissions. The results showed that NOx emissions increase with 20% and higher biodiesel blends. Also, in this study some strategies were proposed and some fuel formulations were found for mitigating NOx emissions increases with biodiesel. The impact of 5% biodiesel on criteria emissions specifically NOx was also fully studied in this thesis. As a part of the results of this study, 5% animal-based biodiesel was certified for use in California based on California Air Resources Board emissions equivalent procedure. NG is one of the most prominent alternative fuels with larger reserves compared to crude oil. However, the quality of NG depends on both its source and the degree to which it is processed. The current study explored the impact of various NG fuels, ranging from low methane/high energy gases to high methane/low energy gases, on criteria and toxic emissions from NG engines with different combustion and aftertreatment technologies. The results showed stronger fuel effects for the lean-burn technology bus. Finally, this thesis investigated the impact of changing diesel fuel composition on the criteria emissions from a variety of heavy-duty engine technologies. Emissions from an average diesel fuel used throughout the U.S. were compared with a 10% aromatic, ultra-low sulfur diesel fuel used in California with more stringent air quality regulations. The results showed that the emerging aftertreatment technologies eventually eliminate the benefits of the lower aromatic content/higher cetane number diesel fuels.

  7. Cloud fraction at the ARM SGP site: Reducing uncertainty with self-organizing maps

    DOE PAGES

    Kennedy, Aaron D.; Dong, Xiquan; Xi, Baike

    2015-02-15

    Instrument downtime leads to uncertainty in the monthly and annual record of cloud fraction (CF), making it difficult to perform time series analyses of cloud properties and perform detailed evaluations of model simulations. As cloud occurrence is partially controlled by the large-scale atmospheric environment, this knowledge is used to reduce uncertainties in the instrument record. Synoptic patterns diagnosed from the North American Regional Reanalysis (NARR) during the period 1997–2010 are classified using a competitive neural network known as the self-organizing map (SOM). The classified synoptic states are then compared to the Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) instrumentmore » record to determine the expected CF. A number of SOMs are tested to understand how the number of classes and the period of classifications impact the relationship between classified states and CFs. Bootstrapping is utilized to quantify the uncertainty of the instrument record when statistical information from the SOM is included. Although all SOMs significantly reduce the uncertainty of the CF record calculated in Kennedy et al. (Theor Appl Climatol 115:91–105, 2014), SOMs with a large number of classes and separated by month are required to produce the lowest uncertainty and best agreement with the annual cycle of CF. Lastly, this result may be due to a manifestation of seasonally dependent biases in NARR.« less

  8. Safety envelope for load tolerance of structural element design based on multi-stage testing

    DOE PAGES

    Park, Chanyoung; Kim, Nam H.

    2016-09-06

    Structural elements, such as stiffened panels and lap joints, are basic components of aircraft structures. For aircraft structural design, designers select predesigned elements satisfying the design load requirement based on their load-carrying capabilities. Therefore, estimation of safety envelope of structural elements for load tolerances would be a good investment for design purpose. In this article, a method of estimating safety envelope is presented using probabilistic classification, which can estimate a specific level of failure probability under both aleatory and epistemic uncertainties. An important contribution of this article is that the calculation uncertainty is reflected in building a safety envelope usingmore » Gaussian process, and the effect of element test data on reducing the calculation uncertainty is incorporated by updating the Gaussian process model with the element test data. It is shown that even one element test can significantly reduce the calculation uncertainty due to lacking knowledge of actual physics, so that conservativeness in a safety envelope is significantly reduced. The proposed approach was demonstrated with a cantilever beam example, which represents a structural element. The example shows that calculation uncertainty provides about 93% conservativeness against the uncertainty due to a few element tests. As a result, it is shown that even a single element test can increase the load tolerance modeled with the safety envelope by 20%.« less

  9. Quantifying the value of redundant measurements at GCOS Reference Upper-Air Network sites

    DOE PAGES

    Madonna, F.; Rosoldi, M.; Güldner, J.; ...

    2014-11-19

    The potential for measurement redundancy to reduce uncertainty in atmospheric variables has not been investigated comprehensively for climate observations. We evaluated the usefulness of entropy and mutual correlation concepts, as defined in information theory, for quantifying random uncertainty and redundancy in time series of the integrated water vapour (IWV) and water vapour mixing ratio profiles provided by five highly instrumented GRUAN (GCOS, Global Climate Observing System, Reference Upper-Air Network) stations in 2010–2012. Results show that the random uncertainties on the IWV measured with radiosondes, global positioning system, microwave and infrared radiometers, and Raman lidar measurements differed by less than 8%.more » Comparisons of time series of IWV content from ground-based remote sensing instruments with in situ soundings showed that microwave radiometers have the highest redundancy with the IWV time series measured by radiosondes and therefore the highest potential to reduce the random uncertainty of the radiosondes time series. Moreover, the random uncertainty of a time series from one instrument can be reduced by ~ 60% by constraining the measurements with those from another instrument. The best reduction of random uncertainty is achieved by conditioning Raman lidar measurements with microwave radiometer measurements. In conclusion, specific instruments are recommended for atmospheric water vapour measurements at GRUAN sites. This approach can be applied to the study of redundant measurements for other climate variables.« less

  10. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  11. Reducing equifinality using isotopes in a process-based stream nitrogen model highlights the flux of algal nitrogen from agricultural streams

    NASA Astrophysics Data System (ADS)

    Ford, William I.; Fox, James F.; Pollock, Erik

    2017-08-01

    The fate of bioavailable nitrogen species transported through agricultural landscapes remains highly uncertain given complexities of measuring fluxes impacting the fluvial N cycle. We present and test a new numerical model named Technology for Removable Annual Nitrogen in Streams For Ecosystem Restoration (TRANSFER), which aims to reduce model uncertainty due to erroneous parameterization, i.e., equifinality, in stream nitrogen cycle assessment and quantify the significance of transient and permanent removal pathways. TRANSFER couples nitrogen elemental and stable isotope mass-balance equations with existing hydrologic, hydraulic, sediment transport, algal biomass, and sediment organic matter mass-balance subroutines and a robust GLUE-like uncertainty analysis. We test the model in an agriculturally impacted, third-order stream reach located in the Bluegrass Region of Central Kentucky. Results of the multiobjective model evaluation for the model application highlight the ability of sediment nitrogen fingerprints including elemental concentrations and stable N isotope signatures to reduce equifinality of the stream N model. Advancements in the numerical simulations allow for illumination of the significance of algal sloughing fluxes for the first time in relation to denitrification. Broadly, model estimates suggest that denitrification is slightly greater than algal N sloughing (10.7% and 6.3% of dissolved N load on average), highlighting the potential for overestimation of denitrification by 37%. We highlight the significance of the transient N pool given the potential for the N store to be regenerated to the water column in downstream reaches, leading to harmful and nuisance algal bloom development.

  12. On the Directional Dependence and Null Space Freedom in Uncertainty Bound Identification

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    1997-01-01

    In previous work, the determination of uncertainty models via minimum norm model validation is based on a single set of input and output measurement data. Since uncertainty bounds at each frequency is directionally dependent for multivariable systems, this will lead to optimistic uncertainty levels. In addition, the design freedom in the uncertainty model has not been utilized to further reduce uncertainty levels. The above issues are addressed by formulating a min- max problem. An analytical solution to the min-max problem is given to within a generalized eigenvalue problem, thus avoiding a direct numerical approach. This result will lead to less conservative and more realistic uncertainty models for use in robust control.

  13. Relatively certain! Comparative thinking reduces uncertainty.

    PubMed

    Mussweiler, Thomas; Posten, Ann-Christin

    2012-02-01

    Comparison is one of the most ubiquitous and versatile mechanisms in human information processing. Previous research demonstrates that one consequence of comparative thinking is increased judgmental efficiency: comparison allows for quicker judgments without a loss in accuracy. We hypothesised that a second potential consequence of comparative thinking is reduced judgmental uncertainty. We examined this possibility in three experiments using three different domains of judgment and three different measures of uncertainty. Results consistently demonstrate that procedurally priming participants to rely more heavily on comparative thinking during judgment induces them to feel more certain about their judgment. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Adaptive neural network motion control for aircraft under uncertainty conditions

    NASA Astrophysics Data System (ADS)

    Efremov, A. V.; Tiaglik, M. S.; Tiumentsev, Yu V.

    2018-02-01

    We need to provide motion control of modern and advanced aircraft under diverse uncertainty conditions. This problem can be solved by using adaptive control laws. We carry out an analysis of the capabilities of these laws for such adaptive systems as MRAC (Model Reference Adaptive Control) and MPC (Model Predictive Control). In the case of a nonlinear control object, the most efficient solution to the adaptive control problem is the use of neural network technologies. These technologies are suitable for the development of both a control object model and a control law for the object. The approximate nature of the ANN model was taken into account by introducing additional compensating feedback into the control system. The capabilities of adaptive control laws under uncertainty in the source data are considered. We also conduct simulations to assess the contribution of adaptivity to the behavior of the system.

  15. Scientific rationality, uncertainty and the governance of human genetics: an interview study with researchers at deCODE genetics.

    PubMed

    Hjörleifsson, Stefán; Schei, Edvin

    2006-07-01

    Technology development in human genetics is fraught with uncertainty, controversy and unresolved moral issues, and industry scientists are sometimes accused of neglecting the implications of their work. The present study was carried out to elicit industry scientists' reflections on the relationship between commercial, scientific and ethical dimensions of present day genetics and the resources needed for robust governance of new technologies. Interviewing scientists of the company deCODE genetics in Iceland, we found that in spite of optimism, the informants revealed ambiguity and uncertainty concerning the use of human genetic technologies for the prevention of common diseases. They concurred that uncritical marketing of scientific success might cause exaggerated public expectations of health benefits from genetics, with the risk of backfiring and causing resistance to genetics in the population. On the other hand, the scientists did not address dilemmas arising from the commercial nature of their own employer. Although the scientists tended to describe public fear as irrational, they identified issues where scepticism might be well founded and explored examples where they, despite expert knowledge, held ambiguous or tentative personal views on the use of predictive genetic technologies. The rationality of science was not seen as sufficient to ensure beneficial governance of new technologies. The reflexivity and suspension of judgement demonstrated in the interviews exemplify productive features of moral deliberation in complex situations. Scientists should take part in dialogues concerning the governance of genetic technologies, acknowledge any vested interests, and use their expertise to highlight, not conceal the technical and moral complexity involved.

  16. Reliability- and performance-based robust design optimization of MEMS structures considering technological uncertainties

    NASA Astrophysics Data System (ADS)

    Martowicz, Adam; Uhl, Tadeusz

    2012-10-01

    The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.

  17. Active subspace uncertainty quantification for a polydomain ferroelectric phase-field model

    NASA Astrophysics Data System (ADS)

    Leon, Lider S.; Smith, Ralph C.; Miles, Paul; Oates, William S.

    2018-03-01

    Quantum-informed ferroelectric phase field models capable of predicting material behavior, are necessary for facilitating the development and production of many adaptive structures and intelligent systems. Uncertainty is present in these models, given the quantum scale at which calculations take place. A necessary analysis is to determine how the uncertainty in the response can be attributed to the uncertainty in the model inputs or parameters. A second analysis is to identify active subspaces within the original parameter space, which quantify directions in which the model response varies most dominantly, thus reducing sampling effort and computational cost. In this investigation, we identify an active subspace for a poly-domain ferroelectric phase-field model. Using the active variables as our independent variables, we then construct a surrogate model and perform Bayesian inference. Once we quantify the uncertainties in the active variables, we obtain uncertainties for the original parameters via an inverse mapping. The analysis provides insight into how active subspace methodologies can be used to reduce computational power needed to perform Bayesian inference on model parameters informed by experimental or simulated data.

  18. Uncertainty quantification of crustal scale thermo-chemical properties in Southeast Australia

    NASA Astrophysics Data System (ADS)

    Mather, B.; Moresi, L. N.; Rayner, P. J.

    2017-12-01

    The thermo-chemical properties of the crust are essential to understanding the mechanical and thermal state of the lithosphere. The uncertainties associated with these parameters are connected to the available geophysical observations and a priori information to constrain the objective function. Often, it is computationally efficient to reduce the parameter space by mapping large portions of the crust into lithologies that have assumed homogeneity. However, the boundaries of these lithologies are, in themselves, uncertain and should also be included in the inverse problem. We assimilate geological uncertainties from an a priori geological model of Southeast Australia with geophysical uncertainties from S-wave tomography and 174 heat flow observations within an adjoint inversion framework. This reduces the computational cost of inverting high dimensional probability spaces, compared to probabilistic inversion techniques that operate in the `forward' mode, but at the sacrifice of uncertainty and covariance information. We overcome this restriction using a sensitivity analysis, that perturbs our observations and a priori information within their probability distributions, to estimate the posterior uncertainty of thermo-chemical parameters in the crust.

  19. Proceedings of the Annual Technology Literacy Conference. (7th, Alexandria, Virginia, February 6-9, 1992).

    ERIC Educational Resources Information Center

    Cheek, Dennis, Ed.

    The following papers are included in these proceedings: "Weaving Technology and Human Affairs" (B. Hazeltine); "Positivist and Constructivist Understandings about Science and Their Implications for STS Teaching and Learning" (B. Reeves; C. Ney); "A Modular Conceptual Framework for Technology and Work" (D. Blandow); "A Time of Uncertainty: The…

  20. Reducing uncertainties for short lived cumulative fission product yields

    DOE PAGES

    Stave, Sean; Prinke, Amanda; Greenwood, Larry; ...

    2015-09-05

    Uncertainties associated with short lived (halflives less than 1 day) fission product yields listed in databases such as the National Nuclear Data Center’s ENDF/B-VII are large enough for certain isotopes to provide an opportunity for new precision measurements to offer significant uncertainty reductions. A series of experiments has begun where small samples of 235U are irradiated with a pulsed, fission neutron spectrum at the Nevada National Security Site and placed between two broad-energy germanium detectors. The amount of various isotopes present immediately following the irradiation can be determined given the total counts and the calibrated properties of the detector system.more » The uncertainty on the fission yields for multiple isotopes has been reduced by nearly an order of magnitude.« less

  1. Research of Uncertainty Reasoning in Pineapple Disease Identification System

    NASA Astrophysics Data System (ADS)

    Liu, Liqun; Fan, Haifeng

    In order to deal with the uncertainty of evidences mostly existing in pineapple disease identification system, a reasoning model based on evidence credibility factor was established. The uncertainty reasoning method is discussed,including: uncertain representation of knowledge, uncertain representation of rules, uncertain representation of multi-evidences and update of reasoning rules. The reasoning can fully reflect the uncertainty in disease identification and reduce the influence of subjective factors on the accuracy of the system.

  2. JUPITER PROJECT - MERGING INVERSE PROBLEM FORMULATION TECHNOLOGIES

    EPA Science Inventory

    The JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) project seeks to enhance and build on the technology and momentum behind two of the most popular sensitivity analysis, data assessment, calibration, and uncertainty analysis programs used in envi...

  3. Analyzing the greenhouse gas impact potential of smallholder development actions across a global food security program

    NASA Astrophysics Data System (ADS)

    Grewer, Uwe; Nash, Julie; Gurwick, Noel; Bockel, Louis; Galford, Gillian; Richards, Meryl; Costa Junior, Ciniro; White, Julianna; Pirolli, Gillian; Wollenberg, Eva

    2018-04-01

    This article analyses the greenhouse gas (GHG) impact potential of improved management practices and technologies for smallholder agriculture promoted under a global food security development program. Under ‘business-as-usual’ development, global studies on the future of agriculture to 2050 project considerable increases in total food production and cultivated area. Conventional cropland intensification and conversion of natural vegetation typically result in increased GHG emissions and loss of carbon stocks. There is a strong need to understand the potential greenhouse gas impacts of agricultural development programs intended to achieve large-scale change, and to identify pathways of smallholder agricultural development that can achieve food security and agricultural production growth without drastic increases in GHG emissions. In an analysis of 134 crop and livestock production systems in 15 countries with reported impacts on 4.8 million ha, improved management practices and technologies by smallholder farmers significantly reduce GHG emission intensity of agricultural production, increase yields and reduce post-harvest losses, while either decreasing or only moderately increasing net GHG emissions per area. Investments in both production and post-harvest stages meaningfully reduced GHG emission intensity, contributing to low emission development. We present average impacts on net GHG emissions per hectare and GHG emission intensity, while not providing detailed statistics of GHG impacts at scale that are associated to additional uncertainties. While reported improvements in smallholder systems effectively reduce future GHG emissions compared to business-as-usual development, these contributions are insufficient to significantly reduce net GHG emission in agriculture beyond current levels, particularly if future agricultural production grows at projected rates.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Heng, E-mail: hengli@mdanderson.org; Zhu, X. Ronald; Zhang, Xiaodong

    Purpose: To develop and validate a novel delivery strategy for reducing the respiratory motion–induced dose uncertainty of spot-scanning proton therapy. Methods and Materials: The spot delivery sequence was optimized to reduce dose uncertainty. The effectiveness of the delivery sequence optimization was evaluated using measurements and patient simulation. One hundred ninety-one 2-dimensional measurements using different delivery sequences of a single-layer uniform pattern were obtained with a detector array on a 1-dimensional moving platform. Intensity modulated proton therapy plans were generated for 10 lung cancer patients, and dose uncertainties for different delivery sequences were evaluated by simulation. Results: Without delivery sequence optimization,more » the maximum absolute dose error can be up to 97.2% in a single measurement, whereas the optimized delivery sequence results in a maximum absolute dose error of ≤11.8%. In patient simulation, the optimized delivery sequence reduces the mean of fractional maximum absolute dose error compared with the regular delivery sequence by 3.3% to 10.6% (32.5-68.0% relative reduction) for different patients. Conclusions: Optimizing the delivery sequence can reduce dose uncertainty due to respiratory motion in spot-scanning proton therapy, assuming the 4-dimensional CT is a true representation of the patients' breathing patterns.« less

  5. Transportation Energy Futures Series. Non-Cost Barriers to Consumer Adoption of New Light-Duty Vehicle Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephens, Thomas

    2013-03-01

    Consumer preferences are key to the adoption of new vehicle technologies. Barriers to consumer adoption include price and other obstacles, such as limited driving range and charging infrastructure; unfamiliarity with the technology and uncertainty about direct benefits; limited makes and models with the technology; reputation or perception of the technology; standardization issues; and regulations. For each of these non-cost barriers, this report estimates an effective cost and summarizes underlying influences on consumer preferences, approximate magnitude and relative severity, and assesses potential actions, based on a comprehensive literature review. While the report concludes that non-cost barriers are significant, effective cost andmore » potential market share are very uncertain. Policies and programs including opportunities for drivers to test drive advanced vehicles, general public outreach and information programs, incentives for providing charging and fueling infrastructure, and development of technology standards were examined for their ability to address barriers, but little quantitative data exists on the effectiveness of these measures. This is one in a series of reports produced as a result of the Transportation Energy Futures project, a Department of Energy-sponsored multi-agency effort to pinpoint underexplored strategies for reducing GHGs and petroleum dependence related to transportation. View all reports on the TEF Web page, http://www.eere.energy.gov/analysis/transportationenergyfutures/index.html.« less

  6. Electrical Stimulation for Pressure Injuries: A Health Technology Assessment.

    PubMed

    2017-01-01

    Pressure injuries (bedsores) are common and reduce quality of life. They are also costly and difficult to treat. This health technology assessment evaluates the effectiveness, cost-effectiveness, budget impact, and lived experience of adding electrical stimulation to standard wound care for pressure injuries. We conducted a systematic search for studies published to December 7, 2016, limited to randomized and non-randomized controlled trials examining the effectiveness of electrical stimulation plus standard wound care versus standard wound care alone for patients with pressure injuries. We assessed the quality of evidence through Grading of Recommendations Assessment, Development, and Evaluation (GRADE). In addition, we conducted an economic literature review and a budget impact analysis to assess the cost-effectiveness and affordability of electrical stimulation for treatment of pressure ulcers in Ontario. Given uncertainties in clinical evidence and resource use, we did not conduct a primary economic evaluation. Finally, we conducted qualitative interviews with patients and caregivers about their experiences with pressure injuries, currently available treatments, and (if applicable) electrical stimulation. Nine randomized controlled trials and two non-randomized controlled trials were found from the systematic search. There was no significant difference in complete pressure injury healing between adjunct electrical stimulation and standard wound care. There was a significant difference in wound surface area reduction favouring electrical stimulation compared with standard wound care.The only study on cost-effectiveness of electrical stimulation was partially applicable to the patient population of interest. Therefore, the cost-effectiveness of electrical stimulation cannot be determined. We estimate that the cost of publicly funding electrical stimulation for pressure injuries would be $0.77 to $3.85 million yearly for the next 5 years.Patients and caregivers reported that pressure injuries were burdensome and reduced their quality of life. Patients and caregivers also noted that electrical stimulation seemed to reduce the time it took the wounds to heal. While electrical stimulation is safe to use (GRADE quality of evidence: high) there is uncertainty about whether it improves wound healing (GRADE quality of evidence: low). In Ontario, publicly funding electrical stimulation for pressure injuries could result in extra costs of $0.77 to $3.85 million yearly for the next 5 years.

  7. Risk assessment and risk management of mycotoxins.

    PubMed

    2012-01-01

    Risk assessment is the process of quantifying the magnitude and exposure, or probability, of a harmful effect to individuals or populations from certain agents or activities. Here, we summarize the four steps of risk assessment: hazard identification, dose-response assessment, exposure assessment, and risk characterization. Risk assessments using these principles have been conducted on the major mycotoxins (aflatoxins, fumonisins, ochratoxin A, deoxynivalenol, and zearalenone) by various regulatory agencies for the purpose of setting food safety guidelines. We critically evaluate the impact of these risk assessment parameters on the estimated global burden of the associated diseases as well as the impact of regulatory measures on food supply and international trade. Apart from the well-established risk posed by aflatoxins, many uncertainties still exist about risk assessments for the other major mycotoxins, often reflecting a lack of epidemiological data. Differences exist in the risk management strategies and in the ways different governments impose regulations and technologies to reduce levels of mycotoxins in the food-chain. Regulatory measures have very little impact on remote rural and subsistence farming communities in developing countries, in contrast to developed countries, where regulations are strictly enforced to reduce and/or remove mycotoxin contamination. However, in the absence of the relevant technologies or the necessary infrastructure, we highlight simple intervention practices to reduce mycotoxin contamination in the field and/or prevent mycotoxin formation during storage.

  8. Earth-Science Research for Addressing the Water-Energy Nexus

    NASA Astrophysics Data System (ADS)

    Healy, R. W.; Alley, W. M.; Engle, M.; McMahon, P. B.; Bales, J. D.

    2013-12-01

    In the coming decades, the United States will face two significant and sometimes competing challenges: preserving sustainable supplies of fresh water for humans and ecosystems, and ensuring available sources of energy. This presentation provides an overview of the earth-science data collection and research needed to address these challenges. Uncertainty limits our understanding of many aspects of the water-energy nexus. These aspects include availability of water, water requirements for energy development, energy requirements for treating and delivering fresh water, effects of emerging energy development technologies on water quality and quantity, and effects of future climates and land use on water and energy needs. Uncertainties can be reduced with an integrated approach that includes assessments of water availability and energy resources; monitoring of surface water and groundwater quantity and quality, water use, and energy use; research on impacts of energy waste streams, hydraulic fracturing, and other fuel-extraction processes on water quality; and research on the viability and environmental footprint of new technologies such as carbon capture and sequestration and conversion of cellulosic material to ethanol. Planning for water and energy development requires consideration of factors such as economics, population trends, human health, and societal values; however, sound resource management must be grounded on a clear understanding of the earth-science aspects of the water-energy nexus. Information gained from an earth-science data-collection and research program can improve our understanding of water and energy issues and lay the ground work for informed resource management.

  9. Uncertainty in Bioenergy Scenarios for California: Lessons Learned in Communicating with Different Stakeholder Groups

    NASA Astrophysics Data System (ADS)

    Youngs, H.

    2013-12-01

    Projecting future bioenergy use involves incorporating several critical inter-related parameters with high uncertainty. Among these are: technology adoption, infrastructure and capacity building, investment, political will, and public acceptance. How, when, where, and to what extent the various bioenergy options are implemented has profound effects on the environmental impacts incurred. California serves as an interesting case study for bioenergy implementation because it has very strong competing forces that can influence these critical factors. The state has aggressive greenhouse gas reduction goals, which will require some biofuels, and has invested accordingly on new technology. At the same time, political will and public acceptance of bioenergy has wavered, seriously stalling bioenergy expansion efforts. We have constructed scenarios for bioenergy implementation in California to 2050, in conjunction with efforts to reach AB32 GHG reduction goals of 80% below 1990 emissions. The state has the potential to produce 3 to 10 TJ of biofuels and electricity; however, this potential will be severely limited in some scenarios. This work examines sources of uncertainty in bioenergy implementation, how uncertainty is or is not incorporated into future bioenergy scenarios, and what this means for assessing environmental impacts. How uncertainty is communicated and perceived also affects future scenarios. Often, there is a disconnect between scenarios for widespread implementation and the actual development of individual projects, resulting in "artificial uncertainty" with very real impacts. Bringing stakeholders to the table is only the first step. Strategies to tailor and stage discussions of uncertainty to stakeholder groups is equally important. Lessons learned in the process of communicating the Calfornia's Energy Future biofuels assessment will be discussed.

  10. Effects of relational uncertainty in heightening national identification and reactive approach motivation of Japanese.

    PubMed

    Terashima, Yuto; Takai, Jiro

    2017-03-23

    This study investigated whether relational uncertainty poses uncertainty threat, which causes compensatory behaviours among Japanese. We hypothesised that Japanese, as collectivists, would perceive relational uncertainty to pose uncertainty threat. In two experiments, we manipulated relational uncertainty, and confirmed that participants exhibited compensatory reactions to reduce aversive feelings due to it. In Study 1, we conducted direct comparison between relational uncertainty, independent self-uncertainty and control conditions. The results revealed that participants who were instructed to imagine events pertaining to relational uncertainty heightened national identification as compensation than did participants in the control condition, but independent self-uncertainty did not provoke such effects. In Study 2, we again manipulated relational uncertainty; however, we also manipulated participants' individualism-collectivism cultural orientation through priming, and the analyses yielded a significant interaction effect between these variables. Relational uncertainty evoked reactive approach motivation, a cause for compensatory behaviours, among participants primed with collectivism, but not for individualism. It was concluded that the effect of uncertainty on compensatory behaviour is influenced by cultural priming, and that relational uncertainty is important to Japanese. © 2017 International Union of Psychological Science.

  11. Cultured Construction: Global Evidence of the Impact of National Values on Piped-to-Premises Water Infrastructure Development.

    PubMed

    Kaminsky, Jessica A

    2016-07-19

    In 2016, the global community undertook the Sustainable Development Goals. One of these goals seeks to achieve universal and equitable access to safe and affordable drinking water for all people by the year 2030. In support of this undertaking, this paper seeks to discover the cultural work done by piped water infrastructure across 33 nations with developed and developing economies that have experienced change in the percentage of population served by piped-to-premises water infrastructure at the national level of analysis. To do so, I regressed the 1990-2012 change in piped-to-premises water infrastructure coverage against Hofstede's cultural dimensions, controlling for per capita GDP, the 1990 baseline level of coverage, percent urban population, overall 1990-2012 change in improved sanitation (all technologies), and per capita freshwater resources. Separate analyses were carried out for the urban, rural, and aggregate national contexts. Hofstede's dimensions provide a measure of cross-cultural difference; high or low scores are not in any way intended to represent better or worse but rather serve as a quantitative way to compare aggregate preferences for ways of being and doing. High scores in the cultural dimensions of Power Distance, Individualism-Collectivism, and Uncertainty Avoidance explain increased access to piped-to-premises water infrastructure in the rural context. Higher Power Distance and Uncertainty Avoidance scores are also statistically significant for increased coverage in the urban and national aggregate contexts. These results indicate that, as presently conceived, piped-to-premises water infrastructure fits best with spatial contexts that prefer hierarchy and centralized control. Furthermore, water infrastructure is understood to reduce uncertainty regarding the provision of individually valued benefits. The results of this analysis identify global trends that enable engineers and policy makers to design and manage more culturally appropriate and socially sustainable water infrastructure by better fitting technologies to user preferences.

  12. Educational Technology Research in a VUCA World

    ERIC Educational Resources Information Center

    Reeves, Thomas C.; Reeves, Patricia M.

    2015-01-01

    The status of educational technology research in a VUCA world is examined. The acronym, VUCA, stands for "Volatility" (rapidly changing contexts and conditions), "Uncertainty" (information missing that is critical to problem solving), "Complexity" (multiple factors difficult to categorize or control), and…

  13. Averaging business cycles vs. myopia: Do we need a long term vision when developing IRP?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, C.; Gupta, P.C.

    1995-05-01

    Utility demand forecasting is inherently imprecise due to the number of uncertainties resulting from business cycles, policy making, technology breakthroughs, national and international political upheavals and the limitations of the forecasting tools. This implies that revisions based primarily on recent experience could lead to unstable forecasts. Moreover, new planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning tools are required that provide an explicit consideration of uncertainty and lead to flexible and robust planning decisions.

  14. A framework to quantify uncertainties of seafloor backscatter from swath mapping echosounders

    NASA Astrophysics Data System (ADS)

    Malik, Mashkoor; Lurton, Xavier; Mayer, Larry

    2018-06-01

    Multibeam echosounders (MBES) have become a widely used acoustic remote sensing tool to map and study the seafloor, providing co-located bathymetry and seafloor backscatter. Although the uncertainty associated with MBES-derived bathymetric data has been studied extensively, the question of backscatter uncertainty has been addressed only minimally and hinders the quantitative use of MBES seafloor backscatter. This paper explores approaches to identifying uncertainty sources associated with MBES-derived backscatter measurements. The major sources of uncertainty are catalogued and the magnitudes of their relative contributions to the backscatter uncertainty budget are evaluated. These major uncertainty sources include seafloor insonified area (1-3 dB), absorption coefficient (up to > 6 dB), random fluctuations in echo level (5.5 dB for a Rayleigh distribution), and sonar calibration (device dependent). The magnitudes of these uncertainty sources vary based on how these effects are compensated for during data acquisition and processing. Various cases (no compensation, partial compensation and full compensation) for seafloor insonified area, transmission losses and random fluctuations were modeled to estimate their uncertainties in different scenarios. Uncertainty related to the seafloor insonified area can be reduced significantly by accounting for seafloor slope during backscatter processing while transmission losses can be constrained by collecting full water column absorption coefficient profiles (temperature and salinity profiles). To reduce random fluctuations to below 1 dB, at least 20 samples are recommended to be used while computing mean values. The estimation of uncertainty in backscatter measurements is constrained by the fact that not all instrumental components are characterized and documented sufficiently for commercially available MBES. Further involvement from manufacturers in providing this essential information is critically required.

  15. A taxonomy of medical uncertainties in clinical genome sequencing.

    PubMed

    Han, Paul K J; Umstead, Kendall L; Bernhardt, Barbara A; Green, Robert C; Joffe, Steven; Koenig, Barbara; Krantz, Ian; Waterston, Leo B; Biesecker, Leslie G; Biesecker, Barbara B

    2017-08-01

    Clinical next-generation sequencing (CNGS) is introducing new opportunities and challenges into the practice of medicine. Simultaneously, these technologies are generating uncertainties of an unprecedented scale that laboratories, clinicians, and patients are required to address and manage. We describe in this report the conceptual design of a new taxonomy of uncertainties around the use of CNGS in health care. Interviews to delineate the dimensions of uncertainty in CNGS were conducted with genomics experts and themes were extracted in order to expand on a previously published three-dimensional taxonomy of medical uncertainty. In parallel, we developed an interactive website to disseminate the CNGS taxonomy to researchers and engage them in its continued refinement. The proposed taxonomy divides uncertainty along three axes-source, issue, and locus-and further discriminates the uncertainties into five layers with multiple domains. Using a hypothetical clinical example, we illustrate how the taxonomy can be applied to findings from CNGS and used to guide stakeholders through interpretation and implementation of variant results. The utility of the proposed taxonomy lies in promoting consistency in describing dimensions of uncertainty in publications and presentations, to facilitate research design and management of the uncertainties inherent in the implementation of CNGS.Genet Med advance online publication 19 January 2017.

  16. A Taxonomy of Medical Uncertainties in Clinical Genome Sequencing

    PubMed Central

    Han, Paul K. J.; Umstead, Kendall L.; Bernhardt, Barbara A.; Green, Robert C.; Joffe, Steven; Koenig, Barbara; Krantz, Ian; Waterston, Leo B.; Biesecker, Leslie G.; Biesecker, Barbara B.

    2017-01-01

    Purpose Clinical next generation sequencing (CNGS) is introducing new opportunities and challenges into the practice of medicine. Simultaneously, these technologies are generating uncertainties of unprecedented scale that laboratories, clinicians, and patients are required to address and manage. We describe in this report the conceptual design of a new taxonomy of uncertainties around the use of CNGS in health care. Methods Interviews to delineate the dimensions of uncertainty in CNGS were conducted with genomics experts, and themes were extracted in order to expand upon a previously published three-dimensional taxonomy of medical uncertainty. In parallel we developed an interactive website to disseminate the CNGS taxonomy to researchers and engage them in its continued refinement. Results The proposed taxonomy divides uncertainty along three axes: source, issue, and locus, and further discriminates the uncertainties into five layers with multiple domains. Using a hypothetical clinical example, we illustrate how the taxonomy can be applied to findings from CNGS and used to guide stakeholders through interpretation and implementation of variant results. Conclusion The utility of the proposed taxonomy lies in promoting consistency in describing dimensions of uncertainty in publications and presentations, to facilitate research design and management of the uncertainties inherent in the implementation of CNGS. PMID:28102863

  17. Quantifying the uncertainties of China's emission inventory for industrial sources: From national to provincial and city scales

    NASA Astrophysics Data System (ADS)

    Zhao, Yu; Zhou, Yaduan; Qiu, Liping; Zhang, Jie

    2017-09-01

    A comprehensive uncertainty analysis was conducted on emission inventories for industrial sources at national (China), provincial (Jiangsu), and city (Nanjing) scales for 2012. Based on various methods and data sources, Monte-Carlo simulation was applied at sector level for national inventory, and at plant level (whenever possible) for provincial and city inventories. The uncertainties of national inventory were estimated at -17-37% (expressed as 95% confidence intervals, CIs), -21-35%, -19-34%, -29-40%, -22-47%, -21-54%, -33-84%, and -32-92% for SO2, NOX, CO, TSP (total suspended particles), PM10, PM2.5, black carbon (BC), and organic carbon (OC) emissions respectively for the whole country. At provincial and city levels, the uncertainties of corresponding pollutant emissions were estimated at -15-18%, -18-33%, -16-37%, -20-30%, -23-45%, -26-50%, -33-79%, and -33-71% for Jiangsu, and -17-22%, -10-33%, -23-75%, -19-36%, -23-41%, -28-48%, -45-82%, and -34-96% for Nanjing, respectively. Emission factors (or associated parameters) were identified as the biggest contributors to the uncertainties of emissions for most source categories except iron & steel production in the national inventory. Compared to national one, uncertainties of total emissions in the provincial and city-scale inventories were not significantly reduced for most species with an exception of SO2. For power and other industrial boilers, the uncertainties were reduced, and the plant-specific parameters played more important roles to the uncertainties. Much larger PM10 and PM2.5 emissions for Jiangsu were estimated in this provincial inventory than other studies, implying the big discrepancies on data sources of emission factors and activity data between local and national inventories. Although the uncertainty analysis of bottom-up emission inventories at national and local scales partly supported the ;top-down; estimates using observation and/or chemistry transport models, detailed investigations and field measurements were recommended for further improving the emission estimates and reducing the uncertainty of inventories at local and regional scales, for both industrial and other sectors.

  18. Perceptions of risk from nanotechnologies and trust in stakeholders: a cross sectional study of public, academic, government and business attitudes.

    PubMed

    Capon, Adam; Gillespie, James; Rolfe, Margaret; Smith, Wayne

    2015-04-26

    Policy makers and regulators are constantly required to make decisions despite the existence of substantial uncertainty regarding the outcomes of their proposed decisions. Understanding stakeholder views is an essential part of addressing this uncertainty, which provides insight into the possible social reactions and tolerance of unpredictable risks. In the field of nanotechnology, large uncertainties exist regarding the real and perceived risks this technology may have on society. Better evidence is needed to confront this issue. We undertook a computer assisted telephone interviewing (CATI) survey of the Australian public and a parallel survey of those involved in nanotechnology from the academic, business and government sectors. Analysis included comparisons of proportions and logistic regression techniques. We explored perceptions of nanotechnology risks both to health and in a range of products. We examined views on four trust actors. The general public's perception of risk was significantly higher than that expressed by other stakeholders. The public bestows less trust in certain trust actors than do academics or government officers, giving its greatest trust to scientists. Higher levels of public trust were generally associated with lower perceptions of risk. Nanotechnology in food and cosmetics/sunscreens were considered riskier applications irrespective of stakeholder, while familiarity with nanotechnology was associated with a reduced risk perception. Policy makers should consider the disparities in risk and trust perceptions between the public and influential stakeholders, placing greater emphasis on risk communication and the uncertainties of risk assessment in these areas of higher concern. Scientists being the highest trusted group are well placed to communicate the risks of nanotechnologies to the public.

  19. Value based pricing, research and development, and patient access schemes. Will the United Kingdom get it right or wrong?

    PubMed Central

    Towse, Adrian

    2010-01-01

    The National Health Service (NHS) should reward innovation it values. This will enable the NHS and the United Kingdom (UK) economy to benefit and impact positively on the Research and Development (R&D) decision making of companies. The National Institute for Health and Clinical Excellence (NICE) currently seeks to do this on behalf of the NHS. Yet the Office of Fair Trading proposals for Value Based Pricing add price setting powers – initially for the Department of Health (DH) and then for NICE. This introduces an additional substantial uncertainty that will impact on R&D and, conditional on R&D proceeding, on launch (or not) in the UK. Instead of adding to uncertainty the institutional arrangements for assessing value should seek to be predictable and science based, building on NICE's current arrangements. The real challenge is to increase understanding of the underlying cost-effectiveness of the technology itself by collecting evidence alongside use. The 2009 Pharmaceutical Price Regulation Scheme sought to help do this with Flexible Pricing (FP) and Patient Access Schemes (PASs). The PASs to date have increased access to medicines, but no schemes proposed to date have yet helped to tackle outcomes uncertainty. The 2010 Innovation Pass can also be seen as a form of ‘coverage with evidence development.’ The NHS is understandably concerned about the costs of running such evidence collection schemes. Enabling the NHS to deliver on such schemes will impact favourably on R&D decisions. Increasing the uncertainty in the UK NHS market through government price setting will reduce incentives for R&D and for early UK launch. PMID:20716236

  20. Value based pricing, research and development, and patient access schemes. Will the United Kingdom get it right or wrong?

    PubMed

    Towse, Adrian

    2010-09-01

    The National Health Service (NHS) should reward innovation it values. This will enable the NHS and the United Kingdom (UK) economy to benefit and impact positively on the Research and Development (R&D) decision making of companies. The National Institute for Health and Clinical Excellence (NICE) currently seeks to do this on behalf of the NHS. Yet the Office of Fair Trading proposals for Value Based Pricing add price setting powers--initially for the Department of Health (DH) and then for NICE. This introduces an additional substantial uncertainty that will impact on R&D and, conditional on R&D proceeding, on launch (or not) in the UK. Instead of adding to uncertainty the institutional arrangements for assessing value should seek to be predictable and science based, building on NICE's current arrangements. The real challenge is to increase understanding of the underlying cost-effectiveness of the technology itself by collecting evidence alongside use. The 2009 Pharmaceutical Price Regulation Scheme sought to help do this with Flexible Pricing (FP) and Patient Access Schemes (PASs). The PASs to date have increased access to medicines, but no schemes proposed to date have yet helped to tackle outcomes uncertainty. The 2010 Innovation Pass can also be seen as a form of 'coverage with evidence development.' The NHS is understandably concerned about the costs of running such evidence collection schemes. Enabling the NHS to deliver on such schemes will impact favourably on R&D decisions. Increasing the uncertainty in the UK NHS market through government price setting will reduce incentives for R&D and for early UK launch.

  1. Socializing Identity Through Practice: A Mixed Methods Approach to Family Medicine Resident Perspectives on Uncertainty.

    PubMed

    Ledford, Christy J W; Cafferty, Lauren A; Seehusen, Dean A

    2015-01-01

    Uncertainty is a central theme in the practice of medicine and particularly primary care. This study explored how family medicine resident physicians react to uncertainty in their practice. This study incorporated a two-phase mixed methods approach, including semi-structured personal interviews (n=21) and longitudinal self-report surveys (n=21) with family medicine residents. Qualitative analysis showed that though residents described uncertainty as an implicit part of their identity, they still developed tactics to minimize or manage uncertainty in their practice. Residents described increasing comfort with uncertainty the longer they practiced and anticipated that growth continuing throughout their careers. Quantitative surveys showed that reactions to uncertainty were more positive over time; however, the difference was not statistically significant. Qualitative and quantitative results show that as family medicine residents practice medicine their perception of uncertainty changes. To reduce uncertainty, residents use relational information-seeking strategies. From a broader view of practice, residents describe uncertainty neutrally, asserting that uncertainty is simply part of the practice of family medicine.

  2. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.

  3. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  4. The Development of a Diagnostic-Prescriptive Tool for Undergraduates Seeking Information for a Social Science/Humanities Assignment. III. Enabling Devices.

    ERIC Educational Resources Information Center

    Cole, Charles; Cantero, Pablo; Ungar, Andras

    2000-01-01

    This article focuses on a study of undergraduates writing an essay for a remedial writing course that tested two devices, an uncertainty expansion device and an uncertainty reduction device. Highlights include Kuhlthau's information search process model, and enabling technology devices for the information needs of information retrieval system…

  5. Annual Report: Carbon Capture Simulation Initiative (CCSI) (30 September 2012)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David C.; Syamlal, Madhava; Cottrell, Roger

    2012-09-30

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that is developing and deploying state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models, with uncertainty quantification (UQ), optimization, risk analysis and decision making capabilities. The CCSI Toolset incorporates commercial and open-source software currently in use by industry and is also developing new software tools asmore » necessary to fill technology gaps identified during execution of the project. Ultimately, the CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. CCSI is organized into 8 technical elements that fall under two focus areas. The first focus area (Physicochemical Models and Data) addresses the steps necessary to model and simulate the various technologies and processes needed to bring a new Carbon Capture and Storage (CCS) technology into production. The second focus area (Analysis & Software) is developing the software infrastructure to integrate the various components and implement the tools that are needed to make quantifiable decisions regarding the viability of new CCS technologies. CCSI also has an Industry Advisory Board (IAB). By working closely with industry from the inception of the project to identify industrial challenge problems, CCSI ensures that the simulation tools are developed for the carbon capture technologies of most relevance to industry. CCSI is led by the National Energy Technology Laboratory (NETL) and leverages the Department of Energy (DOE) national laboratories' core strengths in modeling and simulation, bringing together the best capabilities at NETL, Los Alamos National Laboratory (LANL), Lawrence Berkeley National Laboratory (LBNL), Lawrence Livermore National Laboratory (LLNL), and Pacific Northwest National Laboratory (PNNL). The CCSI's industrial partners provide representation from the power generation industry, equipment manufacturers, technology providers and engineering and construction firms. The CCSI's academic participants (Carnegie Mellon University, Princeton University, West Virginia University, and Boston University) bring unparalleled expertise in multiphase flow reactors, combustion, process synthesis and optimization, planning and scheduling, and process control techniques for energy processes. During Fiscal Year (FY) 12, CCSI released its first set of computational tools and models. This pre-release, a year ahead of the originally planned first release, is the result of intense industry interest in getting early access to the tools and the phenomenal progress of the CCSI technical team. These initial components of the CCSI Toolset provide new models and computational capabilities that will accelerate the commercial development of carbon capture technologies as well as related technologies, such as those found in the power, refining, chemicals, and gas production industries. The release consists of new tools for process synthesis and optimization to help identify promising concepts more quickly, new physics-based models of potential capture equipment and processes that will reduce the time to design and troubleshoot new systems, a framework to quantify the uncertainty of model predictions, and various enabling tools that provide new capabilities such as creating reduced order models (ROMs) from reacting multiphase flow simulations and running thousands of process simulations concurrently for optimization and UQ.« less

  6. Application of stochastic multiattribute analysis to assessment of single walled carbon nanotube synthesis processes.

    PubMed

    Canis, Laure; Linkov, Igor; Seager, Thomas P

    2010-11-15

    The unprecedented uncertainty associated with engineered nanomaterials greatly expands the need for research regarding their potential environmental consequences. However, decision-makers such as regulatory agencies, product developers, or other nanotechnology stakeholders may not find the results of such research directly informative of decisions intended to mitigate environmental risks. To help interpret research findings and prioritize new research needs, there is an acute need for structured decision-analytic aids that are operable in a context of extraordinary uncertainty. Whereas existing stochastic decision-analytic techniques explore uncertainty only in decision-maker preference information, this paper extends model uncertainty to technology performance. As an illustrative example, the framework is applied to the case of single-wall carbon nanotubes. Four different synthesis processes (arc, high pressure carbon monoxide, chemical vapor deposition, and laser) are compared based on five salient performance criteria. A probabilistic rank ordering of preferred processes is determined using outranking normalization and a linear-weighted sum for different weighting scenarios including completely unknown weights and four fixed-weight sets representing hypothetical stakeholder views. No single process pathway dominates under all weight scenarios, but it is likely that some inferior process technologies could be identified as low priorities for further research.

  7. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Uncertainty after treatment for prostate cancer: definition, assessment, and management.

    PubMed

    Yu Ko, Wellam F; Degner, Lesley F

    2008-10-01

    Prostate cancer is the second most common type of cancer in men living in the United States and the most common type of malignancy in Canadian men, accounting for 186,320 new cases in the United States and 24,700 in Canada in 2008. Uncertainty, a component of all illness experiences, influences how men perceive the processes of treatment and adaptation. The Reconceptualized Uncertainty in Illness Theory explains the chronic nature of uncertainty in cancer survivorship by describing a shift from an emergent acute phase of uncertainty in survivors to a new level of uncertainty that is no longer acute and becomes a part of daily life. Proper assessment of certainty and uncertainty may allow nurses to maximize the effectiveness of patient-provider communication, cognitive reframing, and problem-solving interventions to reduce uncertainty after cancer treatment.

  9. Honesty-humility under threat: Self-uncertainty destroys trust among the nice guys.

    PubMed

    Pfattheicher, Stefan; Böhm, Robert

    2018-01-01

    Recent research on humans' prosociality has highlighted the crucial role of Honesty-Humility, a basic trait in the HEXACO personality model. There is overwhelming evidence that Honesty-Humility predicts prosocial behavior across a vast variety of situations. In the present contribution, we cloud this rosy picture, examining a condition under which individuals high in Honesty-Humility reduce prosocial behavior. Specifically, we propose that under self-uncertainty, it is particularly those individuals high in Honesty-Humility who reduce trust in unknown others and become less prosocial. In 5 studies, we assessed Honesty-Humility, manipulated self-uncertainty, and measured interpersonal trust or trust in social institutions using behavioral or questionnaire measures. In Study 1, individuals high (vs. low) in Honesty-Humility showed higher levels of trust. This relation was mediated by their positive social expectations about the trustworthiness of others. Inducing self-uncertainty decreased trust, particularly in individuals high in Honesty-Humility (Studies 2-5). Making use of measuring the mediator (Studies 2 and 3) and applying a causal chain design (Studies 4a and 4b), it is shown that individuals high in Honesty-Humility reduced trust because self-uncertainty decreased positive social expectations about others. We end with an applied perspective, showing that Honesty-Humility is predictive of trust in social institutions (e.g., trust in the police; Study 5a), and that self-uncertainty undermined trust in the police especially for individuals high in Honesty-Humility (Study 5b). By these means, the present research shows that individuals high in Honesty-Humility are not unconditionally prosocial. Further implications for Honesty-Humility as well as for research on self-uncertainty and trust are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  10. An uncertainty analysis of the hydrogen source term for a station blackout accident in Sequoyah using MELCOR 1.8.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Bixler, Nathan E.; Wagner, Kenneth Charles

    2014-03-01

    A methodology for using the MELCOR code with the Latin Hypercube Sampling method was developed to estimate uncertainty in various predicted quantities such as hydrogen generation or release of fission products under severe accident conditions. In this case, the emphasis was on estimating the range of hydrogen sources in station blackout conditions in the Sequoyah Ice Condenser plant, taking into account uncertainties in the modeled physics known to affect hydrogen generation. The method uses user-specified likelihood distributions for uncertain model parameters, which may include uncertainties of a stochastic nature, to produce a collection of code calculations, or realizations, characterizing themore » range of possible outcomes. Forty MELCOR code realizations of Sequoyah were conducted that included 10 uncertain parameters, producing a range of in-vessel hydrogen quantities. The range of total hydrogen produced was approximately 583kg 131kg. Sensitivity analyses revealed expected trends with respected to the parameters of greatest importance, however, considerable scatter in results when plotted against any of the uncertain parameters was observed, with no parameter manifesting dominant effects on hydrogen generation. It is concluded that, with respect to the physics parameters investigated, in order to further reduce predicted hydrogen uncertainty, it would be necessary to reduce all physics parameter uncertainties similarly, bearing in mind that some parameters are inherently uncertain within a range. It is suspected that some residual uncertainty associated with modeling complex, coupled and synergistic phenomena, is an inherent aspect of complex systems and cannot be reduced to point value estimates. The probabilistic analyses such as the one demonstrated in this work are important to properly characterize response of complex systems such as severe accident progression in nuclear power plants.« less

  11. HIT or miss: the application of health care information technology to managing uncertainty in clinical decision making.

    PubMed

    Kazandjian, Vahé A; Lipitz-Snyderman, Allison

    2011-12-01

    To discuss the usefulness of health care information technology (HIT) in assisting care providers minimize uncertainty while simultaneously increasing efficiency of the care provided. An ongoing study of HIT, performance measurement (clinical and production efficiency) and their implications to the payment for care represents the design of this study. Since 2006, all Maryland hospitals have embarked on a multi-faceted study of performance measures and HIT adoption surveys, which will shape the health care payment model in Maryland, the last of the all-payor states, in 2011. This paper focuses on the HIT component of the Maryland care payment initiative. While the payment model is still under review and discussion, 'appropriateness' of care has been discussed as an important dimension of measurement. Within this dimension, the 'uncertainty' concept has been identified as associated with variation in care practices. Hence, the methods of this paper define how HIT can assist care providers in addressing the concept of uncertainty, and then provides findings from the first HIT survey in Maryland to infer the readiness of Maryland hospital in addressing uncertainty of care in part through the use of HIT. Maryland hospitals show noteworthy variation in their adoption and use of HIT. While computerized, electronic patient records are not commonly used among and across Maryland hospitals, many of the uses of HIT internally in each hospital could significantly assist in better communication about better practices to minimize uncertainty of care and enhance the efficiency of its production. © 2010 Blackwell Publishing Ltd.

  12. Steering vaccinomics innovations with anticipatory governance and participatory foresight.

    PubMed

    Ozdemir, Vural; Faraj, Samer A; Knoppers, Bartha M

    2011-09-01

    Vaccinomics is the convergence of vaccinology and population-based omics sciences. The success of knowledge-based innovations such as vaccinomics is not only contingent on access to new biotechnologies. It also requires new ways of governance of science, knowledge production, and management. This article presents a conceptual analysis of the anticipatory and adaptive approaches that are crucial for the responsible design and sustainable transition of vaccinomics to public health practice. Anticipatory governance is a new approach to manage the uncertainties embedded on an innovation trajectory with participatory foresight, in order to devise governance instruments for collective "steering" of science and technology. As a contrast to hitherto narrowly framed "downstream impact assessments" for emerging technologies, anticipatory governance adopts a broader and interventionist approach that recognizes the social construction of technology design and innovation. It includes in its process explicit mechanisms to understand the factors upstream to the innovation trajectory such as deliberation and cocultivation of the aims, motives, funding, design, and direction of science and technology, both by experts and publics. This upstream shift from a consumer "product uptake" focus to "participatory technology design" on the innovation trajectory is an appropriately radical and necessary departure in the field of technology assessment, especially given that considerable public funds are dedicated to innovations. Recent examples of demands by research funding agencies to anticipate the broad impacts of proposed research--at a very upstream stage at the time of research funding application--suggest that anticipatory governance with foresight may be one way how postgenomics scientific practice might transform in the future toward responsible innovation. Moreover, the present context of knowledge production in vaccinomics is such that policy making for vaccines of the 21st century is occurring in the face of uncertainties where the "facts are uncertain, values in dispute, stakes high and decisions urgent and where no single one of these dimensions can be managed in isolation from the rest." This article concludes, however, that uncertainty is not an accident of the scientific method, but its very substance. Anticipatory governance with participatory foresight offers a mechanism to respond to such inherent sociotechnical uncertainties in the emerging field of vaccinomics by making the coproduction of scientific knowledge by technology and the social systems explicit. Ultimately, this serves to integrate scientific and social knowledge thereby steering innovations to coproduce results and outputs that are socially robust and context sensitive.

  13. Application of Risk Management and Uncertainty Concepts and Methods for Ecosystem Restoration: Principles and Best Practice

    DTIC Science & Technology

    2012-08-01

    habitats for specific species of trout . The report noted that these uncertainties — and the SMEs, who had past experience in such topic areas — were...reduce uncertainty in HREP projects is reflected in the completion of the Pool 11 Islands (UMRS RM 583-593) HREP in 2003. In 1989 the Browns Lake

  14. Managing Uncertainty during a Corporate Acquisition: A Longitudinal Study of Communication During an Airline Acquisition

    ERIC Educational Resources Information Center

    Kramer, Michael W.; Dougherty, Debbie S.; Pierce, Tamyra A.

    2004-01-01

    This study examined pilots' (N at T1 = 140; N at T2 = 126; N at T3 = 104) reactions to communication and uncertainty during the acquisition of their airline by another airline. Quantitative results indicate that communication helped to reduce uncertainty and was predictive of affective responses to the acquisition. However, contrary to…

  15. Introducing nonpoint source transferable quotas in nitrogen trading: The effects of transaction costs and uncertainty.

    PubMed

    Zhou, Xiuru; Ye, Weili; Zhang, Bing

    2016-03-01

    Transaction costs and uncertainty are considered to be significant obstacles in the emissions trading market, especially for including nonpoint source in water quality trading. This study develops a nonlinear programming model to simulate how uncertainty and transaction costs affect the performance of point/nonpoint source (PS/NPS) water quality trading in the Lake Tai watershed, China. The results demonstrate that PS/NPS water quality trading is a highly cost-effective instrument for emissions abatement in the Lake Tai watershed, which can save 89.33% on pollution abatement costs compared to trading only between nonpoint sources. However, uncertainty can significantly reduce the cost-effectiveness by reducing trading volume. In addition, transaction costs from bargaining and decision making raise total pollution abatement costs directly and cause the offset system to deviate from the optimal state. While proper investment in monitoring and measuring of nonpoint emissions can decrease uncertainty and save on the total abatement costs. Finally, we show that the dispersed ownership of China's farmland will bring high uncertainty and transaction costs into the PS/NPS offset system, even if the pollution abatement cost is lower than for point sources. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Maximum warming occurs about one decade after a carbon dioxide emission

    NASA Astrophysics Data System (ADS)

    Ricke, Katharine L.; Caldeira, Ken

    2014-12-01

    It is known that carbon dioxide emissions cause the Earth to warm, but no previous study has focused on examining how long it takes to reach maximum warming following a particular CO2 emission. Using conjoined results of carbon-cycle and physical-climate model intercomparison projects (Taylor et al 2012, Joos et al 2013), we find the median time between an emission and maximum warming is 10.1 years, with a 90% probability range of 6.6-30.7 years. We evaluate uncertainties in timing and amount of warming, partitioning them into three contributing factors: carbon cycle, climate sensitivity and ocean thermal inertia. If uncertainty in any one factor is reduced to zero without reducing uncertainty in the other factors, the majority of overall uncertainty remains. Thus, narrowing uncertainty in century-scale warming depends on narrowing uncertainty in all contributing factors. Our results indicate that benefit from avoided climate damage from avoided CO2 emissions will be manifested within the lifetimes of people who acted to avoid that emission. While such avoidance could be expected to benefit future generations, there is potential for emissions avoidance to provide substantial benefit to current generations.

  17. Solar thermal technologies benefits assessment: Objectives, methodologies and results for 1981

    NASA Technical Reports Server (NTRS)

    Gates, W. R.

    1982-01-01

    The economic and social benefits of developing cost competitive solar thermal technologies (STT) were assessed. The analysis was restricted to STT in electric applications for 16 high insolation/high energy price states. Three fuel price scenarios and three 1990 STT system costs were considered, reflecting uncertainty over fuel prices and STT cost projections. After considering the numerous benefits of introducing STT into the energy market, three primary benefits were identified and evaluated: (1) direct energy cost savings were estimated to range from zero to $50 billion; (2) oil imports may be reduced by up to 9 percent, improving national security; and (3) significant environmental benefits can be realized in air basins where electric power plant emissions create substantial air pollution problems. STT research and development was found to be unacceptably risky for private industry in the absence of federal support. The normal risks associated with investments in research and development are accentuated because the OPEC cartel can artificially manipulate oil prices and undercut the growth of alternative energy sources.

  18. Evaluation of heterotrophic plate and chromogenic agar colony counting in water quality laboratories.

    PubMed

    Hallas, Gary; Monis, Paul

    2015-01-01

    The enumeration of bacteria using plate-based counts is a core technique used by food and water microbiology testing laboratories. However, manual counting of bacterial colonies is both time and labour intensive, can vary between operators and also requires manual entry of results into laboratory information management systems, which can be a source of data entry error. An alternative is to use automated digital colony counters, but there is a lack of peer-reviewed validation data to allow incorporation into standards. We compared the performance of digital counting technology (ProtoCOL3) against manual counting using criteria defined in internationally recognized standard methods. Digital colony counting provided a robust, standardized system suitable for adoption in a commercial testing environment. The digital technology has several advantages:•Improved measurement of uncertainty by using a standard and consistent counting methodology with less operator error.•Efficiency for labour and time (reduced cost).•Elimination of manual entry of data onto LIMS.•Faster result reporting to customers.

  19. The prefabricated building risk decision research of DM technology on the basis of Rough Set

    NASA Astrophysics Data System (ADS)

    Guo, Z. L.; Zhang, W. B.; Ma, L. H.

    2017-08-01

    With the resources crises and more serious pollution, the green building has been strongly advocated by most countries and become a new building style in the construction field. Compared with traditional building, the prefabricated building has its own irreplaceable advantages but is influenced by many uncertainties. So far, a majority of scholars have been studying based on qualitative researches from all of the word. This paper profoundly expounds its significance about the prefabricated building. On the premise of the existing research methods, combined with rough set theory, this paper redefines the factors which affect the prefabricated building risk. Moreover, it quantifies risk factors and establish an expert knowledge base through assessing. And then reduced risk factors about the redundant attributes and attribute values, finally form the simplest decision rule. This simplest decision rule, which is based on the DM technology of rough set theory, provides prefabricated building with a controllable new decision-making method.

  20. Solar thermal technologies benefits assessment: Objectives, methodologies and results for 1981

    NASA Astrophysics Data System (ADS)

    Gates, W. R.

    1982-07-01

    The economic and social benefits of developing cost competitive solar thermal technologies (STT) were assessed. The analysis was restricted to STT in electric applications for 16 high insolation/high energy price states. Three fuel price scenarios and three 1990 STT system costs were considered, reflecting uncertainty over fuel prices and STT cost projections. After considering the numerous benefits of introducing STT into the energy market, three primary benefits were identified and evaluated: (1) direct energy cost savings were estimated to range from zero to $50 billion; (2) oil imports may be reduced by up to 9 percent, improving national security; and (3) significant environmental benefits can be realized in air basins where electric power plant emissions create substantial air pollution problems. STT research and development was found to be unacceptably risky for private industry in the absence of federal support. The normal risks associated with investments in research and development are accentuated because the OPEC cartel can artificially manipulate oil prices and undercut the growth of alternative energy sources.

  1. Preliminary noise tradeoff study of a Mach 2.7 cruise aircraft

    NASA Technical Reports Server (NTRS)

    Mascitti, V. R.; Maglieri, D. J. (Editor); Raney, J. P. (Editor)

    1979-01-01

    NASA computer codes in the areas of preliminary sizing and enroute performance, takeoff and landing performance, aircraft noise prediction, and economics were used in a preliminary noise tradeoff study for a Mach 2.7 design supersonic cruise concept. Aerodynamic configuration data were based on wind-tunnel model tests and related analyses. Aircraft structural characteristics and weight were based on advanced structural design methodologies, assuming conventional titanium technology. The most advanced noise prediction techniques available were used, and aircraft operating costs were estimated using accepted industry methods. The 4-engines cycles included in the study were based on assumed 1985 technology levels. Propulsion data was provided by aircraft manufacturers. Additional empirical data is needed to define both noise reduction features and other operating characteristics of all engine cycles under study. Data on VCE design parameters, coannular nozzle inverted flow noise reduction and advanced mechanical suppressors are urgently needed to reduce the present uncertainties in studies of this type.

  2. Maximum Entropy/Optimal Projection (MEOP) control design synthesis: Optimal quantification of the major design tradeoffs

    NASA Technical Reports Server (NTRS)

    Hyland, D. C.; Bernstein, D. S.

    1987-01-01

    The underlying philosophy and motivation of the optimal projection/maximum entropy (OP/ME) stochastic modeling and reduced control design methodology for high order systems with parameter uncertainties are discussed. The OP/ME design equations for reduced-order dynamic compensation including the effect of parameter uncertainties are reviewed. The application of the methodology to several Large Space Structures (LSS) problems of representative complexity is illustrated.

  3. Use of Atmospheric Budget to Reduce Uncertainty in Estimated Water Availability over South Asia from Different Reanalyses

    NASA Astrophysics Data System (ADS)

    Sebastian, Dawn Emil; Pathak, Amey; Ghosh, Subimal

    2016-07-01

    Disagreements across different reanalyses over South Asia result into uncertainty in assessment of water availability, which is computed as the difference between Precipitation and Evapotranspiration (P-E). Here, we compute P-E directly from atmospheric budget with divergence of moisture flux for different reanalyses and find improved correlation with observed values of P-E, acquired from station and satellite data. We also find reduced closure terms for water cycle computed with atmospheric budget, analysed over South Asian landmass, when compared to that obtained with individual values of P and E. The P-E value derived with atmospheric budget is more consistent with energy budget, when we use top-of-atmosphere radiation for the same. For analysing water cycle, we use runoff from Global Land Data Assimilation System, and water storage from Gravity Recovery and Climate Experiment. We find improvements in agreements across different reanalyses, in terms of inter-annual cross correlation when atmospheric budget is used to estimate P-E and hence, emphasize to use the same for estimations of water availability in South Asia to reduce uncertainty. Our results on water availability with reduced uncertainty over highly populated monsoon driven South Asia will be useful for water management and agricultural decision making.

  4. Use of Atmospheric Budget to Reduce Uncertainty in Estimated Water Availability over South Asia from Different Reanalyses.

    PubMed

    Sebastian, Dawn Emil; Pathak, Amey; Ghosh, Subimal

    2016-07-08

    Disagreements across different reanalyses over South Asia result into uncertainty in assessment of water availability, which is computed as the difference between Precipitation and Evapotranspiration (P-E). Here, we compute P-E directly from atmospheric budget with divergence of moisture flux for different reanalyses and find improved correlation with observed values of P-E, acquired from station and satellite data. We also find reduced closure terms for water cycle computed with atmospheric budget, analysed over South Asian landmass, when compared to that obtained with individual values of P and E. The P-E value derived with atmospheric budget is more consistent with energy budget, when we use top-of-atmosphere radiation for the same. For analysing water cycle, we use runoff from Global Land Data Assimilation System, and water storage from Gravity Recovery and Climate Experiment. We find improvements in agreements across different reanalyses, in terms of inter-annual cross correlation when atmospheric budget is used to estimate P-E and hence, emphasize to use the same for estimations of water availability in South Asia to reduce uncertainty. Our results on water availability with reduced uncertainty over highly populated monsoon driven South Asia will be useful for water management and agricultural decision making.

  5. Use of Atmospheric Budget to Reduce Uncertainty in Estimated Water Availability over South Asia from Different Reanalyses

    PubMed Central

    Sebastian, Dawn Emil; Pathak, Amey; Ghosh, Subimal

    2016-01-01

    Disagreements across different reanalyses over South Asia result into uncertainty in assessment of water availability, which is computed as the difference between Precipitation and Evapotranspiration (P–E). Here, we compute P–E directly from atmospheric budget with divergence of moisture flux for different reanalyses and find improved correlation with observed values of P–E, acquired from station and satellite data. We also find reduced closure terms for water cycle computed with atmospheric budget, analysed over South Asian landmass, when compared to that obtained with individual values of P and E. The P–E value derived with atmospheric budget is more consistent with energy budget, when we use top-of-atmosphere radiation for the same. For analysing water cycle, we use runoff from Global Land Data Assimilation System, and water storage from Gravity Recovery and Climate Experiment. We find improvements in agreements across different reanalyses, in terms of inter-annual cross correlation when atmospheric budget is used to estimate P–E and hence, emphasize to use the same for estimations of water availability in South Asia to reduce uncertainty. Our results on water availability with reduced uncertainty over highly populated monsoon driven South Asia will be useful for water management and agricultural decision making. PMID:27388837

  6. Incorporating Land-Use Mapping Uncertainty in Remote Sensing Based Calibration of Land-Use Change Models

    NASA Astrophysics Data System (ADS)

    Cockx, K.; Van de Voorde, T.; Canters, F.; Poelmans, L.; Uljee, I.; Engelen, G.; de Jong, K.; Karssenberg, D.; van der Kwast, J.

    2013-05-01

    Building urban growth models typically involves a process of historic calibration based on historic time series of land-use maps, usually obtained from satellite imagery. Both the remote sensing data analysis to infer land use and the subsequent modelling of land-use change are subject to uncertainties, which may have an impact on the accuracy of future land-use predictions. Our research aims to quantify and reduce these uncertainties by means of a particle filter data assimilation approach that incorporates uncertainty in land-use mapping and land-use model parameter assessment into the calibration process. This paper focuses on part of this work, more in particular the modelling of uncertainties associated with the impervious surface cover estimation and urban land-use classification adopted in the land-use mapping approach. Both stages are submitted to a Monte Carlo simulation to assess their relative contribution to and their combined impact on the uncertainty in the derived land-use maps. The approach was applied on the central part of the Flanders region (Belgium), using a time-series of Landsat/SPOT-HRV data covering the years 1987, 1996, 2005 and 2012. Although the most likely land-use map obtained from the simulation is very similar to the original classification, it is shown that the errors related to the impervious surface sub-pixel fraction estimation have a strong impact on the land-use map's uncertainty. Hence, incorporating uncertainty in the land-use change model calibration through particle filter data assimilation is proposed to address the uncertainty observed in the derived land-use maps and to reduce uncertainty in future land-use predictions.

  7. The efficiency of asset management strategies to reduce urban flood risk.

    PubMed

    ten Veldhuis, J A E; Clemens, F H L R

    2011-01-01

    In this study, three asset management strategies were compared with respect to their efficiency to reduce flood risk. Data from call centres at two municipalities were used to quantify urban flood risks associated with three causes of urban flooding: gully pot blockage, sewer pipe blockage and sewer overloading. The efficiency of three flood reduction strategies was assessed based on their effect on the causes contributing to flood risk. The sensitivity of the results to uncertainty in the data source, citizens' calls, was analysed through incorporation of uncertainty ranges taken from customer complaint literature. Based on the available data it could be shown that increasing gully pot blockage is the most efficient action to reduce flood risk, given data uncertainty. If differences between cause incidences are large, as in the presented case study, call data are sufficient to decide how flood risk can be most efficiently reduced. According to the results of this analysis, enlargement of sewer pipes is not an efficient strategy to reduce flood risk, because flood risk associated with sewer overloading is small compared to other failure mechanisms.

  8. The assessment and appraisal of regenerative medicines and cell therapy products: an exploration of methods for review, economic evaluation and appraisal.

    PubMed

    Hettle, Robert; Corbett, Mark; Hinde, Sebastian; Hodgson, Robert; Jones-Diette, Julie; Woolacott, Nerys; Palmer, Stephen

    2017-02-01

    The National Institute for Health and Care Excellence (NICE) commissioned a 'mock technology appraisal' to assess whether changes to its methods and processes are needed. This report presents the findings of independent research commissioned to inform this appraisal and the deliberations of a panel convened by NICE to evaluate the mock appraisal. Our research included reviews to identify issues, analysis methods and conceptual differences and the relevance of alternative decision frameworks, alongside the development of an exemplar case study of chimeric antigen receptor (CAR) T-cell therapy for treating acute lymphoblastic leukaemia. An assessment of previous evaluations of regenerative medicines found that, although there were a number of evidential challenges, none was unique to regenerative medicines or was beyond the scope of existing methods used to conceptualise decision uncertainty. Regarding the clinical evidence for regenerative medicines, the issues were those associated with a limited evidence base but were not unique to regenerative medicines: small non-randomised studies, high variation in response and the intervention subject to continuing development. The relative treatment effects generated from single-arm trials are likely to be optimistic unless it is certain that the historical data have accurately estimated the efficacy of the control agent. Pivotal trials may use surrogate end points, which, on average, overestimate treatment effects. To reduce overall uncertainty, multivariate meta-analysis of all available data should be considered. Incorporating indirectly relevant but more reliable (more mature) data into the analysis can also be considered; such data may become available as a result of the evolving regulatory pathways being developed by the European Medicines Agency. For the exemplar case of CAR T-cell therapy, target product profiles (TPPs) were developed, which considered the 'curative' and 'bridging to stem-cell transplantation' treatment approaches separately. Within each TPP, three 'hypothetical' evidence sets (minimum, intermediate and mature) were generated to simulate the impact of alternative levels of precision and maturity in the clinical evidence. Subsequent assessments of cost-effectiveness were undertaken, employing the existing NICE reference case alongside additional analyses suggested within alternative frameworks. The additional exploratory analyses were undertaken to demonstrate how assessments of cost-effectiveness and uncertainty could be impacted by alternative managed entry agreements (MEAs), including price discounts, performance-related schemes and technology leasing. The panel deliberated on the range of TPPs, evidence sets and MEAs, commenting on the likely recommendations for each scenario. The panel discussed the challenges associated with the exemplar and regenerative medicines more broadly, focusing on the need for a robust quantification of the level of uncertainty in the cost-effective estimates and the potential value of MEAs in limiting the exposure of the NHS to high upfront costs and loss associated with a wrong decision. It is to be expected that there will be a significant level of uncertainty in determining the clinical effectiveness of regenerative medicines and their long-term costs and benefits, but the existing methods available to estimate the implications of this uncertainty are sufficient. The use of risk sharing and MEAs between the NHS and manufacturers of regenerative medicines should be investigated further. The National Institute for Health Research Health Technology Assessment programme.

  9. Twenty-first century approaches to toxicity testing, biomonitoring, and risk assessment: perspectives from the global chemical industry.

    PubMed

    Phillips, Richard D; Bahadori, Tina; Barry, Brenda E; Bus, James S; Gant, Timothy W; Mostowy, Janet M; Smith, Claudia; Willuhn, Marc; Zimmer, Ulrike

    2009-09-01

    The International Council of Chemical Associations' Long-Range Research Initiative (ICCA-LRI) sponsored a workshop, titled Twenty-First Century Approaches to Toxicity Testing, Biomonitoring, and Risk Assessment, on 16 and 17 June 2008 in Amsterdam, The Netherlands. The workshop focused on interpretation of data from the new technologies for toxicity testing and biomonitoring, and on understanding the relevance of the new data for assessment of human health risks. Workshop participants articulated their concerns that scientific approaches for interpreting and understanding the emerging data in a biologically relevant context lag behind the rapid advancements in the new technologies. Research will be needed to mitigate these lags and to develop approaches for communicating the information, even in a context of uncertainty. A collaborative, coordinated, and sustained research effort is necessary to modernize risk assessment and to significantly reduce current reliance on animal testing. In essence, this workshop was a call to action to bring together the intellectual and financial resources necessary to harness the potential of these new technologies towards improved public health decision making. Without investment in the science of interpretation, it will be difficult to realize the potential that the advanced technologies offer to modernize toxicity testing, exposure science, and risk assessment.

  10. The Uncertainty Principle in the Presence of Quantum Memory

    NASA Astrophysics Data System (ADS)

    Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato

    2010-03-01

    One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.

  11. Developing an Online Framework for Publication of Uncertainty Information in Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Etienne, E.; Piasecki, M.

    2012-12-01

    Inaccuracies in data collection and parameters estimation, and imperfection of models structures imply uncertain predictions of the hydrological models. Finding a way to communicate the uncertainty information in a model output is important in decision-making. This work aims to publish uncertainty information (computed by project partner at Penn State) associated with hydrological predictions on catchments. To this end we have developed a DB schema (derived from the CUAHSI ODM design) which is focused on storing uncertainty information and its associated metadata. The technologies used to build the system are: OGC's Sensor Observation Service (SOS) for publication, the uncertML markup language (also developed by the OGC) to describe uncertainty information, and use of the Interoperability and Automated Mapping (INTAMAP) Web Processing Service (WPS) that handles part of the statistics computations. We develop a service to provide users with the capability to exploit all the functionality of the system (based on DRUPAL). Users will be able to request and visualize uncertainty data, and also publish their data in the system.

  12. Cultural diversity teaching and issues of uncertainty: the findings of a qualitative study

    PubMed Central

    Dogra, Nisha; Giordano, James; France, Nicholas

    2007-01-01

    Background There is considerable ambiguity in the subjective dimensions that comprise much of the relational dynamic of the clinical encounter. Comfort with this ambiguity, and recognition of the potential uncertainty of particular domains of medicine (e.g. – cultural factors of illness expression, value bias in diagnoses, etc) is an important facet of medical education. This paper begins by defining ambiguity and uncertainty as relevant to clinical practice. Studies have shown differing patterns of students' tolerance for ambiguity and uncertainty that appear to reflect extant attitudinal predispositions toward technology, objectivity, culture, value- and theory-ladeness, and the need for self-examination. This paper reports on those findings specifically related to the theme of uncertainty as relevant to teaching about cultural diversity. Its focus is to identify how and where the theme of certainty arose in the teaching and learning of cultural diversity, what were the attitudes toward this theme and topic, and how these attitudes and responses reflect and inform this area of medical pedagogy. Methods A semi-structured interview was undertaken with 61 stakeholders (including policymakers, diversity teachers, students and users). The data were analysed and themes identified. Results There were diverse views about what the term cultural diversity means and what should constitute the cultural diversity curriculum. There was a need to provide certainty in teaching cultural diversity with diversity teachers feeling under considerable pressure to provide information. Students discomfort with uncertainty was felt to drive cultural diversity teaching towards factual emphasis rather than reflection or taking a patient centred approach. Conclusion Students and faculty may feel that cultural diversity teaching is more about how to avoid professional, medico-legal pitfalls, rather than improving the patient experience or the patient-physician relationship. There may be pressure to imbue cultural diversity issues with levels of objectivity and certainty representative of other aspects of the medical curriculum (e.g. – biochemistry). This may reflect a particular selection bias for students with a technocentric orientation. Inadvertently, medical education may enhance this bias through training effects, and accommodate disregard for subjectivity, over-reliance upon technology and thereby foster incorrect assumptions of objective certainty. We opine that it is important to teach students that technology cannot guarantee certainty, and that dealing with subjectivity, diversity, ambiguity and uncertainty is inseparable from the personal dimension of medicine as moral enterprise. Uncertainty is inherent in cultural diversity so this part of the curriculum provides an opportunity to address the issue as it relates to pateint care. PMID:17462089

  13. Cultural diversity teaching and issues of uncertainty: the findings of a qualitative study.

    PubMed

    Dogra, Nisha; Giordano, James; France, Nicholas

    2007-04-26

    There is considerable ambiguity in the subjective dimensions that comprise much of the relational dynamic of the clinical encounter. Comfort with this ambiguity, and recognition of the potential uncertainty of particular domains of medicine (e.g.--cultural factors of illness expression, value bias in diagnoses, etc) is an important facet of medical education. This paper begins by defining ambiguity and uncertainty as relevant to clinical practice. Studies have shown differing patterns of students' tolerance for ambiguity and uncertainty that appear to reflect extant attitudinal predispositions toward technology, objectivity, culture, value- and theory-ladeness, and the need for self-examination. This paper reports on those findings specifically related to the theme of uncertainty as relevant to teaching about cultural diversity. Its focus is to identify how and where the theme of certainty arose in the teaching and learning of cultural diversity, what were the attitudes toward this theme and topic, and how these attitudes and responses reflect and inform this area of medical pedagogy. A semi-structured interview was undertaken with 61 stakeholders (including policymakers, diversity teachers, students and users). The data were analysed and themes identified. There were diverse views about what the term cultural diversity means and what should constitute the cultural diversity curriculum. There was a need to provide certainty in teaching cultural diversity with diversity teachers feeling under considerable pressure to provide information. Students discomfort with uncertainty was felt to drive cultural diversity teaching towards factual emphasis rather than reflection or taking a patient centred approach. Students and faculty may feel that cultural diversity teaching is more about how to avoid professional, medico-legal pitfalls, rather than improving the patient experience or the patient-physician relationship. There may be pressure to imbue cultural diversity issues with levels of objectivity and certainty representative of other aspects of the medical curriculum (e.g.--biochemistry). This may reflect a particular selection bias for students with a technocentric orientation. Inadvertently, medical education may enhance this bias through training effects, and accommodate disregard for subjectivity, over-reliance upon technology and thereby foster incorrect assumptions of objective certainty. We opine that it is important to teach students that technology cannot guarantee certainty, and that dealing with subjectivity, diversity, ambiguity and uncertainty is inseparable from the personal dimension of medicine as moral enterprise. Uncertainty is inherent in cultural diversity so this part of the curriculum provides an opportunity to address the issue as it relates to patient care.

  14. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    PubMed

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  15. 78 FR 42991 - Self-Regulatory Organizations; the Depository Trust Company; Order Approving Proposed Rule Change...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... intraday uncertainty that may arise from reclaim transactions and any potential credit and liquidity risk... receiving Participant prior to DTC processing, thereby reducing the intraday uncertainty that may arise from...

  16. Probabilistic simulation of multi-scale composite behavior

    NASA Technical Reports Server (NTRS)

    Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.

    1993-01-01

    A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.

  17. A method for reducing the largest relative errors in Monte Carlo iterated-fission-source calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunter, J. L.; Sutton, T. M.

    2013-07-01

    In Monte Carlo iterated-fission-source calculations relative uncertainties on local tallies tend to be larger in lower-power regions and smaller in higher-power regions. Reducing the largest uncertainties to an acceptable level simply by running a larger number of neutron histories is often prohibitively expensive. The uniform fission site method has been developed to yield a more spatially-uniform distribution of relative uncertainties. This is accomplished by biasing the density of fission neutron source sites while not biasing the solution. The method is integrated into the source iteration process, and does not require any auxiliary forward or adjoint calculations. For a given amountmore » of computational effort, the use of the method results in a reduction of the largest uncertainties relative to the standard algorithm. Two variants of the method have been implemented and tested. Both have been shown to be effective. (authors)« less

  18. Self-Uncertainty and the Influence of Alternative Goals on Self-Regulation.

    PubMed

    Light, Alysson E; Rios, Kimberly; DeMarree, Kenneth G

    2018-01-01

    The current research examines factors that facilitate or undermine goal pursuit. Past research indicates that attempts to reduce self-uncertainty can result in increased goal motivation. We explore a critical boundary condition of this effect-the presence of alternative goals. Though self-regulatory processes usually keep interest in alternative goals in check, uncertainty reduction may undermine these self-regulatory efforts by (a) reducing conflict monitoring and (b) increasing valuation of alternative goals. As such, reminders of alternative goals will draw effort away from focal goals for self-uncertain (but not self-certain) participants. Across four studies and eight supplemental studies, using different focal goals (e.g., academic achievement, healthy eating) and alternative goals (e.g., social/emotional goals, attractiveness, indulgence), we found that alternative goal salience does not negatively influence goal-directed behavior among participants primed with self-certainty, but that reminders of alternative goals undermine goal pursuit among participants primed with self-uncertainty.

  19. Towards a more open debate about values in decision-making on agricultural biotechnology.

    PubMed

    Devos, Yann; Sanvido, Olivier; Tait, Joyce; Raybould, Alan

    2014-12-01

    Regulatory decision-making over the use of products of new technology aims to be based on science-based risk assessment. In some jurisdictions, decision-making about the cultivation of genetically modified (GM) plants is blocked supposedly because of scientific uncertainty about risks to the environment. However, disagreement about the acceptability of risks is primarily a dispute over normative values, which is not resolvable through natural sciences. Natural sciences may improve the quality and relevance of the scientific information used to support environmental risk assessments and make scientific uncertainties explicit, but offer little to resolve differences about values. Decisions about cultivating GM plants will thus not necessarily be eased by performing more research to reduce scientific uncertainty in environmental risk assessments, but by clarifying the debate over values. We suggest several approaches to reveal values in decision-making: (1) clarifying policy objectives; (2) determining what constitutes environmental harm; (3) making explicit the factual and normative premises on which risk assessments are based; (4) better demarcating environmental risk assessment studies from ecological research; (5) weighing the potential for environmental benefits (i.e., opportunities) as well as the potential for environmental harms (i.e., risks); and (6) expanding participation in the risk governance of GM plants. Recognising and openly debating differences about values will not remove controversy about the cultivation of GM plants. However, by revealing what is truly in dispute, debates about values will clarify decision-making criteria.

  20. Combining Passive Microwave Sounders with CYGNSS information for improved retrievals: Observations during Hurricane Harvey

    NASA Astrophysics Data System (ADS)

    Schreier, M. M.

    2017-12-01

    The launch of CYGNSS (Cyclone Global Navigation Satellite System) has added an interesting component to satellite observations: it can provide wind speeds in the tropical area with a high repetition rate. Passive microwave sounders that are overpassing the same region can benefit from this information, when it comes to the retrieval of temperature or water profiles: the uncertainty about wind speeds has a strong impact on emissivity and reflectivity calculations with respect to surface temperature. This has strong influences on the uncertainty of retrieval of temperature and water content, especially under extreme weather conditions. Adding CYGNSS information to the retrieval can help to reduce errors and provide a significantly better sounder retrieval. Based on observations during Hurricane Harvey, we want to show the impact of CYGNSS data on the retrieval of passive microwave sensors. We will show examples on the impact on the retrieval from polar orbiting instruments, like the Advanced Technology Microwave Sounder (ATMS) and AMSU-A/B on NOAA-18 and 19. In addition we will also show the impact on retrievals from HAMSR (High Altitude MMIC Sounding Radiometer), which was flying on the Global Hawk during the EPOCH campaign. We will compare the results with other observations and estimate the impact of additional CYGNSS information on the microwave retrieval, especially on the impact in error and uncertainty reduction. We think, that a synergetic use of these different data sources could significantly help to produce better assimilation products for forecast assimilation.

  1. Carbon Capture Multidisciplinary Simulation Center Trilab Support Team (TST) Fall Meeting 2016 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draeger, Erik W.

    The theme of this year’s meeting was “Predictivity: Now and in the Future”. After welcoming remarks, Erik Draeger gave a talk on the NNSA Labs’ history of predictive simulation and the new challenges faced by upcoming architecture changes. He described an example where the volume of analysis data produced by a set of inertial confinement fusion (ICF) simulations on the Trinity machine was too large to store or transfer, and the steps needed to reduce it to a manageable size. He also described the software re-engineering plan for LLNL’s suite of multiphysics codes and physics packages with a new pushmore » toward common components, making collaboration with teams like the CCMSC who already have experience trying to architect complex multiphysics code infrastructure on next-generation architectures all the more important. Phil Smith then gave an overview outlining the goals of the project, namely to accelerate development of new technology in the form of high efficiency carbon capture pulverized coal power generation as well as further optimize existing state of the art designs. He then presented a summary of the Center’s top-down uncertainty quantification approach, in which ultimate target predictivity informs uncertainty targets for lower-level components, and gave data on how close all the different components currently are to their targets. Most components still need an approximately two-fold reduction in uncertainty to hit the ultimate predictivity target, but the current accuracy is already rather impressive.« less

  2. Quantum-memory-assisted entropic uncertainty in spin models with Dzyaloshinskii-Moriya interaction

    NASA Astrophysics Data System (ADS)

    Huang, Zhiming

    2018-02-01

    In this article, we investigate the dynamics and correlations of quantum-memory-assisted entropic uncertainty, the tightness of the uncertainty, entanglement, quantum correlation and mixedness for various spin chain models with Dzyaloshinskii-Moriya (DM) interaction, including the XXZ model with DM interaction, the XY model with DM interaction and the Ising model with DM interaction. We find that the uncertainty grows to a stable value with growing temperature but reduces as the coupling coefficient, anisotropy parameter and DM values increase. It is found that the entropic uncertainty is closely correlated with the mixedness of the system. The increasing quantum correlation can result in a decrease in the uncertainty, and the robustness of quantum correlation is better than entanglement since entanglement means sudden birth and death. The tightness of the uncertainty drops to zero, apart from slight volatility as various parameters increase. Furthermore, we propose an effective approach to steering the uncertainty by weak measurement reversal.

  3. Combining uncertainty factors in deriving human exposure levels of noncarcinogenic toxicants.

    PubMed

    Kodell, R L; Gaylor, D W

    1999-01-01

    Acceptable levels of human exposure to noncarcinogenic toxicants in environmental and occupational settings generally are derived by reducing experimental no-observed-adverse-effect levels (NOAELs) or benchmark doses (BDs) by a product of uncertainty factors (Barnes and Dourson, Ref. 1). These factors are presumed to ensure safety by accounting for uncertainty in dose extrapolation, uncertainty in duration extrapolation, differential sensitivity between humans and animals, and differential sensitivity among humans. The common default value for each uncertainty factor is 10. This paper shows how estimates of means and standard deviations of the approximately log-normal distributions of individual uncertainty factors can be used to estimate percentiles of the distribution of the product of uncertainty factors. An appropriately selected upper percentile, for example, 95th or 99th, of the distribution of the product can be used as a combined uncertainty factor to replace the conventional product of default factors.

  4. Confronting the Uncertainty in Aerosol Forcing Using Comprehensive Observational Data

    NASA Astrophysics Data System (ADS)

    Johnson, J. S.; Regayre, L. A.; Yoshioka, M.; Pringle, K.; Sexton, D.; Lee, L.; Carslaw, K. S.

    2017-12-01

    The effect of aerosols on cloud droplet concentrations and radiative properties is the largest uncertainty in the overall radiative forcing of climate over the industrial period. In this study, we take advantage of a large perturbed parameter ensemble of simulations from the UK Met Office HadGEM-UKCA model (the aerosol component of the UK Earth System Model) to comprehensively sample uncertainty in aerosol forcing. Uncertain aerosol and atmospheric parameters cause substantial aerosol forcing uncertainty in climatically important regions. As the aerosol radiative forcing itself is unobservable, we investigate the potential for observations of aerosol and radiative properties to act as constraints on the large forcing uncertainty. We test how eight different theoretically perfect aerosol and radiation observations can constrain the forcing uncertainty over Europe. We find that the achievable constraint is weak unless many diverse observations are used simultaneously. This is due to the complex relationships between model output responses and the multiple interacting parameter uncertainties: compensating model errors mean there are many ways to produce the same model output (known as model equifinality) which impacts on the achievable constraint. However, using all eight observable quantities together we show that the aerosol forcing uncertainty can potentially be reduced by around 50%. This reduction occurs as we reduce a large sample of model variants (over 1 million) that cover the full parametric uncertainty to around 1% that are observationally plausible.Constraining the forcing uncertainty using real observations is a more complex undertaking, in which we must account for multiple further uncertainties including measurement uncertainties, structural model uncertainties and the model discrepancy from reality. Here, we make a first attempt to determine the true potential constraint on the forcing uncertainty from our model that is achievable using a comprehensive set of real aerosol and radiation observations taken from ground stations, flight campaigns and satellite. This research has been supported by the UK-China Research & Innovation Partnership Fund through the Met Office Climate Science for Service Partnership (CSSP) China as part of the Newton Fund, and by the NERC funded GASSP project.

  5. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    NASA Astrophysics Data System (ADS)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the probability of flooding of a certain area, based on the uncertainty assessment of the flood forecasts. By using this type of maps, water managers can focus their attention on the areas with the highest flood probability. Also the larger public can consult these maps for information on the probability of flooding for their specific location, such that they can take pro-active measures to reduce the personal damage. The method of quantifying the uncertainty was implemented in the operational flood forecasting system for the navigable rivers in the Flanders region of Belgium. The method has shown clear benefits during the floods of the last two years.

  6. Marine and Hydrokinetic Technology Development Risk Management Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snowberg, David; Weber, Jochem

    2015-09-01

    Over the past decade, the global marine and hydrokinetic (MHK) industry has suffered a number of serious technological and commercial setbacks. To help reduce the risks of industry failures and advance the development of new technologies, the U.S. Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) developed an MHK Risk Management Framework. By addressing uncertainties, the MHK Risk Management Framework increases the likelihood of successful development of an MHK technology. It covers projects of any technical readiness level (TRL) or technical performance level (TPL) and all risk types (e.g. technological risk, regulatory risk, commercial risk) over themore » development cycle. This framework is intended for the development and deployment of a single MHK technology—not for multiple device deployments within a plant. This risk framework is intended to meet DOE’s risk management expectations for the MHK technology research and development efforts of the Water Power Program (see Appendix A). It also provides an overview of other relevant risk management tools and documentation.1 This framework emphasizes design and risk reviews as formal gates to ensure risks are managed throughout the technology development cycle. Section 1 presents the recommended technology development cycle, Sections 2 and 3 present tools to assess the TRL and TPL of the project, respectively. Section 4 presents a risk management process with design and risk reviews for actively managing risk within the project, and Section 5 presents a detailed description of a risk registry to collect the risk management information into one living document. Section 6 presents recommendations for collecting and using lessons learned throughout the development process.« less

  7. Cost, energy, global warming, eutrophication and local human health impacts of community water and sanitation service options.

    PubMed

    Schoen, Mary E; Xue, Xiaobo; Wood, Alison; Hawkins, Troy R; Garland, Jay; Ashbolt, Nicholas J

    2017-02-01

    We compared water and sanitation system options for a coastal community across selected sustainability metrics, including environmental impact (i.e., life cycle eutrophication potential, energy consumption, and global warming potential), equivalent annual cost, and local human health impact. We computed normalized metric scores, which we used to discuss the options' strengths and weaknesses, and conducted sensitivity analysis of the scores to changes in variable and uncertain input parameters. The alternative systems, which combined centralized drinking water with sanitation services based on the concepts of energy and nutrient recovery as well as on-site water reuse, had reduced environmental and local human health impacts and costs than the conventional, centralized option. Of the selected sustainability metrics, the greatest advantages of the alternative community water systems (compared to the conventional system) were in terms of local human health impact and eutrophication potential, despite large, outstanding uncertainties. Of the alternative options, the systems with on-site water reuse and energy recovery technologies had the least local human health impact; however, the cost of these options was highly variable and the energy consumption was comparable to on-site alternatives without water reuse or energy recovery, due to on-site reuse treatment. Future work should aim to reduce the uncertainty in the energy recovery process and explore the health risks associated with less costly, on-site water treatment options. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. NASA new technology identification and evaluation

    NASA Technical Reports Server (NTRS)

    Lizak, R. M.

    1983-01-01

    Before disclosure in NASA Tech Briefs, reports of new technology are transmitted to the cognizant NASA Field Center Technology Utilization Office (TUO) where they are evaluated for novelty, technical validity and significance, and nonaerospace utility. If uncertainty exists regarding these criteria, the documentation may be forwarded to SRI International for evaluation before recommending publication. From November 1980 to November 1983, some 3,103 technologies were evaluated by SRI. Activities performed and progress made are summarized.

  9. Final Technical Report: Advanced Measurement and Analysis of PV Derate Factors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Bruce Hardison; Burton, Patrick D.; Hansen, Clifford

    2015-12-01

    The Advanced Measurement and Analysis of PV Derate Factors project focuses on improving the accuracy and reducing the uncertainty of PV performance model predictions by addressing a common element of all PV performance models referred to as “derates”. Widespread use of “rules of thumb”, combined with significant uncertainty regarding appropriate values for these factors contribute to uncertainty in projected energy production.

  10. Potential Nationwide Improvements in Productivity and Health from Better Indoor Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, W.J.; Rosenfeld, A.H.

    1998-05-01

    Theoretical considerations and empirical data suggest that existing technologies and procedures can improve indoor environments in a manner that significantly increases productivity and health. Existing literature contains moderate to strong evidence that characteristics of buildings and indoor environments significantly influence rates of respiratory disease, allergy and asthma symptoms, sick building symptoms, and worker performance. While there is considerable uncertainty in our estimates of the magnitudes of productivity gains that may be obtained by providing better indoor environments, the projected gains are very large. For the U.S., we estimate potential annual savings and productivity gains of $6 to $19 billion frommore » reduced respiratory disease, $1 to $4 billion from reduced allergies and asthma, $10 to $20 billion from reduced sick building syndrome symptoms, and $12 to $125 billion from direct improvements in worker performance that are unrelated to health. In two example calculations, the potential financial benefits of improving indoor environments exceed costs by a factor of 8 and 14. Productivity gains that are quantified and demonstrated could serve as a strong stimulus for energy efficiency measures that simultaneously improve the indoor environment.« less

  11. Potential nationwide improvements in productivity and health from better indoor environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, W.J.; Rosenfeld, A.H.

    1998-07-01

    Theoretical considerations and empirical data suggest that existing technologies and procedures can improve indoor environments in a manner that significantly increases productivity and health. Existing literature contains moderate to strong evidence that characteristics of buildings and indoor environments significantly influence rates of respiratory disease, allergy and asthma symptoms, sick building symptoms, and worker performance. While there is considerable uncertainty in their estimates of the magnitudes of productivity gains that may be obtained by providing better indoor environments, the projected gains are very large. For the US, the authors estimate potential annual savings and productivity gains of $6 to $19 billionmore » from reduced respiratory disease, $1 to $4 billion from reduced allergies and asthma, $10 to $20 billion from reduced sick building syndrome symptoms, and $12 to $125 billion from direct improvements in worker performance that are unrelated to health. In two example calculations, the potential financial benefits of improving indoor environments exceed costs by a factor of 8 and 14. Productivity gains that are quantified and demonstrated could serve as a strong stimulus for energy efficiency measures that simultaneously improve the indoor environment.« less

  12. Artificial intelligence in robot control systems

    NASA Astrophysics Data System (ADS)

    Korikov, A.

    2018-05-01

    This paper analyzes modern concepts of artificial intelligence and known definitions of the term "level of intelligence". In robotics artificial intelligence system is defined as a system that works intelligently and optimally. The author proposes to use optimization methods for the design of intelligent robot control systems. The article provides the formalization of problems of robotic control system design, as a class of extremum problems with constraints. Solving these problems is rather complicated due to the high dimensionality, polymodality and a priori uncertainty. Decomposition of the extremum problems according to the method, suggested by the author, allows reducing them into a sequence of simpler problems, that can be successfully solved by modern computing technology. Several possible approaches to solving such problems are considered in the article.

  13. The atmospheric effects of stratospheric aircraft: A current consensus

    NASA Technical Reports Server (NTRS)

    Douglass, A. R.; Carroll, M. A.; Demore, W. B.; Holton, J. R.; Isaksen, I. S. A.; Johnston, H. S.; Ko, M. K. W.

    1991-01-01

    In the early 1970's, a fleet of supersonic aircraft flying in the lower stratosphere was proposed. A large fleet was never built for economic, political, and environmental reasons. Technological improvements may make it economically feasible to develop supersonic aircraft for current markets. Some key results of earlier scientific programs designed to assess the impact of aircraft emissions on stratospheric ozone are reviewed, and factors that must be considered to assess the environmental impact of aircraft exhaust are discussed. These include the amount of nitrogen oxides injected in the stratosphere, horizontal transport, and stratosphere/troposphere assessment models are presented. Areas in which improvements in scientific understanding and model representation must be made to reduce the uncertainty in model calculations are identified.

  14. Neural Network Control of a Magnetically Suspended Rotor System

    NASA Technical Reports Server (NTRS)

    Choi, Benjamin; Brown, Gerald; Johnson, Dexter

    1997-01-01

    Abstract Magnetic bearings offer significant advantages because of their noncontact operation, which can reduce maintenance. Higher speeds, no friction, no lubrication, weight reduction, precise position control, and active damping make them far superior to conventional contact bearings. However, there are technical barriers that limit the application of this technology in industry. One of them is the need for a nonlinear controller that can overcome the system nonlinearity and uncertainty inherent in magnetic bearings. This paper discusses the use of a neural network as a nonlinear controller that circumvents system nonlinearity. A neural network controller was well trained and successfully demonstrated on a small magnetic bearing rig. This work demonstrated the feasibility of using a neural network to control nonlinear magnetic bearings and systems with unknown dynamics.

  15. [The metrology of uncertainty: a study of vital statistics from Chile and Brazil].

    PubMed

    Carvajal, Yuri; Kottow, Miguel

    2012-11-01

    This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.

  16. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    NASA Astrophysics Data System (ADS)

    Grassi, Giacomo; Monni, Suvi; Federici, Sandro; Achard, Frederic; Mollicone, Danilo

    2008-07-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data—i.e., area change and C stock change/area—may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools—already existing in UNFCCC decisions and IPCC guidance documents—may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.

  17. Evaluating the effects of China's pollution controls on inter-annual trends and uncertainties of atmospheric mercury emissions

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Zhong, H.; Zhang, J.; Nielsen, C. P.

    2015-04-01

    China's anthropogenic emissions of atmospheric mercury (Hg) are effectively constrained by national air pollution control and energy efficiency policies. In this study, improved methods, based on available data from domestic field measurements, are developed to quantify the benefits of Hg abatement by various emission control measures. Those measures include increased use of (1) flue gas desulfurization (FGD) and selective catalyst reduction (SCR) systems in power generation; (2) precalciner kilns with fabric filters (FF) in cement production; (3) mechanized coking ovens with electrostatic precipitators (ESP) in iron and steel production; and (4) advanced production technologies in nonferrous metal smelting. Investigation reveals declining trends in emission factors for each of these sources, which together drive a much slower growth of total Hg emissions than the growth of China's energy consumption and economy, from 679 metric tons (t) in 2005 to 750 t in 2012. In particular, estimated emissions from the above-mentioned four source types declined 3% from 2005 to 2012, which can be attributed to expanded deployment of technologies with higher energy efficiencies and air pollutant removal rates. Emissions from other anthropogenic sources are estimated to increase by 22% during the period. The species shares of total Hg emissions have been stable in recent years, with mass fractions of around 55, 39, and 6% for gaseous elemental Hg (Hg0), reactive gaseous mercury (Hg2+), and particle-bound mercury (Hgp), respectively. The higher estimate of total Hg emissions than previous inventories is supported by limited simulation of atmospheric chemistry and transport. With improved implementation of emission controls and energy saving, a 23% reduction in annual Hg emissions from 2012 to 2030, to below 600 t, is expected at the most. While growth in Hg emissions has been gradually constrained, uncertainties quantified by Monte Carlo simulation for recent years have increased, particularly for the power sector and particular industrial sources. The uncertainty (expressed as 95% confidence intervals) of Hg emissions from coal-fired power plants, for example, increased from -48-+73% in 2005 to -50-+89% in 2012. This is attributed mainly to increased penetration of advanced manufacturing and pollutant control technologies; the unclear operational status and relatively small sample sizes of field measurements of those processes have resulted in lower but highly varied emission factors. To reduce uncertainty and further confirm the benefits of pollution control and energy polices, therefore, systematic investigation of specific Hg pollution sources is recommended. The variability of temporal trends and spatial distributions of Hg emissions needs to be better tracked during the ongoing dramatic changes in China's economy, energy use, and air pollution status.

  18. High voltage system: Plasma interaction summary

    NASA Technical Reports Server (NTRS)

    Stevens, N. John

    1986-01-01

    The possible interactions that could exist between a high voltage system and the space plasma environment are reviewed. A solar array is used as an example of such a system. The emphasis in this review is on the discrepancies that exist in this technology in both flight and ground experiment data. It has been found that, in ground testing, there are facility effects, cell size effects and area scaling uncertainties. For space applications there are area scaling and discharge concerns for an array as well as the influence of the large space structures on the collection process. There are still considerable uncertainties in the high voltage-space plasma interaction technology even after several years of effort.

  19. Criminal responsibility and predictability

    NASA Astrophysics Data System (ADS)

    Siccardi, F.

    2009-04-01

    The Italian Civil Protection has developed a set of technologies and rules for issuing early warnings. The right to be protected from natural disasters is felt intensely by people. The evaluation of the size of the target areas and of the severity of events is subject to inherent uncertainty. Victims in areas and at times where early warnings are not provided for are possible. This causes, not always, but more and more frequently, people complaining in courts against civil protection decision makers. The concept of real time uncertainty and conditional probability is difficult to be understood in courts, where the timeliness and effectiveness of the alert is under judgement. A reflection on scientific and technological capabilities is needed.

  20. Minimizing the health and climate impacts of emissions from heavy-duty public transportation bus fleets through operational optimization.

    PubMed

    Gouge, Brian; Dowlatabadi, Hadi; Ries, Francis J

    2013-04-16

    In contrast to capital control strategies (i.e., investments in new technology), the potential of operational control strategies (e.g., vehicle scheduling optimization) to reduce the health and climate impacts of the emissions from public transportation bus fleets has not been widely considered. This case study demonstrates that heterogeneity in the emission levels of different bus technologies and the exposure potential of bus routes can be exploited though optimization (e.g., how vehicles are assigned to routes) to minimize these impacts as well as operating costs. The magnitude of the benefits of the optimization depend on the specific transit system and region. Health impacts were found to be particularly sensitive to different vehicle assignments and ranged from worst to best case assignment by more than a factor of 2, suggesting there is significant potential to reduce health impacts. Trade-offs between climate, health, and cost objectives were also found. Transit agencies that do not consider these objectives in an integrated framework and, for example, optimize for costs and/or climate impacts alone, risk inadvertently increasing health impacts by as much as 49%. Cost-benefit analysis was used to evaluate trade-offs between objectives, but large uncertainties make identifying an optimal solution challenging.

  1. ISS utilization and countermeasure validation: Implementing the critical path roadmap to reduce uncertainties of extended human spaceflight expeditions

    NASA Astrophysics Data System (ADS)

    Leveton, Lauren B.; Robinson, Judith L.; Charles, John B.

    2000-01-01

    Human exploration of space requires the ability to understand and mitigate risks to crews exposed to the conditions associated with such missions. This becomes a greater imperative as we prepare for interplanetary expeditions involving humans who will be subjected to long transit periods in microgravity as they travel to a distant planet such as Mars, embark and live on the planet's surface for an extended time, and finally, return to the 1 g environment of Earth. We need to know, more definitively, what the human health, safety, and performance risks are, and how to prevent or counteract them throughout all phases of a long duration mission. The Johnson Space Center's Space and Life Sciences Directorate along with the National Space Biomedical Research Institute (NSBRI) have been engaged in a strategic planning effort that identifies the most critical risks confronting humans who will venture forth on such missions and the types of research and technology efforts required to mitigate and otherwise reduce the probability and/or severity of those risks. This paper describes the unique approach used to define, assess and prioritize the risks and presents the results of the assessment with an emphasis on the research and technology priorities that will help us to meet the challenge of long duration human spaceflight missions. .

  2. ISS Utilization and Countermeasure Validation: Implementing the Critical Path Roadmap to Reduce Uncertainties of Extended Human Spaceflight Expeditions

    NASA Technical Reports Server (NTRS)

    Leveton, Lauren B.; Robinson, Judith L.; Charles, John B.

    2000-01-01

    Human exploration of space requires the ability to understand and mitigate risks to crews exposed to the conditions associated with such missions. This becomes a greater imperative as we prepare for interplanetary expeditions involving humans who will be subjected to long transit periods in microgravity as they travel to a distant planet such as Mars, embark and live on the planet's surface for an extended time, and finally, return to the 1 g environment of Earth. We need to know, more definitively, what the human health, safety, and performance risks are, and how to prevent or counteract them throughout all phases of a long duration mission. The Johnson Space Center's Space and Life Sciences Directorate along with the National Space Biomedical Research Institute (NSBRI) have been engaged in a strategic planning effort that identifies the most critical risks confronting humans who will venture forth on such missions and the types of research and technology efforts required to mitigate and otherwise reduce the probability and/or severity of those risks. This paper describes the unique approach used to define, assess and prioritize the risks and presents the results of the assessment with an emphasis on the research and technology priorities that will help us to meet the challenge of long duration human spaceflight missions.

  3. Bayesian Chance-Constrained Hydraulic Barrier Design under Geological Structure Uncertainty.

    PubMed

    Chitsazan, Nima; Pham, Hai V; Tsai, Frank T-C

    2015-01-01

    The groundwater community has widely recognized geological structure uncertainty as a major source of model structure uncertainty. Previous studies in aquifer remediation design, however, rarely discuss the impact of geological structure uncertainty. This study combines chance-constrained (CC) programming with Bayesian model averaging (BMA) as a BMA-CC framework to assess the impact of geological structure uncertainty in remediation design. To pursue this goal, the BMA-CC method is compared with traditional CC programming that only considers model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from salt water intrusion in the "1500-foot" sand and the "1700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address geological structure uncertainty, three groundwater models based on three different hydrostratigraphic architectures are developed. The results show that using traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from the connector wells is higher than the total pumpage of the protected public supply wells. While reducing the injection rate can be achieved by reducing the reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station may not be economically attractive. © 2014, National Ground Water Association.

  4. The concentration-discharge slope as a tool for water quality management.

    PubMed

    Bieroza, M Z; Heathwaite, A L; Bechmann, M; Kyllmar, K; Jordan, P

    2018-07-15

    Recent technological breakthroughs of optical sensors and analysers have enabled matching the water quality measurement interval to the time scales of stream flow changes and led to an improved understanding of spatially and temporally heterogeneous sources and delivery pathways for many solutes and particulates. This new ability to match the chemograph with the hydrograph has promoted renewed interest in the concentration-discharge (c-q) relationship and its value in characterizing catchment storage, time lags and legacy effects for both weathering products and anthropogenic pollutants. In this paper we evaluated the stream c-q relationships for a number of water quality determinands (phosphorus, suspended sediments, nitrogen) in intensively managed agricultural catchments based on both high-frequency (sub-hourly) and long-term low-frequency (fortnightly-monthly) routine monitoring data. We used resampled high-frequency data to test the uncertainty in water quality parameters (e.g. mean, 95th percentile and load) derived from low-frequency sub-datasets. We showed that the uncertainty in water quality parameters increases with reduced sampling frequency as a function of the c-q slope. We also showed that different sources and delivery pathways control c-q relationship for different solutes and particulates. Secondly, we evaluated the variation in c-q slopes derived from the long-term low-frequency data for different determinands and catchments and showed strong chemostatic behaviour for phosphorus and nitrogen due to saturation and agricultural legacy effects. The c-q slope analysis can provide an effective tool to evaluate the current monitoring networks and the effectiveness of water management interventions. This research highlights how improved understanding of solute and particulate dynamics obtained with optical sensors and analysers can be used to understand patterns in long-term water quality time series, reduce the uncertainty in the monitoring data and to manage eutrophication in agricultural catchments. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendell, D.

    This talk addresses the impact that future price uncertainty and current low oil and gas prices have on the conduct and management of R&D in the upstream business. Uncertainty in future prices underscores the need to develop technology that will improve our ability to reduce technical uncertainties in investment decisions, to lower all costs and to operate in a flawless way. Low current prices result in a need to be more efficient and cost conscious in everything we do, including R&D. Since the price environment provides little tolerance for mistakes, we need the best possible definition of the hydrocarbon resourcesmore » that we find before committing to development. Furthermore, we must find and define the resource at the lowest possible cost, and develop it in an efficient way that is cost effective, safe and environmentally acceptable. The vital role of research includes improving tools for reconstructing basin histories, predicting hydrocarbon generation, migration and trapping, and improving the quality of seismic data and attribute analysis while reducing acquisition cost. Improved methods for interpreting the data and for integrating it into the evaluation and decision making process also facilitates success. We need to continually strive for the competitive advantage provided by leading edge research, while making maximum use of outsourcing and leveraging to get the most out of every research dollar spent. Systematic prioritization and highgrading of our research portfolio is particularly important in achieving this balance. Exxon understands the importance of R&D to the upstream business, and we are committed to managing our resources to provide the value added research needed to address today`s needs as well as those we know will be there down the road. Exxon has been a successful player in this industry for many decades and we believe that our future success is closely tied to our ability to continually generate key research breakthroughs in an efficient way.« less

  6. OAST planning model for space systems technology

    NASA Technical Reports Server (NTRS)

    Sadin, S. R.

    1978-01-01

    The NASA Office of Aeronautics and Space Technology (OAST) planning model for space systems technology is described, and some space technology forecasts of a general nature are reported. Technology forecasts are presented as a span of technology levels; uncertainties in level of commitment to project and in required time are taken into account, with emphasis on differences resulting from high or low commitment. Forecasts are created by combining several types of data, including information on past technology trends, the trends of past predictions, the rate of advancement predicted by experts in the field, and technology forecasts already published.

  7. Radiation health for a Mars mission

    NASA Technical Reports Server (NTRS)

    Robbins, Donald E.

    1992-01-01

    Uncertainties in risk assessments for exposure of a Mars mission crew to space radiation place limitations on mission design and operation. Large shielding penalties are imposed in order to obtain acceptable safety margins. Galactic cosmic rays (GCR) and solar particle events (SPE) are the major concern. A warning system and 'safe-haven' are needed to protect the crew from large SPE which produce lethal doses. A model developed at NASA Johnson Space Center (JSC) to describe solar modulation of GCR intensities reduces that uncertainty to less than 10 percent. Radiation transport models used to design spacecraft shielding have large uncertainties in nuclear fragmentation cross sections for GCR which interact with spacecraft materials. Planned space measurements of linear energy transfer (LET) spectra behind various shielding thicknesses will reduce uncertainties in dose-versus-shielding thickness relationships to 5-10 percent. The largest remaining uncertainty is in biological effects of space radiation. Data on effects of energetic ions in human are nonexistent. Experimental research on effects in animals and cell is needed to allow extrapolation to the risk of carcinogenesis in humans.

  8. The role of intolerance of uncertainty in terms of alcohol use motives among college students.

    PubMed

    Kraemer, Kristen M; McLeish, Alison C; O'Bryan, Emily M

    2015-03-01

    Hazardous drinking rates among college students are exceedingly high. Despite the link between worry and alcohol use problems, there has been a dearth of empirical work examining worry-related risk factors in terms of motivations for alcohol use. Therefore, the aim of the present investigation was to examine the unique predictive ability of intolerance of uncertainty in terms of alcohol use motives. Participants were 389 college students (72.2% female, Mage=19.92, SD=3.87, Range=18-58 years) who completed self-report measures for course credit. As hypothesized, after controlling for the effects of gender, smoking status, marijuana use status, alcohol consumption, negative affect, and anxiety sensitivity, greater levels of intolerance of uncertainty were significantly predictive of greater coping (1.5% unique variance) and conformity (4.7% unique variance) drinking motives, but not social or enhancement drinking motives. These results suggest that intolerance of uncertainty is associated with drinking to manage or avoid negative emotions, and interventions aimed at reducing intolerance of uncertainty may be helpful in reducing problematic alcohol consumption among college students. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Measurement time and statistics for a noise thermometer with a synthetic-noise reference

    NASA Astrophysics Data System (ADS)

    White, D. R.; Benz, S. P.; Labenski, J. R.; Nam, S. W.; Qu, J. F.; Rogalla, H.; Tew, W. L.

    2008-08-01

    This paper describes methods for reducing the statistical uncertainty in measurements made by noise thermometers using digital cross-correlators and, in particular, for thermometers using pseudo-random noise for the reference signal. First, a discrete-frequency expression for the correlation bandwidth for conventional noise thermometers is derived. It is shown how an alternative frequency-domain computation can be used to eliminate the spectral response of the correlator and increase the correlation bandwidth. The corresponding expressions for the uncertainty in the measurement of pseudo-random noise in the presence of uncorrelated thermal noise are then derived. The measurement uncertainty in this case is less than that for true thermal-noise measurements. For pseudo-random sources generating a frequency comb, an additional small reduction in uncertainty is possible, but at the cost of increasing the thermometer's sensitivity to non-linearity errors. A procedure is described for allocating integration times to further reduce the total uncertainty in temperature measurements. Finally, an important systematic error arising from the calculation of ratios of statistical variables is described.

  10. Precision Departure Release Capability (PDRC) Final Report

    NASA Technical Reports Server (NTRS)

    Engelland, Shawn A.; Capps, Richard; Day, Kevin Brian; Kistler, Matthew Stephen; Gaither, Frank; Juro, Greg

    2013-01-01

    After takeoff, aircraft must merge into en route (Center) airspace traffic flows that may be subject to constraints that create localized demand/capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool, based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas/Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents research results from the PDRC research activity. Companion papers present the Concept of Operations and a Technology Description.

  11. Flowers help bees cope with uncertainty: signal detection and the function of floral complexity

    PubMed Central

    Leonard, Anne S.; Dornhaus, Anna; Papaj, Daniel R.

    2011-01-01

    Plants often attract pollinators with floral displays composed of visual, olfactory, tactile and gustatory stimuli. Since pollinators' responses to each of these stimuli are usually studied independently, the question of why plants produce multi-component floral displays remains relatively unexplored. Here we used signal detection theory to test the hypothesis that complex displays reduce a pollinator's uncertainty about the floral signal. Specifically, we asked whether one component of the floral display, scent, improved a bee's certainty about the value of another component, color hue. We first trained two groups of bumble bees (Bombus impatiens Cresson) to discriminate between rewarding and unrewarding artificial flowers of slightly different hues in the presence vs absence of scent. In a test phase, we presented these bees with a gradient of floral hues and assessed their ability to identify the hue rewarded during training. We interpreted the extent to which bees' preferences were biased away from the unrewarding hue (‘peak shift’) as an indicator of uncertainty in color discrimination. Our data show that the presence of an olfactory signal reduces uncertainty regarding color: not only was color learning facilitated on scented flowers but also bees showed a lower amount of peak shift in the presence of scent. We explore potential mechanisms by which scent might reduce uncertainty about color, and discuss the broader significance of our results for our understanding of signal evolution. PMID:21147975

  12. Reducing uncertainty and increasing consistency: technical improvements to forest carbon pool estimation using the national forest inventory of the US

    Treesearch

    C.W. Woodall; G.M. Domke; J. Coulston; M.B. Russell; J.A. Smith; C.H. Perry; S.M. Ogle; S. Healey; A. Gray

    2015-01-01

    The FIA program does not directly measure forest C stocks. Instead, a combination of empirically derived C estimates (e.g., standing live and dead trees) and models (e.g., understory C stocks related to stand age and forest type) are used to estimate forest C stocks. A series of recent refinements in FIA estimation procedures have sought to reduce the uncertainty...

  13. Hyperspectral imaging spectro radiometer improves radiometric accuracy

    NASA Astrophysics Data System (ADS)

    Prel, Florent; Moreau, Louis; Bouchard, Robert; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc

    2013-06-01

    Reliable and accurate infrared characterization is necessary to measure the specific spectral signatures of aircrafts and associated infrared counter-measures protections (i.e. flares). Infrared characterization is essential to improve counter measures efficiency, improve friend-foe identification and reduce the risk of friendly fire. Typical infrared characterization measurement setups include a variety of panchromatic cameras and spectroradiometers. Each instrument brings essential information; cameras measure the spatial distribution of targets and spectroradiometers provide the spectral distribution of the emitted energy. However, the combination of separate instruments brings out possible radiometric errors and uncertainties that can be reduced with Hyperspectral imagers. These instruments combine both spectral and spatial information into the same data. These instruments measure both the spectral and spatial distribution of the energy at the same time ensuring the temporal and spatial cohesion of collected information. This paper presents a quantitative analysis of the main contributors of radiometric uncertainties and shows how a hyperspectral imager can reduce these uncertainties.

  14. Assessing and reducing hydrogeologic model uncertainty

    USDA-ARS?s Scientific Manuscript database

    NRC is sponsoring research that couples model abstraction techniques with model uncertainty assessment methods. Insights and information from this program will be useful in decision making by NRC staff, licensees and stakeholders in their assessment of subsurface radionuclide transport. All analytic...

  15. Bird-landscape relations in the Chihuahuan Desert: Coping with uncertainties about predictive models

    USGS Publications Warehouse

    Gutzwiller, K.J.; Barrow, W.C.

    2001-01-01

    During the springs of 1995-1997, we studied birds and landscapes in the Chihuahuan Desert along part of the Texas-Mexico border. Our objectives were to assess bird-landscape relations and their interannual consistency and to identify ways to cope with associated uncertainties that undermine confidence in using such relations in conservation decision processes. Bird distributions were often significantly associated with landscape features, and many bird-landscape models were valid and useful for predictive purposes. Differences in early spring rainfall appeared to influence bird abundance, but there was no evidence that annual differences in bird abundance affected model consistency. Model consistency for richness (42%) was higher than mean model consistency for 26 focal species (mean 30%, range 0-67%), suggesting that relations involving individual species are, on average, more subject to factors that cause variation than are richness-landscape relations. Consistency of bird-landscape relations may be influenced by such factors as plant succession, exotic species invasion, bird species' tolerances for environmental variation, habitat occupancy patterns, and variation in food density or weather. The low model consistency that we observed for most species indicates the high variation in bird-landscape relations that managers and other decision makers may encounter. The uncertainty of interannual variation in bird-landscape relations can be reduced by using projections of bird distributions from different annual models to determine the likely range of temporal and spatial variation in a species' distribution. Stochastic simulation models can be used to incorporate the uncertainty of random environmental variation into predictions of bird distributions based on bird-landscape relations and to provide probabilistic projections with which managers can weigh the costs and benefits of various decisions, Uncertainty about the true structure of bird-landscape relations (structural uncertainty) can be reduced by ensuring that models meet important statistical assumptions, designing studies with sufficient statistical power, validating the predictive ability of models, and improving model accuracy through continued field sampling and model fitting. Un certainty associated with sampling variation (partial observability) can be reduced by ensuring that sample sizes are large enough to provide precise estimates of both bird and landscape parameters. By decreasing the uncertainty due to partial observability, managers will improve their ability to reduce structural uncertainty.

  16. Forest management under climatic and social uncertainty: trade-offs between reducing climate change impacts and fostering adaptive capacity.

    PubMed

    Seidl, Rupert; Lexer, Manfred J

    2013-01-15

    The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Learning under uncertainty in smart home environments.

    PubMed

    Zhang, Shuai; McClean, Sally; Scotney, Bryan; Nugent, Chris

    2008-01-01

    Technologies and services for the home environment can provide levels of independence for elderly people to support 'ageing in place'. Learning inhabitants' patterns of carrying out daily activities is a crucial component of these technological solutions with sensor technologies being at the core of such smart environments. Nevertheless, identifying high-level activities from low-level sensor events can be a challenge, as information may be unreliable resulting in incomplete data. Our work addresses the issues of learning in the presence of incomplete data along with the identification and the prediction of inhabitants and their activities under such uncertainty. We show via the evaluation results that our approach also offers the ability to assess the impact of various sensors in the activity recognition process. The benefit of this work is that future predictions can be utilised in a proposed intervention mechanism in a real smart home environment.

  18. Western oil shale development: a technology assessment. Volume 1. Main report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1981-11-01

    The general goal of this study is to present the prospects of shale oil within the context of (1) environmental constraints, (2) available natural and economic resources, and (3) the characteristics of existing and emerging technology. The objectives are: to review shale oil technologies objectively as a means of supplying domestically produced fuels within environmental, social, economic, and legal/institutional constraints; using available data, analyses, and experienced judgment, to examine the major points of uncertainty regarding potential impacts of oil shale development; to resolve issues where data and analyses are compelling or where conclusions can be reached on judgmental grounds; tomore » specify issues which cannot be resolved on the bases of the data, analyses, and experienced judgment currently available; and when appropriate and feasible, to suggest ways for the removal of existing uncertainties that stand in the way of resolving outstanding issues.« less

  19. Estimated Bounds and Important Factors for Fuel Use and Consumer Costs of Connected and Automated Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephens, T. S.; Gonder, Jeff; Chen, Yuche

    This report details a study of the potential effects of connected and automated vehicle (CAV) technologies on vehicle miles traveled (VMT), vehicle fuel efficiency, and consumer costs. Related analyses focused on a range of light-duty CAV technologies in conventional powertrain vehicles -- from partial automation to full automation, with and without ridesharing -- compared to today's base-case scenario. Analysis results revealed widely disparate upper- and lower-bound estimates for fuel use and VMT, ranging from a tripling of fuel use to decreasing light-duty fuel use to below 40% of today's level. This wide range reflects uncertainties in the ways that CAVmore » technologies can influence vehicle efficiency and use through changes in vehicle designs, driving habits, and travel behavior. The report further identifies the most significant potential impacting factors, the largest areas of uncertainty, and where further research is particularly needed.« less

  20. Solar thermal technologies - Potential benefits to U.S. utilities and industry

    NASA Technical Reports Server (NTRS)

    Terasawa, K. L.; Gates, W. R.

    1983-01-01

    Solar energy systems were investigated which complement nuclear and coal technologies as a means of reducing the U.S. dependence on imported petroleum. Solar Thermal Energy Systems (STES) represents an important category of solar energy technologies. STES can be utilized in a broad range of applications servicing a variety of economic sectors, and they can be deployed in both near-term and long-term markets. The net present value of the energy cost savings attributable to electric utility and IPH applications of STES were estimated for a variety of future energy cost scenarios and levels of R&D success. This analysis indicated that the expected net benefits of developing an STES option are significantly greater than the expected costs of completing the required R&D. In addition, transportable fuels and chemical feedstocks represent a substantial future potential market for STES. Due to the basic nature of this R&D activity, however, it is currently impossible to estimate the value of STES in these markets. Despite this fact, private investment in STES R&D is not anticipated due to the high level of uncertainty characterizing the expected payoffs. Previously announced in STAR as N83-10547

  1. Cost and Systems Analysis of Innovative Fuel Resources Concepts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schneider, Erich; Byers, M.

    Economically recovered uranium from seawater can have a transformative effect on the way policy makers view the long-term viability of uranium based fuel cycles. Seawater uranium, even when estimated to cost more than terrestrially mined uranium, is integral in establishing an economic backstop, thus reducing uncertainty in future nuclear power costs. While a passive recovery scheme relying on a field of polymer adsorbents prepared via radiation induced grafting has long been considered the leading technology for full scale deployment, non-trivial cost and logistical barriers persist. Consequently, university partners of the nation-wide consortium for seawater uranium recovery have developed variants ofmore » this technology, each aiming to address a substantial weakness. The focus of this NEUP project is the economic impacts of the proposed variant technologies. The team at University of Alabama has pursued an adsorbent synthesis method that replaces the synthetic fiber backbone with a natural waste product. Chitin fibers suitable for ligand grafting have been prepared from shrimp shell waste. These environmental benefits could be realized at a comparable cost to the reference fiber so long as the uptake can be increased or the chemical consumption cost decreased.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pless, Jacquelyn; Arent, Douglas J.; Logan, Jeffrey

    One energy policy objective in the United States is to promote the adoption of technologies that provide consumers with stable, secure, and clean energy. Recent work provides anecdotal evidence of natural gas (NG) and renewable electricity (RE) synergies in the power sector, however few studies quantify the value of investing in NG and RE systems together as complements. This paper uses discounted cash flow analysis and real options analysis to value hybrid NG-RE systems in distributed applications, focusing on residential and commercial projects assumed to be located in the states of New York and Texas. Technology performance and operational riskmore » profiles are modeled at the hourly level to capture variable RE output and NG prices are modeled stochastically as geometric Ornstein-Uhlenbeck (OU) stochastic processes to capture NG price uncertainty. The findings consistently suggest that NG-RE hybrid distributed systems are more favorable investments in the applications studied relative to their single-technology alternatives when incentives for renewables are available. In some cases, NG-only systems are the favorable investments. Understanding the value of investing in NG-RE hybrid systems provides insights into one avenue towards reducing greenhouse gas emissions, given the important role of NG and RE in the power sector.« less

  3. Time lapse imaging: is it time to incorporate this technology into routine clinical practice?

    PubMed

    Bhide, Priya; Maheshwari, Abha; Cutting, Rachel; Seenan, Susan; Patel, Anita; Khan, Khalid; Homburg, Roy

    2017-06-01

    Time-lapse imaging (TLI) systems for embryo incubation, assessment and selection are a novel technology available to in vitro fertilization (IVF) clinics. However, there is uncertainty about their clinical and cost-effectiveness and insufficient good quality evidence to warrant their routine use. Despite this, enthusiastic commercial marketing and slipping clinical equipoise have led to the widespread hasty introduction of this technology into practice, often at a considerable expense to the patient. We have reviewed the published literature and aim to summarize the strengths, weaknesses, opportunities and threats of these systems. These specialized incubators provide undisturbed embryo culture conditions and, by almost continuous monitoring of embryo development, generate morphokinetic parameters to aid embryo selection. They are thus hypothesized to improve outcomes following IVF. Although literature reports improved reproductive outcomes, these outcomes are largely surrogate and there is a paucity of studies reporting live births. The use of time lapse systems may reduce early pregnancy loss, increase elective single embryo transfers and limit multiple pregnancies through better embryo selection. However, the quality of the studies and hence the evidence so far, is low to moderate quality. We recommend further research producing robust high-quality evidence for and against the use of these systems.

  4. Robust Operation of Soft Open Points in Active Distribution Networks with High Penetration of Photovoltaic Integration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, Fei; Ji, Haoran; Wang, Chengshan

    Distributed generators (DGs) including photovoltaic panels (PVs) have been integrated dramatically in active distribution networks (ADNs). Due to the strong volatility and uncertainty, the high penetration of PV generation immensely exacerbates the conditions of voltage violation in ADNs. However, the emerging flexible interconnection technology based on soft open points (SOPs) provides increased controllability and flexibility to the system operation. For fully exploiting the regulation ability of SOPs to address the problems caused by PV, this paper proposes a robust optimization method to achieve the robust optimal operation of SOPs in ADNs. A two-stage adjustable robust optimization model is built tomore » tackle the uncertainties of PV outputs, in which robust operation strategies of SOPs are generated to eliminate the voltage violations and reduce the power losses of ADNs. A column-and-constraint generation (C&CG) algorithm is developed to solve the proposed robust optimization model, which are formulated as second-order cone program (SOCP) to facilitate the accuracy and computation efficiency. Case studies on the modified IEEE 33-node system and comparisons with the deterministic optimization approach are conducted to verify the effectiveness and robustness of the proposed method.« less

  5. Uncertainty analysis of diffuse-gray radiation enclosure problems: A hypersensitive case study

    NASA Technical Reports Server (NTRS)

    Taylor, Robert P.; Luck, Rogelio; Hodge, B. K.; Steele, W. Glenn

    1993-01-01

    An uncertainty analysis of diffuse-gray enclosure problems is presented. The genesis was a diffuse-gray enclosure problem which proved to be hypersensitive to the specification of view factors. This genesis is discussed in some detail. The uncertainty analysis is presented for the general diffuse-gray enclosure problem and applied to the hypersensitive case study. It was found that the hypersensitivity could be greatly reduced by enforcing both closure and reciprocity for the view factors. The effects of uncertainties in the surface emissivities and temperatures are also investigated.

  6. Experimental Test of Entropic Noise-Disturbance Uncertainty Relations for Spin-1/2 Measurements.

    PubMed

    Sulyok, Georg; Sponar, Stephan; Demirel, Bülent; Buscemi, Francesco; Hall, Michael J W; Ozawa, Masanao; Hasegawa, Yuji

    2015-07-17

    Information-theoretic definitions for noise and disturbance in quantum measurements were given in [Phys. Rev. Lett. 112, 050401 (2014)] and a state-independent noise-disturbance uncertainty relation was obtained. Here, we derive a tight noise-disturbance uncertainty relation for complementary qubit observables and carry out an experimental test. Successive projective measurements on the neutron's spin-1/2 system, together with a correction procedure which reduces the disturbance, are performed. Our experimental results saturate the tight noise-disturbance uncertainty relation for qubits when an optimal correction procedure is applied.

  7. Analogy as a strategy for supporting complex problem solving under uncertainty.

    PubMed

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  8. Analysis of key technologies for virtual instruments metrology

    NASA Astrophysics Data System (ADS)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  9. Advanced Small Modular Reactor Economics Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    This report describes the data collection work performed for an advanced small modular reactor (AdvSMR) economics analysis activity at the Oak Ridge National Laboratory. The methodology development and analytical results are described in separate, stand-alone documents as listed in the references. The economics analysis effort for the AdvSMR program combines the technical and fuel cycle aspects of advanced (non-light water reactor [LWR]) reactors with the market and production aspects of SMRs. This requires the collection, analysis, and synthesis of multiple unrelated and potentially high-uncertainty data sets from a wide range of data sources. Further, the nature of both economic andmore » nuclear technology analysis requires at least a minor attempt at prediction and prognostication, and the far-term horizon for deployment of advanced nuclear systems introduces more uncertainty. Energy market uncertainty, especially the electricity market, is the result of the integration of commodity prices, demand fluctuation, and generation competition, as easily seen in deregulated markets. Depending on current or projected values for any of these factors, the economic attractiveness of any power plant construction project can change yearly or quarterly. For long-lead construction projects such as nuclear power plants, this uncertainty generates an implied and inherent risk for potential nuclear power plant owners and operators. The uncertainty in nuclear reactor and fuel cycle costs is in some respects better understood and quantified than the energy market uncertainty. The LWR-based fuel cycle has a long commercial history to use as its basis for cost estimation, and the current activities in LWR construction provide a reliable baseline for estimates for similar efforts. However, for advanced systems, the estimates and their associated uncertainties are based on forward-looking assumptions for performance after the system has been built and has achieved commercial operation. Advanced fuel materials and fabrication costs have large uncertainties based on complexities of operation, such as contact-handled fuel fabrication versus remote handling, or commodity availability. Thus, this analytical work makes a good faith effort to quantify uncertainties and provide qualifiers, caveats, and explanations for the sources of these uncertainties. The overall result is that this work assembles the necessary information and establishes the foundation for future analyses using more precise data as nuclear technology advances.« less

  10. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  11. Technology assessment: What should it be?

    NASA Technical Reports Server (NTRS)

    Black, G.

    1975-01-01

    The necessity of uncovering unsuspected relationships in proposed actions is discussed along with the feasibility of using decision theoretical models to cope with problems of uncertainty in the future-oriented analyses characteristic of assessments. It is shown that it is necessary to integrate the results of technology assessment with other program analyses and that results of technology assessment be supplied in a form that permits integration with other information.

  12. Whither Space Weapons: A Capability in Need of an Advocate

    DTIC Science & Technology

    2005-05-17

    organizations may be particularly resistant. “ Disruptive technologies introduce a very different package of attributes from the one mainstream customers...and Clayton M. Christensen, “ Disruptive Technologies : Catching the Wave,” Harvard Business Review on Managing Uncertainty, (Boston, MA: Harvard...way to deter and win wars in their theater. However, if fundamentally different disruptive technologies such as space weapons are to be introduced

  13. Improvements in world-wide intercomparison of PV module calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salis, E.; Pavanello, D.; Field, M.

    The calibration of the electrical performance of seven photovoltaic (PV) modules was compared between four reference laboratories on three continents. The devices included two samples in standard and two in high-efficiency crystalline silicon technology, two CI(G)S and one CdTe module. The reference value for each PV module parameter was calculated from the average of the results of all four laboratories, weighted by the respective measurement uncertainties. All single results were then analysed with respect to this reference value using the E n number approach. For the four modules in crystalline silicon technology, the results agreed in general within +/-0.5%, withmore » all values within +/-1% and all E n numbers well within [-1, 1], indicating further scope for reducing quoted measurement uncertainty. Regarding the three thin-film modules, deviations were on average roughly twice as large, i.e. in general from +/-1% to +/-2%. A number of inconsistent results were observable, although within the 5% that can be statistically expected on the basis of the E n number approach. Most inconsistencies can be traced to the preconditioning procedure of one participant, although contribution of other factors cannot be ruled out. After removing these obvious inconsistent results, only two real outliers remained, representing less than 2% of the total number of measurands. The results presented show improved agreement for the calibration of PV modules with respect to previous international exercises. For thin-film PV modules, the preconditioning of the devices prior to calibration measurements is the most critical factor for obtaining consistent results, while the measurement processes seem consistent and repeatable.« less

  14. Improvements in world-wide intercomparison of PV module calibration

    DOE PAGES

    Salis, E.; Pavanello, D.; Field, M.; ...

    2017-09-14

    The calibration of the electrical performance of seven photovoltaic (PV) modules was compared between four reference laboratories on three continents. The devices included two samples in standard and two in high-efficiency crystalline silicon technology, two CI(G)S and one CdTe module. The reference value for each PV module parameter was calculated from the average of the results of all four laboratories, weighted by the respective measurement uncertainties. All single results were then analysed with respect to this reference value using the E n number approach. For the four modules in crystalline silicon technology, the results agreed in general within +/-0.5%, withmore » all values within +/-1% and all E n numbers well within [-1, 1], indicating further scope for reducing quoted measurement uncertainty. Regarding the three thin-film modules, deviations were on average roughly twice as large, i.e. in general from +/-1% to +/-2%. A number of inconsistent results were observable, although within the 5% that can be statistically expected on the basis of the E n number approach. Most inconsistencies can be traced to the preconditioning procedure of one participant, although contribution of other factors cannot be ruled out. After removing these obvious inconsistent results, only two real outliers remained, representing less than 2% of the total number of measurands. The results presented show improved agreement for the calibration of PV modules with respect to previous international exercises. For thin-film PV modules, the preconditioning of the devices prior to calibration measurements is the most critical factor for obtaining consistent results, while the measurement processes seem consistent and repeatable.« less

  15. Utilization of Integrated Assessment Modeling for determining geologic CO2 storage security

    NASA Astrophysics Data System (ADS)

    Pawar, R.

    2017-12-01

    Geologic storage of carbon dioxide (CO2) has been extensively studied as a potential technology to mitigate atmospheric concentration of CO2. Multiple international research & development efforts, large-scale demonstration and commercial projects are helping advance the technology. One of the critical areas of active investigation is prediction of long-term CO2 storage security and risks. A quantitative methodology for predicting a storage site's long-term performance is critical for making key decisions necessary for successful deployment of commercial scale projects where projects will require quantitative assessments of potential long-term liabilities. These predictions are challenging given that they require simulating CO2 and in-situ fluid movements as well as interactions through the primary storage reservoir, potential leakage pathways (such as wellbores, faults, etc.) and shallow resources such as groundwater aquifers. They need to take into account the inherent variability and uncertainties at geologic sites. This talk will provide an overview of an approach based on integrated assessment modeling (IAM) to predict long-term performance of a geologic storage site including, storage reservoir, potential leakage pathways and shallow groundwater aquifers. The approach utilizes reduced order models (ROMs) to capture the complex physical/chemical interactions resulting due to CO2 movement and interactions but are computationally extremely efficient. Applicability of the approach will be demonstrated through examples that are focused on key storage security questions such as what is the probability of leakage of CO2 from a storage reservoir? how does storage security vary for different geologic environments and operational conditions? how site parameter variability and uncertainties affect storage security, etc.

  16. Predictions of space radiation fatality risk for exploration missions

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; To, Khiet; Cacao, Eliedonna

    2017-05-01

    In this paper we describe revisions to the NASA Space Cancer Risk (NSCR) model focusing on updates to probability distribution functions (PDF) representing the uncertainties in the radiation quality factor (QF) model parameters and the dose and dose-rate reduction effectiveness factor (DDREF). We integrate recent heavy ion data on liver, colorectal, intestinal, lung, and Harderian gland tumors with other data from fission neutron experiments into the model analysis. In an earlier work we introduced distinct QFs for leukemia and solid cancer risk predictions, and here we consider liver cancer risks separately because of the higher RBE's reported in mouse experiments compared to other tumors types, and distinct risk factors for liver cancer for astronauts compared to the U.S. population. The revised model is used to make predictions of fatal cancer and circulatory disease risks for 1-year deep space and International Space Station (ISS) missions, and a 940 day Mars mission. We analyzed the contribution of the various model parameter uncertainties to the overall uncertainty, which shows that the uncertainties in relative biological effectiveness (RBE) factors at high LET due to statistical uncertainties and differences across tissue types and mouse strains are the dominant uncertainty. NASA's exposure limits are approached or exceeded for each mission scenario considered. Two main conclusions are made: 1) Reducing the current estimate of about a 3-fold uncertainty to a 2-fold or lower uncertainty will require much more expansive animal carcinogenesis studies in order to reduce statistical uncertainties and understand tissue, sex and genetic variations. 2) Alternative model assumptions such as non-targeted effects, increased tumor lethality and decreased latency at high LET, and non-cancer mortality risks from circulatory diseases could significantly increase risk estimates to several times higher than the NASA limits.

  17. Managing Lunar and Mars Mission Radiation Risks. Part 1; Cancer Risks, Uncertainties, and Shielding Effectiveness

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Ren, Lei

    2005-01-01

    This document addresses calculations of probability distribution functions (PDFs) representing uncertainties in projecting fatal cancer risk from galactic cosmic rays (GCR) and solar particle events (SPEs). PDFs are used to test the effectiveness of potential radiation shielding approaches. Monte-Carlo techniques are used to propagate uncertainties in risk coefficients determined from epidemiology data, dose and dose-rate reduction factors, quality factors, and physics models of radiation environments. Competing mortality risks and functional correlations in radiation quality factor uncertainties are treated in the calculations. The cancer risk uncertainty is about four-fold for lunar and Mars mission risk projections. For short-stay lunar missins (<180 d), SPEs present the most significant risk, but one effectively mitigated by shielding. For long-duration (>180 d) lunar or Mars missions, GCR risks may exceed radiation risk limits. While shielding materials are marginally effective in reducing GCR cancer risks because of the penetrating nature of GCR and secondary radiation produced in tissue by relativisitc particles, polyethylene or carbon composite shielding cannot be shown to significantly reduce risk compared to aluminum shielding. Therefore, improving our knowledge of space radiobiology to narrow uncertainties that lead to wide PDFs is the best approach to ensure radiation protection goals are met for space exploration.

  18. Comparison of Calibration Methods for Tristimulus Colorimeters.

    PubMed

    Gardner, James L

    2007-01-01

    Uncertainties in source color measurements with a tristimulus colorimeter are estimated for calibration factors determined, based on a known source spectral distribution or on accurate measurements of the spectral responsivities of the colorimeter channels. Application is to the National Institute of Standards and Technology (NIST) colorimeter and an International Commission on Illumination (CIE) Illuminant A calibration. Detector-based calibration factors generally have lower uncertainties than source-based calibration factors. Uncertainties are also estimated for calculations of spectral mismatch factors. Where both spectral responsivities of the colorimeter channels and the spectral power distributions of the calibration and test sources are known, uncertainties are lowest if the colorimeter calibration factors are recalculated for the test source; this process also avoids correlations between the CIE Source A calibration factors and the spectral mismatch factors.

  19. Comparison of Calibration Methods for Tristimulus Colorimeters

    PubMed Central

    Gardner, James L.

    2007-01-01

    Uncertainties in source color measurements with a tristimulus colorimeter are estimated for calibration factors determined, based on a known source spectral distribution or on accurate measurements of the spectral responsivities of the colorimeter channels. Application is to the National Institute of Standards and Technology (NIST) colorimeter and an International Commission on Illumination (CIE) Illuminant A calibration. Detector-based calibration factors generally have lower uncertainties than source-based calibration factors. Uncertainties are also estimated for calculations of spectral mismatch factors. Where both spectral responsivities of the colorimeter channels and the spectral power distributions of the calibration and test sources are known, uncertainties are lowest if the colorimeter calibration factors are recalculated for the test source; this process also avoids correlations between the CIE Source A calibration factors and the spectral mismatch factors. PMID:27110460

  20. Cost effectiveness and value of information analyses of islet cell transplantation in the management of 'unstable' type 1 diabetes mellitus.

    PubMed

    Wallner, Klemens; Shapiro, A M James; Senior, Peter A; McCabe, Christopher

    2016-04-09

    Islet cell transplantation is a method to stabilize type 1 diabetes patients with hypoglycemia unawareness and unstable blood glucose levels by reducing insulin dependency and protecting against severe hypoglycemia through restoring endogenous insulin secretion. This study analyses the current cost-effectiveness of this technology and estimates the value of further research to reduce uncertainty around cost-effectiveness. We performed a cost-utility analysis using a Markov cohort model with a mean patient age of 49 to simulate costs and health outcomes over a life-time horizon. Our analysis used intensive insulin therapy (IIT) as comparator and took the provincial healthcare provider perspective. Cost and effectiveness data for up to four transplantations per patient came from the University of Alberta hospital. Costs are expressed in 2012 Canadian dollars and effectiveness in quality-adjusted life-years (QALYs) and life years. To characterize the uncertainty around expected outcomes, we carried out a probabilistic sensitivity analysis within the Bayesian decision-analytic framework. We performed a value-of-information analysis to identify priority areas for future research under various scenarios. We applied a structural sensitivity analysis to assess the dependence of outcomes on model characteristics. Compared to IIT, islet cell transplantation using non-generic (generic) immunosuppression had additional costs of $150,006 ($112,023) per additional QALY, an average gain of 3.3 life years, and a probability of being cost-effective of 0.5 % (28.3 %) at a willingness-to-pay threshold of $100,000 per QALY. At this threshold the non-generic technology has an expected value of perfect information (EVPI) of $260,744 for Alberta. This increases substantially in cost-reduction scenarios. The research areas with the highest partial EVPI are costs, followed by natural history, and effectiveness and safety. Current transplantation technology provides substantial improvements in health outcomes over conventional therapy for highly selected patients with 'unstable' type 1 diabetes. However, it is much more costly and so is not cost-effective. The value of further research into the cost-effectiveness is dependent upon treatment costs. Further, we suggest the value of information should not only be derived from current data alone when knowing that this data will most likely change in the future.

  1. Reducing risk and increasing confidence of decision making at a lower cost: In-situ pXRF assessment of metal-contaminated sites.

    PubMed

    Rouillon, Marek; Taylor, Mark P; Dong, Chenyin

    2017-10-01

    This study evaluates the in-situ use of field portable X-ray Fluorescence (pXRF) for metal-contaminated site assessments, and assesses the advantages of increased sampling to reduce risk, and increase confidence of decision making at a lower cost. Five metal-contaminated sites were assessed using both in-situ pXRF and ex-situ inductively coupled plasma mass spectrometry (ICP-MS) analyses at various sampling resolutions. Twenty second in-situ pXRF measurements of Mn, Zn and Pb were corrected using a subset of parallel ICP-MS measurements taken at each site. Field and analytical duplicates revealed sampling as the major contributor (>95% variation) to measurement uncertainties. This study shows that increased sampling led to several benefits including more representative site characterisation, higher soil-metal mapping resolution, reduced uncertainty around the site mean, and reduced sampling uncertainty. Real time pXRF data enabled efficient, on-site decision making for further judgemental sampling, without the need to return to the site. Additionally, in-situ pXRF was more cost effective than the current approach of ex-situ sampling and ICP-MS analysis, even with higher sampling at each site. Lastly, a probabilistic site assessment approach was applied to demonstrate the advantages of integrating estimated measurement uncertainties into site reporting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, Keith

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less

  3. Innovative Clean Coal Technology (ICCT). Demonstration of Selective Catalytic Reduction (SCR) technology for the control of nitrogen oxide (NO{sub x}) emissions from high-sulfur coal-fired boilers: Volume 1. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-10-01

    The objective of this project is to demonstrate and evaluate commercially available Selective Catalytic Reduction (SCR) catalysts from U.S., Japanese and European catalyst suppliers on a high-sulfur U.S. coal-fired boiler. SCR is a post-combustion nitrogen oxide (NO.) control technology that involves injecting ammonia into the flue gas generated from coal combustion in an electric utility boiler. The flue gas containing ammonia is then passed through a reactor that contains a specialized catalyst. In the presence of the catalyst, the ammonia reacts with NO. to convert it to nitrogen and water vapor. Although SCR is widely practiced in Japan and Europemore » on gas-, oil-, and low-sulfur coal- fired boilers, there are several technical uncertainties associated with applying SCR to U.S. coals. These uncertainties include: 1) potential catalyst deactivation due to poisoning by trace metal species present in U.S. coals that are not present in other fuels. 2) performance of the technology and effects on the balance-of- plant equipment in the presence of high amounts of SO{sub 2} and SO{sub 3}. 3) performance of a wide variety of SCR catalyst compositions, geometries and methods of manufacturer under typical high-sulfur coal-fired utility operating conditions. These uncertainties were explored by operating nine small-scale SCR reactors and simultaneously exposing different SCR catalysts to flue gas derived from the combustion of high sulfur U.S. coal. In addition, the test facility operating experience provided a basis for an economic study investigating the implementation of SCR technology.« less

  4. Applications of Advanced Technology for Monitoring Forest Carbon to Support Climate Change Mitigation

    NASA Astrophysics Data System (ADS)

    Birdsey, R.; Hurtt, G. C.; Dubayah, R.; Hagen, S. C.; Vargas, R.; Nehrkorn, T.; Domke, G. M.; Houghton, R. A.

    2015-12-01

    Measurement, Reporting, and Verification (MRV) is a broad concept guiding the application of monitoring technology to the needs of countries or entities for reporting and verifying reductions in greenhouse gas emissions or increases in greenhouse gas sinks. Credibility, cost-effectiveness, and compatibility are important features of global MRV efforts that can support implementation of climate change mitigation programs such as Reducing Emissions from Deforestation and Forest Degradation and Sustainable Forest Management (REDD+). Applications of MRV technology may be tailored to individual country circumstances following guidance provided by the Intergovernmental Panel on Climate Change; hence, there is no single approach that is uniquely viable but rather a range of ways to integrate new MRV methods. MRV technology is advancing rapidly with new remote sensing and advanced measurement of atmospheric CO2, and in situ terrestrial and ocean measurements, coupled with improvements in data analysis, modeling, and assessing uncertainty. Here we briefly summarize some of the most application-ready MRV technologies being developed under NASA's Carbon Monitoring System (CMS) program, and illustrate how these technologies may be applied for monitoring forests using several case studies that span a range of scales, country circumstances, and stakeholder reporting requirements. We also include remarks about the potential role of advanced monitoring technology in the context of the global climate accord that is expected to result from the 21st session of the Conference of the Parties to the United Nations Framework Convention on Climate Change, which is expected to take place in December 2015, in Paris, France.

  5. A project-based system for including farmers in the EU ETS.

    PubMed

    Brandt, Urs Steiner; Svendsen, Gert Tinggaard

    2011-04-01

    Farmers in the EU do not trade greenhouse gases under the Kyoto agreement. This is an empirical puzzle because agriculture is a significant contributor of greenhouse gases (GHG) in the EU and may harvest private net gains from trade. Furthermore, the US has strongly advocated land-use practices as 'the missing link' in past climate negotiations. We argue that farmers have relatively low marginal reduction costs and that consequences in terms of the effect on permit price and technology are overall positive in the EU Emission Trading System (ETS). Thus, we propose a project-based system for including the farming practices in the EU ETS that reduces the uncertainty from measuring emission reduction in this sector. The system encourages GHG reduction either by introducing a new and less polluting practice or by reducing the polluting activity. When doing so, farmers will receive GHG permits corresponding to the amount of reduction which can be stored for later use or sold in the EU ETS. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Drug-eluting versus bare-metal coronary stents: where are we now?

    PubMed

    Amoroso, Nicholas S; Bangalore, Sripal

    2012-11-01

    Drug-eluting stents have dramatically reduced the risk of restenosis, but concerns of an increased risk of stent thrombosis have provided uncertainty about their use. Recent studies have continued to show improved procedural and clinical outcomes with drug-eluting stents both in the setting of acute coronary syndromes and stable coronary artery disease. Newer generation drug-eluting stents (especially everolimus-eluting stents) have been shown to be not only efficacious but also safe with reduced risk of stent thrombosis when compared with bare-metal stents, potentially changing the benchmark for stent safety from bare-metal stents to everolimus-eluting stents. While much progress is being made in the development of bioabsorbable polymer stents, nonpolymer stents and bioabsorbable stent technology, it remains to be seen whether these stents will have superior safety and efficacy outcomes compared with the already much improved rates of revascularization and stent thrombosis seen with newer generation stents (everolimus-eluting stents and resolute zotarolimus-eluting stents).

  7. Reduction in maximum time uncertainty of paired time signals

    DOEpatents

    Theodosiou, G.E.; Dawson, J.W.

    1983-10-04

    Reduction in the maximum time uncertainty (t[sub max]--t[sub min]) of a series of paired time signals t[sub 1] and t[sub 2] varying between two input terminals and representative of a series of single events where t[sub 1][<=]t[sub 2] and t[sub 1]+t[sub 2] equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t[sub min]) of the first signal t[sub 1] closer to t[sub max] and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20--800. 6 figs.

  8. Reduction in maximum time uncertainty of paired time signals

    DOEpatents

    Theodosiou, George E.; Dawson, John W.

    1983-01-01

    Reduction in the maximum time uncertainty (t.sub.max -t.sub.min) of a series of paired time signals t.sub.1 and t.sub.2 varying between two input terminals and representative of a series of single events where t.sub.1 .ltoreq.t.sub.2 and t.sub.1 +t.sub.2 equals a constant, is carried out with a circuit utilizing a combination of OR and AND gates as signal selecting means and one or more time delays to increase the minimum value (t.sub.min) of the first signal t.sub.1 closer to t.sub.max and thereby reduce the difference. The circuit may utilize a plurality of stages to reduce the uncertainty by factors of 20-800.

  9. Back-stepping active disturbance rejection control design for integrated missile guidance and control system via reduced-order ESO.

    PubMed

    Xingling, Shao; Honglun, Wang

    2015-07-01

    This paper proposes a novel composite integrated guidance and control (IGC) law for missile intercepting against unknown maneuvering target with multiple uncertainties and control constraint. First, by using back-stepping technique, the proposed IGC law design is separated into guidance loop and control loop. The unknown target maneuvers and variations of aerodynamics parameters in guidance and control loop are viewed as uncertainties, which are estimated and compensated by designed model-assisted reduced-order extended state observer (ESO). Second, based on the principle of active disturbance rejection control (ADRC), enhanced feedback linearization (FL) based control law is implemented for the IGC model using the estimates generated by reduced-order ESO. In addition, performance analysis and comparisons between ESO and reduced-order ESO are examined. Nonlinear tracking differentiator is employed to construct the derivative of virtual control command in the control loop. Third, the closed-loop stability for the considered system is established. Finally, the effectiveness of the proposed IGC law in enhanced interception performance such as smooth interception course, improved robustness against multiple uncertainties as well as reduced control consumption during initial phase are demonstrated through simulations. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  10. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    PubMed

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  11. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  12. Overview of the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.

  13. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  14. Nanotechnology risk perceptions and communication: emerging technologies, emerging challenges.

    PubMed

    Pidgeon, Nick; Harthorn, Barbara; Satterfield, Terre

    2011-11-01

    Nanotechnology involves the fabrication, manipulation, and control of materials at the atomic level and may also bring novel uncertainties and risks. Potential parallels with other controversial technologies mean there is a need to develop a comprehensive understanding of processes of public perception of nanotechnology uncertainties, risks, and benefits, alongside related communication issues. Study of perceptions, at so early a stage in the development trajectory of a technology, is probably unique in the risk perception and communication field. As such it also brings new methodological and conceptual challenges. These include: dealing with the inherent diversity of the nanotechnology field itself; the unfamiliar and intangible nature of the concept, with few analogies to anchor mental models or risk perceptions; and the ethical and value questions underlying many nanotechnology debates. Utilizing the lens of social amplification of risk, and drawing upon the various contributions to this special issue of Risk Analysis on Nanotechnology Risk Perceptions and Communication, nanotechnology may at present be an attenuated hazard. The generic idea of "upstream public engagement" for emerging technologies such as nanotechnology is also discussed, alongside its importance for future work with emerging technologies in the risk communication field. © 2011 Society for Risk Analysis.

  15. Effects of Phasor Measurement Uncertainty on Power Line Outage Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chen; Wang, Jianhui; Zhu, Hao

    2014-12-01

    Phasor measurement unit (PMU) technology provides an effective tool to enhance the wide-area monitoring systems (WAMSs) in power grids. Although extensive studies have been conducted to develop several PMU applications in power systems (e.g., state estimation, oscillation detection and control, voltage stability analysis, and line outage detection), the uncertainty aspects of PMUs have not been adequately investigated. This paper focuses on quantifying the impact of PMU uncertainty on power line outage detection and identification, in which a limited number of PMUs installed at a subset of buses are utilized to detect and identify the line outage events. Specifically, the linemore » outage detection problem is formulated as a multi-hypothesis test, and a general Bayesian criterion is used for the detection procedure, in which the PMU uncertainty is analytically characterized. We further apply the minimum detection error criterion for the multi-hypothesis test and derive the expected detection error probability in terms of PMU uncertainty. The framework proposed provides fundamental guidance for quantifying the effects of PMU uncertainty on power line outage detection. Case studies are provided to validate our analysis and show how PMU uncertainty influences power line outage detection.« less

  16. Structured Uncertainty Bound Determination From Data for Control and Performance Validation

    NASA Technical Reports Server (NTRS)

    Lim, Kyong B.

    2003-01-01

    This report attempts to document the broad scope of issues that must be satisfactorily resolved before one can expect to methodically obtain, with a reasonable confidence, a near-optimal robust closed loop performance in physical applications. These include elements of signal processing, noise identification, system identification, model validation, and uncertainty modeling. Based on a recently developed methodology involving a parameterization of all model validating uncertainty sets for a given linear fractional transformation (LFT) structure and noise allowance, a new software, Uncertainty Bound Identification (UBID) toolbox, which conveniently executes model validation tests and determine uncertainty bounds from data, has been designed and is currently available. This toolbox also serves to benchmark the current state-of-the-art in uncertainty bound determination and in turn facilitate benchmarking of robust control technology. To help clarify the methodology and use of the new software, two tutorial examples are provided. The first involves the uncertainty characterization of a flexible structure dynamics, and the second example involves a closed loop performance validation of a ducted fan based on an uncertainty bound from data. These examples, along with other simulation and experimental results, also help describe the many factors and assumptions that determine the degree of success in applying robust control theory to practical problems.

  17. The Interplay between Uncertainty Monitoring and Working Memory: Can Metacognition Become Automatic?

    PubMed Central

    Coutinho, Mariana V. C.; Redford, Joshua S.; Church, Barbara A.; Zakrzewski, Alexandria C.; Couchman, Justin J.; Smith, J. David

    2016-01-01

    The uncertainty response has grounded the study of metacognition in nonhuman animals. Recent research has explored the processes supporting uncertainty monitoring in monkeys. It revealed that uncertainty responding in contrast to perceptual responding depends on significant working memory resources. The aim of the present study was to expand this research by examining whether uncertainty monitoring is also working memory demanding in humans. To explore this issue, human participants were tested with or without a cognitive load on a psychophysical discrimination task including either an uncertainty response (allowing the decline of difficult trials) or a middle-perceptual response (labeling the same intermediate trial levels). The results demonstrated that cognitive load reduced uncertainty responding, but increased middle responding. However, this dissociation between uncertainty and middle responding was only observed when participants either lacked training or had very little training with the uncertainty response. If more training was provided, the effect of load was small. These results suggest that uncertainty responding is resource demanding, but with sufficient training, human participants can respond to uncertainty either by using minimal working memory resources or effectively sharing resources. These results are discussed in relation to the literature on animal and human metacognition. PMID:25971878

  18. A risk based approach for SSTO/TSTO comparisons

    NASA Astrophysics Data System (ADS)

    Greenberg, Joel S.

    1996-03-01

    An approach has been developed for performing early comparisons of transportation architectures explicitly taking into account quantitative measures of uncertainty and resulting risk. Risk considerations are necessary since the transportation systems are likely to have significantly different levels of risk, both because of differing degrees of freedom in achieving desired performance levels and their different states of development and utilization. The approach considers the uncertainty of achievement of technology goals, effect that the achieved technology level will have on transportation system performance and the relationship between system performance/capability and the ability to accommodate variations in payload mass. The consequences of system performance are developed in terms of nonrecurring, recurring, and the present value of transportation system life cycle costs.

  19. Crossing the Technology Adoption Chasm in the Presence of Network Externalities: Implications for DoD

    DTIC Science & Technology

    2007-06-01

    Innovations. New York: The Free Press. Rohlfs, J. (2001). Bandwagon Effects in High-Technology Industries. Massachusetts Institute of Technology...adoption. It focuses on cost and benefit uncertainty as well as network effects applied to end- users and their organizations. Specifically, it...as network effects applied to end- users and their organizations. Specifically, it explores Department of Defense (DoD) acquisition programs

  20. A Model for Determining Optimal Governance Structure in DoD Acquisition Projects in a Performance-Based Environment

    DTIC Science & Technology

    2011-03-09

    task stability, technology application certainty, risk, and transaction-specific investments impact the selection of the optimal mode of governance...technology application certainty, risk, and transaction-specific investments impact the selection of the optimal mode of governance. Our model views...U.S. Defense Industry. The 1990s were a perfect storm of technological change, consolidation , budget downturns, environmental uncertainty, and the

Top