Science.gov

Sample records for address remaining uncertainties

  1. Intolerance of uncertainty in emotional disorders: What uncertainties remain?

    PubMed

    Shihata, Sarah; McEvoy, Peter M; Mullan, Barbara Ann; Carleton, R Nicholas

    2016-06-01

    The current paper presents a future research agenda for intolerance of uncertainty (IU), which is a transdiagnostic risk and maintaining factor for emotional disorders. In light of the accumulating interest and promising research on IU, it is timely to emphasize the theoretical and therapeutic significance of IU, as well as to highlight what remains unknown about IU across areas such as development, assessment, behavior, threat and risk, and relationships to cognitive vulnerability factors and emotional disorders. The present paper was designed to provide a synthesis of what is known and unknown about IU, and, in doing so, proposes broad and novel directions for future research to address the remaining uncertainties in the literature.

  2. Research strategies for addressing uncertainties

    USGS Publications Warehouse

    Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh; Garfin, Gregg; Jardine, Angela; Merideth, Robert; Black, Mary; LeRoy, Sarah

    2013-01-01

    Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.

  3. Remaining Useful Life Estimation in Prognosis: An Uncertainty Propagation Problem

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    The estimation of remaining useful life is significant in the context of prognostics and health monitoring, and the prediction of remaining useful life is essential for online operations and decision-making. However, it is challenging to accurately predict the remaining useful life in practical aerospace applications due to the presence of various uncertainties that affect prognostic calculations, and in turn, render the remaining useful life prediction uncertain. It is challenging to identify and characterize the various sources of uncertainty in prognosis, understand how each of these sources of uncertainty affect the uncertainty in the remaining useful life prediction, and thereby compute the overall uncertainty in the remaining useful life prediction. In order to achieve these goals, this paper proposes that the task of estimating the remaining useful life must be approached as an uncertainty propagation problem. In this context, uncertainty propagation methods which are available in the literature are reviewed, and their applicability to prognostics and health monitoring are discussed.

  4. Programmatic methods for addressing contaminated volume uncertainties.

    SciTech Connect

    DURHAM, L.A.; JOHNSON, R.L.; RIEMAN, C.R.; SPECTOR, H.L.; Environmental Science Division; U.S. ARMY CORPS OF ENGINEERS BUFFALO DISTRICT

    2007-01-01

    Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the preremedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly contributes to the uncertainty in the overall project cost estimates, especially since excavation and off-site disposal are the primary cost items in soil remedial action projects. The Army Corps of Engineers Buffalo District's experience has been that historical contaminated soil volume estimates developed under the Formerly Utilized Sites Remedial Action Program (FUSRAP) often underestimated the actual volume of subsurface contaminated soils requiring excavation during the course of a remedial activity. In response, the Buffalo District has adopted a variety of programmatic methods for addressing contaminated volume uncertainties. These include developing final status survey protocols prior to remedial design, explicitly estimating the uncertainty associated with volume estimates, investing in predesign data collection to reduce volume uncertainties, and incorporating dynamic work strategies and real-time analytics in predesign characterization and remediation activities. This paper describes some of these experiences in greater detail, drawing from the knowledge gained at Ashland1, Ashland2, Linde, and Rattlesnake Creek. In the case of Rattlesnake Creek, these approaches provided the Buffalo District with an accurate predesign contaminated volume estimate and resulted in one of the first successful FUSRAP fixed-price remediation contracts for the Buffalo District.

  5. Programmatic methods for addressing contaminated volume uncertainties

    SciTech Connect

    Rieman, C.R.; Spector, H.L.; Durham, L.A.; Johnson, R.L.

    2007-07-01

    Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the pre-remedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly contributes to the uncertainty in the overall project cost estimates, especially since excavation and off-site disposal are the primary cost items in soil remedial action projects. The U.S. Army Corps of Engineers Buffalo District's experience has been that historical contaminated soil volume estimates developed under the Formerly Utilized Sites Remedial Action Program (FUSRAP) often underestimated the actual volume of subsurface contaminated soils requiring excavation during the course of a remedial activity. In response, the Buffalo District has adopted a variety of programmatic methods for addressing contaminated volume uncertainties. These include developing final status survey protocols prior to remedial design, explicitly estimating the uncertainty associated with volume estimates, investing in pre-design data collection to reduce volume uncertainties, and incorporating dynamic work strategies and real-time analytics in pre-design characterization and remediation activities. This paper describes some of these experiences in greater detail, drawing from the knowledge gained at Ashland 1, Ashland 2, Linde, and Rattlesnake Creek. In the case of Rattlesnake Creek, these approaches provided the Buffalo District with an accurate pre-design contaminated volume estimate and resulted in one of the first successful FUSRAP fixed-price remediation contracts for the Buffalo District. (authors)

  6. Addressing biological uncertainties in engineering gene circuits.

    PubMed

    Zhang, Carolyn; Tsoi, Ryan; You, Lingchong

    2016-04-18

    Synthetic biology has grown tremendously over the past fifteen years. It represents a new strategy to develop biological understanding and holds great promise for diverse practical applications. Engineering of a gene circuit typically involves computational design of the circuit, selection of circuit components, and test and optimization of circuit functions. A fundamental challenge in this process is the predictable control of circuit function due to multiple layers of biological uncertainties. These uncertainties can arise from different sources. We categorize these uncertainties into incomplete quantification of parts, interactions between heterologous components and the host, or stochastic dynamics of chemical reactions and outline potential design strategies to minimize or exploit them.

  7. Addressing uncertainty in adaptation planning for agriculture

    PubMed Central

    Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.

    2013-01-01

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681

  8. Addressing uncertainty in adaptation planning for agriculture.

    PubMed

    Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R

    2013-05-21

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.

  9. Remaining uncertainties in the kinetic mechanism of hydrogen combustion

    SciTech Connect

    Konnov, Alexander A.

    2008-03-15

    An analysis of the performance of an updated hydrogen combustion mechanism is presented. Particular attention was paid to different channels of reaction between H atoms and HO{sub 2} radicals, to pressure dependence of the recombination of HO{sub 2} radicals, and to the anomalous rate constant of reaction between OH and HO{sub 2} radicals. The contemporary choice of the reaction rate constants is presented with the emphasis on their uncertainties. Then the predictions of ignition, oxidation, flame burning velocities, and flame structure of hydrogen-oxygen-inert mixtures are shown. The modeling range covers ignition experiments from 950 to 2700 K and from subatmospheric pressures up to 87 atm; hydrogen oxidation in a flow reactor at temperatures around 900 K from 0.3 up to 15.7 atm; flame burning velocities in hydrogen-oxygen-inert mixtures from 0.35 up to 4 atm; and hydrogen flame structure at 1 and 10 atm. Comparison of the modeling and experiments is discussed in terms of the range of applicability of the present detailed mechanism. The necessity for analysis of the mechanism to have an exhaustive list of reactions is emphasized. (author)

  10. Addressing structural and observational uncertainty in resource management.

    PubMed

    Fackler, Paul; Pacifici, Krishna

    2014-01-15

    Most natural resource management and conservation problems are plagued with high levels of uncertainties, which make good decision making difficult. Although some kinds of uncertainties are easily incorporated into decision making, two types of uncertainty present more formidable difficulties. The first, structural uncertainty, represents our imperfect knowledge about how a managed system behaves. The second, observational uncertainty, arises because the state of the system must be inferred from imperfect monitoring systems. The former type of uncertainty has been addressed in ecology using Adaptive Management (AM) and the latter using the Partially Observable Markov Decision Processes (POMDP) framework. Here we present a unifying framework that extends standard POMDPs and encompasses both standard POMDPs and AM. The approach allows any system variable to be observed or not observed and uses any relevant observed variable to update beliefs about unknown variables and parameters. This extends standard AM, which only uses realizations of the state variable to update beliefs and extends standard POMDP by allowing more general stochastic dependence among the observable variables and the state variables. This framework enables both structural and observational uncertainty to be simultaneously modeled. We illustrate the features of the extended POMDP framework with an example.

  11. Remaining uncertainties in the use of Rn-222 as a quantitative tracer of submarine groundwater discharge

    USGS Publications Warehouse

    Burnett, W.C.; Santos, I.R.; Weinstein, Y.; Swarzenski, P.W.; Herut, B.

    2007-01-01

    Research performed in many locations over the past decade has shown that radon is an effective tracer for quantifying submarine groundwater discharge (SGD). The technique works because both fresh and saline groundwaters acquire radon from the subterranean environment and display activities that are typically orders of magnitude greater than those found in coastal seawaters. However, some uncertainties and unanswered problems remain. We focus here on three components of the mass balance, each of which has some unresolved issues: (1) End-member radon - what to do if groundwater Rn measurements are highly variable? (2) Atmospheric evasion -do the standard gas exchange equations work under high-energy coastal mixing scenarios? And (3) "mixing" losses - are there other significant radon losses (e.g. recharge of coastal waters into the aquifer) besides those attributed to mixing with lower-activity waters offshore? We address these issues using data sets collected from several different types of coastal environment. Copyright ?? 2007 IAHS Press.

  12. Adaptively Addressing Uncertainty in Estuarine and Near Coastal Restoration Projects

    SciTech Connect

    Thom, Ronald M.; Williams, Greg D.; Borde, Amy B.; Southard, John A.; Sargeant, Susan L.; Woodruff, Dana L.; Laufle, Jeffrey C.; Glasoe, Stuart

    2005-03-01

    Restoration projects have an uncertain outcome because of a lack of information about current site conditions, historical disturbance levels, effects of landscape alterations on site development, unpredictable trajectories or patterns of ecosystem structural development, and many other factors. A poor understanding of the factors that control the development and dynamics of a system, such as hydrology, salinity, wave energies, can also lead to an unintended outcome. Finally, lack of experience in restoring certain types of systems (e.g., rare or very fragile habitats) or systems in highly modified situations (e.g., highly urbanized estuaries) makes project outcomes uncertain. Because of these uncertainties, project costs can rise dramatically in an attempt to come closer to project goals. All of the potential sources of error can be addressed to a certain degree through adaptive management. The first step is admitting that these uncertainties can exist, and addressing as many of the uncertainties with planning and directed research prior to implementing the project. The second step is to evaluate uncertainties through hypothesis-driven experiments during project implementation. The third step is to use the monitoring program to evaluate and adjust the project as needed to improve the probability of the project to reach is goal. The fourth and final step is to use the information gained in the project to improve future projects. A framework that includes a clear goal statement, a conceptual model, and an evaluation framework can help in this adaptive restoration process. Projects and programs vary in their application of adaptive management in restoration, and it is very difficult to be highly prescriptive in applying adaptive management to projects that necessarily vary widely in scope, goal, ecosystem characteristics, and uncertainties. Very large ecosystem restoration programs in the Mississippi River delta (Coastal Wetlands Planning, Protection, and Restoration

  13. Uncertainty Quantification in Remaining Useful Life of Aerospace Components using State Space Models and Inverse FORM

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    This paper investigates the use of the inverse first-order reliability method (inverse- FORM) to quantify the uncertainty in the remaining useful life (RUL) of aerospace components. The prediction of remaining useful life is an integral part of system health prognosis, and directly helps in online health monitoring and decision-making. However, the prediction of remaining useful life is affected by several sources of uncertainty, and therefore it is necessary to quantify the uncertainty in the remaining useful life prediction. While system parameter uncertainty and physical variability can be easily included in inverse-FORM, this paper extends the methodology to include: (1) future loading uncertainty, (2) process noise; and (3) uncertainty in the state estimate. The inverse-FORM method has been used in this paper to (1) quickly obtain probability bounds on the remaining useful life prediction; and (2) calculate the entire probability distribution of remaining useful life prediction, and the results are verified against Monte Carlo sampling. The proposed methodology is illustrated using a numerical example.

  14. Addressing uncertainty in rock properties through geostatistical simulation

    SciTech Connect

    McKenna, S.A.; Rautman, A.; Cromer, M.V.; Zelinski, W.P.

    1996-09-01

    Fracture and matrix properties in a sequence of unsaturated, welded tuffs at Yucca Mountain, Nevada, are modeled in two-dimensional cross-sections through geostatistical simulation. In the absence of large amounts of sample data, an n interpretive, deterministic, stratigraphic model is coupled with a gaussian simulation algorithm to constrain realizations of both matrix porosity and fracture frequency. Use of the deterministic, stratigraphic model imposes scientific judgment, in the form of a conceptual geologic model, onto the property realizations. Linear coregionalization and a regression relationship between matrix porosity and matrix hydraulic conductivity are used to generate realizations of matrix hydraulic conductivity. Fracture-frequency simulations conditioned on the stratigraphic model represent one class of fractures (cooling fractures) in the conceptual model of the geology. A second class of fractures (tectonic fractures) is conceptualized as fractures that cut across strata vertically and includes discrete features such as fault zones. Indicator geostatistical simulation provides locations of this second class of fractures. The indicator realizations are combined with the realizations of fracture spacing to create realizations of fracture frequency that are a combination of both classes of fractures. Evaluations of the resulting realizations include comparing vertical profiles of rock properties within the model to those observed in boreholes and checking intra-unit property distributions against collected data. Geostatistical simulation provides an efficient means of addressing spatial uncertainty in dual continuum rock properties.

  15. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false How will NIOSH address uncertainty about dose... § 82.19 How will NIOSH address uncertainty about dose levels? The estimate of each annual dose will be characterized with a probability distribution that accounts for the uncertainty of the estimate....

  16. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false How will NIOSH address uncertainty about dose... § 82.19 How will NIOSH address uncertainty about dose levels? The estimate of each annual dose will be characterized with a probability distribution that accounts for the uncertainty of the estimate....

  17. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false How will NIOSH address uncertainty about dose... § 82.19 How will NIOSH address uncertainty about dose levels? The estimate of each annual dose will be characterized with a probability distribution that accounts for the uncertainty of the estimate....

  18. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false How will NIOSH address uncertainty about dose... § 82.19 How will NIOSH address uncertainty about dose levels? The estimate of each annual dose will be characterized with a probability distribution that accounts for the uncertainty of the estimate....

  19. Analytical algorithms to quantify the uncertainty in remaining useful life prediction

    NASA Astrophysics Data System (ADS)

    Sankararaman, S.; Daigle, M.; Saxena, A.; Goebel, K.

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decision-making. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the first-order second moment method (FOSM), the first-order reliabilitymethod (FORM), and the inverse first-order reliabilitymethod (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  20. Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai

    2013-01-01

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  1. Addressing Uncertainty in Signal Propagation and Sensor Performance Predictions

    DTIC Science & Technology

    2008-11-01

    L. Pettit , Sean Mackay, Matthew S. Lewis, and Peter M. Seman November 2008 C ol d R eg io n s R es ea rc h an d E n gi n ee ri n g La b...U.S. Army Engineer Research and Development Center 72 Lyme Road Hanover, NH 03755-1290 Chris L. Pettit U.S. Naval Academy Aerospace Engineering...outcome. Uncertainty by itself is not a concern unless there is as- sociated, significant risk. Although uncertainty is emphasized in this re- port , we

  2. Addressing uncertainty in fecal indicator bacteria dark inactivation rates.

    PubMed

    Gronewold, Andrew D; Myers, Luke; Swall, Jenise L; Noble, Rachel T

    2011-01-01

    Assessing the potential threat of fecal contamination in surface water often depends on model forecasts which assume that fecal indicator bacteria (FIB, a proxy for the concentration of pathogens found in fecal contamination from warm-blooded animals) are lost or removed from the water column at a certain rate (often referred to as an "inactivation" rate). In efforts to reduce human health risks in these water bodies, regulators enforce limits on easily-measured FIB concentrations, commonly reported as most probable number (MPN) and colony forming unit (CFU) values. Accurate assessment of the potential threat of fecal contamination, therefore, depends on propagating uncertainty surrounding "true" FIB concentrations into MPN and CFU values, inactivation rates, model forecasts, and management decisions. Here, we explore how empirical relationships between FIB inactivation rates and extrinsic factors might vary depending on how uncertainty in MPN values is expressed. Using water samples collected from the Neuse River Estuary (NRE) in eastern North Carolina, we compare Escherichia coli (EC) and Enterococcus (ENT) dark inactivation rates derived from two statistical models of first-order loss; a conventional model employing ordinary least-squares (OLS) regression with MPN values, and a novel Bayesian model utilizing the pattern of positive wells in an IDEXX Quanti-Tray®/2000 test. While our results suggest that EC dark inactivation rates tend to decrease as initial EC concentrations decrease and that ENT dark inactivation rates are relatively consistent across different ENT concentrations, we find these relationships depend upon model selection and model calibration procedures. We also find that our proposed Bayesian model provides a more defensible approach to quantifying uncertainty in microbiological assessments of water quality than the conventional MPN-based model, and that our proposed model represents a new strategy for developing robust relationships between

  3. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    NASA Astrophysics Data System (ADS)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  4. Addressing Uncertainty in Fecal Indicator Bacteria Dark Inactivation Rates

    EPA Science Inventory

    Fecal contamination is a leading cause of surface water quality degradation. Roughly 20% of all total maximum daily load assessments approved by the United States Environmental Protection Agency since 1995, for example, address water bodies with unacceptably high fecal indicator...

  5. Designing Technology to Address Parent Uncertainty in Childhood Cancer.

    PubMed

    Morrison, Caroline F; Szulczewski, Lauren; Strahlendorf, Laura F; Lane, J Blake; Mullins, Larry L; Pai, Ahna L H

    2016-01-01

    The stress and uncertainty created by a child's cancer diagnosis and treatment can affect parent and child functioning. Health technology provides a potential avenue for intervention delivery. Interviews were conducted with parents of children diagnosed with cancer to discover their needs following diagnosis and design a relevant mobile application. Treatment experience was the overarching theme. Subthemes included the emotional response, use of information, and environmental factors. Technology was used primarily to seek out information and communicate with others. Health technologies are gaining popularity and have the potential to be beneficial for patients and families throughout the treatment experience.

  6. Addressing sources of uncertainty in a global terrestrial carbon model

    NASA Astrophysics Data System (ADS)

    Exbrayat, J.; Pitman, A. J.; Zhang, Q.; Abramowitz, G.; Wang, Y.

    2013-12-01

    Several sources of uncertainty exist in the parameterization of the land carbon cycle in current Earth System Models (ESMs). For example, recently implemented interactions between the carbon (C), nitrogen (N) and phosphorus (P) cycles lead to diverse changes in land-atmosphere C fluxes simulated by different models. Further, although soil organic matter decomposition is commonly parameterized as a first-order decay process, the formulation of the microbial response to changes in soil moisture and soil temperature varies tremendously between models. Here, we examine the sensitivity of historical land-atmosphere C fluxes simulated by an ESM to these two major sources of uncertainty. We implement three soil moisture (SMRF) and three soil temperature (STRF) respiration functions in the CABLE-CASA-CNP land biogeochemical component of the coarse resolution CSIRO Mk3L climate model. Simulations are undertaken using three degrees of biogeochemical nutrient limitation: C-only, C and N, and C and N and P. We first bring all 27 possible combinations of a SMRF with a STRF and a biogeochemical mode to a steady-state in their biogeochemical pools. Then, transient historical (1850-2005) simulations are driven by prescribed atmospheric CO2 concentrations used in the fifth phase of the Coupled Model Intercomparison Project (CMIP5). Similarly to some previously published results, representing N and P limitation on primary production reduces the global land carbon sink while some regions become net C sources over the historical period (1850-2005). However, the uncertainty due to the SMRFs and STRFs does not decrease relative to the inter-annual variability in net uptake when N and P limitations are added. Differences in the SMRFs and STRFs and their effect on the soil C balance can also change the sign of some regional sinks. We show that this response is mostly driven by the pool size achieved at the end of the spin-up procedure. Further, there exists a six-fold range in the level

  7. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false How will NIOSH address uncertainty about dose levels? 82.19 Section 82.19 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES METHODS FOR CONDUCTING DOSE RECONSTRUCTION...

  8. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    SciTech Connect

    Cooke, Roger; MacDonell, Margaret

    2007-07-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  9. Addressing rainfall data selection uncertainty using connections between rainfall and streamflow.

    PubMed

    Levy, Morgan C; Cohn, Avery; Lopes, Alan Vaz; Thompson, Sally E

    2017-03-16

    Studies of the hydroclimate at regional scales rely on spatial rainfall data products, derived from remotely-sensed (RS) and in-situ (IS, rain gauge) observations. Because regional rainfall cannot be directly measured, spatial data products are biased. These biases pose a source of uncertainty in environmental analyses, attributable to the choices made by data-users in selecting a representation of rainfall. We use the rainforest-savanna transition region in Brazil to show differences in the statistics describing rainfall across nine RS and interpolated-IS daily rainfall datasets covering the period of 1998-2013. These differences propagate into estimates of temporal trends in monthly rainfall and descriptive hydroclimate indices. Rainfall trends from different datasets are inconsistent at river basin scales, and the magnitude of index differences is comparable to the estimated bias in global climate model projections. To address this uncertainty, we evaluate the correspondence of different rainfall datasets with streamflow from 89 river basins. We demonstrate that direct empirical comparisons between rainfall and streamflow provide a method for evaluating rainfall dataset performance across multiple areal (basin) units. These results highlight the need for users of rainfall datasets to quantify this "data selection uncertainty" problem, and either justify data use choices, or report the uncertainty in derived results.

  10. Key outcomes and addressing remaining challenges--perspectives from a final evaluation of the China GAVI project.

    PubMed

    Yang, Weizhong; Liang, Xiaofeng; Cui, Fuqiang; Li, Li; Hadler, Stephen C; Hutin, Yvan J; Kane, Mark; Wang, Yu

    2013-12-27

    During the China GAVI project, implemented between 2002 and 2010, more than 25 million children received hepatitis B vaccine with the support of project, and the vaccine proved to be safe and effective. With careful consideration for project savings, China and GAVI continually adjusted the budget, additionally allowing the project to spend operational funds to support demonstration projects to improve timely birth dose (TBD), conduct training of EPI staff, and to monitor the project impact. Results from the final evaluation indicated the achievement of key outcomes. As a result of government co-investment, human resources at county level engaged in hepatitis B vaccination increased from 29 per county on average in 2002 to 66 in 2009. All project counties funded by the GAVI project use auto-disable syringes for hepatitis B vaccination and other vaccines. Surveyed hepatitis B vaccine coverage increased from 71% in 2002 to 93% in 2009 among infants. The HBsAg prevalence declined from 9.67% in 1992 to 0.96% in 2006 among children under 5 years of age. However, several important issues remain: (1) China still accounts for the largest annual number of perinatal HBV infections (estimated 84,121) in the WHO WPR region; (2) China still lacks a clear national policy for safe injection of vaccines; (3) vaccination of high risk adults and protection of health care workers are still not implemented; (4) hepatitis B surveillance needs to be refined to more accurately monitor acute hepatitis B; and (5) a program for treatment of persons with chronic HBV infection is needed. Recommendations for future hepatitis B control include: using the lessons learned from the China GAVI project for future introductions of new vaccines; addressing unmet needs with a second generation hepatitis B program to reach every infant, including screening mothers, and providing HBIG for infants born to HBsAg positive mothers; expanding vaccination to high risk adults; addressing remaining unsafe

  11. Empirical estimates to reduce modeling uncertainties of soil organic carbon in permafrost regions: a review of recent progress and remaining challenges

    USGS Publications Warehouse

    Mishra, U.; Jastrow, J.D.; Matamala, R.; Hugelius, G.; Koven, C.D.; Harden, J.W.; Ping, S.L.; Michaelson, G.J.; Fan, Z.; Miller, R.M.; McGuire, A.D.; Tarnocai, C.; Kuhry, P.; Riley, W.J.; Schaefer, K.; Schuur, E.A.G.; Jorgenson, M.T.; Hinzman, L.D.

    2013-01-01

    The vast amount of organic carbon (OC) stored in soils of the northern circumpolar permafrost region is a potentially vulnerable component of the global carbon cycle. However, estimates of the quantity, decomposability, and combustibility of OC contained in permafrost-region soils remain highly uncertain, thereby limiting our ability to predict the release of greenhouse gases due to permafrost thawing. Substantial differences exist between empirical and modeling estimates of the quantity and distribution of permafrost-region soil OC, which contribute to large uncertainties in predictions of carbon–climate feedbacks under future warming. Here, we identify research challenges that constrain current assessments of the distribution and potential decomposability of soil OC stocks in the northern permafrost region and suggest priorities for future empirical and modeling studies to address these challenges.

  12. Simulating microbial systems: addressing model uncertainty/incompleteness via multiscale and entropy methods.

    PubMed

    Singharoy, A; Joshi, H; Cheluvaraja, S; Miao, Y; Brown, D; Ortoleva, P

    2012-01-01

    Most systems of interest in the natural and engineering sciences are multiscale in character. Typically available models are incomplete or uncertain. Thus, a probabilistic approach is required. We present a deductive multiscale approach to address such problems, focusing on virus and cell systems to demonstrate the ideas. There is usually an underlying physical model, all factors in which (e.g., particle masses, charges, and force constants) are known. For example, the underlying model can be cast in terms of a collection of N-atoms evolving via Newton's equations. When the number of atoms is 10(6) or more, these physical models cannot be simulated directly. However, one may only be interested in a coarse-grained description, e.g., in terms of molecular populations or overall system size, shape, position, and orientation. The premise of this chapter is that the coarse-grained equations should be derived from the underlying model so that a deductive calibration-free methodology is achieved. We consider a reduction in resolution from a description for the state of N-atoms to one in terms of coarse-grained variables. This implies a degree of uncertainty in the underlying microstates. We present a methodology for modeling microbial systems that integrates equations for coarse-grained variables with a probabilistic description of the underlying fine-scale ones. The implementation of our strategy as a general computational platform (SimEntropics™) for microbial modeling and prospects for developments and applications are discussed.

  13. Research approaches to address uncertainties in the risk assessment of arsenic in drinking water

    SciTech Connect

    Hughes, Michael F. Kenyon, Elaina M.; Kitchin, Kirk T.

    2007-08-01

    Inorganic arsenic (iAs), an environmental drinking water contaminant, is a human toxicant and carcinogen. The public health community has developed recommendations and regulations that limit human exposure to iAs in drinking water. Although there is a vast amount of information available to regulators on the exposure, disposition and the health-related effects of iAs, there is still critical information about the toxicology of this metalloid that is needed. This necessary information includes identification of the chemical species of arsenic that is (are) the active toxicant(s), the mode(s) of action for its various toxicities and information on potentially susceptible populations. Because of these unknown factors, the risk assessment of iAs still incorporates default assumptions, leading to uncertainties in the overall assessment. The characteristics of a scientifically defensible risk assessment for iAs are that it must: (1) quantitatively link exposure and target tissue dose of active metabolites to key events in the mode of action for major health effects and (2) identify sources of variation in susceptibility to arsenic-induced health effects and quantitatively evaluate their impact wherever possible. Integration of research to address these goals will better protect the health of iAs-exposed populations.

  14. Application of fuzzy system theory in addressing the presence of uncertainties

    SciTech Connect

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-03

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  15. An integrated approach for addressing uncertainty in the delineation of groundwater management areas

    NASA Astrophysics Data System (ADS)

    Sousa, Marcelo R.; Frind, Emil O.; Rudolph, David L.

    2013-05-01

    Uncertainty is a pervasive but often poorly understood factor in the delineation of wellhead protection areas (WHPAs), which can discourage water managers and practitioners from relying on model results. To make uncertainty more understandable and thereby remove a barrier to the acceptance of models in the WHPA context, we present a simple approach for dealing with uncertainty. The approach considers two spatial scales for representing uncertainty: local and global. At the local scale, uncertainties are assumed to be due to heterogeneities, and a capture zone is expressed in terms of a capture probability plume. At the global scale, uncertainties are expressed through scenario analysis, using a limited number of physically realistic scenarios. The two scales are integrated by using the precautionary principle to merge the individual capture probability plumes corresponding to the different scenarios. The approach applies to both wellhead protection and the mitigation of contaminated aquifers, or in general, to groundwater management areas. An example relates to the WHPA for a supply well located in a complex glacial aquifer system in southwestern Ontario, where we focus on uncertainty due to the spatial distributions of recharge. While different recharge scenarios calibrate equally well to the same data, they result in different capture probability plumes. Using the precautionary approach, the different plumes are merged into two types of maps delineating groundwater management areas for either wellhead protection or aquifer mitigation. The study shows that calibrations may be non-unique, and that finding a "best" model on the basis of the calibration fit may not be possible.

  16. An integrated approach for addressing uncertainty in the delineation of groundwater management areas.

    PubMed

    Sousa, Marcelo R; Frind, Emil O; Rudolph, David L

    2013-05-01

    Uncertainty is a pervasive but often poorly understood factor in the delineation of wellhead protection areas (WHPAs), which can discourage water managers and practitioners from relying on model results. To make uncertainty more understandable and thereby remove a barrier to the acceptance of models in the WHPA context, we present a simple approach for dealing with uncertainty. The approach considers two spatial scales for representing uncertainty: local and global. At the local scale, uncertainties are assumed to be due to heterogeneities, and a capture zone is expressed in terms of a capture probability plume. At the global scale, uncertainties are expressed through scenario analysis, using a limited number of physically realistic scenarios. The two scales are integrated by using the precautionary principle to merge the individual capture probability plumes corresponding to the different scenarios. The approach applies to both wellhead protection and the mitigation of contaminated aquifers, or in general, to groundwater management areas. An example relates to the WHPA for a supply well located in a complex glacial aquifer system in southwestern Ontario, where we focus on uncertainty due to the spatial distributions of recharge. While different recharge scenarios calibrate equally well to the same data, they result in different capture probability plumes. Using the precautionary approach, the different plumes are merged into two types of maps delineating groundwater management areas for either wellhead protection or aquifer mitigation. The study shows that calibrations may be non-unique, and that finding a "best" model on the basis of the calibration fit may not be possible.

  17. Fiscal Year 2015 U.S. Government Financial Statements: Need to Address the Governments Remaining Financial Management Challenges and Long Term Fiscal Path

    DTIC Science & Technology

    2016-04-06

    FINANCIAL STATEMENTS Need to Address the Government’s Remaining Financial Management Challenges and Long- Term Fiscal Path Statement of Gene L. Dodaro... Management Challenges and Long-Term Fiscal Path Why GAO Did This Study Congress and the President need reliable, useful, and timely financial and...discusses the federal government’s remaining financial management challenges and long-term fiscal path, specifically in the context of GAO’s report on

  18. Addressing Uncertainty in Contaminant Transport in Groundwater Using the Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Dwivedi, D.; Mohanty, B. P.

    2011-12-01

    Nitrate in groundwater shows significant uncertainty which arises from sparse data and interaction among multiple geophysical factors such as source availability (land use), thickness and composition of the vadose zone, types of aquifers (confined or unconfined), aquifer heterogeneity (geological and alluvial), precipitation characteristics, etc. This work presents the fusion of the ensemble Kalman filter (EnKF) with the numerical groundwater flow model MODFLOW and the solute transport model MT3DMS. The EnKF is a sequential data assimilation approach, which is applied to quantify and reduce the uncertainty of groundwater flow and solute transport models. We conducted numerical simulation experiments for the period January 1990 to December 2005 with MODFLOW and MT3DMS models for variably saturated groundwater flow in various aquifers across Texas. The EnKF was used to update the model parameters, hydraulic conductivity, hydraulic head and solute concentration. Results indicate that the EnKF method notably improves the estimation of the hydraulic conductivity distribution and solute transport prediction by assimilating piezometric head measurements with a known nitrate initial condition. A better estimation of hydraulic conductivity and assimilation of continuous measurements of solute concentrations resulted in reduced uncertainty in MODFLOW and MT3DMS models. It was found that the observation locations and locations in spatial proximity were appropriately corrected by the EnKF. The knowledge of nitrate plume evolution provided an insight into model structure, parameters, and sources of uncertainty.

  19. Life-cycle greenhouse gas assessment of Nigerian liquefied natural gas addressing uncertainty.

    PubMed

    Safaei, Amir; Freire, Fausto; Henggeler Antunes, Carlos

    2015-03-17

    Natural gas (NG) has been regarded as a bridge fuel toward renewable sources and is expected to play a greater role in future global energy mix; however, a high degree of uncertainty exists concerning upstream (well-to-tank, WtT) greenhouse gas (GHG) emissions of NG. In this study, a life-cycle (LC) model is built to assess uncertainty in WtT GHG emissions of liquefied NG (LNG) supplied to Europe by Nigeria. The 90% prediction interval of GHG intensity of Nigerian LNG was found to range between 14.9 and 19.3 g CO2 eq/MJ, with a mean value of 16.8 g CO2 eq/MJ. This intensity was estimated considering no venting practice in Nigerian fields. The mean estimation can shift up to 25 g CO2 eq when considering a scenario with a higher rate of venting emissions. A sensitivity analysis of the time horizon to calculate GHG intensity was also performed showing that higher GHG intensity and uncertainty are obtained for shorter time horizons, due to the higher impact factor of methane. The uncertainty calculated for Nigerian LNG, specifically regarding the gap of data for methane emissions, recommends initiatives to measure and report emissions and further LC studies to identify hotspots to reduce the GHG intensity of LNG chains.

  20. Cost effectiveness of antimicrobial catheters in the intensive care unit: addressing uncertainty in the decision

    PubMed Central

    Halton, Kate A; Cook, David A; Whitby, Michael; Paterson, David L; Graves, Nicholas

    2009-01-01

    Introduction Some types of antimicrobial-coated central venous catheters (A-CVC) have been shown to be cost effective in preventing catheter-related bloodstream infection (CR-BSI). However, not all types have been evaluated, and there are concerns over the quality and usefulness of these earlier studies. There is uncertainty amongst clinicians over which, if any, A-CVCs to use. We re-evaluated the cost effectiveness of all commercially available A-CVCs for prevention of CR-BSI in adult intensive care unit (ICU) patients. Methods We used a Markov decision model to compare the cost effectiveness of A-CVCs relative to uncoated catheters. Four catheter types were evaluated: minocycline and rifampicin (MR)-coated catheters, silver, platinum and carbon (SPC)-impregnated catheters, and two chlorhexidine and silver sulfadiazine-coated catheters; one coated on the external surface (CH/SSD (ext)) and the other coated on both surfaces (CH/SSD (int/ext)). The incremental cost per quality-adjusted life year gained and the expected net monetary benefits were estimated for each. Uncertainty arising from data estimates, data quality and heterogeneity was explored in sensitivity analyses. Results The baseline analysis, with no consideration of uncertainty, indicated all four types of A-CVC were cost-saving relative to uncoated catheters. MR-coated catheters prevented 15 infections per 1,000 catheters and generated the greatest health benefits, 1.6 quality-adjusted life years, and cost savings (AUD $130,289). After considering uncertainty in the current evidence, the MR-coated catheters returned the highest incremental monetary net benefits of AUD $948 per catheter; however there was a 62% probability of error in this conclusion. Although the MR-coated catheters had the highest monetary net benefits across multiple scenarios, the decision was always associated with high uncertainty. Conclusions Current evidence suggests that the cost effectiveness of using A-CVCs within the ICU is

  1. LCA of emerging technologies: addressing high uncertainty on inputs' variability when performing global sensitivity analysis.

    PubMed

    Lacirignola, Martino; Blanc, Philippe; Girard, Robin; Pérez-López, Paula; Blanc, Isabelle

    2017-02-01

    In the life cycle assessment (LCA) context, global sensitivity analysis (GSA) has been identified by several authors as a relevant practice to enhance the understanding of the model's structure and ensure reliability and credibility of the LCA results. GSA allows establishing a ranking among the input parameters, according to their influence on the variability of the output. Such feature is of high interest in particular when aiming at defining parameterized LCA models. When performing a GSA, the description of the variability of each input parameter may affect the results. This aspect is critical when studying new products or emerging technologies, where data regarding the model inputs are very uncertain and may cause misleading GSA outcomes, such as inappropriate input rankings. A systematic assessment of this sensitivity issue is now proposed. We develop a methodology to analyze the sensitivity of the GSA results (i.e. the stability of the ranking of the inputs) with respect to the description of such inputs of the model (i.e. the definition of their inherent variability). With this research, we aim at enriching the debate on the application of GSA to LCAs affected by high uncertainties. We illustrate its application with a case study, aiming at the elaboration of a simple model expressing the life cycle greenhouse gas emissions of enhanced geothermal systems (EGS) as a function of few key parameters. Our methodology allows identifying the key inputs of the LCA model, taking into account the uncertainty related to their description.

  2. Towards a common oil spill risk assessment framework – Adapting ISO 31000 and addressing uncertainties.

    PubMed

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio; Janeiro, Joao; Samaras, Achilleas; Zodiatis, George; De Dominicis, Michela

    2015-08-15

    Oil spills are a transnational problem, and establishing a common standard methodology for Oil Spill Risk Assessments (OSRAs) is thus paramount in order to protect marine environments and coastal communities. In this study we firstly identified the strengths and weaknesses of the OSRAs carried out in various parts of the globe. We then searched for a generic and recognized standard, i.e. ISO 31000, in order to design a method to perform OSRAs in a scientific and standard way. The new framework was tested for the Lebanon oil spill that occurred in 2006 employing ensemble oil spill modeling to quantify the risks and uncertainties due to unknown spill characteristics. The application of the framework generated valuable visual instruments for the transparent communication of the risks, replacing the use of risk tolerance levels, and thus highlighting the priority areas to protect in case of an oil spill.

  3. Educated Guesses and Other Ways to Address the Pharmacological Uncertainty of Designer Drugs

    PubMed Central

    Berning, Moritz

    2016-01-01

    This study examines how experimentation with designer drugs is mediated by the Internet. We selected a popular drug forum that presents reports on self-experimentation with little or even completely unexplored designer drugs to examine: (1) how participants report their “trying out” of new compounds and (2) how participants reduce the pharmacological uncertainty associated with using these substances. Our methods included passive observation online, engaging more actively with the online community using an avatar, and off-line interviews with key interlocutors to validate our online findings. This article reflects on how forum participants experiment with designer drugs, their trust in suppliers and the testimonials of others, the use of ethno-scientific techniques that involve numerical weighing, “allergy dosing,” and the use of standardized trip reports. We suggest that these techniques contribute to a sense of control in the face of the possible toxicity of unknown or little-known designer drugs. The online reporting of effects allows users to experience not only the thrill of a new kind of high but also connection with others in the self-experimenting drug community. PMID:27721526

  4. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  5. The Role of Health Education in Addressing Uncertainty about Health and Cell Phone Use--A Commentary

    ERIC Educational Resources Information Center

    Ratnapradipa, Dhitinut; Dundulis, William P., Jr.; Ritzel, Dale O.; Haseeb, Abdul

    2012-01-01

    Although the fundamental principles of health education remain unchanged, the practice of health education continues to evolve in response to the rapidly changing lifestyles and technological advances. Emerging health risks are often associated with these lifestyle changes. The purpose of this article is to address the role of health educators…

  6. Addressing Stability Robustness, Period Uncertainties, and Startup of Multiple-Period Repetitive Control for Spacecraft Jitter Mitigation

    NASA Astrophysics Data System (ADS)

    Ahn, Edwin S.

    Repetitive Control (RC) is a relatively new form of control that seeks to converge to zero tracking error when executing a periodic command, or when executing a constant command in the presence of a periodic disturbance. The design makes use of knowledge of the period of the disturbance or command, and makes use of the error observed in the previous period to update the command in the present period. The usual RC approaches address one period, and this means that potentially they can simultaneously address DC or constant error, the fundamental frequency for that period, and all harmonics up to Nyquist frequency. Spacecraft often have multiple sources of periodic excitation. Slight imbalance in reaction wheels used for attitude control creates three disturbance periods. A special RC structure was developed to allow one to address multiple unrelated periods which is referred to as Multiple-Period Repetitive Control (MPRC). MPRC in practice faces three main challenges for hardware implementation. One is instability due to model errors or parasitic high frequency modes, the second is degradation of the final error level due to period uncertainties or fluctuations, and the third is bad transients due to issues in startup. Regarding these three challenges, the thesis develops a series of methods to enhance the performance of MPRC or to assist in analyzing its performance for mitigating optical jitter induced by mechanical vibration within the structure of a spacecraft testbed. Experimental analysis of MPRC shows contrasting advantages over existing adaptive control algorithms, such as Filtered-X LMS, Adaptive Model Predictive Control, and Adaptive Basis Method, for mitigating jitter within the transmitting beam of Laser Communication (LaserCom) satellites.

  7. Addressing the impact of environmental uncertainty in plankton model calibration with a dedicated software system: the Marine Model Optimization Testbed (MarMOT)

    NASA Astrophysics Data System (ADS)

    Hemmings, J. C. P.; Challenor, P. G.

    2011-08-01

    A wide variety of different marine plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. The Marine Model Optimization Testbed is a new software tool designed for rigorous analysis of plankton models in a multi-site 1-D framework, in particular to address uncertainty issues in model assessment. A flexible user interface ensures its suitability to more general inter-comparison, sensitivity and uncertainty analyses, including model comparison at the level of individual processes, and to state estimation for specific locations. The principal features of MarMOT are described and its application to model calibration is demonstrated by way of a set of twin experiments, in which synthetic observations are assimilated in an attempt to recover the true parameter values of a known system. The experimental aim is to investigate the effect of different misfit weighting schemes on parameter recovery in the presence of error in the plankton model's environmental input data. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergences of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error over an annual cycle, indicating

  8. Addressing the impact of environmental uncertainty in plankton model calibration with a dedicated software system: the Marine Model Optimization Testbed (MarMOT 1.1 alpha)

    NASA Astrophysics Data System (ADS)

    Hemmings, J. C. P.; Challenor, P. G.

    2012-04-01

    A wide variety of different plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. Parameter uncertainty has been widely addressed by calibrating models at data-rich ocean sites. However, relatively little attention has been given to quantifying uncertainty in the physical fields required by the plankton models at these sites, and tendencies in the biogeochemical properties due to the effects of horizontal processes are often neglected. Here we use model twin experiments, in which synthetic data are assimilated to estimate a system's known "true" parameters, to investigate the impact of error in a plankton model's environmental input data. The experiments are supported by a new software tool, the Marine Model Optimization Testbed, designed for rigorous analysis of plankton models in a multi-site 1-D framework. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergence tendencies of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error variance over an annual cycle, indicating variation in the significance attributable to individual model-data differences. An inverse scheme using ensemble-based estimates of the

  9. Eliciting climate experts' knowledge to address model uncertainties in regional climate projections: a case study of Guanacaste, Northwest Costa Rica

    NASA Astrophysics Data System (ADS)

    Grossmann, I.; Steyn, D. G.

    2014-12-01

    Global general circulation models typically cannot provide the detailed and accurate regional climate information required by stakeholders for climate adaptation efforts, given their limited capacity to resolve the regional topography and changes in local sea surface temperature, wind and circulation patterns. The study region in Northwest Costa Rica has a tropical wet-dry climate with a double-peak wet season. During the dry season the central Costa Rican mountains prevent tropical Atlantic moisture from reaching the region. Most of the annual precipitation is received following the northward migration of the ITCZ in May that allows the region to benefit from moist southwesterly flow from the tropical Pacific. The wet season begins with a short period of "early rains" and is interrupted by the mid-summer drought associated with the intensification and westward expansion of the North Atlantic subtropical high in late June. Model projections for the 21st century indicate a lengthening and intensification of the mid-summer drought and a weakening of the early rains on which current crop cultivation practices rely. We developed an expert elicitation to systematically address uncertainties in the available model projections of changes in the seasonal precipitation pattern. Our approach extends an elicitation approach developed previously at Carnegie Mellon University. Experts in the climate of the study region or Central American climate were asked to assess the mechanisms driving precipitation during each part of the season, uncertainties regarding these mechanisms, expected changes in each mechanism in a warming climate, and the capacity of current models to reproduce these processes. To avoid overconfidence bias, a step-by-step procedure was followed to estimate changes in the timing and intensity of precipitation during each part of the season. The questions drew upon interviews conducted with the regions stakeholders to assess their climate information needs. This

  10. Binary variable multiple-model multiple imputation to address missing data mechanism uncertainty: application to a smoking cessation trial.

    PubMed

    Siddique, Juned; Harel, Ofer; Crespi, Catherine M; Hedeker, Donald

    2014-07-30

    The true missing data mechanism is never known in practice. We present a method for generating multiple imputations for binary variables, which formally incorporates missing data mechanism uncertainty. Imputations are generated from a distribution of imputation models rather than a single model, with the distribution reflecting subjective notions of missing data mechanism uncertainty. Parameter estimates and standard errors are obtained using rules for nested multiple imputation. Using simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal smoking cessation trial where nonignorably missing data were a concern. Our method provides a simple approach for formalizing subjective notions regarding nonresponse and can be implemented using existing imputation software.

  11. Addressing Missing Data Mechanism Uncertainty using Multiple-Model Multiple Imputation: Application to a Longitudinal Clinical Trial.

    PubMed

    Siddique, Juned; Harel, Ofer; Crespi, Catherine M

    2012-12-01

    We present a framework for generating multiple imputations for continuous data when the missing data mechanism is unknown. Imputations are generated from more than one imputation model in order to incorporate uncertainty regarding the missing data mechanism. Parameter estimates based on the different imputation models are combined using rules for nested multiple imputation. Through the use of simulation, we investigate the impact of missing data mechanism uncertainty on post-imputation inferences and show that incorporating this uncertainty can increase the coverage of parameter estimates. We apply our method to a longitudinal clinical trial of low-income women with depression where nonignorably missing data were a concern. We show that different assumptions regarding the missing data mechanism can have a substantial impact on inferences. Our method provides a simple approach for formalizing subjective notions regarding nonresponse so that they can be easily stated, communicated, and compared.

  12. Optimal regeneration planning for old-growth forest: addressing scientific uncertainty in endangered species recovery through adaptive management

    USGS Publications Warehouse

    Moore, C.T.; Conroy, M.J.

    2006-01-01

    Stochastic and structural uncertainties about forest dynamics present challenges in the management of ephemeral habitat conditions for endangered forest species. Maintaining critical foraging and breeding habitat for the endangered red-cockaded woodpecker (Picoides borealis) requires an uninterrupted supply of old-growth forest. We constructed and optimized a dynamic forest growth model for the Piedmont National Wildlife Refuge (Georgia, USA) with the objective of perpetuating a maximum stream of old-growth forest habitat. Our model accommodates stochastic disturbances and hardwood succession rates, and uncertainty about model structure. We produced a regeneration policy that was indexed by current forest state and by current weight of evidence among alternative model forms. We used adaptive stochastic dynamic programming, which anticipates that model probabilities, as well as forest states, may change through time, with consequent evolution of the optimal decision for any given forest state. In light of considerable uncertainty about forest dynamics, we analyzed a set of competing models incorporating extreme, but plausible, parameter values. Under any of these models, forest silviculture practices currently recommended for the creation of woodpecker habitat are suboptimal. We endorse fully adaptive approaches to the management of endangered species habitats in which predictive modeling, monitoring, and assessment are tightly linked.

  13. MODEL EVALUATION SCIENCE TO MEET TODAY'S QUALITY ASSURANCE REQUIREMENTS FOR REGULATORY USE: ADDRESSING UNCERTAINTY, SENSITIVITY, AND PARAMETERIZATION

    EPA Science Inventory

    The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental de...

  14. MEETING IN TUCSON: MODEL EVALUATION SCIENCE TO MEET TODAY'S QUALITY ASSURANCE REQUIREMENTS FOR REGULATORY USE: ADDRESSING UNCERTAINTY, SENSITIVITY, AND PARAMETERIZATION

    EPA Science Inventory

    The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental dec...

  15. Effectiveness and Tradeoffs between Portfolios of Adaptation Strategies Addressing Future Climate and Socioeconomic Uncertainties in California's Central Valley

    NASA Astrophysics Data System (ADS)

    Tansey, M. K.; Van Lienden, B.; Das, T.; Munevar, A.; Young, C. A.; Flores-Lopez, F.; Huntington, J. L.

    2013-12-01

    The Central Valley of California is one of the major agricultural areas in the United States. The Central Valley Project (CVP) is operated by the Bureau of Reclamation to serve multiple purposes including generating approximately 4.3 million gigawatt hours of hydropower and providing, on average, 5 million acre-feet of water per year to irrigate approximately 3 million acres of land in the Sacramento, San Joaquin, and Tulare Lake basins, 600,000 acre-feet per year of water for urban users, and 800,000 acre-feet of annual supplies for environmental purposes. The development of effective adaptation and mitigation strategies requires assessing multiple risks including potential climate changes as well as uncertainties in future socioeconomic conditions. In this study, a scenario-based analytical approach was employed by combining three potential 21st century socioeconomic futures with six representative climate and sea level change projections developed using a transient hybrid delta ensemble method from an archive of 112 bias corrected spatially downscaled CMIP3 global climate model simulations to form 18 future socioeconomic-climate scenarios. To better simulate the effects of climate changes on agricultural water demands, analyses of historical agricultural meteorological station records were employed to develop estimates of future changes in solar radiation and atmospheric humidity from the GCM simulated temperature and precipitation. Projected changes in atmospheric carbon dioxide were computed directly by weighting SRES emissions scenarios included in each representative climate projection. These results were used as inputs to a calibrated crop water use, growth and yield model to simulate the effects of climate changes on the evapotranspiration and yields of major crops grown in the Central Valley. Existing hydrologic, reservoir operations, water quality, hydropower, greenhouse gas (GHG) emissions and both urban and agricultural economic models were integrated

  16. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes

    PubMed Central

    Curtis, Janelle M.R.

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  17. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes.

    PubMed

    Naujokaitis-Lewis, Ilona; Curtis, Janelle M R

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  18. Addressing solar modulation and long-term uncertainties in scaling secondary cosmic rays for in situ cosmogenic nuclide applications [rapid communication

    NASA Astrophysics Data System (ADS)

    Lifton, Nathaniel A.; Bieber, John W.; Clem, John M.; Duldig, Marc L.; Evenson, Paul; Humble, John E.; Pyle, Roger

    2005-10-01

    Solar modulation affects the secondary cosmic rays responsible for in situ cosmogenic nuclide (CN) production the most at the high geomagnetic latitudes to which CN production rates are traditionally referenced. While this has long been recognized (e.g., D. Lal, B. Peters, Cosmic ray produced radioactivity on the Earth, in: K. Sitte (Ed.), Handbuch Der Physik XLVI/2, Springer-Verlag, Berlin, 1967, pp. 551-612 and D. Lal, Theoretically expected variations in the terrestrial cosmic ray production rates of isotopes, in: G.C. Castagnoli (Ed.), Proceedings of the Enrico Fermi International School of Physics 95, Italian Physical Society, Varenna 1988, pp. 216-233), these variations can lead to potentially significant scaling model uncertainties that have not been addressed in detail. These uncertainties include the long-term (millennial-scale) average solar modulation level to which secondary cosmic rays should be referenced, and short-term fluctuations in cosmic ray intensity measurements used to derive published secondary cosmic ray scaling models. We have developed new scaling models for spallogenic nucleons, slow-muon capture and fast-muon interactions that specifically address these uncertainties. Our spallogenic nucleon scaling model, which includes data from portions of 5 solar cycles, explicitly incorporates a measure of solar modulation ( S), and our fast- and slow-muon scaling models (based on more limited data) account for solar modulation effects through increased uncertainties. These models improve on previously published models by better sampling the observed variability in measured cosmic ray intensities as a function of geomagnetic latitude, altitude, and solar activity. Furthermore, placing the spallogenic nucleon data in a consistent time-space framework allows for a more realistic assessment of uncertainties in our model than in earlier ones. We demonstrate here that our models reasonably account for the effects of solar modulation on measured

  19. Procedures for addressing uncertainty and variability in exposure to characterize potential health risk from trichloroethylene contaminated groundwater at Beale Air Force Base in California

    SciTech Connect

    Bogen, K T; Daniels, J I; Hall, L C

    1999-09-01

    This study was designed to accomplish two objectives. The first was to provide to the US Air Force and the regulatory community quantitative procedures that they might want to consider using for addressing uncertainty and variability in exposure to better characterize potential health risk. Such methods could be used at sites where populations may now or in the future be faced with using groundwater contaminated with low concentrations of the chemical trichloroethylene (TCE). The second was to illustrate and explain the application of these procedures with respect to available data for TCE in ground water beneath an inactive landfill site that is undergoing remediation at Beale Air Force Base in California. The results from this illustration provide more detail than the more traditional conservative deterministic, screening-level calculations of risk, also computed for purposes of comparison. Application of the procedures described in this report can lead to more reasonable and equitable risk-acceptability criteria for potentially exposed populations at specific sites.

  20. Procedures for addressing uncertainty and variability in exposure to characterize potential health risk from trichloroethylene contaminated ground water at Beale Air Force Base in California

    SciTech Connect

    Daniels, J I; Bogen, K T; Hall, L C

    1999-10-05

    Conservative deterministic, screening-level calculations of exposure and risk commonly are used in quantitative assessments of potential human-health consequences from contaminants in environmental media. However, these calculations generally are based on multiple upper-bound point estimates of input parameters, particularly for exposure attributes, and can therefore produce results for decision makers that actually overstate the need for costly remediation. Alternatively, a more informative and quantitative characterization of health risk can be obtained by quantifying uncertainty and variability in exposure. This process is illustrated in this report for a hypothetical population at a specific site at Beale Air Force Base in California, where there is trichloroethylene (TCE) contaminated ground water and a potential for future residential use. When uncertainty and variability in exposure were addressed jointly for this case, the 95th-percentile upper-bound value of individual excess lifetime cancer risk was a factor approaching 10 lower than the most conservative deterministic estimate. Additionally, the probability of more than zero additional cases of cancer can be estimated, and in this case it is less than 0.5 for a hypothetical future residential population of up to 26,900 individuals present for any 7.6-y interval of a 70-y time period. Clearly, the results from application of this probabilistic approach can provide reasonable and equitable risk-acceptability criteria for a contaminated site.

  1. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk from Trichloroethylene-Contaminated Ground Water at Beale Air Force Base in California:Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response

    SciTech Connect

    Bogen, K T

    2001-05-24

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability within a systematic probabilistic framework to integrate the joint effects on risk of distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such a framework was used to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA{sub G}) vs. a cytotoxic/nonlinear (MA{sub c}) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA{sub G} based on seven rodent-bioassay data sets, and under MA{sub c} based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were <10{sup -6} and 10{sup -4}, respectively, while corresponding estimates based on traditional deterministic methods were >10{sup -5} and 10{sup -4}, respectively. It was estimated that no TCE-related harm is likely to occur due to any plausible residential exposure scenario involving the site. The systematic probabilistic framework illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action.

  2. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk From Trichloroethylene-Contaminated Ground Water Beale Air Force Base in California: Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response

    SciTech Connect

    Bogen, K.T.

    1999-09-29

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability after applying a unified probabilistic approach to the distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such an approach was applied to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA{sub g}) vs. a cytotoxic/nonlinear (MA{sub c}) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA{sub G} based on seven rodent-bioassay data sets, and under MA, based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were <10{sup -6} and <10{sup -4}, respectively, while corresponding estimates based on traditional deterministic methods were >10{sup -5} and >10{sup -4}, respectively. It was estimated that no TCE-related harm is likely occur due any plausible residential exposure scenario involving the site. The unified approach illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action.

  3. Propellant-remaining modeling

    NASA Technical Reports Server (NTRS)

    Torgovitsky, S.

    1991-01-01

    A successful satellite mission is predicted upon the proper maintenance of the spacecraft's orbit and attitude. One requirement for planning and predicting the orbit and attitude is the accurate estimation of the propellant remaining onboard the spacecraft. Focuss is on the three methods that were developed for calculating the propellant budget: the errors associated with each method and the uncertainties in the variables required to determine the propellant remaining that contribute to these errors. Based on these findings, a strategy is developed for improved propellant-remaining estimation. The first method is based on Boyle's law, which related the values of pressure, volume, and temperature (PVT) of an ideal gas. The PVT method is used for the monopropellant and the bipropellant engines. The second method is based on the engine performance tests, which provide data that relate thrust and specific impulse associated with a propellant tank to that tank's pressure. Two curves representing thrust and specific impulse as functions of pressure are then generated using a polynomial fit on the engine performance data. The third method involves a computer simulation of the propellant system. The propellant flow is modeled by creating a conceptual model of the propulsion system configuration, taking into account such factors as the propellant and pressurant tank characteristics, thruster functionality, and piping layout. Finally, a thrust calibration technique is presented that uses differential correction with the computer simulation method of propellant-remaining modeling. Thrust calibration provides a better assessment of thruster performance and therefore enables a more accurate estimation of propellant consumed during a given maneuver.

  4. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  5. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  6. Changing language, remaining pygmy.

    PubMed

    Bahuchet, Serge

    2012-02-01

    In this article I am illustrating the linguistic diversity of African Pygmy populations in order to better address their anthropological diversity and history. I am also introducing a new method, based on the analysis of specialized vocabulary, to reconstruct the substratum of some languages they speak. I show that Pygmy identity is not based on their languages, which have often been borrowed from neighboring non-Pygmy farmer communities with whom each Pygmy group is linked. Understanding the nature of this partnership, quite variable in history, is essential to addressing Pygmy languages, identity, and history. Finally, I show that only a multidisciplinary approach is likely to push forward the understanding of African Pygmy societies as genetic, archeological, anthropological, and ethnological evidence suggest.

  7. Chemical Loss of Polar Ozone: Present Understanding and Remaining Uncertainties

    NASA Technical Reports Server (NTRS)

    Salawitch, Ross; Canty, Tim; Cunnold, Derek; Dorf, Marcel; Frieler, Katja; Godin-Beekman, Sophie; Newchurch, Michael; Pfeilsticker, Klaus; Rex, Markus; Stimpfle, Rick; Streibel, Martin; vonderGathen, Peter; Weisenstein, Debra; Yan, Eun-Su

    2005-01-01

    Not long after the discovery of the Antarctic ozone hole, it was established that halogen compounds, supplied to the atmosphere mainly by anthropogenic activities, are the primary driver of polar ozone loss. We will briefly review the chemical mechanisms that cause polar ozone loss and the early evidence showing the key role played by anthropogenic halogens. Recently, stratospheric halogen loading has leveled off, due to adherence to the Montreal Protocol and its amendments that has essentially banned CFCs (chlorofluorocarbons) and other halocarbons. We will describe recent reports of the first stage of recovery of the Antarctic ozone hole (e.g., a statistically significant slowing of the downward trend), associated with the leveling off of stratospheric halogens. Despite this degree of understanding, we will discuss the tendency of photochemical models to underestimate the observed rate of polar ozone loss and a hypothesis that has recently been put forth that might resolve this discrepancy. Finally, we will briefly discuss chemical loss of Arctic ozone, which

  8. Chemical Loss of Polar Ozone: Present Understanding and Remaining Uncertainties

    NASA Astrophysics Data System (ADS)

    Salawitch, R. J.; Canty, T.; Cunnold, D.; Dorf, M.; Frieler, K.; Godin-Beekman, S.; Newchurch, M. J.; Pfeilsticker, K.; Rex, M.; Stimpfle, R. M.; Streibel, M.; von der Gathen, P.; Weisenstein, D. K.; Yang, E.

    2005-12-01

    Not long after the discovery of the Antarctic ozone hole, it was established that halogen compounds, supplied to the atmosphere mainly by anthropogenic activities, are the primary driver of polar ozone loss. We will briefly review the chemical mechanisms that cause polar ozone loss and the early evidence showing the key role played by anthropogenic halogens. Recently, stratospheric halogen loading has leveled off, due to adherence to the Montreal Protocol and its amendments that has essentially banned CFCs (chlorofluorocarbons) and other halocarbons. We will describe recent reports of the first stage of recovery of the Antarctic ozone hole (e.g., a statistically significant slowing of the downward trend), associated with the leveling off of stratospheric halogens. Despite this degree of understanding, we will discuss the tendency of photochemical models to underestimate the observed rate of polar ozone loss and a hypothesis that has recently been put forth that might resolve this discrepancy. Finally, we will briefly discuss chemical loss of Arctic ozone, which also involves anthropogenic halogens but is governed primarily by year-to-year variability in stratospheric temperature.

  9. [PALEOPATHOLOGY OF HUMAN REMAINS].

    PubMed

    Minozzi, Simona; Fornaciari, Gino

    2015-01-01

    Many diseases induce alterations in the human skeleton, leaving traces of their presence in ancient remains. Paleopathological examination of human remains not only allows the study of the history and evolution of the disease, but also the reconstruction of health conditions in the past populations. This paper describes the most interesting diseases observed in skeletal samples from the Roman Imperial Age necropoles found in urban and suburban areas of Rome during archaeological excavations in the last decades. The diseases observed were grouped into the following categories: articular diseases, traumas, infections, metabolic or nutritional diseases, congenital diseases and tumours, and some examples are reported for each group. Although extensive epidemiological investigation in ancient skeletal records is impossible, the palaeopathological study allowed to highlight the spread of numerous illnesses, many of which can be related to the life and health conditions of the Roman population.

  10. [What remains of arthrography?].

    PubMed

    Morvan, G

    1994-06-15

    At the time of RMI, arthrography appears sometimes old-fashioned. However this exam, which knows a second youth in relation with the supply of CT-scan (arthro-CT) remains the gold-standard in the exploration of many pathologic situations: intra-articular foreign bodies, tears of glenoid or acetabular labrum, precise assessment of chondral or ligamentous lesions (especially of the ankle), sub-scapularis tendon tears, adhesive capsulitis, complications of prosthesis, appreciation of intra-articular position of the needle's tip before injection of a therapeutic drug. Arthrography, completed or not by CT-slices gives, in this indications, excellent spatial resolution images, easy to perform, to read, to understand and to transmit at the clinicians, with a reasonable cost and a minor risk. RMI is a more and more used alternative, especially for the study of meniscus and ligaments of the knee, and rotator's cuff of the shoulder. It's sure that, with the increase of the RMI image's quality, other common indications will slip towards this technique, but nevertheless at this time (and it seams to me, for a long time) arthrography and arthro-CT will remain an excellent diagnostic tool with a very competitive advantages/inconvenience ratio.

  11. Addressing healthcare.

    PubMed

    Daly, Rich

    2013-02-11

    Though President Barack Obama has rarely made healthcare references in his State of the Union addresses, health policy experts are hoping he changes that strategy this year. "The question is: Will he say anything? You would hope that he would, given that that was the major issue he started his presidency with," says Dr. James Weinstein, left, of the Dartmouth-Hitchcock health system.

  12. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  13. Inaugural address

    NASA Astrophysics Data System (ADS)

    Joshi, P. S.

    2014-03-01

    From jets to cosmos to cosmic censorship P S Joshi Tata Institute of Fundamental Research, Homi Bhabha Road, Colaba, Mumbai 400005, India E-mail: psj@tifr.res.in 1. Introduction At the outset, I should like to acknowledge that part of the title above, which tries to capture the main flavour of this meeting, and has been borrowed from one of the plenary talks at the conference. When we set out to make the programme for the conference, we thought of beginning with observations on the Universe, but then we certainly wanted to go further and address deeper questions, which were at the very foundations of our inquiry, and understanding on the nature and structure of the Universe. I believe, we succeeded to a good extent, and it is all here for you in the form of these Conference Proceedings, which have been aptly titled as 'Vishwa Mimansa', which could be possibly translated as 'Analysis of the Universe'! It is my great pleasure and privilege to welcome you all to the ICGC-2011 meeting at Goa. The International Conference on Gravitation and Cosmology (ICGC) series of meetings are being organized by the Indian Association for General Relativity and Gravitation (IAGRG), and the first such meeting was planned and conducted in Goa in 1987, with subsequent meetings taking place at a duration of about four years at various locations in India. So, it was thought appropriate to return to Goa to celebrate the 25 years of the ICGC meetings. The recollections from that first meeting have been recorded elsewhere here in these Proceedings. The research and teaching on gravitation and cosmology was initiated quite early in India, by V V Narlikar at the Banares Hindu University, and by N R Sen in Kolkata in the 1930s. In course of time, this activity grew and gained momentum, and in early 1969, at the felicitation held for the 60 years of V V Narlikar at a conference in Ahmedabad, P C Vaidya proposed the formation of the IAGRG society, with V V Narlikar being the first President. This

  14. Convocation address.

    PubMed

    Kakodkar, A

    1999-07-01

    This convocation addressed by Dr. Anil Kakodkar focuses on the challenges faced by graduating students. In his speech, he emphasized the high level of excellence achieved by the industrial sector; however, he noted that there has been a loss of initiative in maximizing value addition, which was worsened by an increasing population pressure. In facing a stiff competition in the external and domestic markets, it is imperative to maximize value addition within the country in a competitive manner and capture the highest possible market share. To achieve this, high-quality human resources are central. Likewise, family planning programs should become more effective and direct available resources toward national advantage. To boost the domestic market, he suggests the need to search for strengths to achieve leadership position in those areas. First, an insight into the relationship between the lifestyles and the needs of our people and the natural resource endowment must be gained. Second, remodeling of the education system must be undertaken to prepare the people for adding the necessary innovative content in our value addition activities. Lastly, Dr. Kakodkar emphasizes the significance of developing a strong bond between parents and children to provide a sound foundation and allow the education system to grow upon it.

  15. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  16. Opening address

    NASA Astrophysics Data System (ADS)

    Castagnoli, C.

    1994-01-01

    Ladies and Gentlemen My cordial thanks to you for participating in our workshop and to all those who have sponsored it. When in 1957 I attended the International Congress on Fundamental Constants held in Turin on the occasion of the first centenary of the death of Amedeo Avogadro, I did not expect that about thirty-five years later a small but representative number of distinguished scientists would meet here again, to discuss how to go beyond the sixth decimal figure of the Avogadro constant. At that time, the uncertainty of the value of this constant was linked to the fourth decimal figure, as reported in the book by DuMond and Cohen. The progress made in the meantime is universally acknowledged to be due to the discovery of x-ray interferometry. We are honoured that one of the two founding fathers, Prof. Ulrich Bonse, is here with us, but we regret that the other, Prof. Michael Hart, is not present. After Bonse and Hart's discovery, the x-ray crystal density method triggered, as in a chain reaction, the investigation of two other quantities related to the Avogadro constant—density and molar mass. Scientists became, so to speak, resonant and since then have directed their efforts, just to mention a few examples, to producing near-perfect silicon spheres and determining their density, to calibrating, with increasing accuracy, mass spectrometers, and to studying the degree of homogeneity of silicon specimens. Obviously, I do not need to explain to you why the Avogadro constant is important. I wish, however, to underline that it is not only because of its position among fundamental constants, as we all know very well its direct links with the fine structure constant, the Boltzmann and Faraday constants, the h/e ratio, but also because when a new value of NA is obtained, the whole structure of the fundamental constants is shaken to a lesser or greater extent. Let me also remind you that the second part of the title of this workshop concerns the silicon

  17. Opening Address

    NASA Astrophysics Data System (ADS)

    Yamada, T.

    2014-12-01

    Ladies and Gentlemen, it is my great honor and pleasure to present an opening address of the 3rd International Workshop on "State of the Art in Nuclear Cluster Physics"(SOTANCP3). On the behalf of the organizing committee, I certainly welcome all your visits to KGU Kannai Media Center belonging to Kanto Gakuin University, and stay in Yokohama. In particular, to whom come from abroad more than 17 countries, I would appreciate your participations after long long trips from your homeland to Yokohama. The first international workshop on "State of the Art in Nuclear Cluster Physics", called SOTANCP, was held in Strasbourg, France, in 2008, and the second one was held in Brussels, Belgium, in 2010. Then the third workshop is now held in Yokohama. In this period, we had the traditional 10th cluster conference in Debrecen, Hungary, in 2012. Thus we have the traditional cluster conference and SOTANCP, one after another, every two years. This obviously shows our field of nuclear cluster physics is very active and flourishing. It is for the first time in about 10 years to hold the international workshop on nuclear cluster physics in Japan, because the last cluster conference held in Japan was in Nara in 2003, about 10 years ago. The president in Nara conference was Prof. K. Ikeda, and the chairpersons were Prof. H. Horiuchi and Prof. I. Tanihata. I think, quite a lot of persons in this room had participated at the Nara conference. Since then, about ten years passed. So, this workshop has profound significance for our Japanese colleagues. The subjects of this workshop are to discuss "the state of the art in nuclear cluster physics" and also discuss the prospect of this field. In a couple of years, we saw significant progresses of this field both in theory and in experiment, which have brought better and new understandings on the clustering aspects in stable and unstable nuclei. I think, the concept of clustering has been more important than ever. This is true also in the

  18. Presidential address.

    PubMed

    Vohra, U

    1993-07-01

    The Secretary of India's Ministry of Health and Family Welfare serves as Chair of the Executive Council of the International Institute for Population Sciences in Bombay. She addressed its 35th convocation in 1993. Global population stands at 5.43 billion and increases by about 90 million people each year. 84 million of these new people are born in developing countries. India contributes 17 million new people annually. The annual population growth rate in India is about 2%. Its population size will probably surpass 1 billion by the 2000. High population growth rates are a leading obstacle to socioeconomic development in developing countries. Governments of many developing countries recognize this problem and have expanded their family planning programs to stabilize population growth. Asian countries that have done so and have completed the fertility transition include China, Japan, Singapore, South Korea, and Thailand. Burma, Malaysia, North Korea, Sri Lanka, and Vietnam have not yet completed the transition. Afghanistan, Bangladesh, Iran, Nepal, and Pakistan are half-way through the transition. High population growth rates put pressure on land by fragmenting finite land resources, increasing the number of landless laborers and unemployment, and by causing considerable rural-urban migration. All these factors bring about social stress and burden civic services. India has reduced its total fertility rate from 5.2 to 3.9 between 1971 and 1991. Some Indian states have already achieved replacement fertility. Considerable disparity in socioeconomic development exists among states and districts. For example, the states of Bihar, Madhya Pradesh, Rajasthan, and Uttar Pradesh have female literacy rates lower than 27%, while that for Kerala is 87%. Overall, infant mortality has fallen from 110 to 80 between 1981 and 1990. In Uttar Pradesh, it has fallen from 150 to 98, while it is at 17 in Kerala. India needs innovative approaches to increase contraceptive prevalence rates

  19. Welcome Address

    NASA Astrophysics Data System (ADS)

    Kiku, H.

    2014-12-01

    Ladies and Gentlemen, It is an honor for me to present my welcome address in the 3rd International Workshop on "State of the Art in Nuclear Cluster Physics"(SOTANCP3), as the president of Kanto Gakuin University. Particularly to those from abroad more than 17 countries, I am very grateful for your participation after long long trips from your home to Yokohama. On the behalf of the Kanto Gakuin University, we certainly welcome your visit to our university and stay in Yokohama. First I would like to introduce Kanto Gakuin University briefly. Kanto Gakuin University, which is called KGU, traces its roots back to the Yokohama Baptist Seminary founded in 1884 in Yamate, Yokohama. The seminary's founder was Albert Arnold Bennett, alumnus of Brown University, who came to Japan from the United States to establish a theological seminary for cultivating and training Japanese missionaries. Now KGU is a major member of the Kanto Gakuin School Corporation, which is composed of two kindergartens, two primary schools, two junior high schools, two senior high schools as well as KGU. In this university, we have eight faculties with graduate school including Humanities, Economics, Law, Sciences and Engineering, Architecture and Environmental Design, Human and Environmental Studies, Nursing, and Law School. Over eleven thousands students are currently learning in our university. By the way, my major is the geotechnical engineering, and I belong to the faculty of Sciences and Engineering in my university. Prof. T. Yamada, here, is my colleague in the same faculty. I know that the nuclear physics is one of the most active academic fields in the world. In fact, about half of the participants, namely, more than 50 scientists, come from abroad in this conference. Moreover, I know that the nuclear physics is related to not only the other fundamental physics such as the elementary particle physics and astrophysics but also chemistry, medical sciences, medical cares, and radiation metrology

  20. President's Address

    PubMed Central

    Craig, Maurice

    1928-01-01

    patients may remain in good health and full mental activity for many years under treatment. Research made from this standpoint may be of much value in the prevention and treatment of functional nervous disorder. PMID:19986761

  1. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  2. SU(2) uncertainty limits

    NASA Astrophysics Data System (ADS)

    Shabbir, Saroosh; Björk, Gunnar

    2016-05-01

    Although progress has been made recently in defining nontrivial uncertainty limits for the SU(2) group, a description of the intermediate states bound by these limits remains lacking. In this paper we enumerate possible uncertainty relations for the SU(2) group that involve all three observables and that are, moreover, invariant under SU(2) transformations. We demonstrate that these relations however, even taken as a group, do not provide sharp, saturable bounds. To find sharp bounds, we systematically calculate the variance of the SU(2) operators for all pure states belonging to the N =2 and N =3 polarization excitation manifold (corresponding to spin 1 and spin 3/2). Lastly, and perhaps counter to expectation, we note that even pure states can reach the maximum uncertainty limit.

  3. Parasite remains in archaeological sites.

    PubMed

    Bouchet, Françoise; Guidon, Niéde; Dittmar, Katharina; Harter, Stephanie; Ferreira, Luiz Fernando; Chaves, Sergio Miranda; Reinhard, Karl; Araújo, Adauto

    2003-01-01

    Organic remains can be found in many different environments. They are the most significant source for paleoparasitological studies as well as for other paleoecological reconstruction. Preserved paleoparasitological remains are found from the driest to the moistest conditions. They help us to understand past and present diseases and therefore contribute to understanding the evolution of present human sociality, biology, and behavior. In this paper, the scope of the surviving evidence will be briefy surveyed, and the great variety of ways it has been preserved in different environments will be discussed. This is done to develop to the most appropriated techniques to recover remaining parasites. Different techniques applied to the study of paleoparasitological remains, preserved in different environments, are presented. The most common materials used to analyze prehistoric human groups are reviewed, and their potential for reconstructing ancient environment and disease are emphasized. This paper also urges increased cooperation among archaeologists, paleontologists, and paleoparasitologists.

  4. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained.

  5. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  6. Uncertainty quantification in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Rizzi, Francesco

    This dissertation focuses on uncertainty quantification (UQ) in molecular dynamics (MD) simulations. The application of UQ to molecular dynamics is motivated by the broad uncertainty characterizing MD potential functions and by the complexity of the MD setting, where even small uncertainties can be amplified to yield large uncertainties in the model predictions. Two fundamental, distinct sources of uncertainty are investigated in this work, namely parametric uncertainty and intrinsic noise. Intrinsic noise is inherently present in the MD setting, due to fluctuations originating from thermal effects. Averaging methods can be exploited to reduce the fluctuations, but due to finite sampling, this effect cannot be completely filtered, thus yielding a residual uncertainty in the MD predictions. Parametric uncertainty, on the contrary, is introduced in the form of uncertain potential parameters, geometry, and/or boundary conditions. We address the UQ problem in both its main components, namely the forward propagation, which aims at characterizing how uncertainty in model parameters affects selected observables, and the inverse problem, which involves the estimation of target model parameters based on a set of observations. The dissertation highlights the challenges arising when parametric uncertainty and intrinsic noise combine to yield non-deterministic, noisy MD predictions of target macroscale observables. Two key probabilistic UQ methods, namely Polynomial Chaos (PC) expansions and Bayesian inference, are exploited to develop a framework that enables one to isolate the impact of parametric uncertainty on the MD predictions and, at the same time, properly quantify the effect of the intrinsic noise. Systematic applications to a suite of problems of increasing complexity lead to the observation that an uncertain PC representation built via Bayesian regression is the most suitable model for the representation of uncertain MD predictions of target observables in the

  7. Content and Access Remain Key

    ERIC Educational Resources Information Center

    Johnson, Linda B.

    2007-01-01

    It is impossible to review the year's outstanding government publication landscape without acknowledging that change remains paramount. Just as striking, however, is that these changes go hand in hand with some familiar constants. Within this shifting environment, there are the consistency and dependability of government information itself,…

  8. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  9. Generalized uncertainty principle: Approaches and applications

    NASA Astrophysics Data System (ADS)

    Tawfik, A.; Diab, A.

    2014-11-01

    In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  10. Becoming and remaining homeless: a qualitative investigation.

    PubMed

    Morrell-Bellai, T; Goering, P N; Boydell, K M

    2000-09-01

    This article reports the qualitative findings of a multimethod study of the homeless population in Toronto, Canada. The qualitative component sought to identify how people become homeless and why some individuals remain homeless for an extended period of time or cycle in and out of homelessness (the chronically homeless). In-depth, semistructured interviews were conducted with 29 homeless adults. The findings suggest that people both become and remain homeless due to a combination of macro level factors (poverty, lack of employment, low welfare wages, lack of affordable housing) and personal vulnerability (childhood abuse or neglect, mental health symptoms, impoverished support networks, substance abuse). Chronically homeless individuals often reported experiences of severe childhood trauma and tended to attribute their continued homelessness to a substance abuse problem. It is concluded that both macro and individual level factors must be considered in planning programs and services to address the issue of homelessness in Canada.

  11. Decomposition Technique for Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)

    2014-01-01

    The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.

  12. Uncertainty Can Increase Explanatory Credibility

    DTIC Science & Technology

    2013-08-01

    metacognitive cue to infer their conversational partner’s depth of processing . Keywords: explanations, confidence, uncertainty, collaborative reasoning...scope, i.e., those that account for only observed phenomena (Khemlani, Sussman, & Oppenheimer , 2011). These preferences show that properties intrinsic...Fischhoff, & Phillips , 1982; Lindley, 1982; McClelland & Bolger, 1994). Much of the research on subjective confidence addresses how individuals

  13. When Does the Uncertainty Become Non-Gaussian

    NASA Astrophysics Data System (ADS)

    Alfriend, K.; Park, I.

    2016-09-01

    The orbit state covariance is used in the conjunction assessment/probability of collision calculation. It can also be a valuable tool in track association, maneuver detection and sensor tasking. These uses all assume that the uncertainty is Gaussian. Studies have shown that the uncertainty at epoch (time of last observation) is reasonably Gaussian, but the neglected nonlinearities in the covariance propagation eventually result in the uncertainty becoming non-Gaussian. Numerical studies have shown that for space objects in low Earth orbit the covariance remains Gaussian the longest in orbital element space. It has been shown that the covariance remains Gaussian for up to 10 days in orbital element space, but becomes non-Gaussian after 2-3 days in Cartesian coordinates for a typical LEO orbit. The fundamental question is when does it become non-Gaussian and how can one given the orbit state and covariance at epoch determine when it occurs. A tool that an operator could use to compute the approximate time when the when the uncertainty becomes non-Gaussian would be useful This paper addresses the development of such a tool.

  14. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  15. Silicon photonics: some remaining challenges

    NASA Astrophysics Data System (ADS)

    Reed, G. T.; Topley, R.; Khokhar, A. Z.; Thompson, D. J.; Stanković, S.; Reynolds, S.; Chen, X.; Soper, N.; Mitchell, C. J.; Hu, Y.; Shen, L.; Martinez-Jimenez, G.; Healy, N.; Mailis, S.; Peacock, A. C.; Nedeljkovic, M.; Gardes, F. Y.; Soler Penades, J.; Alonso-Ramos, C.; Ortega-Monux, A.; Wanguemert-Perez, G.; Molina-Fernandez, I.; Cheben, P.; Mashanovich, G. Z.

    2016-03-01

    This paper discusses some of the remaining challenges for silicon photonics, and how we at Southampton University have approached some of them. Despite phenomenal advances in the field of Silicon Photonics, there are a number of areas that still require development. For short to medium reach applications, there is a need to improve the power consumption of photonic circuits such that inter-chip, and perhaps intra-chip applications are viable. This means that yet smaller devices are required as well as thermally stable devices, and multiple wavelength channels. In turn this demands smaller, more efficient modulators, athermal circuits, and improved wavelength division multiplexers. The debate continues as to whether on-chip lasers are necessary for all applications, but an efficient low cost laser would benefit many applications. Multi-layer photonics offers the possibility of increasing the complexity and effectiveness of a given area of chip real estate, but it is a demanding challenge. Low cost packaging (in particular, passive alignment of fibre to waveguide), and effective wafer scale testing strategies, are also essential for mass market applications. Whilst solutions to these challenges would enhance most applications, a derivative technology is emerging, that of Mid Infra-Red (MIR) silicon photonics. This field will build on existing developments, but will require key enhancements to facilitate functionality at longer wavelengths. In common with mainstream silicon photonics, significant developments have been made, but there is still much left to do. Here we summarise some of our recent work towards wafer scale testing, passive alignment, multiplexing, and MIR silicon photonics technology.

  16. Uncertainty and equipoise: at interplay between epistemology, decision making and ethics.

    PubMed

    Djulbegovic, Benjamin

    2011-10-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned because it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. As equipoise represents just 1 measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this article, I show how uncertainty (equipoise) is at the intersection between epistemology, decision making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision making depends both on analytical, deliberative processes embodied in scientific method (system II), and good human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors and unavoidable injustice.

  17. [The metrology of uncertainty: a study of vital statistics from Chile and Brazil].

    PubMed

    Carvajal, Yuri; Kottow, Miguel

    2012-11-01

    This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.

  18. Overcoming the Uncertainty Barrier to Adaptation

    EPA Pesticide Factsheets

    The second in a three-part webinar series about climate change adaptation for state and local governments, this webinar addressed the challenge of planning for climate change in the face of uncertainty.

  19. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  20. The Role of Uncertainty in Climate Science

    NASA Astrophysics Data System (ADS)

    Oreskes, N.

    2012-12-01

    Scientific discussions of climate change place considerable weight on uncertainty. The research frontier, by definition, rests at the interface between the known and the unknown and our scientific investigations necessarily track this interface. Yet, other areas of active scientific research are not necessarily characterized by a similar focus on uncertainty; previous assessments of science for policy, for example, do not reveal such extensive efforts at uncertainty quantification. Why has uncertainty loomed so large in climate science? This paper argues that the extensive discussions of uncertainty surrounding climate change are at least in part a response to the social and political context of climate change. Skeptics and contrarians focus on uncertainty as a political strategy, emphasizing or exaggerating uncertainties as a means to undermine public concern about climate change and delay policy action. The strategy works in part because it appeals to a certain logic: if our knowledge is uncertain, then it makes sense to do more research. Change, as the tobacco industry famously realized, requires justification; doubt favors the status quo. However, the strategy also works by pulling scientists into an "uncertainty framework," inspiring them to respond to the challenge by addressing and quantifying the uncertainties. The problem is that all science is uncertain—nothing in science is ever proven absolutely, positively—so as soon as one uncertainty is addressed, another can be raised, which is precisely what contrarians have done over the past twenty years.

  1. Important Questions Remain to Be Addressed before Adopting a Dimensional Classification of Mental Disorders

    ERIC Educational Resources Information Center

    Ruscio, Ayelet Meron

    2008-01-01

    Comments on the original article "Plate tectonics in the classification of personality disorder: Shifting to a dimensional model," by T. A. Widiger and T. J. Trull (2007). Widiger and Trull raised important nosological issues that warrant serious consideration not only for the personality disorders but for all mental disorders as the Diagnostic…

  2. Characterizing Uncertainty for Regional Climate Change Mitigation and Adaptation Decisions

    SciTech Connect

    Unwin, Stephen D.; Moss, Richard H.; Rice, Jennie S.; Scott, Michael J.

    2011-09-30

    This white paper describes the results of new research to develop an uncertainty characterization process to help address the challenges of regional climate change mitigation and adaptation decisions.

  3. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  4. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  5. The uncertainties in estimating measurement uncertainties

    SciTech Connect

    Clark, J.P.; Shull, A.H.

    1994-07-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties.

  6. Uncertainty estimates for electron probe X-ray microanalysis measurements.

    PubMed

    Ritchie, Nicholas W M; Newbury, Dale E

    2012-11-20

    It has been over 60 years since Castaing (Castaing, R. Application of Electron Probes to Local Chemical and Crystallographic Analysis. Ph.D. Thesis, University of Paris, Paris, France, 1951; translated by P. Duwez and D. Wittry, California Institute of Technology, 1955) introduced the technique of electron probe X-ray microanalysis (EPMA), yet the community remains unable to quantify some of the largest terms in the technique's uncertainty budget. Historically, the EPMA community has assigned uncertainties to its measurements which reflect the measurement precision portion of the uncertainty budget and omitted terms related to the measurement accuracy. Yet, in many cases, the precision represents only a small fraction of the total budget. This paper addresses this shortcoming by considering two significant sources of uncertainty in the quantitative matrix correction models--the mass absorption coefficient, [μ/ρ], and the backscatter coefficient, η. Understanding the influence of these sources provides insight into the utility of EPMA measurements, and equally important, it allows practitioners to develop strategies to optimize measurement accuracy by minimizing the influence of poorly known model parameters.

  7. Assessment of SFR Wire Wrap Simulation Uncertainties

    SciTech Connect

    Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David; Swiler, Laura P.

    2016-09-30

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility.

  8. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  9. 32 CFR 806.26 - Addressing FOIA requests.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... determining the correct Air Force element to address their requests. If there is uncertainty as to the... Reserve Command (AFRC): HQ AFRC/SCSM, 155 2nd Street, Robins AFB, GA 31098-1635. (5) Air Force...

  10. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  11. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  12. Addressing Ozone Layer Depletion

    EPA Pesticide Factsheets

    Access information on EPA's efforts to address ozone layer depletion through regulations, collaborations with stakeholders, international treaties, partnerships with the private sector, and enforcement actions under Title VI of the Clean Air Act.

  13. Structural Damage Assessment under Uncertainty

    NASA Astrophysics Data System (ADS)

    Lopez Martinez, Israel

    Structural damage assessment has applications in the majority of engineering structures and mechanical systems ranging from aerospace vehicles to manufacturing equipment. The primary goals of any structural damage assessment and health monitoring systems are to ascertain the condition of a structure and to provide an evaluation of changes as a function of time as well as providing an early-warning of an unsafe condition. There are many structural heath monitoring and assessment techniques developed for research using numerical simulations and scaled structural experiments. However, the transition from research to real-world structures has been rather slow. One major reason for this slow-progress is the existence of uncertainty in every step of the damage assessment process. This dissertation research involved the experimental and numerical investigation of uncertainty in vibration-based structural health monitoring and development of robust detection and localization methods. The basic premise of vibration-based structural health monitoring is that changes in structural characteristics, such as stiffness, mass and damping, will affect the global vibration response of the structure. The diagnostic performance of vibration-based monitoring system is affected by uncertainty sources such as measurement errors, environmental disturbances and parametric modeling uncertainties. To address diagnostic errors due to irreducible uncertainty, a pattern recognition framework for damage detection has been developed to be used for continuous monitoring of structures. The robust damage detection approach developed is based on the ensemble of dimensional reduction algorithms for improved damage-sensitive feature extraction. For damage localization, the determination of an experimental structural model was performed based on output-only modal analysis. An experimental model correlation technique is developed in which the discrepancies between the undamaged and damaged modal data are

  14. Space Acquisitions: Some Programs Have Overcome Past Problems, but Challenges and Uncertainty Remain for the Future

    DTIC Science & Technology

    2015-04-29

    are being conducted for the SpaceX Falcon 9 v1.1 launch system. In addition, in its fiscal year 2016 President’s Budget request, DOD requested funding...DOD expects SpaceX to be certified by June 2015. Additionally, the department has faced unexpected complications, such as challenges to its

  15. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  16. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  17. Spatial uncertainty and ecological models

    SciTech Connect

    Jager, Yetta; King, Anthony Wayne

    2004-07-01

    Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

  18. Key findings and remaining questions in the areas of core-concrete interaction and debris coolability

    DOE PAGES

    Farmer, M. T.; Gerardi, C.; Bremer, N.; ...

    2016-10-31

    The reactor accidents at Fukushima-Dai-ichi have rekindled interest in late phase severe accident behavior involving reactor pressure vessel breach and discharge of molten core melt into the containment. Two technical issues of interest in this area include core-concrete interaction and the extent to which the core debris may be quenched and rendered coolable by top flooding. The OECD-sponsored Melt Coolability and Concrete Interaction (MCCI) programs at Argonne National Laboratory included the conduct of large scale reactor material experiments and associated analysis with the objectives of resolving the ex-vessel debris coolability issue, and to address remaining uncertainties related to long-term two-dimensionalmore » molten core-concrete interactions under both wet and dry cavity conditions. These tests provided a broad database to support accident management planning, as well as the development and validation of models and codes that can be used to extrapolate the experiment results to plant conditions. This paper provides a high level overview of the key experiment results obtained during the program. Finally, a discussion is also provided that describes technical gaps that remain in this area, several of which have arisen based on the sequence of events and operator actions during Fukushima.« less

  19. Key findings and remaining questions in the areas of core-concrete interaction and debris coolability

    SciTech Connect

    Farmer, M. T.; Gerardi, C.; Bremer, N.; Basu, S.

    2016-10-31

    The reactor accidents at Fukushima-Dai-ichi have rekindled interest in late phase severe accident behavior involving reactor pressure vessel breach and discharge of molten core melt into the containment. Two technical issues of interest in this area include core-concrete interaction and the extent to which the core debris may be quenched and rendered coolable by top flooding. The OECD-sponsored Melt Coolability and Concrete Interaction (MCCI) programs at Argonne National Laboratory included the conduct of large scale reactor material experiments and associated analysis with the objectives of resolving the ex-vessel debris coolability issue, and to address remaining uncertainties related to long-term two-dimensional molten core-concrete interactions under both wet and dry cavity conditions. These tests provided a broad database to support accident management planning, as well as the development and validation of models and codes that can be used to extrapolate the experiment results to plant conditions. This paper provides a high level overview of the key experiment results obtained during the program. Finally, a discussion is also provided that describes technical gaps that remain in this area, several of which have arisen based on the sequence of events and operator actions during Fukushima.

  20. Uncertainty in environmental health impact assessment: quantitative methods and perspectives.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Vanni, Tazio; Foss, Anna M

    2013-01-01

    Environmental health impact assessment models are subjected to great uncertainty due to the complex associations between environmental exposures and health. Quantifying the impact of uncertainty is important if the models are used to support health policy decisions. We conducted a systematic review to identify and appraise current methods used to quantify the uncertainty in environmental health impact assessment. In the 19 studies meeting the inclusion criteria, several methods were identified. These were grouped into random sampling methods, second-order probability methods, Bayesian methods, fuzzy sets, and deterministic sensitivity analysis methods. All 19 studies addressed the uncertainty in the parameter values but only 5 of the studies also addressed the uncertainty in the structure of the models. None of the articles reviewed considered conceptual sources of uncertainty associated with the framing assumptions or the conceptualisation of the model. Future research should attempt to broaden the way uncertainty is taken into account in environmental health impact assessments.

  1. Uncertainties in Arctic Precipitation

    NASA Astrophysics Data System (ADS)

    Majhi, I.; Alexeev, V. A.; Cherry, J. E.; Cohen, J. L.; Groisman, P. Y.

    2012-12-01

    Arctic precipitation is riddled with measurement biases; to address the problem is imperative. Our study focuses on comparison of various datasets and analyzing their biases for the region of Siberia and caution that is needed when using them. Five sources of data were used ranging from NOAA's product (RAW, Bogdanova's correction), Yang's correction technique and two reanalysis products (ERA-Interim and NCEP). The reanalysis dataset performed better for some months in comparison to Yang's product, which tends to overestimate precipitation, and the raw dataset, which tends to underestimate. The sources of bias vary from topography, to wind, to missing data .The final three products chosen show higher biases during the winter and spring season. Emphasis on equations which incorporate blizzards, blowing snow and higher wind speed is necessary for regions which are influenced by any or all of these factors; Bogdanova's correction technique is the most robust of all the datasets analyzed and gives the most reasonable results. One of our future goals is to analyze the impact of precipitation uncertainties on water budget analysis for the Siberian Rivers.

  2. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  3. Addressing Social Issues.

    ERIC Educational Resources Information Center

    Schoebel, Susan

    1991-01-01

    Maintains that advertising can help people become more aware of social responsibilities. Describes a successful nationwide newspaper advertising competition for college students in which ads address social issues such as literacy, drugs, teen suicide, and teen pregnancy. Notes how the ads have helped grassroots programs throughout the United…

  4. Invitational Addresses, 1965.

    ERIC Educational Resources Information Center

    Gates, Arthur I.; And Others

    The full texts of invitational addresses given at the 1965 International Reading Association (IRA) Convention in Detroit, Michigan, by six recipients of IRA citation awards are presented. Gates suggests steps IRA should take to revive and redirect reading research. McCallister discusses the implications of the changing and expanding vocabulary of…

  5. States Address Achievement Gaps.

    ERIC Educational Resources Information Center

    Christie, Kathy

    2002-01-01

    Summarizes 2 state initiatives to address the achievement gap: North Carolina's report by the Advisory Commission on Raising Achievement and Closing Gaps, containing an 11-point strategy, and Kentucky's legislation putting in place 10 specific processes. The North Carolina report is available at www.dpi.state.nc.us.closingthegap; Kentucky's…

  6. Addressing Sexual Harassment

    ERIC Educational Resources Information Center

    Young, Ellie L.; Ashbaker, Betty Y.

    2008-01-01

    This article discusses ways on how to address the problem of sexual harassment in schools. Sexual harassment--simply defined as any unwanted and unwelcome sexual behavior--is a sensitive topic. Merely providing students, parents, and staff members with information about the school's sexual harassment policy is insufficient; schools must take…

  7. Where do those remains come from?

    PubMed

    Nociarová, Dominika; Adserias, M Jose; Malgosa, Assumpció; Galtés, Ignasi

    2014-12-01

    Part of the study of skeletal remains or corpses in advance decay located in the field involves determining their origin. They may be the result of criminal activity, accident, unearthed because of erosion, or they may also have originated from a cemetery. The discovery site, condition of the remains, and the associated artifacts, are factors that could be helpful for the forensic anthropologist to identify the origin of the remains. In order to contribute to this recognition, an analysis was made of the exhumations of 168 unclaimed human remains from the cemetery of Terrassa (Catalonia, Spain). This investigation presents a description of artifacts and conditions of remains that could indicate that the human remains may have originated from a cemetery.

  8. Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.

    ERIC Educational Resources Information Center

    Bartell, Lawrence S.

    1985-01-01

    Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)

  9. Uncertainty Analysis in the Decadal Survey Era: A Hydrologic Application using the Land Information System (LIS)

    NASA Astrophysics Data System (ADS)

    Harrison, K.; Kumar, S.; Peters-Lidard, C. D.; Santanello, J. A.

    2010-12-01

    Computing and algorithmic advancements are making possible a more complete accounting of errors and uncertainties in earth science modeling. Knowledge of uncertainty can be critical in many application areas and can help to guide scientific research efforts. Here, we describe a plan and progress to date for a fuller accounting of hydrologic modeling uncertainties that addresses the challenges posed by decadal survey missions. These challenges include the need to account for a wide range of error sources (e.g., model error, stochastically varying inputs, observational error, downscaling) and uncertainties (model parameters, error parameters, model selection). In addition, there is a need to incorporate into an assessment all available data, which for decadal survey missions includes the wealth of data from ground, air and satellite observing systems. Our core tool is NASA’s Land Information System (LIS), a high-resolution, high-performance, land surface modeling and data assimilation system that supports a wide range of land surface research and applications. Support for parameter and uncertainty estimation was recently incorporated into the software architecture, and to date three optimization algorithms (Levenberg-Marquardt, Genetic Algorithm, and SCE-UA) and two Markov chain Monte Carlo algorithms for Bayesian analysis (random walk, Differential Evolution-Monte Carlo) have been added. Results and discussion center on a case study that was the focus of Santanello et al. (2007) who demonstrated the use of remotely sensed soil moisture for hydrologic parameter estimation in the Walnut Gulch Experimental Watershed. We contrast results from uncertainty estimation to those from parameter estimation alone. We demonstrate considerable but not complete uncertainty reduction. From this analysis, we identify remaining challenges to a more complete accounting of uncertainties.

  10. Pandemic influenza: certain uncertainties

    PubMed Central

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  11. Maintaining Realistic Uncertainty in Model and Forecast

    DTIC Science & Technology

    2000-09-30

    Maintaining Realistic Uncertainty in Model and Forecast Leonard Smith Pembroke College Oxford University St. Aldates Oxford OX1 1DW United Kingdom...5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Pembroke College, Oxford University ,,St...evaluation: l-shadowing, probabilistic prediction and weather forecasting. D.Phil Thesis, Oxford University . Lorenz, E. (1995) Predictability-a Partially

  12. Content Addressable Memory Project

    DTIC Science & Technology

    1990-11-01

    The Content Addressable M1-emory Project consists of the development of several experimental software systems on an AMT Distributed Array Processor...searching (database) compiler algorithms memory management other systems software) Linear C is an unlovely hybrid language which imports the CAM...memory from AMT’s operating system for the DAP; how- ever, other than this limitation, the memory management routines work exactly as their C counterparts

  13. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  14. [Ethics, empiricism and uncertainty].

    PubMed

    Porz, R; Zimmermann, H; Exadaktylos, A K

    2011-01-01

    Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine.

  15. A review of uncertainty research in impact assessment

    SciTech Connect

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  16. Wellhead protection area delineation under uncertainty

    SciTech Connect

    Jacobson, E.; Andricevic, R.; Hultin, T.

    1994-09-01

    A program to protect groundwater resources used for water supply from all potential threats due to contamination was established in the Amendments to the Safe Drinking Water Act (SDWA). Goal of the State Wellhead Protection (WHP) Program is to ``protect wellhead areas within their jurisdiction from contaminants which may have any adverse effect on the health of persons.`` A major component of WHP is the determination of zones around water-supply wells called Wellhead Protection Areas (WHPAs) within which contaminant source assessment and management should be addressed. WHPAs are defined in the SDWA as ``the surface and subsurface area surrounding a water well or wellfield, supplying a public water system, through which contaminants are reasonably likely to move toward and reach such water well or wellfield.`` A total of 14 water-supply wells are currently being used at the Nevada Test Site (NTS). Eleven of the wells are used for potable water supplies and the remaining three wells are used for construction purposes only. Purpose of this study is to estimate WHPAs for each water-supply well at the NTS. Due to the limited information about the hydraulic properties needed for estimating the WHPAS, an approach that considered the uncertainty in the estimates of the hydraulic properties was developed and implemented.

  17. Calibration and uncertainty issues of a hydrological model (SWAT) applied to West Africa

    NASA Astrophysics Data System (ADS)

    Schuol, J.; Abbaspour, K. C.

    2006-09-01

    Distributed hydrological models like SWAT (Soil and Water Assessment Tool) are often highly over-parameterized, making parameter specification and parameter estimation inevitable steps in model calibration. Manual calibration is almost infeasible due to the complexity of large-scale models with many objectives. Therefore we used a multi-site semi-automated inverse modelling routine (SUFI-2) for calibration and uncertainty analysis. Nevertheless, the question of when a model is sufficiently calibrated remains open, and requires a project dependent definition. Due to the non-uniqueness of effective parameter sets, parameter calibration and prediction uncertainty of a model are intimately related. We address some calibration and uncertainty issues using SWAT to model a four million km2 area in West Africa, including mainly the basins of the river Niger, Volta and Senegal. This model is a case study in a larger project with the goal of quantifying the amount of global country-based available freshwater. Annual and monthly simulations with the "calibrated" model for West Africa show promising results in respect of the freshwater quantification but also point out the importance of evaluating the conceptual model uncertainty as well as the parameter uncertainty.

  18. Bioreactors Addressing Diabetes Mellitus

    PubMed Central

    Minteer, Danielle M.; Gerlach, Jorg C.

    2014-01-01

    The concept of bioreactors in biochemical engineering is a well-established process; however, the idea of applying bioreactor technology to biomedical and tissue engineering issues is relatively novel and has been rapidly accepted as a culture model. Tissue engineers have developed and adapted various types of bioreactors in which to culture many different cell types and therapies addressing several diseases, including diabetes mellitus types 1 and 2. With a rising world of bioreactor development and an ever increasing diagnosis rate of diabetes, this review aims to highlight bioreactor history and emerging bioreactor technologies used for diabetes-related cell culture and therapies. PMID:25160666

  19. Bioreactors addressing diabetes mellitus.

    PubMed

    Minteer, Danielle M; Gerlach, Jorg C; Marra, Kacey G

    2014-11-01

    The concept of bioreactors in biochemical engineering is a well-established process; however, the idea of applying bioreactor technology to biomedical and tissue engineering issues is relatively novel and has been rapidly accepted as a culture model. Tissue engineers have developed and adapted various types of bioreactors in which to culture many different cell types and therapies addressing several diseases, including diabetes mellitus types 1 and 2. With a rising world of bioreactor development and an ever increasing diagnosis rate of diabetes, this review aims to highlight bioreactor history and emerging bioreactor technologies used for diabetes-related cell culture and therapies.

  20. Content addressable memory project

    NASA Technical Reports Server (NTRS)

    Hall, J. Storrs; Levy, Saul; Smith, Donald E.; Miyake, Keith M.

    1992-01-01

    A parameterized version of the tree processor was designed and tested (by simulation). The leaf processor design is 90 percent complete. We expect to complete and test a combination of tree and leaf cell designs in the next period. Work is proceeding on algorithms for the computer aided manufacturing (CAM), and once the design is complete we will begin simulating algorithms for large problems. The following topics are covered: (1) the practical implementation of content addressable memory; (2) design of a LEAF cell for the Rutgers CAM architecture; (3) a circuit design tool user's manual; and (4) design and analysis of efficient hierarchical interconnection networks.

  1. Aeroservoelastic Uncertainty Model Identification from Flight Data

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.

    2001-01-01

    Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.

  2. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  3. Uncertainty and global climate change research

    SciTech Connect

    Tonn, B.E.; Weiher, R.

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  4. Climate change, uncertainty, and natural resource management

    USGS Publications Warehouse

    Nichols, J.D.; Koneff, M.D.; Heglund, P.J.; Knutson, M.G.; Seamans, M.E.; Lyons, J.E.; Morton, J.M.; Jones, M.T.; Boomer, G.S.; Williams, B.K.

    2011-01-01

    Climate change and its associated uncertainties are of concern to natural resource managers. Although aspects of climate change may be novel (e.g., system change and nonstationarity), natural resource managers have long dealt with uncertainties and have developed corresponding approaches to decision-making. Adaptive resource management is an application of structured decision-making for recurrent decision problems with uncertainty, focusing on management objectives, and the reduction of uncertainty over time. We identified 4 types of uncertainty that characterize problems in natural resource management. We examined ways in which climate change is expected to exacerbate these uncertainties, as well as potential approaches to dealing with them. As a case study, we examined North American waterfowl harvest management and considered problems anticipated to result from climate change and potential solutions. Despite challenges expected to accompany the use of adaptive resource management to address problems associated with climate change, we conclude that adaptive resource management approaches will be the methods of choice for managers trying to deal with the uncertainties of climate change. ?? 2010 The Wildlife Society.

  5. Solving navigational uncertainty using grid cells on robots.

    PubMed

    Milford, Michael J; Wiles, Janet; Wyeth, Gordon F

    2010-11-11

    To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our

  6. Addressing Environmental Health Inequalities

    PubMed Central

    Gouveia, Nelson

    2016-01-01

    Environmental health inequalities refer to health hazards disproportionately or unfairly distributed among the most vulnerable social groups, which are generally the most discriminated, poor populations and minorities affected by environmental risks. Although it has been known for a long time that health and disease are socially determined, only recently has this idea been incorporated into the conceptual and practical framework for the formulation of policies and strategies regarding health. In this Special Issue of the International Journal of Environmental Research and Public Health (IJERPH), “Addressing Environmental Health Inequalities—Proceedings from the ISEE Conference 2015”, we incorporate nine papers that were presented at the 27th Conference of the International Society for Environmental Epidemiology (ISEE), held in Sao Paulo, Brazil, in 2015. This small collection of articles provides a brief overview of the different aspects of this topic. Addressing environmental health inequalities is important for the transformation of our reality and for changing the actual development model towards more just, democratic, and sustainable societies driven by another form of relationship between nature, economy, science, and politics. PMID:27618906

  7. Electoral Knowledge and Uncertainty.

    ERIC Educational Resources Information Center

    Blood, R. Warwick; And Others

    Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may…

  8. Intolerance of Uncertainty

    PubMed Central

    Beier, Meghan L.

    2015-01-01

    Multiple sclerosis (MS) is a chronic and progressive neurologic condition that, by its nature, carries uncertainty as a hallmark characteristic. Although all patients face uncertainty, there is variability in how individuals cope with its presence. In other populations, the concept of “intolerance of uncertainty” has been conceptualized to explain this variability such that individuals who have difficulty tolerating the possibility of future occurrences may engage in thoughts or behaviors by which they attempt to exert control over that possibility or lessen the uncertainty but may, as a result, experience worse outcomes, particularly in terms of psychological well-being. This topical review introduces MS-focused researchers, clinicians, and patients to intolerance of uncertainty, integrates the concept with what is already understood about coping with MS, and suggests future steps for conceptual, assessment, and treatment-focused research that may benefit from integrating intolerance of uncertainty as a central feature. PMID:26300700

  9. Economic uncertainty and econophysics

    NASA Astrophysics Data System (ADS)

    Schinckus, Christophe

    2009-10-01

    The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.

  10. Physical Uncertainty Bounds (PUB)

    SciTech Connect

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  11. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  12. Content addressable memory project

    NASA Technical Reports Server (NTRS)

    Hall, Josh; Levy, Saul; Smith, D.; Wei, S.; Miyake, K.; Murdocca, M.

    1991-01-01

    The progress on the Rutgers CAM (Content Addressable Memory) Project is described. The overall design of the system is completed at the architectural level and described. The machine is composed of two kinds of cells: (1) the CAM cells which include both memory and processor, and support local processing within each cell; and (2) the tree cells, which have smaller instruction set, and provide global processing over the CAM cells. A parameterized design of the basic CAM cell is completed. Progress was made on the final specification of the CPS. The machine architecture was driven by the design of algorithms whose requirements are reflected in the resulted instruction set(s). A few of these algorithms are described.

  13. Estimating the magnitude of prediction uncertainties for the APLE model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...

  14. Integration of Data Assimilation, Stochastic Optimization and Uncertainty Modeling within NASA's Land Information System (LIS)

    NASA Astrophysics Data System (ADS)

    Harrison, K.; Peters-Lidard, C. D.; Kumar, S.; Santanello, J. A.; Kirschbaum, D. B.

    2011-12-01

    Recent advances in information systems technology have significantly improved our ability to fully exploit the information content of remote sensing data. In this talk, we discuss a range of applications for the optimization and uncertainty tools recently incorporated into the NASA Land Information System (LIS) to address this challenge. LIS is a high-resolution, high-performance, land surface modeling and data assimilation system that supports a wide range of land surface research and applications. The applications of the new optimization and uncertainty tools involve several LIS land-coupled models, including the Weather Research and Forecasting Model (WRF), models of land microwave emission (the Soil Moisture and Ocean Salinity (SMOS) Community Microwave Emission Model (CMEM)), radiative transfer (Joint Center for Satellite Data Assimilation's Community Radiative Transfer Model (CRTM)), landslide and streamflow simulation. The impact of parameter estimation on land surface modeling is investigated for a range of studies, including soil moisture modeling in the Walnut Gulch experimental watershed, land data assimilation over the continental United States, and coupled land-atmosphere forecasts using WRF for the Southern Great Plains. In addition, the uncertainty in the outputs of such coupled systems is investigated. The uncertainty methods include Monte Carlo for propagating parameter uncertainties and model errors as well as Markov chain Monte Carlo methods that enable the updating of parameter uncertainties with remote sensing data. The tradeoffs between uncertainty estimation and parameter estimation are also highlighted. Finally, remaining challenges for the development of information systems of this kind are identified, including challenges in their use as part of mission simulation experiments.

  15. Luminescence of thermally altered human skeletal remains.

    PubMed

    Krap, Tristan; Nota, Kevin; Wilk, Leah S; van de Goot, Franklin R W; Ruijter, Jan M; Duijst, Wilma; Oostra, Roelof-Jan

    2017-02-23

    Literature on luminescent properties of thermally altered human remains is scarce and contradictory. Therefore, the luminescence of heated bone was systemically reinvestigated. A heating experiment was conducted on fresh human bone, in two different media, and cremated human remains were recovered from a modern crematory. Luminescence was excited with light sources within the range of 350 to 560 nm. The excitation light was filtered out by using different long pass filters, and the luminescence was analysed by means of a scoring method. The results show that temperature, duration and surrounding medium determine the observed emission intensity and bandwidth. It is concluded that the luminescent characteristic of bone can be useful for identifying thermally altered human remains in a difficult context as well as yield information on the perimortem and postmortem events.

  16. Constructing the Uncertainty of Due Dates

    PubMed Central

    Vos, Sarah C.; Anthony, Kathryn E.; O'Hair, H. Dan

    2015-01-01

    By its nature, the date that a baby is predicted to be born, or the due date, is uncertain. How women construct the uncertainty of their due dates may have implications for when and how women give birth. In the United States as many as 15% of births occur before 39 weeks because of elective inductions or cesarean sections, putting these babies at risk for increased medical problems after birth and later in life. This qualitative study employs a grounded theory approach to understand the decisions women make of how and when to give birth. Thirty-three women who were pregnant or had given birth within the past two years participated in key informant or small group interviews. The results suggest that women interpret the uncertainty of their due dates as a reason to wait on birth and as a reason to start the process early; however, information about a baby's brain development in the final weeks of pregnancy may persuade women to remain pregnant longer. The uncertainties of due dates are analyzed using Babrow's problematic integration, which distinguishes between epistemological and ontological uncertainty. The results point to a third type uncertainty, axiological uncertainty. Axiological uncertainty is rooted in the values and ethics of outcomes. PMID:24266788

  17. Evaluation of uncertainties associated with the determination of community drug use through the measurement of sewage drug biomarkers.

    PubMed

    Castiglioni, Sara; Bijlsma, Lubertus; Covaci, Adrian; Emke, Erik; Hernández, Félix; Reid, Malcolm; Ort, Christoph; Thomas, Kevin V; van Nuijs, Alexander L N; de Voogt, Pim; Zuccato, Ettore

    2013-02-05

    The aim of this study was to integrally address the uncertainty associated with all the steps used to estimate community drug consumption through the chemical analysis of sewage biomarkers of illicit drugs. Uncertainty has been evaluated for sampling, chemical analysis, stability of drug biomarkers in sewage, back-calculation of drug use (specific case of cocaine), and estimation of population size in a catchment using data collected from a recent Europe-wide investigation and from the available literature. The quality of sampling protocols and analytical measurements has been evaluated by analyzing standardized questionnaires collected from 19 sewage treatments plants (STPs) and the results of an interlaboratory study (ILS), respectively. Extensive reviews of the available literature have been used to evaluate stability of drug biomarkers in sewage and the uncertainty related to back-calculation of cocaine use. Different methods for estimating population size in a catchment have been compared and the variability among the collected data was very high (7-55%). A reasonable strategy to reduce uncertainty was therefore to choose the most reliable estimation case by case. In the other cases, the highest uncertainties are related to the analysis of sewage drug biomarkers (uncertainty as relative standard deviation; RSD: 6-26% from ILS) and to the back-calculation of cocaine use (uncertainty; RSD: 26%). Uncertainty can be kept below 10% in the remaining steps, if specific requirements outlined in this work are considered. For each step, a best practice protocol has been suggested and discussed to reduce and keep to a minimum the uncertainty of the entire procedure and to improve the reliability of the estimates of drug use.

  18. The challenges on uncertainty analysis for pebble bed HTGR

    SciTech Connect

    Hao, C.; Li, F.; Zhang, H.

    2012-07-01

    The uncertainty analysis is very popular and important, and many works have been done for Light Water Reactor (LWR), although the experience for the uncertainty analysis in High Temperature Gas cooled Reactor (HTGR) modeling is still in the primary stage. IAEA will launch a Coordination Research Project (CRP) on this topic soon. This paper addresses some challenges for the uncertainty analysis in HTGR modeling, based on the experience of OECD LWR Uncertainty Analysis in Modeling (UAM) activities, and taking into account the peculiarities of pebble bed HTGR designs. The main challenges for HTGR UAM are: the lack of experience, the totally different code packages, the coupling of power distribution, temperature distribution and burnup distribution through the temperature feedback and pebble flow. The most serious challenge is how to deal with the uncertainty in pebble flow, the uncertainty in pebble bed flow modeling, and their contribution to the uncertainty of maximum fuel temperature, which is the most interested parameter for the modular HTGR. (authors)

  19. Fragility, uncertainty, and healthcare.

    PubMed

    Rogers, Wendy A; Walker, Mary J

    2016-02-01

    Medicine seeks to overcome one of the most fundamental fragilities of being human, the fragility of good health. No matter how robust our current state of health, we are inevitably susceptible to future illness and disease, while current disease serves to remind us of various frailties inherent in the human condition. This article examines the relationship between fragility and uncertainty with regard to health, and argues that there are reasons to accept rather than deny at least some forms of uncertainty. In situations of current ill health, both patients and doctors seek to manage this fragility through diagnoses that explain suffering and provide some certainty about prognosis as well as treatment. However, both diagnosis and prognosis are inevitably uncertain to some degree, leading to questions about how much uncertainty health professionals should disclose, and how to manage when diagnosis is elusive, leaving patients in uncertainty. We argue that patients can benefit when they are able to acknowledge, and appropriately accept, some uncertainty. Healthy people may seek to protect the fragility of their good health by undertaking preventative measures including various tests and screenings. However, these attempts to secure oneself against the onset of biological fragility can cause harm by creating rather than eliminating uncertainty. Finally, we argue that there are good reasons for accepting the fragility of health, along with the associated uncertainties.

  20. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  1. Mutually Exclusive Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan

    2016-11-01

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds.

  2. Mutually Exclusive Uncertainty Relations.

    PubMed

    Xiao, Yunlong; Jing, Naihuan

    2016-11-08

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds.

  3. Mutually Exclusive Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan

    2016-01-01

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds. PMID:27824161

  4. Optimal Universal Uncertainty Relations

    PubMed Central

    Li, Tao; Xiao, Yunlong; Ma, Teng; Fei, Shao-Ming; Jing, Naihuan; Li-Jost, Xianqing; Wang, Zhi-Xi

    2016-01-01

    We study universal uncertainty relations and present a method called joint probability distribution diagram to improve the majorization bounds constructed independently in [Phys. Rev. Lett. 111, 230401 (2013)] and [J. Phys. A. 46, 272002 (2013)]. The results give rise to state independent uncertainty relations satisfied by any nonnegative Schur-concave functions. On the other hand, a remarkable recent result of entropic uncertainty relation is the direct-sum majorization relation. In this paper, we illustrate our bounds by showing how they provide a complement to that in [Phys. Rev. A. 89, 052115 (2014)]. PMID:27775010

  5. Odor analysis of decomposing buried human remains

    SciTech Connect

    Vass, Arpad Alexander; Smith, Rob R; Thompson, Cyril V; Burnett, Michael N; Dulgerian, Nishan; Eckenrode, Brian A

    2008-01-01

    This study, conducted at the University of Tennessee's Anthropological Research Facility (ARF), lists and ranks the primary chemical constituents which define the odor of decomposition of human remains as detected at the soil surface of shallow burial sites. Triple sorbent traps were used to collect air samples in the field and revealed eight major classes of chemicals which now contain 478 specific volatile compounds associated with burial decomposition. Samples were analyzed using gas chromatography-mass spectrometry (GC-MS) and were collected below and above the body, and at the soil surface of 1.5-3.5 ft. (0.46-1.07 m) deep burial sites of four individuals over a 4-year time span. New data were incorporated into the previously established Decompositional Odor Analysis (DOA) Database providing identification, chemical trends, and semi-quantitation of chemicals for evaluation. This research identifies the 'odor signatures' unique to the decomposition of buried human remains with projected ramifications on human remains detection canine training procedures and in the development of field portable analytical instruments which can be used to locate human remains in shallow burial sites.

  6. Juveniles' Motivations for Remaining in Prostitution

    ERIC Educational Resources Information Center

    Hwang, Shu-Ling; Bedford, Olwen

    2004-01-01

    Qualitative data from in-depth interviews were collected in 1990-1991, 1992, and 2000 with 49 prostituted juveniles remanded to two rehabilitation centers in Taiwan. These data are analyzed to explore Taiwanese prostituted juveniles' feelings about themselves and their work, their motivations for remaining in prostitution, and their difficulties…

  7. Identification of ancient remains through genomic sequencing

    PubMed Central

    Blow, Matthew J.; Zhang, Tao; Woyke, Tanja; Speller, Camilla F.; Krivoshapkin, Andrei; Yang, Dongya Y.; Derevianko, Anatoly; Rubin, Edward M.

    2008-01-01

    Studies of ancient DNA have been hindered by the preciousness of remains, the small quantities of undamaged DNA accessible, and the limitations associated with conventional PCR amplification. In these studies, we developed and applied a genomewide adapter-mediated emulsion PCR amplification protocol for ancient mammalian samples estimated to be between 45,000 and 69,000 yr old. Using 454 Life Sciences (Roche) and Illumina sequencing (formerly Solexa sequencing) technologies, we examined over 100 megabases of DNA from amplified extracts, revealing unbiased sequence coverage with substantial amounts of nonredundant nuclear sequences from the sample sources and negligible levels of human contamination. We consistently recorded over 500-fold increases, such that nanogram quantities of starting material could be amplified to microgram quantities. Application of our protocol to a 50,000-yr-old uncharacterized bone sample that was unsuccessful in mitochondrial PCR provided sufficient nuclear sequences for comparison with extant mammals and subsequent phylogenetic classification of the remains. The combined use of emulsion PCR amplification and high-throughput sequencing allows for the generation of large quantities of DNA sequence data from ancient remains. Using such techniques, even small amounts of ancient remains with low levels of endogenous DNA preservation may yield substantial quantities of nuclear DNA, enabling novel applications of ancient DNA genomics to the investigation of extinct phyla. PMID:18426903

  8. The case for fencing remains intact.

    PubMed

    Packer, C; Swanson, A; Canney, S; Loveridge, A; Garnett, S; Pfeifer, M; Burton, A C; Bauer, H; MacNulty, D

    2013-11-01

    Creel et al. argue against the conservation effectiveness of fencing based on a population measure that ignores the importance of top predators to ecosystem processes. Their statistical analyses consider, first, only a subset of fenced reserves and, second, an incomplete examination of 'costs per lion.' Our original conclusions remain unaltered.

  9. Predicting the remaining service life of concrete

    SciTech Connect

    Clifton, J.F.

    1991-11-01

    Nuclear power plants are providing, currently, about 17 percent of the U.S. electricity and many of these plants are approaching their licensed life of 40 years. The U.S. Nuclear Regulatory Commission and the Department of Energy`s Oak Ridge National Laboratory are carrying out a program to develop a methodology for assessing the remaining safe-life of the concrete components and structures in nuclear power plants. This program has the overall objective of identifying potential structural safety issues, as well as acceptance criteria, for use in evaluations of nuclear power plants for continued service. The National Institute of Standards and Technology (NIST) is contributing to this program by identifying and analyzing methods for predicting the remaining life of in-service concrete materials. This report examines the basis for predicting the remaining service lives of concrete materials of nuclear power facilities. Methods for predicting the service life of new and in-service concrete materials are analyzed. These methods include (1) estimates based on experience, (2) comparison of performance, (3) accelerated testing, (4) stochastic methods, and (5) mathematical modeling. New approaches for predicting the remaining service lives of concrete materials are proposed and recommendations for their further development given. Degradation processes are discussed based on considerations of their mechanisms, likelihood of occurrence, manifestations, and detection. They include corrosion, sulfate attack, alkali-aggregate reactions, frost attack, leaching, radiation, salt crystallization, and microbiological attack.

  10. Odor analysis of decomposing buried human remains.

    PubMed

    Vass, Arpad A; Smith, Rob R; Thompson, Cyril V; Burnett, Michael N; Dulgerian, Nishan; Eckenrode, Brian A

    2008-03-01

    This study, conducted at the University of Tennessee's Anthropological Research Facility (ARF), lists and ranks the primary chemical constituents which define the odor of decomposition of human remains as detected at the soil surface of shallow burial sites. Triple sorbent traps were used to collect air samples in the field and revealed eight major classes of chemicals which now contain 478 specific volatile compounds associated with burial decomposition. Samples were analyzed using gas chromatography-mass spectrometry (GC-MS) and were collected below and above the body, and at the soil surface of 1.5-3.5 ft. (0.46-1.07 m) deep burial sites of four individuals over a 4-year time span. New data were incorporated into the previously established Decompositional Odor Analysis (DOA) Database providing identification, chemical trends, and semi-quantitation of chemicals for evaluation. This research identifies the "odor signatures" unique to the decomposition of buried human remains with projected ramifications on human remains detection canine training procedures and in the development of field portable analytical instruments which can be used to locate human remains in shallow burial sites.

  11. Why Agricultural Educators Remain in the Classroom

    ERIC Educational Resources Information Center

    Crutchfield, Nina; Ritz, Rudy; Burris, Scott

    2013-01-01

    The purpose of this study was to identify and describe factors that are related to agricultural educator career retention and to explore the relationships between work engagement, work-life balance, occupational commitment, and personal and career factors as related to the decision to remain in the teaching profession. The target population for…

  12. [Keynote address: Climate change

    SciTech Connect

    Forrister, D.

    1994-12-31

    Broadly speaking, the climate issue is moving from talk to action both in the United States and internationally. While few nations have adopted strict controls or stiff new taxes, a number of them are developing action plans that are making clear their intention to ramp up activity between now and the year 2000... and beyond. There are sensible, economically efficient strategies to be undertaken in the near term that offer the possibility, in many countries, to avoid more draconian measures. These strategies are by-and-large the same measures that the National Academy of Sciences recommended in a 1991 report called, Policy Implications of Greenhouse Warming. The author thinks the Academy`s most important policy contribution was how it recommended the nations act in the face of uncertain science and high risks--that cost effective measures are adopted as cheap insurance... just as nations insure against other high risk, low certainty possibilities, like catastrophic health insurance, auto insurance, and fire insurance. This insurance theme is still right. First, the author addresses how the international climate change negotiations are beginning to produce insurance measures. Next, the author will discuss some of the key issues to watch in those negotiations that relate to longer-term insurance. And finally, the author will report on progress in the United States on the climate insurance plan--The President`s Climate Action Plan.

  13. Communicating scientific uncertainty.

    PubMed

    Fischhoff, Baruch; Davis, Alex L

    2014-09-16

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science.

  14. Uncertainty in chemistry.

    PubMed

    Menger, Fredric M

    2010-09-01

    It might come as a disappointment to some chemists, but just as there are uncertainties in physics and mathematics, there are some chemistry questions we may never know the answer to either, suggests Fredric M. Menger.

  15. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  16. Conundrums with uncertainty factors.

    PubMed

    Cooke, Roger

    2010-03-01

    The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets.

  17. Uncertainty in QSAR predictions.

    PubMed

    Sahlin, Ullrika

    2013-03-01

    It is relevant to consider uncertainty in individual predictions when quantitative structure-activity (or property) relationships (QSARs) are used to support decisions of high societal concern. Successful communication of uncertainty in the integration of QSARs in chemical safety assessment under the EU Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) system can be facilitated by a common understanding of how to define, characterise, assess and evaluate uncertainty in QSAR predictions. A QSAR prediction is, compared to experimental estimates, subject to added uncertainty that comes from the use of a model instead of empirically-based estimates. A framework is provided to aid the distinction between different types of uncertainty in a QSAR prediction: quantitative, i.e. for regressions related to the error in a prediction and characterised by a predictive distribution; and qualitative, by expressing our confidence in the model for predicting a particular compound based on a quantitative measure of predictive reliability. It is possible to assess a quantitative (i.e. probabilistic) predictive distribution, given the supervised learning algorithm, the underlying QSAR data, a probability model for uncertainty and a statistical principle for inference. The integration of QSARs into risk assessment may be facilitated by the inclusion of the assessment of predictive error and predictive reliability into the "unambiguous algorithm", as outlined in the second OECD principle.

  18. Explosives remain preferred methods for platform abandonment

    SciTech Connect

    Pulsipher, A.; Daniel, W. IV; Kiesler, J.E.; Mackey, V. III

    1996-05-06

    Economics and safety concerns indicate that methods involving explosives remain the most practical and cost-effective means for abandoning oil and gas structures in the Gulf of Mexico. A decade has passed since 51 dead sea turtles, many endangered Kemp`s Ridleys, washed ashore on the Texas coast shortly after explosives helped remove several offshore platforms. Although no relationship between the explosions and the dead turtles was ever established, in response to widespread public concern, the US Minerals Management Service (MMS) and National Marine Fisheries Service (NMFS) implemented regulations limiting the size and timing of explosive charges. Also, more importantly, they required that operators pay for observers to survey waters surrounding platforms scheduled for removal for 48 hr before any detonations. If observers spot sea turtles or marine mammals within the danger zone, the platform abandonment is delayed until the turtles leave or are removed. However, concern about the effects of explosives on marine life remains.

  19. Mill and the right to remain uninformed.

    PubMed

    Strasser, M

    1986-08-01

    In a recent article in the Journal of Medicine and Philosophy, David Ost (1984) claims that patients do not have a right to waive their right to information. He argues that patients cannot make informed rational decisions without full information and thus, a right to waive information would involve a right to avoid one's responsibility to act as an autonomous moral agent. In support of his position, Ost cites a passage from Mill. Yet, a correct interpretation of the passage in question would support one's right to remain uninformed in certain situations. If the information would hurt one's chances for survival or hurt one's ability to make calm, rational decisions, then one not only does not have a duty to find out the information, but one's exercising one's right to remain uninformed may be the only rational course of action to take.

  20. [Professional confidentiality: speak out or remain silent? ].

    PubMed

    Daubigney, Jean-claude

    2014-01-01

    People who work with children, in their daily tasks, must choose whether to disclose information entrusted to them. However, they are subject to the law, which authorises or imposes speaking out or remaining silent. In terms of ethics, they can seek the best possible response while respecting professional secrecy when meeting an individual, in a situation, in a place or at a particular time. They must then take responsibility for that decision.

  1. 13 percent remain AIDS-free.

    PubMed

    1995-01-01

    Researchers predict that approximately thirteen percent of homosexual/bisexual men infected with HIV at an early age will be long-term survivors, remaining free of disease for more than twenty years. Researchers with the Multicenter AIDS Cohort Study based their predictions on data from the ongoing study of 1,809 HIV-positive men. Stable immune markers and no use of antiretrovirals were the criteria used to define long-term.

  2. Direct Dating of Hominids Remains In Eurasia

    NASA Astrophysics Data System (ADS)

    Yokoyama, Y.; Falguères, C.

    When archaeological sites are associated with human remains, it is relevant to be able to date those valuable remains for different reasons. The main one is that it avoids the stratigraphical problems which can be due to intrusive burials in the sequence. The other reason consists in the fact that human bones may be encountered out of established stratigraphical context. On the other hand, the majority of dating methods currently used are destructive and can not be applied on these precious samples particularly when they are older than 40,000 years and can not be dated by radiocarbon. Since several years, we have developped a completely non-destructive method which consists in the measurement of human remains using the gamma -ray spectrometry. This technique has been used recently by other laboratories. We present here two important cases for the knowledge of human evolution in Eurasia. The first example is Qafzeh site in Israel where many human skeletons have been unearthed from burials associated with fauna and lithic artefacts. This site has been dated by several independent radiometric methods. So, it was possible to compare our gamma results with the other results yielded by the different methods. The second case concerns the most evolved Homo erectus found in Java, Indonesia, at Ngandong site, close to the Solo river. A recent debate has been focused on the age of these fossils and their direct dating is of outmost importance for the knowledge of settlement of Modern Humans in South-East Asia.

  3. Distribution of albatross remains in the Far East regions during the Holocene, based on zooarchaeological remains.

    PubMed

    Eda, Masaki; Higuchi, Hiroyoshi

    2004-07-01

    Many albatross remains have been found in the Japanese Islands and the surrounding areas, such as Sakhalin and South Korea. These remains are interesting for two reasons: numerous sites from which albatross remains have been found are located in coastal regions of the Far East where no albatrosses have been distributed recently, and there are some sites in which albatross remains represent a large portion of avian remains, although albatrosses are not easily preyed upon by human beings. We collected data on albatross remains from archaeological sites in the Far East regions during the Holocene and arranged the remains geographically, temporally and in terms of quantity. Based on these results, we showed that coastal areas along the Seas of Okhotsk and Japan have rarely been used by albatrosses in Modern times, though formerly there were many albatrosses. We proposed two explanations for the shrinkage of their distributional range: excessive hunting in the breeding areas, and distributional changes of prey for albatrosses.

  4. USING CONDITION MONITORING TO PREDICT REMAINING LIFE OF ELECTRIC CABLES.

    SciTech Connect

    LOFARO,R.; SOO,P.; VILLARAN,M.; GROVE,E.

    2001-03-29

    Electric cables are passive components used extensively throughout nuclear power stations to perform numerous safety and non-safety functions. It is known that the polymers commonly used to insulate the conductors on these cables can degrade with time; the rate of degradation being dependent on the severity of the conditions in which the cables operate. Cables do not receive routine maintenance and, since it can be very costly, they are not replaced on a regular basis. Therefore, to ensure their continued functional performance, it would be beneficial if condition monitoring techniques could be used to estimate the remaining useful life of these components. A great deal of research has been performed on various condition monitoring techniques for use on electric cables. In a research program sponsored by the U.S. Nuclear Regulatory Commission, several promising techniques were evaluated and found to provide trendable information on the condition of low-voltage electric cables. These techniques may be useful for predicting remaining life if well defined limiting values for the aging properties being measured can be determined. However, each technique has advantages and limitations that must be addressed in order to use it effectively, and the necessary limiting values are not always easy to obtain. This paper discusses how condition monitoring measurements can be used to predict the remaining useful life of electric cables. The attributes of an appropriate condition monitoring technique are presented, and the process to be used in estimating the remaining useful life of a cable is discussed along with the difficulties that must be addressed.

  5. Spectral optimization and uncertainty quantification in combustion modeling

    NASA Astrophysics Data System (ADS)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will

  6. Why Do Some Cores Remain Starless?

    NASA Astrophysics Data System (ADS)

    Anathpindika, S.

    2016-08-01

    Prestellar cores, by definition, are gravitationally bound but starless pockets of dense gas. Physical conditions that could render a core starless (in the local Universe) is the subject of investigation in this work. To this end, we studied the evolution of four starless cores, B68, L694-2, L1517B, L1689, and L1521F, a VeLLO. We demonstrate: (i) cores contracted in quasistatic manner over a timescale on the order of ~ 105 yr. Those that remained starless briefly acquired a centrally concentrated density configuration that mimicked the profile of a unstable BonnorEbert sphere before rebounding, (ii) three cores viz. L694-2, L1689-SMM16, and L1521F remained starless despite becoming thermally super-critical. By contrast, B68 and L1517B remained sub-critical; L1521F collapsed to become a VeLLO only when gas-cooling was enhanced by increasing the size of dust-grains. This result is robust, for other starless cores viz. B68, L694-2, L1517B, and L1689 could also be similarly induced to collapse. The temperature-profile of starless cores and those that collapsed was found to be radically different. While in the former type, only very close to the centre of a core was there any evidence of decline in gas temperature, by contrast, a core of the latter type developed a more uniformly cold interior. Our principle conclusions are: (a) thermal super-criticality of a core is insufficient to ensure it will become protostellar, (b) potential star-forming cores (the VeLLO L1521F here), could be experiencing dust-coagulation that must enhance gasdust coupling and in turn lower gas temperature, thereby assisting collapse. This also suggests, mere gravitational/virial boundedness of a core is insufficient to ensure it will form stars.

  7. 51-L Challenger Crew Remains Transferred

    NASA Technical Reports Server (NTRS)

    1988-01-01

    The Challenger crewmember remains are being transferred from 7 hearse vehicles to a MAC C-141 transport plane at the Kennedy Space Center's Shuttle Landing Facility for transport to Dover Air Force Base, Delaware. The STS-51L crew consisted of: Mission Specialist, Ellison S. Onizuka, Teacher in Space Participant Sharon Christa McAuliffe, Payload Specialist, Greg Jarvis and Mission Specialist, Judy Resnik. In the front row from left to right: Pilot Mike Smith, Commander, Dick Scobee and Mission Specialist, Ron McNair.

  8. Uncertainty in quantum mechanics: faith or fantasy?

    PubMed

    Penrose, Roger

    2011-12-13

    The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications.

  9. Model development and data uncertainty integration

    SciTech Connect

    Swinhoe, Martyn Thomas

    2015-12-02

    The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the 240Pu spontaneous fission neutron spectrum, the effect of P(ν) for 240Pu spontaneous fission, and the effect of spontaneous fission and (α,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (α,n) processes, and uncertainties and important data depend on the analysis technique chosen.

  10. So close: remaining challenges to eradicating polio.

    PubMed

    Toole, Michael J

    2016-03-14

    The Global Polio Eradication Initiative, launched in 1988, is close to achieving its goal. In 2015, reported cases of wild poliovirus were limited to just two countries - Afghanistan and Pakistan. Africa has been polio-free for more than 18 months. Remaining barriers to global eradication include insecurity in areas such as Northwest Pakistan and Eastern and Southern Afghanistan, where polio cases continue to be reported. Hostility to vaccination is either based on extreme ideologies, such as in Pakistan, vaccination fatigue by parents whose children have received more than 15 doses, and misunderstandings about the vaccine's safety and effectiveness such as in Ukraine. A further challenge is continued circulation of vaccine-derived poliovirus in populations with low immunity, with 28 cases reported in 2015 in countries as diverse as Madagascar, Ukraine, Laos, and Myanmar. This paper summarizes the current epidemiology of wild and vaccine-derived poliovirus, and describes the remaining challenges to eradication and innovative approaches being taken to overcome them.

  11. Shotgun microbial profiling of fossil remains.

    PubMed

    Der Sarkissian, C; Ermini, L; Jónsson, H; Alekseev, A N; Crubezy, E; Shapiro, B; Orlando, L

    2014-04-01

    Millions to billions of DNA sequences can now be generated from ancient skeletal remains thanks to the massive throughput of next-generation sequencing platforms. Except in cases of exceptional endogenous DNA preservation, most of the sequences isolated from fossil material do not originate from the specimen of interest, but instead reflect environmental organisms that colonized the specimen after death. Here, we characterize the microbial diversity recovered from seven c. 200- to 13 000-year-old horse bones collected from northern Siberia. We use a robust, taxonomy-based assignment approach to identify the microorganisms present in ancient DNA extracts and quantify their relative abundance. Our results suggest that molecular preservation niches exist within ancient samples that can potentially be used to characterize the environments from which the remains are recovered. In addition, microbial community profiling of the seven specimens revealed site-specific environmental signatures. These microbial communities appear to comprise mainly organisms that colonized the fossils recently. Our approach significantly extends the amount of useful data that can be recovered from ancient specimens using a shotgun sequencing approach. In future, it may be possible to correlate, for example, the accumulation of postmortem DNA damage with the presence and/or abundance of particular microbes.

  12. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.

  13. Recent progress toward reducing the uncertainty in tropical low cloud feedback and climate sensitivity: a review

    NASA Astrophysics Data System (ADS)

    Kamae, Youichi; Ogura, Tomoo; Shiogama, Hideo; Watanabe, Masahiro

    2016-12-01

    Equilibrium climate sensitivity (ECS) to doubling of atmospheric CO2 concentration is a key index for understanding the Earth's climate history and prediction of future climate changes. Tropical low cloud feedback, the predominant factor for uncertainty in modeled ECS, diverges both in sign and magnitude among climate models. Despite its importance, the uncertainty in ECS and low cloud feedback remains a challenge. Recently, researches based on observations and climate models have demonstrated a possibility that the tropical low cloud feedback in a perturbed climate can be constrained by the observed relationship between cloud, sea surface temperature and atmospheric dynamic and thermodynamic structures. The observational constraint on the tropical low cloud feedback suggests a higher ECS range than raw range obtained from climate model simulations. In addition, newly devised modeling frameworks that address both spreads among different model structures and parameter settings have contributed to evaluate possible ranges of the uncertainty in ECS and low cloud feedback. Further observational and modeling approaches and their combinations may help to advance toward dispelling the clouds of uncertainty.

  14. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  15. Low uncertainty method for inertia tensor identification

    NASA Astrophysics Data System (ADS)

    Barreto, J. P.; Muñoz, L. E.

    2016-02-01

    The uncertainty associated with the experimental identification of the inertia tensor can be reduced by implementing adequate rotational and translational motions in the experiment. This paper proposes a particular 3D trajectory that improves the experimental measurement of the inertia tensor of rigid bodies. Such a trajectory corresponds to a motion in which the object is rotated around a large number of instantaneous axes, while the center of gravity remains static. The uncertainty in the inertia tensor components obtained with this practice is reduced by 45% in average, compared with those calculated using simple rotations around three perpendicular axes (Roll, Pitch, Yaw).

  16. Uncertainty in ecological risk assessment: A statistician`s view

    SciTech Connect

    Smith, E.P.

    1995-12-31

    Uncertainty is a topic that has different meanings to researchers, modelers, managers and policy makers. The perspective of this presentation will be on the modeling view of uncertainty and its quantitative assessment. The goal is to provide some insight into how a statistician visualizes and addresses the issue of uncertainty in ecological risk assessment problems. In ecological risk assessment, uncertainty arises from many sources and is of different type depending on what is studies, where it is studied and how it is studied. Some major sources and their impact are described. A variety of quantitative approaches to modeling uncertainty are characterized and a general taxonomy given. Examples of risk assessments of lake acidification, power plant impact assessment and the setting of standards for chemicals will be used discuss approaches to quantitative assessment of uncertainty and some of the potential difficulties.

  17. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  18. Some remaining problems in HCDA analysis. [LMFBR

    SciTech Connect

    Chang, Y.W.

    1981-01-01

    The safety assessment and licensing of liquid-metal fast breeder reactors (LMFBRs) requires an analysis on the capability of the reactor primary system to sustain the consequences of a hypothetical core-disruptive accident (HCDA). Although computational methods and computer programs developed for HCDA analyses can predict reasonably well the response of the primary containment system, and follow up the phenomena of HCDA from the start of excursion to the time of dynamic equilibrium in the system, there remain areas in the HCDA analysis that merit further analytical and experimental studies. These are the analysis of fluid impact on reactor cover, three-dimensional analysis, the treatment of the perforated plates, material properties under high strain rates and under high temperatures, the treatment of multifield flows, and the treatment of prestressed concrete reactor vessels. The purpose of this paper is to discuss the structural mechanics of HCDA analysis in these areas where improvements are needed.

  19. Traffic-Related Air Pollution and Childhood Asthma: Recent Advances and Remaining Gaps in the Exposure Assessment Methods.

    PubMed

    Khreis, Haneen; Nieuwenhuijsen, Mark J

    2017-03-17

    Background: Current levels of traffic-related air pollution (TRAP) are associated with the development of childhood asthma, although some inconsistencies and heterogeneity remain. An important part of the uncertainty in studies of TRAP-associated asthma originates from uncertainties in the TRAP exposure assessment and assignment methods. In this work, we aim to systematically review the exposure assessment methods used in the epidemiology of TRAP and childhood asthma, highlight recent advances, remaining research gaps and make suggestions for further research. Methods: We systematically reviewed epidemiological studies published up until 8 September 2016 and available in Embase, Ovid MEDLINE (R), and "Transport database". We included studies which examined the association between children's exposure to TRAP metrics and their risk of "asthma" incidence or lifetime prevalence, from birth to the age of 18 years old. Results: We found 42 studies which examined the associations between TRAP and subsequent childhood asthma incidence or lifetime prevalence, published since 1999. Land-use regression modelling was the most commonly used method and nitrogen dioxide (NO₂) was the most commonly used pollutant in the exposure assessments. Most studies estimated TRAP exposure at the residential address and only a few considered the participants' mobility. TRAP exposure was mostly assessed at the birth year and only a few studies considered different and/or multiple exposure time windows. We recommend that further work is needed including e.g., the use of new exposure metrics such as the composition of particulate matter, oxidative potential and ultra-fine particles, improved modelling e.g., by combining different exposure assessment models, including mobility of the participants, and systematically investigating different exposure time windows. Conclusions: Although our previous meta-analysis found statistically significant associations for various TRAP exposures and

  20. Traffic-Related Air Pollution and Childhood Asthma: Recent Advances and Remaining Gaps in the Exposure Assessment Methods

    PubMed Central

    Khreis, Haneen; Nieuwenhuijsen, Mark J.

    2017-01-01

    Background: Current levels of traffic-related air pollution (TRAP) are associated with the development of childhood asthma, although some inconsistencies and heterogeneity remain. An important part of the uncertainty in studies of TRAP-associated asthma originates from uncertainties in the TRAP exposure assessment and assignment methods. In this work, we aim to systematically review the exposure assessment methods used in the epidemiology of TRAP and childhood asthma, highlight recent advances, remaining research gaps and make suggestions for further research. Methods: We systematically reviewed epidemiological studies published up until 8 September 2016 and available in Embase, Ovid MEDLINE (R), and “Transport database”. We included studies which examined the association between children’s exposure to TRAP metrics and their risk of “asthma” incidence or lifetime prevalence, from birth to the age of 18 years old. Results: We found 42 studies which examined the associations between TRAP and subsequent childhood asthma incidence or lifetime prevalence, published since 1999. Land-use regression modelling was the most commonly used method and nitrogen dioxide (NO2) was the most commonly used pollutant in the exposure assessments. Most studies estimated TRAP exposure at the residential address and only a few considered the participants’ mobility. TRAP exposure was mostly assessed at the birth year and only a few studies considered different and/or multiple exposure time windows. We recommend that further work is needed including e.g., the use of new exposure metrics such as the composition of particulate matter, oxidative potential and ultra-fine particles, improved modelling e.g., by combining different exposure assessment models, including mobility of the participants, and systematically investigating different exposure time windows. Conclusions: Although our previous meta-analysis found statistically significant associations for various TRAP exposures and

  1. Does hypertension remain after kidney transplantation?

    PubMed

    Pourmand, Gholamreza; Dehghani, Sanaz; Rahmati, Mohamad Reza; Mehrsai, Abdolrasoul; Gooran, Shahram; Alizadeh, Farimah; Khaki, Siavash; Mortazavi, Seyede Hamideh; Pourmand, Naghmeh

    2015-01-01

    Hypertension is a common complication of kidney transplantation with the prevalence of 80%. Studies in adults have shown a high prevalence of hypertension (HTN) in the first three months of transplantation while this rate is reduced to 50- 60% at the end of the first year. HTN remains as a major risk factor for cardiovascular diseases, lower graft survival rates and poor function of transplanted kidney in adults and children. In this retrospective study, medical records of 400 kidney transplantation patients of Sina Hospital were evaluated. Patients were followed monthly for the 1st year, every two months in the 2nd year and every three months after that. In this study 244 (61%) patients were male. Mean ± SD age of recipients was 39.3 ± 13.8 years. In most patients (40.8%) the cause of end-stage renal disease (ESRD) was unknown followed by HTN (26.3%). A total of 166 (41.5%) patients had been hypertensive before transplantation and 234 (58.5%) had normal blood pressure. Among these 234 individuals, 94 (40.2%) developed post-transplantation HTN. On the other hand, among 166 pre-transplant hypertensive patients, 86 patients (56.8%) remained hypertensive after transplantation. Totally 180 (45%) patients had post-transplantation HTN and 220 patients (55%) didn't develop HTN. Based on the findings, the incidence of post-transplantation hypertension is high, and kidney transplantation does not lead to remission of hypertension. On the other hand, hypertension is one of the main causes of ESRD. Thus, early screening of hypertension can prevent kidney damage and reduce further problems in renal transplant recipients.

  2. The Reach Address Database (RAD)

    EPA Pesticide Factsheets

    The Reach Address Database (RAD) stores reach address information for each Water Program feature that has been linked to the underlying surface water features (streams, lakes, etc) in the National Hydrology Database (NHD) Plus dataset.

  3. Uncertainty and Dimensional Calibrations

    PubMed Central

    Doiron, Ted; Stoup, John

    1997-01-01

    The calculation of uncertainty for a measurement is an effort to set reasonable bounds for the measurement result according to standardized rules. Since every measurement produces only an estimate of the answer, the primary requisite of an uncertainty statement is to inform the reader of how sure the writer is that the answer is in a certain range. This report explains how we have implemented these rules for dimensional calibrations of nine different types of gages: gage blocks, gage wires, ring gages, gage balls, roundness standards, optical flats indexing tables, angle blocks, and sieves. PMID:27805114

  4. A Certain Uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, Mark P.

    2014-07-01

    1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

  5. The legacy of uncertainty

    NASA Technical Reports Server (NTRS)

    Brown, Laurie M.

    1993-01-01

    An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

  6. Uncertainty and Dimensional Calibrations.

    PubMed

    Doiron, Ted; Stoup, John

    1997-01-01

    The calculation of uncertainty for a measurement is an effort to set reasonable bounds for the measurement result according to standardized rules. Since every measurement produces only an estimate of the answer, the primary requisite of an uncertainty statement is to inform the reader of how sure the writer is that the answer is in a certain range. This report explains how we have implemented these rules for dimensional calibrations of nine different types of gages: gage blocks, gage wires, ring gages, gage balls, roundness standards, optical flats indexing tables, angle blocks, and sieves.

  7. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten

  8. Weighted Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-01-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation. PMID:26984295

  9. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect

    Smith, F.; Phifer, M.

    2011-06-30

    sand and clay), (b) Dose Parameters (34 parameters), (c) Material Properties (20 parameters), (d) Surface Water Flows (6 parameters), and (e) Vadose and Aquifer Flow (4 parameters). Results provided an assessment of which group of parameters is most significant in the dose uncertainty. It was found that K{sub d} and the vadose/aquifer flow parameters, both of which impact transport timing, had the greatest impact on dose uncertainty. Dose parameters had an intermediate level of impact while material properties and surface water flows had little impact on dose uncertainty. Results of the importance analysis are discussed further in Section 7 of this report. The objectives of this work were to address comments received during the CA review on the uncertainty analysis and to demonstrate an improved methodology for CA uncertainty calculations as part of CA maintenance. This report partially addresses the LFRG Review Team issue of producing an enhanced CA sensitivity and uncertainty analysis. This is described in Table 1-1 which provides specific responses to pertinent CA maintenance items extracted from Section 11 of the SRS CA (2009). As noted above, the original uncertainty analysis looked at each POA separately and only included the effects from at most five sources giving the highest peak doses at each POA. Only 17 of the 152 CA sources were used in the original uncertainty analysis and the simulation time was reduced from 10,000 to 2,000 years. A major constraint on the original uncertainty analysis was the limitation of only being able to use at most four distributed processes. This work expanded the analysis to 10,000 years using 39 of the CA sources, included cumulative dose effects at downstream POAs, with more realizations (1,000) and finer time steps. This was accomplished by using the GoldSim DP-Plus module and the 36 processors available on a new windows cluster. The last part of the work looked at the contribution to overall uncertainty from the main

  10. Uncertainties in the North Korean Nuclear Threat

    DTIC Science & Technology

    2010-01-01

    providing objective analysis and effective solutions that address the challenges facing the public and private sectors around the world . RAND’s...stigmatize Korean goods, further complicating problems for the Korean economy. And economic disruption can ripple through an economy in devastating ways...the outside world . 1 “N. Korean Poster Seems to Confi rm Succession,” 2009. 4 Uncertainties in the North Korean Nuclear Threat NK Nuclear

  11. Optimal test selection for prediction uncertainty reduction

    DOE PAGES

    Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel

    2016-12-02

    Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecisemore » data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.« less

  12. Optimal test selection for prediction uncertainty reduction

    SciTech Connect

    Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel

    2016-12-02

    Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecise data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.

  13. Uncertainty in NIST Force Measurements.

    PubMed

    Bartel, Tom

    2005-01-01

    This paper focuses upon the uncertainty of force calibration measurements at the National Institute of Standards and Technology (NIST). The uncertainty of the realization of force for the national deadweight force standards at NIST is discussed, as well as the uncertainties associated with NIST's voltage-ratio measuring instruments and with the characteristics of transducers being calibrated. The combined uncertainty is related to the uncertainty of dissemination for force transfer standards sent to NIST for calibration.

  14. Smart Point Cloud: Definition and Remaining Challenges

    NASA Astrophysics Data System (ADS)

    Poux, F.; Hallot, P.; Neuville, R.; Billen, R.

    2016-10-01

    Dealing with coloured point cloud acquired from terrestrial laser scanner, this paper identifies remaining challenges for a new data structure: the smart point cloud. This concept arises with the statement that massive and discretized spatial information from active remote sensing technology is often underused due to data mining limitations. The generalisation of point cloud data associated with the heterogeneity and temporality of such datasets is the main issue regarding structure, segmentation, classification, and interaction for an immediate understanding. We propose to use both point cloud properties and human knowledge through machine learning to rapidly extract pertinent information, using user-centered information (smart data) rather than raw data. A review of feature detection, machine learning frameworks and database systems indexed both for mining queries and data visualisation is studied. Based on existing approaches, we propose a new 3-block flexible framework around device expertise, analytic expertise and domain base reflexion. This contribution serves as the first step for the realisation of a comprehensive smart point cloud data structure.

  15. Organic Remains in Finnish Subglacial Sediments

    NASA Astrophysics Data System (ADS)

    Punkari, Mikko; Forsström, Lars

    1995-05-01

    Many sites in Fennoscandia contain pre-Late Weichselian beds of organic matter, located mostly in the flanks of eskers. It is a matter of debate whether these fragmentary beds were deposited in situ, or whether they were deposited elsewhere and then picked up and moved by glacial ice. The till-mantled esker of Harrinkangas includes a shallow depression filled with sand and silt containing, for example, several tightly packed laminar sheets of brown moss ( Bryales) remains. It is argued that these thin peat sheets were transported at the base of the ice sheet, or englacially, and were deposited together with the silt and sand on the side of a subglacial meltwater tunnel. Subglacial meltout till subsequently covered the flanks of the esker near the receding ice margin. Information about the depositional and climatic environments was obtained from biostratigraphic analysis of the organic matter. Pollen spectra for the peat represent an open birch forest close to the tundra zone. A thin diamicton beneath the peat contains charred pine wood, recording the former presence of pine forests in western Finland. The unhumified, extremely well-preserved peat evidently originated during the final phase of an ice-free period, most probably the end of the Eemian Interglaciation. It was redeposited in the esker by the last ice sheet. Reconstructions of the Pleistocene chronology and stratigraphy of central Fennoscandia that rely on such redeposited organic matter should be viewed with caution.

  16. Spatial patterning of vulture scavenged human remains.

    PubMed

    Spradley, M Katherine; Hamilton, Michelle D; Giordano, Alberto

    2012-06-10

    This article presents the results of a pilot study on the effects of vulture modification to human remains. A donated body from the Willed Body Donation Program was placed at the Forensic Anthropology Research Facility (FARF), an outdoor human decomposition laboratory located at Texas State University-San Marcos. The effects of vulture scavenging on the timing and sequence, and the rate of skeletonization, disarticulation, and dispersal were observed via a motion sensing camera and direct observation. Using GIS (Geographic Information Systems) and GPS (Global Positioning System) technologies and spatial analytical methods, the transport of skeletal elements was mapped in order to analyze dispersal and terrain-influenced patterns of active vulture scavenging. Results showed that the initial scavenging took place 37 days after placement at FARF. This delay in scavenging differs from previous research. After the initial appearance of the vultures, the body was reduced from a fully-fleshed individual to a skeleton within only 5h. This underscores the potential for errors in postmortem interval estimations made at vulture scavenged scenes. Additionally, spatial analysis showed that skeletal elements were dispersed by vultures to lower elevations, and that the disarticulation and dispersal of the skeletal elements occurs early in the scavenging sequence.

  17. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  18. Uncertainties in successive measurements

    NASA Astrophysics Data System (ADS)

    Distler, Jacques; Paban, Sonia

    2013-06-01

    When you measure an observable, A, in quantum mechanics, the state of the system changes. This, in turn, affects the quantum-mechanical uncertainty in some noncommuting observable, B. The standard uncertainty relation puts a lower bound on the uncertainty of B in the initial state. What is relevant for a subsequent measurement of B, however, is the uncertainty of B in the postmeasurement state. We re-examine this problem, both in the case where A has a pure point spectrum and in the case where A has a continuous spectrum. In the latter case, the need to include a finite detector resolution, as part of what it means to measure such an observable, has dramatic implications for the result of successive measurements. Ozawa, [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.67.042105 67, 042105 (2003)] proposed an inequality satisfied in the case of successive measurements. Among our results, we show that his inequality is ineffective (can never come close to being saturated). For the cases of interest, we compute a sharper lower bound.

  19. Uncertainties in repository modeling

    SciTech Connect

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  20. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  1. Reducing the uncertainty in subtropical cloud feedback

    NASA Astrophysics Data System (ADS)

    Myers, Timothy A.; Norris, Joel R.

    2016-03-01

    Large uncertainty remains on how subtropical clouds will respond to anthropogenic climate change and therefore whether they will act as a positive feedback that amplifies global warming or negative feedback that dampens global warming by altering Earth's energy budget. Here we reduce this uncertainty using an observationally constrained formulation of the response of subtropical clouds to greenhouse forcing. The observed interannual sensitivity of cloud solar reflection to varying meteorological conditions suggests that increasing sea surface temperature and atmospheric stability in the future climate will have largely canceling effects on subtropical cloudiness, overall leading to a weak positive shortwave cloud feedback (0.4 ± 0.9 W m-2 K-1). The uncertainty of this observationally based approximation of the cloud feedback is narrower than the intermodel spread of the feedback produced by climate models. Subtropical cloud changes will therefore complement positive cloud feedbacks identified by previous work, suggesting that future global cloud changes will amplify global warming.

  2. Uncertainty Quantification for Quantitative Imaging Holdup Measurements

    SciTech Connect

    Bevill, Aaron M; Bledsoe, Keith C

    2016-01-01

    In nuclear fuel cycle safeguards, special nuclear material "held up" in pipes, ducts, and glove boxes causes significant uncertainty in material-unaccounted-for estimates. Quantitative imaging is a proposed non-destructive assay technique with potential to estimate the holdup mass more accurately and reliably than current techniques. However, uncertainty analysis for quantitative imaging remains a significant challenge. In this work we demonstrate an analysis approach for data acquired with a fast-neutron coded aperture imager. The work includes a calibrated forward model of the imager. Cross-validation indicates that the forward model predicts the imager data typically within 23%; further improvements are forthcoming. A new algorithm based on the chi-squared goodness-of-fit metric then uses the forward model to calculate a holdup confidence interval. The new algorithm removes geometry approximations that previous methods require, making it a more reliable uncertainty estimator.

  3. Strategy under uncertainty.

    PubMed

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  4. Ghost Remains After Black Hole Eruption

    NASA Astrophysics Data System (ADS)

    2009-05-01

    NASA's Chandra X-ray Observatory has found a cosmic "ghost" lurking around a distant supermassive black hole. This is the first detection of such a high-energy apparition, and scientists think it is evidence of a huge eruption produced by the black hole. This discovery presents astronomers with a valuable opportunity to observe phenomena that occurred when the Universe was very young. The X-ray ghost, so-called because a diffuse X-ray source has remained after other radiation from the outburst has died away, is in the Chandra Deep Field-North, one of the deepest X-ray images ever taken. The source, a.k.a. HDF 130, is over 10 billion light years away and existed at a time 3 billion years after the Big Bang, when galaxies and black holes were forming at a high rate. "We'd seen this fuzzy object a few years ago, but didn't realize until now that we were seeing a ghost", said Andy Fabian of the Cambridge University in the United Kingdom. "It's not out there to haunt us, rather it's telling us something - in this case what was happening in this galaxy billions of year ago." Fabian and colleagues think the X-ray glow from HDF 130 is evidence for a powerful outburst from its central black hole in the form of jets of energetic particles traveling at almost the speed of light. When the eruption was ongoing, it produced prodigious amounts of radio and X-radiation, but after several million years, the radio signal faded from view as the electrons radiated away their energy. HDF 130 Chandra X-ray Image of HDF 130 However, less energetic electrons can still produce X-rays by interacting with the pervasive sea of photons remaining from the Big Bang - the cosmic background radiation. Collisions between these electrons and the background photons can impart enough energy to the photons to boost them into the X-ray energy band. This process produces an extended X-ray source that lasts for another 30 million years or so. "This ghost tells us about the black hole's eruption long after

  5. An Integrated Bayesian Uncertainty Estimator: fusion of Input, Parameter and Model Structural Uncertainty Estimation in Hydrologic Prediction System

    NASA Astrophysics Data System (ADS)

    Ajami, N. K.; Duan, Q.; Sorooshian, S.

    2005-12-01

    To-date single conceptual hydrologic models often applied to interpret physical processes within a watershed. Nevertheless hydrologic models regardless of their sophistication and complexity are simplified representation of the complex, spatially distributed and highly nonlinear real world system. Consequently their hydrologic predictions contain considerable uncertainty from different sources including: hydrometeorological forcing inputs, boundary/initial conditions, model structure, model parameters which need to be accounted for. Thus far the effort has gone to address these sources of uncertainty explicitly, making an implicit assumption that uncertainties from different sources are additive. Nevertheless because of the nonlinear nature of the hydrologic systems, it is not feasible to account for these uncertainties independently. Here we present the Integrated Bayesian Uncertainty Estimator (IBUNE) which accounts for total uncertainties from all major sources: inputs forcing, model structure, model parameters. This algorithm explores multi-model framework to tackle model structural uncertainty while using the Bayesian rules to estimate parameter and input uncertainty within individual models. Three hydrologic models including SACramento Soil Moisture Accounting (SAC-SMA) model, Hydrologic model (HYMOD) and Simple Water Balance (SWB) model were considered within IBUNE framework for this study. The results which are presented for the Leaf River Basin, MS, indicates that IBUNE gives a better quantification of uncertainty through hydrological modeling processes, therefore provide more reliable and less bias prediction with realistic uncertainty boundaries.

  6. Population growth of Yellowstone grizzly bears: Uncertainty and future monitoring

    USGS Publications Warehouse

    Harris, R.B.; White, Gary C.; Schwartz, C.C.; Haroldson, M.A.

    2007-01-01

    Grizzly bears (Ursus arctos) in the Greater Yellowstone Ecosystem of the US Rocky Mountains have recently increased in numbers, but remain vulnerable due to isolation from other populations and predicted reductions in favored food resources. Harris et al. (2006) projected how this population might fare in the future under alternative survival rates, and in doing so estimated the rate of population growth, 1983–2002. We address issues that remain from that earlier work: (1) the degree of uncertainty surrounding our estimates of the rate of population change (λ); (2) the effect of correlation among demographic parameters on these estimates; and (3) how a future monitoring system using counts of females accompanied by cubs might usefully differentiate between short-term, expected, and inconsequential fluctuations versus a true change in system state. We used Monte Carlo re-sampling of beta distributions derived from the demographic parameters used by Harris et al. (2006) to derive distributions of λ during 1983–2002 given our sampling uncertainty. Approximate 95% confidence intervals were 0.972–1.096 (assuming females with unresolved fates died) and 1.008–1.115 (with unresolved females censored at last contact). We used well-supported models of Haroldson et al. (2006) and Schwartz et al. (2006a,b,c) to assess the strength of correlations among demographic processes and the effect of omitting them in projection models. Incorporating correlations among demographic parameters yielded point estimates of λ that were nearly identical to those from the earlier model that omitted correlations, but yielded wider confidence intervals surrounding λ. Finally, we suggest that fitting linear and quadratic curves to the trend suggested by the estimated number of females with cubs in the ecosystem, and using AICc model weights to infer population sizes and λ provides an objective means to monitoring approximate population trajectories in addition to demographic

  7. Computations of uncertainty mediate acute stress responses in humans.

    PubMed

    de Berker, Archy O; Rutledge, Robb B; Mathys, Christoph; Marshall, Louise; Cross, Gemma F; Dolan, Raymond J; Bestmann, Sven

    2016-03-29

    The effects of stress are frequently studied, yet its proximal causes remain unclear. Here we demonstrate that subjective estimates of uncertainty predict the dynamics of subjective and physiological stress responses. Subjects learned a probabilistic mapping between visual stimuli and electric shocks. Salivary cortisol confirmed that our stressor elicited changes in endocrine activity. Using a hierarchical Bayesian learning model, we quantified the relationship between the different forms of subjective task uncertainty and acute stress responses. Subjective stress, pupil diameter and skin conductance all tracked the evolution of irreducible uncertainty. We observed a coupling between emotional and somatic state, with subjective and physiological tuning to uncertainty tightly correlated. Furthermore, the uncertainty tuning of subjective and physiological stress predicted individual task performance, consistent with an adaptive role for stress in learning under uncertain threat. Our finding that stress responses are tuned to environmental uncertainty provides new insight into their generation and likely adaptive function.

  8. Computations of uncertainty mediate acute stress responses in humans

    PubMed Central

    de Berker, Archy O.; Rutledge, Robb B.; Mathys, Christoph; Marshall, Louise; Cross, Gemma F.; Dolan, Raymond J.; Bestmann, Sven

    2016-01-01

    The effects of stress are frequently studied, yet its proximal causes remain unclear. Here we demonstrate that subjective estimates of uncertainty predict the dynamics of subjective and physiological stress responses. Subjects learned a probabilistic mapping between visual stimuli and electric shocks. Salivary cortisol confirmed that our stressor elicited changes in endocrine activity. Using a hierarchical Bayesian learning model, we quantified the relationship between the different forms of subjective task uncertainty and acute stress responses. Subjective stress, pupil diameter and skin conductance all tracked the evolution of irreducible uncertainty. We observed a coupling between emotional and somatic state, with subjective and physiological tuning to uncertainty tightly correlated. Furthermore, the uncertainty tuning of subjective and physiological stress predicted individual task performance, consistent with an adaptive role for stress in learning under uncertain threat. Our finding that stress responses are tuned to environmental uncertainty provides new insight into their generation and likely adaptive function. PMID:27020312

  9. Science, Uncertainty, and Adaptive Management in Large River Restoration Programs: Trinity River example

    NASA Astrophysics Data System (ADS)

    McBain, S.

    2002-12-01

    Following construction of Trinity and Lewiston dams on the upper Trinity River in 1964, dam induced changes to streamflows and sediment regime had severely simplified channel morphology and aquatic habitat downstream of the dams. This habitat change, combined with blocked access to over 100 miles of salmon and steelhead habitat upstream of the dams, caused salmon and steelhead populations to quickly plummet. An instream flow study was initiated in 1984 to address the flow needs to restore the fishery, and this study relied on the Physical Habitat Simulation (PHABSIM) Model to quantify instream flow needs. In 1992, geomorphic and riparian studies were integrated into the instream flow study, with the overall study completed in 1999 (USFWS 1999). This 13-year process continued through three presidential administrations, several agency managers, and many turnovers of the agency technical staff responsible for conducting the study. This process culminated in 1996-1998 when a group of scientists were convened to integrate all the studies and data to produce the final instream flow study document. This 13-year, non-linear process, resulted in many uncertainties that could not be resolved in the short amount of time allowed for completing the instream flow study document. Shortly after completion of the instream flow study document, the Secretary of Interior issued a Record of Decision to implement the recommendations contained in the instream flow study document. The uncertainties encountered as the instream flow study report was prepared were highlighted in the report, and the Record of Decision initiated an Adaptive Environmental Assessment and Management program to address these existing uncertainties and improve future river management. There have been many lessons learned going through this process, and the presentation will summarize: 1)The progression of science used to develop the instream flow study report; 2)How the scientists preparing the report addressed

  10. Ciguatera: recent advances but the risk remains.

    PubMed

    Lehane, L; Lewis, R J

    2000-11-01

    Ciguatera is an important form of human poisoning caused by the consumption of seafood. The disease is characterised by gastrointestinal, neurological and cardiovascular disturbances. In cases of severe toxicity, paralysis, coma and death may occur. There is no immunity, and the toxins are cumulative. Symptoms may persist for months or years, or recur periodically. The epidemiology of ciguatera is complex and of central importance to the management and future use of marine resources. Ciguatera is an important medical entity in tropical and subtropical Pacific and Indian Ocean regions, and in the tropical Caribbean. As reef fish are increasingly exported to other areas, it has become a world health problem. The disease is under-reported and often misdiagnosed. Lipid-soluble, polyether toxins known as ciguatoxins accumulated in the muscles of certain subtropical and tropical marine finfish cause ciguatera. Ciguatoxins arise from biotransformation in the fish of less polar ciguatoxins (gambiertoxins) produced by Gambierdiscus toxicus, a marine dinoflagellate that lives on macroalgae, usually attached to dead coral. The toxins and their metabolites are concentrated in the food chain when carnivorous fish prey on smaller herbivorous fish. Humans are exposed at the end of the food chain. More than 400 species of fish can be vectors of ciguatoxins, but generally only a relatively small number of species are regularly incriminated in ciguatera. Ciguateric fish look, taste and smell normal, and detection of toxins in fish remains a problem. More than 20 precursor gambiertoxins and ciguatoxins have been identified in G. toxicus and in herbivorous and carnivorous fish. The toxins become more polar as they undergo oxidative metabolism and pass up the food chain. The main Pacific ciguatoxin (P-CTX-1) causes ciguatera at levels=0.1 microg/kg in the flesh of carnivorous fish. The main Caribbean ciguatoxin (C-CTX-1) is less polar and 10-fold less toxic than P-CTX-1. Ciguatoxins

  11. On the Directional Dependence and Null Space Freedom in Uncertainty Bound Identification

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    1997-01-01

    In previous work, the determination of uncertainty models via minimum norm model validation is based on a single set of input and output measurement data. Since uncertainty bounds at each frequency is directionally dependent for multivariable systems, this will lead to optimistic uncertainty levels. In addition, the design freedom in the uncertainty model has not been utilized to further reduce uncertainty levels. The above issues are addressed by formulating a min- max problem. An analytical solution to the min-max problem is given to within a generalized eigenvalue problem, thus avoiding a direct numerical approach. This result will lead to less conservative and more realistic uncertainty models for use in robust control.

  12. Atomic data for stellar spectroscopy: recent successes and remaining needs

    NASA Astrophysics Data System (ADS)

    Sneden, Christopher; Lawler, James E.; Wood, Michael P.; Den Hartog, Elizabeth A.; Cowan, John J.

    2014-11-01

    Stellar chemical composition analyses provide vital insights into galactic nucleosynthesis. Atomic line data are critical inputs to stellar abundance computations. Recent lab studies have made significant progress in refining and extending knowledge of transition probabilities, isotopic wavelength shifts, and hyperfine substructure patterns for the absorption lines that are of most interest to stellar spectroscopists. The observable neutron-capture (n-capture) element species (Z \\gt 30) have been scrutinized in lab studies by several groups. For many species the uncertainties in experimental oscillator strengths are ≤slant 10%, which permits detailed assessment of rapid and slow n-capture nucleosynthesis contributions. In this review, extreme examples of r-process-enriched stars in the galactic halo will be shown, which suggest that the description of observable n-capture abundances in these stars is nearly complete. Unfortunately, there are serious remaining concerns about the reliability of observed abundances of lighter elements. In particular, it is not clear that line formation in real stellar atmospheres is being modeled correctly. But for many elements with Z \\lt 30 the atomic transition data are not yet settled. Highlights will be given of some recent large improvements, with suggestions for the most important needs for the near future.

  13. CONTENT-ADDRESSABLE MEMORY SYSTEMS,

    DTIC Science & Technology

    The utility of content -addressable memories (CAM’s) within a general purpose computing system is investigated. Word cells within CAM may be...addressed by the character of all or a part of cell contents . Multimembered sets of word cells may be addressed simultaneously. The distributed logical...package is developed which allows simulation of CAM commands within job programs run on the IBM 7090 and derives tallies of execution times corresponding to a particular realization of a CAM system . (Author)

  14. Radar stage uncertainty

    USGS Publications Warehouse

    Fulford, J.M.; Davies, W.J.

    2005-01-01

    The U.S. Geological Survey is investigating the performance of radars used for stage (or water-level) measurement. This paper presents a comparison of estimated uncertainties and data for radar water-level measurements with float, bubbler, and wire weight water-level measurements. The radar sensor was also temperature-tested in a laboratory. The uncertainty estimates indicate that radar measurements are more accurate than uncorrected pressure sensors at higher water stages, but are less accurate than pressure sensors at low stages. Field data at two sites indicate that radar sensors may have a small negative bias. Comparison of field radar measurements with wire weight measurements found that the radar tends to measure slightly lower values as stage increases. Copyright ASCE 2005.

  15. How Uncertain is Uncertainty?

    NASA Astrophysics Data System (ADS)

    Vámos, Tibor

    The gist of the paper is the fundamental uncertain nature of all kinds of uncertainties and consequently a critical epistemic review of historical and recent approaches, computational methods, algorithms. The review follows the development of the notion from the beginnings of thinking, via the Aristotelian and Skeptic view, the medieval nominalism and the influential pioneering metaphors of ancient India and Persia to the birth of modern mathematical disciplinary reasoning. Discussing the models of uncertainty, e.g. the statistical, other physical and psychological background we reach a pragmatic model related estimation perspective, a balanced application orientation for different problem areas. Data mining, game theories and recent advances in approximation algorithms are discussed in this spirit of modest reasoning.

  16. Uncertainties in transpiration estimates.

    PubMed

    Coenders-Gerrits, A M J; van der Ent, R J; Bogaard, T A; Wang-Erlandsson, L; Hrachowitz, M; Savenije, H H G

    2014-02-13

    arising from S. Jasechko et al. Nature 496, 347-350 (2013)10.1038/nature11983How best to assess the respective importance of plant transpiration over evaporation from open waters, soils and short-term storage such as tree canopies and understories (interception) has long been debated. On the basis of data from lake catchments, Jasechko et al. conclude that transpiration accounts for 80-90% of total land evaporation globally (Fig. 1a). However, another choice of input data, together with more conservative accounting of the related uncertainties, reduces and widens the transpiration ratio estimation to 35-80%. Hence, climate models do not necessarily conflict with observations, but more measurements on the catchment scale are needed to reduce the uncertainty range. There is a Reply to this Brief Communications Arising by Jasechko, S. et al. Nature 506, http://dx.doi.org/10.1038/nature12926 (2014).

  17. Aggregating and Communicating Uncertainty.

    DTIC Science & Technology

    1980-04-01

    means for identifying and communicating uncertainty. i 12- APPENDIX A BIBLIOGRAPHY j| 1. Ajzen , Icek ; "Intuitive Theories of Events and the Effects...disregarding valid but noncausal information." (Icak Ajzen , "Intuitive Theo- ries of Events and the Effects of Base-Rate Information on Prediction...9 4i,* ,4.. -. .- S % to the criterion while disregarding valid but noncausal information." (Icak Ajzen , "Intuitive Theories of Events and the Effects

  18. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  19. Variants of Uncertainty

    DTIC Science & Technology

    1981-05-15

    Variants of Uncertainty Daniel Kahneman University of British Columbia Amos Tversky Stanford University DTI-C &%E-IECTE ~JUNO 1i 19 8 1j May 15, 1981... Dennett , 1979) in which different parts have ac- cess to different data, assign then different weights and hold different views of the situation...2robable and t..h1 provable. Oxford- Claredor Press, 1977. Dennett , D.C. Brainstorms. Hassocks: Harvester, 1979. Donchin, E., Ritter, W. & McCallum, W.C

  20. Calibration Under Uncertainty.

    SciTech Connect

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  1. Uncertainty Quantification in Aeroelasticity

    NASA Astrophysics Data System (ADS)

    Beran, Philip; Stanford, Bret; Schrock, Christopher

    2017-01-01

    Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.

  2. Facing uncertainty in ecosystem services-based resource management.

    PubMed

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes.

  3. Multi-scenario modelling of uncertainty in stochastic chemical systems

    SciTech Connect

    Evans, R. David; Ricardez-Sandoval, Luis A.

    2014-09-15

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo.

  4. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches.

  5. Uncertainty and error in computational simulations

    SciTech Connect

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  6. Addressing spatial scales and new mechanisms in climate impact ecosystem modeling

    NASA Astrophysics Data System (ADS)

    Poulter, B.; Joetzjer, E.; Renwick, K.; Ogunkoya, G.; Emmett, K.

    2015-12-01

    Climate change impacts on vegetation distributions are typically addressed using either an empirical approach, such as a species distribution model (SDM), or with process-based methods, for example, dynamic global vegetation models (DGVMs). Each approach has its own benefits and disadvantages. For example, an SDM is constrained by data and few parameters, but does not include adaptation or acclimation processes or other ecosystem feedbacks that may act to mitigate or enhance climate effects. Alternatively, a DGVM model includes many mechanisms relating plant growth and disturbance to climate, but simulations are costly to perform at high-spatial resolution and there remains large uncertainty on a variety of fundamental physical processes. To address these issues, here, we present two DGVM-based case studies where i) high-resolution (1 km) simulations are being performed for vegetation in the Greater Yellowstone Ecosystem using a biogeochemical, forest gap model, LPJ-GUESS, and ii) where new mechanisms for simulating tropical tree-mortality are being introduced. High-resolution DGVM model simulations require not only computing and reorganizing code but also a consideration of scaling issues on vegetation dynamics and stochasticity and also on disturbance and migration. New mechanisms for simulating forest mortality must consider hydraulic limitations and carbon reserves and their interactions on source-sink dynamics and in controlling water potentials. Improving DGVM approaches by addressing spatial scale challenges and integrating new approaches for estimating forest mortality will provide new insights more relevant for land management and possibly reduce uncertainty by physical processes more directly comparable to experimental and observational evidence.

  7. CO2 studies remain key to understanding a future world.

    PubMed

    Becklin, Katie M; Walker, S Michael; Way, Danielle A; Ward, Joy K

    2017-04-01

    Contents 34 I. 34 II. 36 III. 37 IV. 37 V. 38 38 References 38 SUMMARY: Characterizing plant responses to past, present and future changes in atmospheric carbon dioxide concentration ([CO2 ]) is critical for understanding and predicting the consequences of global change over evolutionary and ecological timescales. Previous CO2 studies have provided great insights into the effects of rising [CO2 ] on leaf-level gas exchange, carbohydrate dynamics and plant growth. However, scaling CO2 effects across biological levels, especially in field settings, has proved challenging. Moreover, many questions remain about the fundamental molecular mechanisms driving plant responses to [CO2 ] and other global change factors. Here we discuss three examples of topics in which significant questions in CO2 research remain unresolved: (1) mechanisms of CO2 effects on plant developmental transitions; (2) implications of rising [CO2 ] for integrated plant-water dynamics and drought tolerance; and (3) CO2 effects on symbiotic interactions and eco-evolutionary feedbacks. Addressing these and other key questions in CO2 research will require collaborations across scientific disciplines and new approaches that link molecular mechanisms to complex physiological and ecological interactions across spatiotemporal scales.

  8. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  9. Propagation of radar rainfall uncertainty in urban flood simulations

    NASA Astrophysics Data System (ADS)

    Liguori, Sara; Rico-Ramirez, Miguel

    2013-04-01

    This work discusses the results of the implementation of a novel probabilistic system designed to improve ensemble sewer flow predictions for the drainage network of a small urban area in the North of England. The probabilistic system has been developed to model the uncertainty associated to radar rainfall estimates and propagate it through radar-based ensemble sewer flow predictions. The assessment of this system aims at outlining the benefits of addressing the uncertainty associated to radar rainfall estimates in a probabilistic framework, to be potentially implemented in the real-time management of the sewer network in the study area. Radar rainfall estimates are affected by uncertainty due to various factors [1-3] and quality control and correction techniques have been developed in order to improve their accuracy. However, the hydrological use of radar rainfall estimates and forecasts remains challenging. A significant effort has been devoted by the international research community to the assessment of the uncertainty propagation through probabilistic hydro-meteorological forecast systems [4-5], and various approaches have been implemented for the purpose of characterizing the uncertainty in radar rainfall estimates and forecasts [6-11]. A radar-based ensemble stochastic approach, similar to the one implemented for use in the Southern-Alps by the REAL system [6], has been developed for the purpose of this work. An ensemble generator has been calibrated on the basis of the spatial-temporal characteristics of the residual error in radar estimates assessed with reference to rainfall records from around 200 rain gauges available for the year 2007, previously post-processed and corrected by the UK Met Office [12-13]. Each ensemble member is determined by summing a perturbation field to the unperturbed radar rainfall field. The perturbations are generated by imposing the radar error spatial and temporal correlation structure to purely stochastic fields. A

  10. Pauli effects in uncertainty relations

    NASA Astrophysics Data System (ADS)

    Toranzo, I. V.; Sánchez-Moreno, P.; Esquivel, R. O.; Dehesa, J. S.

    2014-10-01

    In this Letter we analyze the effect of the spin dimensionality of a physical system in two mathematical formulations of the uncertainty principle: a generalized Heisenberg uncertainty relation valid for all antisymmetric N-fermion wavefunctions, and the Fisher-information-based uncertainty relation valid for all antisymmetric N-fermion wavefunctions of central potentials. The accuracy of these spin-modified uncertainty relations is examined for all atoms from Hydrogen to Lawrencium in a self-consistent framework.

  11. Uncertainty Quantification in Climate Modeling and Projection

    SciTech Connect

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.

  12. 8 CFR 213a.3 - Notice of change of address.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... SUPPORT ON BEHALF OF IMMIGRANTS § 213a.3 Notice of change of address. (a)(1) If the address of a sponsor... obligation under the affidavit of support remains in effect with respect to any sponsored immigrant, the... 213A(d)(2)(A) of the Act. (ii) If the sponsor, knowing that the sponsored immigrant has received...

  13. 8 CFR 213a.3 - Notice of change of address.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... SUPPORT ON BEHALF OF IMMIGRANTS § 213a.3 Notice of change of address. (a)(1) If the address of a sponsor... obligation under the affidavit of support remains in effect with respect to any sponsored immigrant, the... 213A(d)(2)(A) of the Act. (ii) If the sponsor, knowing that the sponsored immigrant has received...

  14. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  15. Climate change risk perception and communication: addressing a critical moment?

    PubMed

    Pidgeon, Nick

    2012-06-01

    Climate change is an increasingly salient issue for societies and policy-makers worldwide. It now raises fundamental interdisciplinary issues of risk and uncertainty analysis and communication. The growing scientific consensus over the anthropogenic causes of climate change appears to sit at odds with the increasing use of risk discourses in policy: for example, to aid in climate adaptation decision making. All of this points to a need for a fundamental revision of our conceptualization of what it is to do climate risk communication. This Special Collection comprises seven papers stimulated by a workshop on "Climate Risk Perceptions and Communication" held at Cumberland Lodge Windsor in 2010. Topics addressed include climate uncertainties, images and the media, communication and public engagement, uncertainty transfer in climate communication, the role of emotions, localization of hazard impacts, and longitudinal analyses of climate perceptions. Climate change risk perceptions and communication work is critical for future climate policy and decisions.

  16. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2015-01-01

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  17. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    SciTech Connect

    Díez, C.J.; Cabellos, O.; Martínez, J.S.

    2015-01-15

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  18. Addressing health literacy in patient decision aids

    PubMed Central

    2013-01-01

    Background Effective use of a patient decision aid (PtDA) can be affected by the user’s health literacy and the PtDA’s characteristics. Systematic reviews of the relevant literature can guide PtDA developers to attend to the health literacy needs of patients. The reviews reported here aimed to assess: 1. a) the effects of health literacy / numeracy on selected decision-making outcomes, and b) the effects of interventions designed to mitigate the influence of lower health literacy on decision-making outcomes, and 2. the extent to which existing PtDAs a) account for health literacy, and b) are tested in lower health literacy populations. Methods We reviewed literature for evidence relevant to these two aims. When high-quality systematic reviews existed, we summarized their evidence. When reviews were unavailable, we conducted our own systematic reviews. Results Aim 1: In an existing systematic review of PtDA trials, lower health literacy was associated with lower patient health knowledge (14 of 16 eligible studies). Fourteen studies reported practical design strategies to improve knowledge for lower health literacy patients. In our own systematic review, no studies reported on values clarity per se, but in 2 lower health literacy was related to higher decisional uncertainty and regret. Lower health literacy was associated with less desire for involvement in 3 studies, less question-asking in 2, and less patient-centered communication in 4 studies; its effects on other measures of patient involvement were mixed. Only one study assessed the effects of a health literacy intervention on outcomes; it showed that using video to improve the salience of health states reduced decisional uncertainty. Aim 2: In our review of 97 trials, only 3 PtDAs overtly addressed the needs of lower health literacy users. In 90% of trials, user health literacy and readability of the PtDA were not reported. However, increases in knowledge and informed choice were reported in those studies

  19. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi

  20. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1972-01-01

    Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.

  1. Methods for handling uncertainty within pharmaceutical funding decisions

    NASA Astrophysics Data System (ADS)

    Stevenson, Matt; Tappenden, Paul; Squires, Hazel

    2014-01-01

    This article provides a position statement regarding decision making under uncertainty within the economic evaluation of pharmaceuticals, with a particular focus upon the National Institute for Health and Clinical Excellence context within England and Wales. This area is of importance as funding agencies have a finite budget from which to purchase a selection of competing health care interventions. The objective function generally used is that of maximising societal health with an explicit acknowledgement that there will be opportunity costs associated with purchasing a particular intervention. Three components of uncertainty are discussed within a pharmaceutical funding perspective: methodological uncertainty, parameter uncertainty and structural uncertainty, alongside a discussion of challenges that are particularly pertinent to health economic evaluation. The discipline has focused primarily on handling methodological and parameter uncertainty and a clear reference case has been developed for consistency across evaluations. However, uncertainties still remain. Less attention has been given to methods for handling structural uncertainty. The lack of adequate methods to explicitly incorporate this aspect of model development may result in the true uncertainty surrounding health care investment decisions being underestimated. Research in this area is ongoing as we review.

  2. Efficient inversion and uncertainty quantification of a tephra fallout model

    NASA Astrophysics Data System (ADS)

    White, J. T.; Connor, C. B.; Connor, L.; Hasenaka, T.

    2017-01-01

    An efficient and effective inversion and uncertainty quantification approach is proposed for estimating eruption parameters given a data set collected from a tephra deposit. The approach is model independent and here is applied using Tephra2, a code that simulates advective and dispersive tephra transport and deposition. The Levenburg-Marquardt algorithm is combined with formal Tikhonov and subspace regularization to invert eruption parameters; a linear equation for conditional uncertainty propagation is used to estimate posterior parameter uncertainty. Both the inversion and uncertainty analysis support simultaneous analysis of the full eruption and wind field parameterization. The combined inversion/uncertainty quantification approach is applied to the 1992 eruption of Cerro Negro and the 2011 Kirishima-Shinmoedake eruption. While eruption mass uncertainty is reduced by inversion against tephra isomass data, considerable uncertainty remains for many eruption and wind field parameters, such as plume height. Supplementing the inversion data set with tephra granulometry data is shown to further reduce the uncertainty of most eruption and wind field parameters. The eruption mass of the 2011 Kirishima-Shinmoedake eruption is 0.82 × 1010 kg to 2.6 × 1010 kg, with 95% confidence; total eruption mass for the 1992 Cerro Negro eruption is 4.2 × 1010 kg to 7.3 × 1010 kg, with 95% confidence. These results indicate that eruption classification and characterization of eruption parameters can be significantly improved through this uncertainty quantification approach.

  3. Uncertainty bounds using sector theory

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Schmidt, David K.

    1989-01-01

    An approach based on sector-stability theory can furnish a description of the uncertainty associated with the frequency response of a model, given sector-bounds on the individual parameters of the model. The application of the sector-based approach to the formulation of useful uncertainty descriptions for linear, time-invariant multivariable systems is presently explored, and the approach is applied to two generic forms of parameter uncertainty in order to investigate its advantages and limitations. The results obtained show that sector-uncertainty bounds can be used to evaluate the impact of parameter uncertainties on the frequency response of the design model.

  4. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  5. Address tracing for parallel machines

    NASA Technical Reports Server (NTRS)

    Stunkel, Craig B.; Janssens, Bob; Fuchs, W. Kent

    1991-01-01

    Recently implemented parallel system address-tracing methods based on several metrics are surveyed. The issues specific to collection of traces for both shared and distributed memory parallel computers are highlighted. Five general categories of address-trace collection methods are examined: hardware-captured, interrupt-based, simulation-based, altered microcode-based, and instrumented program-based traces. The problems unique to shared memory and distributed memory multiprocessors are examined separately.

  6. Teaching Quantum Uncertainty1

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2011-10-01

    An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"—spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

  7. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  8. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  9. Uncertainty in adaptive capacity

    NASA Astrophysics Data System (ADS)

    Adger, W. Neil; Vincent, Katharine

    2005-03-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).

  10. Models in animal collective decision-making: information uncertainty and conflicting preferences.

    PubMed

    Conradt, Larissa

    2012-04-06

    Collective decision-making plays a central part in the lives of many social animals. Two important factors that influence collective decision-making are information uncertainty and conflicting preferences. Here, I bring together, and briefly review, basic models relating to animal collective decision-making in situations with information uncertainty and in situations with conflicting preferences between group members. The intention is to give an overview about the different types of modelling approaches that have been employed and the questions that they address and raise. Despite the use of a wide range of different modelling techniques, results show a coherent picture, as follows. Relatively simple cognitive mechanisms can lead to effective information pooling. Groups often face a trade-off between decision accuracy and speed, but appropriate fine-tuning of behavioural parameters could achieve high accuracy while maintaining reasonable speed. The right balance of interdependence and independence between animals is crucial for maintaining group cohesion and achieving high decision accuracy. In conflict situations, a high degree of decision-sharing between individuals is predicted, as well as transient leadership and leadership according to needs and physiological status. Animals often face crucial trade-offs between maintaining group cohesion and influencing the decision outcome in their own favour. Despite the great progress that has been made, there remains one big gap in our knowledge: how do animals make collective decisions in situations when information uncertainty and conflict of interest operate simultaneously?

  11. Uncertainty and sensitivity analyses of a decision analytic model for posteradication polio risk management.

    PubMed

    Duintjer Tebbens, Radboud J; Pallansch, Mark A; Kew, Olen M; Sutter, Roland W; Bruce Aylward, R; Watkins, Margaret; Gary, Howard; Alexander, James; Jafari, Hamid; Cochi, Stephen L; Thompson, Kimberly M

    2008-08-01

    Decision analytic modeling of polio risk management policies after eradication may help inform decisionmakers about the quantitative tradeoffs implied by various options. Given the significant dynamic complexity and uncertainty involving posteradication decisions, this article aims to clarify the structure of a decision analytic model developed to help characterize the risks, costs, and benefits of various options for polio risk management after eradication of wild polioviruses and analyze the implications of different sources of uncertainty. We provide an influence diagram of the model with a description of each component, explore the impact of different assumptions about model inputs, and present probability distributions of model outputs. The results show that choices made about surveillance, response, and containment for different income groups and immunization policies play a major role in the expected final costs and polio cases. While the overall policy implications of the model remain robust to the variations of assumptions and input uncertainty we considered, the analyses suggest the need for policymakers to carefully consider tradeoffs and for further studies to address the most important knowledge gaps.

  12. Organizational uncertainty and stress among teachers in Hong Kong: work characteristics and organizational justice.

    PubMed

    Hassard, Juliet; Teoh, Kevin; Cox, Tom

    2016-03-30

    A growing literature now exists examining the relationship between organizational justice and employees' experience of stress. Despite the growth in this field of enquiry, there remain continued gaps in knowledge. In particular, the contribution of perceptions of justice to employees' stress within an organizational context of uncertainty and change, and in relation to the new and emerging concept of procedural-voice justice. The aim of the current study was to examine the main, interaction and additive effects of work characteristics and organizational justice perceptions to employees' experience of stress (as measured by their feelings of helplessness and perceived coping) during an acknowledged period of organizational uncertainty. Questionnaires were distributed among teachers in seven public primary schools in Hong Kong that were under threat of closure (n= 212). Work characteristics were measured using the demand-control-support model. Hierarchical regression analyses observed perceptions of job demands and procedural-voice justice to predict both teachers' feelings of helplessness and perceived coping ability. Furthermore, teacher's perceived coping was predicted by job control and a significant interaction between procedural-voice justice and distributive justice. The addition of organizational justice variables did account for unique variance, but only in relation to the measure of perceived coping. The study concludes that in addition to 'traditional' work characteristics, health promotion strategies should also address perceptions of organizational justice during times of organizational uncertainty; and, in particular, the value and importance of enhancing employee's perceived 'voice' in influencing and shaping justice-related decisions.

  13. Are models, uncertainty, and dispute resolution compatible?

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  14. Uncertainty Analysis of the Three Pagodas Fault-Source Geometry

    NASA Astrophysics Data System (ADS)

    Haller, K. M.

    2015-12-01

    Probabilistic seismic-hazard assessment generally relies on an earthquake catalog (to estimate future seismicity from the locations and rates of past earthquakes) and faults sources (to estimate future seismicity from the known paleoseismic history of surface rupture). The paleoseismic history of potentially active faults in Southeast Asia is addressed at few locations and spans only a few complete recurrence intervals; many faults remain unstudied. Even where the timing of a surface-rupturing earthquakes is known, the extent of rupture may not be well constrained. Therefore, subjective judgment of experts is often used to define the three-dimensional size of future ruptures; limited paleoseismic data can lead to large uncertainties in ground-motion hazard from fault sources due to the preferred models that underlie these judgments. The 300-km-long, strike-slip Three Pagodas fault in western Thailand is possibly one of the most active faults in the country. The fault parallels the plate boundary and may be characterized by a slip rate high enough to result in measurable ground-motion at periods of interest for building design. The known paleoseismic history is limited and likely does not include the largest possible earthquake on the fault. This lack of knowledge begs the question what sizes of earthquakes are expected? Preferred rupture models constrain possible magnitude-frequency distributions, and alternative rupture models can result in different ground-motion hazard near the fault. This analysis includes alternative rupture models for the Three Pagodas fault, a first-level check against gross modeling assumptions to assure the source model is a reasonable reflection of observed data, and resulting ground-motion hazard for each alternative. Inadequate paleoseismic data is an important source of uncertainty that could be compensated for by considering alternative rupture models for poorly known seismic sources.

  15. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  16. Forensic Entomology: Evaluating Uncertainty Associated With Postmortem Interval (PMI) Estimates With Ecological Models.

    PubMed

    Faris, A M; Wang, H-H; Tarone, A M; Grant, W E

    2016-05-31

    Estimates of insect age can be informative in death investigations and, when certain assumptions are met, can be useful for estimating the postmortem interval (PMI). Currently, the accuracy and precision of PMI estimates is unknown, as error can arise from sources of variation such as measurement error, environmental variation, or genetic variation. Ecological models are an abstract, mathematical representation of an ecological system that can make predictions about the dynamics of the real system. To quantify the variation associated with the pre-appearance interval (PAI), we developed an ecological model that simulates the colonization of vertebrate remains by Cochliomyia macellaria (Fabricius) (Diptera: Calliphoridae), a primary colonizer in the southern United States. The model is based on a development data set derived from a local population and represents the uncertainty in local temperature variability to address PMI estimates at local sites. After a PMI estimate is calculated for each individual, the model calculates the maximum, minimum, and mean PMI, as well as the range and standard deviation for stadia collected. The model framework presented here is one manner by which errors in PMI estimates can be addressed in court when no empirical data are available for the parameter of interest. We show that PAI is a potential important source of error and that an ecological model is one way to evaluate its impact. Such models can be re-parameterized with any development data set, PAI function, temperature regime, assumption of interest, etc., to estimate PMI and quantify uncertainty that arises from specific prediction systems.

  17. Risk communication: Uncertainties and the numbers game

    SciTech Connect

    Ortigara, M.

    1995-08-30

    The science of risk assessment seeks to characterize the potential risk in situations that may pose hazards to human health or the environment. However, the conclusions reached by the scientists and engineers are not an end in themselves - they are passed on to the involved companies, government agencies, legislators, and the public. All interested parties must then decide what to do with the information. Risk communication is a type of technical communication that involves some unique challenges. This paper first defines the relationships between risk assessment, risk management, and risk communication and then explores two issues in risk communication: addressing uncertainty and putting risk number into perspective.

  18. Uncertainties in container failure time predictions

    SciTech Connect

    Williford, R.E.

    1990-01-01

    Stochastic variations in the local chemical environment of a geologic waste repository can cause corresponding variations in container corrosion rates and failure times, and thus in radionuclide release rates. This paper addresses how well the future variations in repository chemistries must be known in order to predict container failure times that are bounded by a finite time period within the repository lifetime. Preliminary results indicate that a 5000 year scatter in predicted container failure times requires that repository chemistries be known to within {plus minus}10% over the repository lifetime. These are small uncertainties compared to current estimates. 9 refs., 3 figs.

  19. Communicating Storm Surge Forecast Uncertainty

    NASA Astrophysics Data System (ADS)

    Troutman, J. A.; Rhome, J.

    2015-12-01

    When it comes to tropical cyclones, storm surge is often the greatest threat to life and property along the coastal United States. The coastal population density has dramatically increased over the past 20 years, putting more people at risk. Informing emergency managers, decision-makers and the public about the potential for wind driven storm surge, however, has been extremely difficult. Recently, the Storm Surge Unit at the National Hurricane Center in Miami, Florida has developed a prototype experimental storm surge watch/warning graphic to help communicate this threat more effectively by identifying areas most at risk for life-threatening storm surge. This prototype is the initial step in the transition toward a NWS storm surge watch/warning system and highlights the inundation levels that have a 10% chance of being exceeded. The guidance for this product is the Probabilistic Hurricane Storm Surge (P-Surge) model, which predicts the probability of various storm surge heights by statistically evaluating numerous SLOSH model simulations. Questions remain, however, if exceedance values in addition to the 10% may be of equal importance to forecasters. P-Surge data from 2014 Hurricane Arthur is used to ascertain the practicality of incorporating other exceedance data into storm surge forecasts. Extracting forecast uncertainty information through analyzing P-surge exceedances overlaid with track and wind intensity forecasts proves to be beneficial for forecasters and decision support.

  20. DOD ELAP Lab Uncertainties

    DTIC Science & Technology

    2012-03-01

    IPV6, NLLAP, NEFAP  TRAINING Programs  Certification Bodies – ISO /IEC 17021  Accreditation for  Management  System  Certification Bodies that...certify to :  ISO   9001  (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO /IEC 17025:2005  Each has uncertainty...NOTES Presented at the 9th Annual DoD Environmental Monitoring and Data Quality (EDMQ) Workshop Held 26-29 March 2012 in La Jolla, CA. U.S

  1. Generalized uncertainty relations

    NASA Astrophysics Data System (ADS)

    Herdegen, Andrzej; Ziobro, Piotr

    2017-04-01

    The standard uncertainty relations (UR) in quantum mechanics are typically used for unbounded operators (like the canonical pair). This implies the need for the control of the domain problems. On the other hand, the use of (possibly bounded) functions of basic observables usually leads to more complex and less readily interpretable relations. In addition, UR may turn trivial for certain states if the commutator of observables is not proportional to a positive operator. In this letter we consider a generalization of standard UR resulting from the use of two, instead of one, vector states. The possibility to link these states to each other in various ways adds additional flexibility to UR, which may compensate some of the above-mentioned drawbacks. We discuss applications of the general scheme, leading not only to technical improvements, but also to interesting new insight.

  2. Uncertainty as Certaint

    NASA Astrophysics Data System (ADS)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  3. Medical decisions under uncertainty.

    PubMed

    Carmi, A

    1993-01-01

    The court applies the criteria of the reasonable doctor and common practice in order to consider the behaviour of a defendant physician. The meaning of our demand that the doctor expects that his or her acts or omissions will bring about certain implications is that, according to the present circumstances and subject to the limited knowledge of the common practice, the course of certain events or situations in the future may be assumed in spite of the fog of uncertainty which surrounds us. The miracles and wonders of creation are concealed from us, and we are not aware of the way and the nature of our bodily functioning. Therefore, there seems to be no way to avoid mistakes, because in several cases the correct diagnosis cannot be determined even with the most advanced application of all information available. Doctors find it difficult to admit that they grope in the dark. They wish to form clear and accurate diagnoses for their patients. The fact that their profession is faced with innumerable and unavoidable risks and mistakes is hard to swallow, and many of them claim that in their everyday work this does not happen. They should not content themselves by changing their style. A radical metamorphosis is needed. They should not be tempted to formulate their diagnoses in 'neutral' statements in order to be on the safe side. Uncertainty should be accepted and acknowledged by the profession and by the public at large as a human phenomenon, as an integral part of any human decision, and as a clear characteristic of any legal or medical diagnosis.(ABSTRACT TRUNCATED AT 250 WORDS)

  4. Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

    SciTech Connect

    Amoroso, Richard L.

    2010-12-22

    For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

  5. Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

    NASA Astrophysics Data System (ADS)

    Amoroso, Richard L.

    2010-12-01

    For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M̂4±C4 with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

  6. Incorporating Model Parameter Uncertainty into Prostate IMRT Treatment Planning

    DTIC Science & Technology

    2005-04-01

    Distribution Unlimited The views, opinions and/or findings contained in this report are those of the author( s ) and should not be construed as an...Incorporating Model Parameter Uncertainty into Prostate DAMD17-03-1-0019 IMRT Treatment Planning 6. AUTHOR( S ) David Y. Yang, Ph.D. 7. PERFORMING ORGANIZA TION...NAME( S ) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION Stanford University REPORT NUMBER Stanford, California 94305-5401 E-Mail: yong@reyes .stanford

  7. Identifying the crystal graveyards remaining after large silicic eruptions

    NASA Astrophysics Data System (ADS)

    Gelman, Sarah E.; Deering, Chad D.; Bachmann, Olivier; Huber, Christian; Gutiérrez, Francisco J.

    2014-10-01

    The formation of crystal-poor high-silica rhyolite via extraction of interstitial melt from an upper crustal mush predicts the complementary formation of large amounts of (typically unerupted) silicic cumulates. However, identification of these cumulates remains controversial. One hindrance to our ability to identify them is a lack of clear predictions for complementary chemical signatures between extracted melts and their residues. To address this discrepancy, we present a generalized geochemical model tracking the evolution of trace elements in a magma reservoir concurrently experiencing crystallization and extraction of interstitial melt. Our method uses a numerical solution rather than analytical, thereby allowing for various dependencies between crystallinity, partition coefficients for variably compatible and/or incompatible elements, and melt extraction efficiency. Results reveal unambiguous fractionation signatures for the extracted melts, while those signatures are muted for their cumulate counterparts. Our model is first applied to a well-constrained example (Searchlight pluton, USA), and provides a good fit to geochemical data. We then extrapolate our results to understanding the relationship between volcanic and plutonic silicic suites on a global scale. Utilizing the NAVDAT database to identify crystal accumulation or depletion signatures for each suite, we suggest that many large granitoids are indeed silicic cumulates, although their crystal accumulation signature is expected to be subtle.

  8. Coalition Formation under Uncertainty

    DTIC Science & Technology

    2010-03-01

    Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements for the Degree of...Accepted: M.U. Thomas, PhD Date Dean, Graduate School of Engineering and Management AFIT/DEE/ENG/10-05 Abstract Many multiagent systems require allocation...andartificial intelligence. The first direct address of cooperative game theory is in von Neumann and Morgenstern’s book, Theory of Games and Economic Behavior

  9. Impact of discharge data uncertainty on nutrient load uncertainty

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars

    2016-04-01

    Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.

  10. Uncertainty and Surprise: An Introduction

    NASA Astrophysics Data System (ADS)

    McDaniel, Reuben R.; Driebe, Dean J.

    Much of the traditional scientific and applied scientific work in the social and natural sciences has been built on the supposition that the unknowability of situations is the result of a lack of information. This has led to an emphasis on uncertainty reduction through ever-increasing information seeking and processing, including better measurement and observational instrumentation. Pending uncertainty reduction through better information, efforts are devoted to uncertainty management and hierarchies of controls. A central goal has been the avoidance of surprise.

  11. 43 CFR 4730.2 - Disposal of remains.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... HORSES AND BURROS Destruction of Wild Horses or Burros and Disposal of Remains § 4730.2 Disposal of remains. Remains of wild horses or burros that die after capture shall be disposed of in accordance...

  12. 43 CFR 4730.2 - Disposal of remains.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HORSES AND BURROS Destruction of Wild Horses or Burros and Disposal of Remains § 4730.2 Disposal of remains. Remains of wild horses or burros that die after capture shall be disposed of in accordance...

  13. 43 CFR 4730.2 - Disposal of remains.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... HORSES AND BURROS Destruction of Wild Horses or Burros and Disposal of Remains § 4730.2 Disposal of remains. Remains of wild horses or burros that die after capture shall be disposed of in accordance...

  14. 43 CFR 4730.2 - Disposal of remains.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... HORSES AND BURROS Destruction of Wild Horses or Burros and Disposal of Remains § 4730.2 Disposal of remains. Remains of wild horses or burros that die after capture shall be disposed of in accordance...

  15. New approaches to uncertainty analysis for use in aggregate and cumulative risk assessment of pesticides.

    PubMed

    Kennedy, Marc C; van der Voet, Hilko; Roelofs, Victoria J; Roelofs, Willem; Glass, C Richard; de Boer, Waldo J; Kruisselbrink, Johannes W; Hart, Andy D M

    2015-05-01

    Risk assessments for human exposures to plant protection products (PPPs) have traditionally focussed on single routes of exposure and single compounds. Extensions to estimate aggregate (multi-source) and cumulative (multi-compound) exposure from PPPs present many new challenges and additional uncertainties that should be addressed as part of risk analysis and decision-making. A general approach is outlined for identifying and classifying the relevant uncertainties and variabilities. The implementation of uncertainty analysis within the MCRA software, developed as part of the EU-funded ACROPOLIS project to address some of these uncertainties, is demonstrated. An example is presented for dietary and non-dietary exposures to the triazole class of compounds. This demonstrates the chaining of models, linking variability and uncertainty generated from an external model for bystander exposure with variability and uncertainty in MCRA dietary exposure assessments. A new method is also presented for combining pesticide usage survey information with limited residue monitoring data, to address non-detect uncertainty. The results show that incorporating usage information reduces uncertainty in parameters of the residue distribution but that in this case quantifying uncertainty is not a priority, at least for UK grown crops. A general discussion of alternative approaches to treat uncertainty, either quantitatively or qualitatively, is included.

  16. Higher-order uncertainty relations

    NASA Astrophysics Data System (ADS)

    Wünsche, A.

    2006-07-01

    Using the non-negativity of Gram determinants of arbitrary order, we derive higher-order uncertainty relations for the symmetric uncertainty matrices of corresponding order n?>?2 to n Hermitean operators (n?=?2 is the usual case). The special cases of third-order and fourth-order uncertainty relations are considered in detail. The obtained third-order uncertainty relations are applied to the Lie groups SU(1,1) with three Hermitean basis operators (K1,K2,K0) and SU(2) with three Hermitean basis operators (J1,J2,J3) where, in particular, the group-coherent states of Perelomov type and of Barut Girardello type for SU(1,1) and the spin or atomic coherent states for SU(2) are investigated. The uncertainty relations for the determinant of the third-order uncertainty matrix are satisfied with the equality sign for coherent states and this determinant becomes vanishing for the Perelomov type of coherent states for SU(1,1) and SU(2). As an example of the application of fourth-order uncertainty relations, we consider the canonical operators (Q1,P1,Q2,P2) of two boson modes and the corresponding uncertainty matrix formed by the operators of the corresponding mean deviations, taking into account the correlations between the two modes. In two mathematical appendices, we prove the non-negativity of the determinant of correlation matrices of arbitrary order and clarify the principal structure of higher-order uncertainty relations.

  17. Simplified propagation of standard uncertainties

    SciTech Connect

    Shull, A.H.

    1997-06-09

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards` uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper.

  18. Every Other Day. Keynote Address.

    ERIC Educational Resources Information Center

    Tiller, Tom

    Schools need to be reoriented and restructured so that what is taught and learned, and the way in which it is taught and learned, are better integrated with young people's real-world experiences. Many indicators suggest that the meaningful aspects of school have been lost in the encounter with modern times. The title of this address--"Every…

  19. Agenda to address climate change

    SciTech Connect

    1998-10-01

    This document looks at addressing climate change in the 21st century. Topics covered are: Responding to climate change; exploring new avenues in energy efficiency; energy efficiency and alternative energy; residential sector; commercial sector; industrial sector; transportation sector; communities; renewable energy; understanding forests to mitigate and adapt to climate change; the Forest Carbon budget; mitigation and adaptation.

  20. Addressing Phonological Questions with Ultrasound

    ERIC Educational Resources Information Center

    Davidson, Lisa

    2005-01-01

    Ultrasound can be used to address unresolved questions in phonological theory. To date, some studies have shown that results from ultrasound imaging can shed light on how differences in phonological elements are implemented. Phenomena that have been investigated include transitional schwa, vowel coalescence, and transparent vowels. A study of…

  1. Keynote Address: Rev. Mark Massa

    ERIC Educational Resources Information Center

    Massa, Mark S.

    2011-01-01

    Rev. Mark S. Massa, S.J., is the dean and professor of Church history at the School of Theology and Ministry at Boston College. He was invited to give a keynote to begin the third Catholic Higher Education Collaborative Conference (CHEC), cosponsored by Boston College and Fordham University. Fr. Massa's address posed critical questions about…

  2. State of the Lab Address

    ScienceCinema

    King, Alex

    2016-07-12

    In his third-annual State of the Lab address, Ames Laboratory Director Alex King called the past year one of "quiet but strong progress" and called for Ames Laboratory to continue to build on its strengths while responding to changing expectations for energy research.

  3. Matching Alternative Addresses: a Semantic Web Approach

    NASA Astrophysics Data System (ADS)

    Ariannamazi, S.; Karimipour, F.; Hakimpour, F.

    2015-12-01

    Rapid development of crowd-sourcing or volunteered geographic information (VGI) provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature's literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.

  4. Uncertainty analysis for a field-scale P loss model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...

  5. Methods for exploring uncertainty in groundwater management predictions

    USGS Publications Warehouse

    Guillaume, Joseph H. A.; Hunt, Randall J.; Comunian, Alessandro; Fu, Baihua; Blakers, Rachel S; Jakeman, Anthony J; Barreteau, Olivier; Hunt, Randall J.; Rinaudo, Jean-Daniel; Ross, Andrew

    2016-01-01

    Models of groundwater systems help to integrate knowledge about the natural and human system covering different spatial and temporal scales, often from multiple disciplines, in order to address a range of issues of concern to various stakeholders. A model is simply a tool to express what we think we know. Uncertainty, due to lack of knowledge or natural variability, means that there are always alternative models that may need to be considered. This chapter provides an overview of uncertainty in models and in the definition of a problem to model, highlights approaches to communicating and using predictions of uncertain outcomes and summarises commonly used methods to explore uncertainty in groundwater management predictions. It is intended to raise awareness of how alternative models and hence uncertainty can be explored in order to facilitate the integration of these techniques with groundwater management.

  6. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    SciTech Connect

    Gregory, Julie J.; Harper, Frederick T.

    1999-07-28

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry.

  7. Qualitative Representation and Reasoning with Uncertainty in Space and Time

    NASA Astrophysics Data System (ADS)

    El-Geresy, Baher A.; Abdelmoty, Alia I.

    Imprecision, indeterminacy and vagueness are all terms which have been studied recently in studies of representations of entities in space and time. The interest has arisen from the fact that in many cases, precise information about objects in space are not available. In this paper a study of spatial uncertainty is presented and extended to temporal uncertainty. Different types and modes of uncertainty are identified. A unified framework is presented for the representation and reasoning over uncertain qualitative domains. The method addresses some of the main limitations of the current approaches. It is shown to apply to different types of entities with arbitrary complexity with total or partial uncertainty. The approach is part of a comprehensive research program aimed at developing a unified complete theory for qualitative spatial and temporal domains.

  8. Analysis and Reduction of Complex Networks Under Uncertainty

    SciTech Connect

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  9. Assessing Uncertainty in Subsurface Transport Predictions Using the ASCEM Toolset

    NASA Astrophysics Data System (ADS)

    Freedman, V.; Chen, X.; Keating, E. H.; Higdon, D. M.; Rockhold, M. L.; Schuchardt, K. L.; Finsterle, S.; Gorton, I.; Freshley, M.

    2011-12-01

    Transport simulation of nonreactive solutes can be used to identify potential pathways of contaminants in the vadose zone and the effectiveness of site remediation technologies. At the BC Cribs site at Hanford in southeastern Washington State, innovative remedial technologies are being explored to address recalcitrant contamination in the deep (~100 m) vadose zone. To identify the effectiveness of the technologies, the impacts of a "no-action" alternative must also be explored. Because only sparse information is available for the geologic conceptual model and the physical and chemical properties of the sediments, there is considerable uncertainty in subsurface transport predictions. In this contribution, the uncertainty of the technetium-99 mass flux to the water table due to parameter uncertainty and variations in the conceptual model are investigated using a newly developed toolset for performing an uncertainty quantification (UQ) analysis. This toolset is part of ASCEM (Advanced Simulation Capability for Environmental Management), a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. Using the Akuna user environment currently under development, the uncertainty in technetium-99 transport through a two-dimensional, heterogeneous vadose-zone system is quantified with Monte Carlo simulation. Results show that uncertainty in simulated mass fluxes in hydraulic properties can be significant within a single conceptual model, and that significant additional uncertainty can be introduced by conceptual model variation.

  10. Quantification of Emission Factor Uncertainty

    EPA Science Inventory

    Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

  11. Uncertainties in nuclear fission data

    NASA Astrophysics Data System (ADS)

    Talou, Patrick; Kawano, Toshihiko; Chadwick, Mark B.; Neudecker, Denise; Rising, Michael E.

    2015-03-01

    We review the current status of our knowledge of nuclear fission data, and quantify uncertainties related to each fission observable whenever possible. We also discuss the roles that theory and experiment play in reducing those uncertainties, contributing to the improvement of our fundamental understanding of the nuclear fission process as well as of evaluated nuclear data libraries used in nuclear applications.

  12. Mama Software Features: Uncertainty Testing

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  13. A framework for modeling anthropogenic impacts on waterbird habitats: addressing future uncertainty in conservation planning

    USGS Publications Warehouse

    Matchett, Elliott L.; Fleskes, Joseph P.; Young, Charles A.; Purkey, David R.

    2015-01-01

    The amount and quality of natural resources available for terrestrial and aquatic wildlife habitats are expected to decrease throughout the world in areas that are intensively managed for urban and agricultural uses. Changes in climate and management of increasingly limited water supplies may further impact water resources essential for sustaining habitats. In this report, we document adapting a Water Evaluation and Planning (WEAP) system model for the Central Valley of California. We demonstrate using this adapted model (WEAP-CVwh) to evaluate impacts produced from plausible future scenarios on agricultural and wetland habitats used by waterbirds and other wildlife. Processed output from WEAP-CVwh indicated varying levels of impact caused by projected climate, urbanization, and water supply management in scenarios used to exemplify this approach. Among scenarios, the NCAR-CCSM3 A2 climate projection had a greater impact than the CNRM-CM3 B1 climate projection, whereas expansive urbanization had a greater impact than strategic urbanization, on annual availability of waterbird habitat. Scenarios including extensive rice-idling or substantial instream flow requirements on important water supply sources produced large impacts on annual availability of waterbird habitat. In the year corresponding with the greatest habitat reduction for each scenario, the scenario including instream flow requirements resulted in the greatest decrease in habitats throughout all months of the wintering period relative to other scenarios. This approach provides a new and useful tool for habitat conservation planning in the Central Valley and a model to guide similar research investigations aiming to inform conservation, management, and restoration of important wildlife habitats.

  14. Evaluating Health Risks from Inhaled Polychlorinated Biphenyls: Research Needs for Addressing Uncertainty

    EPA Science Inventory

    Indoor air polychlorinated biphenyl (PCB) concentrations in some U.S. schools are one or more orders of magnitude higher than background levels. In response to this, efforts have been made to assess the potential health risk posed by inhaled PCBs. These efforts are hindered by un...

  15. Towards Algorithmic Advances for Solving Stackelberg Games: Addressing Model Uncertainties and Massive Game Scale-up

    DTIC Science & Technology

    2015-02-04

    SECURITY CLASSIFICATION OF: This project opens up a brand new area of research that fuses two separate subareas of game theory: algorithmic game theory...and behavioral game theory. More specifically, game -theoretic algorithms have been deployed by several security agencies, allowing them to generate...optimal randomized schedules against adversaries who may exploit predictability. However, one key challenge in applying game theory to solving real

  16. Mass Uncertainty and Application For Space Systems

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey

    2013-01-01

    Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.

  17. Atomic clusters with addressable complexity

    NASA Astrophysics Data System (ADS)

    Wales, David J.

    2017-02-01

    A general formulation for constructing addressable atomic clusters is introduced, based on one or more reference structures. By modifying the well depths in a given interatomic potential in favour of nearest-neighbour interactions that are defined in the reference(s), the potential energy landscape can be biased to make a particular permutational isomer the global minimum. The magnitude of the bias changes the resulting potential energy landscape systematically, providing a framework to produce clusters that should self-organise efficiently into the target structure. These features are illustrated for small systems, where all the relevant local minima and transition states can be identified, and for the low-energy regions of the landscape for larger clusters. For a 55-particle cluster, it is possible to design a target structure from a transition state of the original potential and to retain this structure in a doubly addressable landscape. Disconnectivity graphs based on local minima that have no direct connections to a lower minimum provide a helpful way to visualise the larger databases. These minima correspond to the termini of monotonic sequences, which always proceed downhill in terms of potential energy, and we identify them as a class of biminimum. Multiple copies of the target cluster are treated by adding a repulsive term between particles with the same address to maintain distinguishable targets upon aggregation. By tuning the magnitude of this term, it is possible to create assemblies of the target cluster corresponding to a variety of structures, including rings and chains.

  18. Uncertainty in Integrated Assessment Scenarios

    SciTech Connect

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  19. Equivalence theorem of uncertainty relations

    NASA Astrophysics Data System (ADS)

    Li, Jun-Li; Qiao, Cong-Feng

    2017-01-01

    We present an equivalence theorem to unify the two classes of uncertainty relations, i.e. the variance-based ones and the entropic forms, showing that the entropy of an operator in a quantum system can be built from the variances of a set of commutative operators. This means that an uncertainty relation in the language of entropy may be mapped onto a variance-based one, and vice versa. Employing the equivalence theorem, alternative formulations of entropic uncertainty relations are obtained for the qubit system that are stronger than the existing ones in the literature, and variance-based uncertainty relations for spin systems are reached from the corresponding entropic uncertainty relations.

  20. Reformulating the Quantum Uncertainty Relation.

    PubMed

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-03

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  1. Photoferrotrophy: Remains of an Ancient Photosynthesis in Modern Environments.

    PubMed

    Camacho, Antonio; Walter, Xavier A; Picazo, Antonio; Zopfi, Jakob

    2017-01-01

    Photoferrotrophy, the process by which inorganic carbon is fixed into organic matter using light as an energy source and reduced iron [Fe(II)] as an electron donor, has been proposed as one of the oldest photoautotrophic metabolisms on Earth. Under the iron-rich (ferruginous) but sulfide poor conditions dominating the Archean ocean, this type of metabolism could have accounted for most of the primary production in the photic zone. Here we review the current knowledge of biogeochemical, microbial and phylogenetic aspects of photoferrotrophy, and evaluate the ecological significance of this process in ancient and modern environments. From the ferruginous conditions that prevailed during most of the Archean, the ancient ocean evolved toward euxinic (anoxic and sulfide rich) conditions and, finally, much after the advent of oxygenic photosynthesis, to a predominantly oxic environment. Under these new conditions photoferrotrophs lost importance as primary producers, and now photoferrotrophy remains as a vestige of a formerly relevant photosynthetic process. Apart from the geological record and other biogeochemical markers, modern environments resembling the redox conditions of these ancient oceans can offer insights into the past significance of photoferrotrophy and help to explain how this metabolism operated as an important source of organic carbon for the early biosphere. Iron-rich meromictic (permanently stratified) lakes can be considered as modern analogs of the ancient Archean ocean, as they present anoxic ferruginous water columns where light can still be available at the chemocline, thus offering suitable niches for photoferrotrophs. A few bacterial strains of purple bacteria as well as of green sulfur bacteria have been shown to possess photoferrotrophic capacities, and hence, could thrive in these modern Archean ocean analogs. Studies addressing the occurrence and the biogeochemical significance of photoferrotrophy in ferruginous environments have been

  2. Future Remains: Industrial Heritage at the Hanford Plutonium Works

    NASA Astrophysics Data System (ADS)

    Freer, Brian

    This dissertation argues that U.S. environmental and historic preservation regulations, industrial heritage projects, history, and art only provide partial frameworks for successfully transmitting an informed story into the long range future about nuclear technology and its related environmental legacy. This argument is important because plutonium from nuclear weapons production is toxic to humans in very small amounts, threatens environmental health, has a half-life of 24, 110 years and because the industrial heritage project at Hanford is the first time an entire U.S. Department of Energy weapons production site has been designated a U.S. Historic District. This research is situated within anthropological interest in industrial heritage studies, environmental anthropology, applied visual anthropology, as well as wider discourses on nuclear studies. However, none of these disciplines is really designed or intended to be a completely satisfactory frame of reference for addressing this perplexing challenge of documenting and conveying an informed story about nuclear technology and its related environmental legacy into the long range future. Others have thought about this question and have made important contributions toward a potential solution. Examples here include: future generations movements concerning intergenerational equity as evidenced in scholarship, law, and amongst Native American groups; Nez Perce and Confederated Tribes of the Umatilla Indian Reservation responses to the Hanford End State Vision and Hanford's Canyon Disposition Initiative; as well as the findings of organizational scholars on the advantages realized by organizations that have a long term future perspective. While these ideas inform the main line inquiry of this dissertation, the principal approach put forth by the researcher of how to convey an informed story about nuclear technology and waste into the long range future is implementation of the proposed Future Remains clause, as

  3. Photoferrotrophy: Remains of an Ancient Photosynthesis in Modern Environments

    PubMed Central

    Camacho, Antonio; Walter, Xavier A.; Picazo, Antonio; Zopfi, Jakob

    2017-01-01

    Photoferrotrophy, the process by which inorganic carbon is fixed into organic matter using light as an energy source and reduced iron [Fe(II)] as an electron donor, has been proposed as one of the oldest photoautotrophic metabolisms on Earth. Under the iron-rich (ferruginous) but sulfide poor conditions dominating the Archean ocean, this type of metabolism could have accounted for most of the primary production in the photic zone. Here we review the current knowledge of biogeochemical, microbial and phylogenetic aspects of photoferrotrophy, and evaluate the ecological significance of this process in ancient and modern environments. From the ferruginous conditions that prevailed during most of the Archean, the ancient ocean evolved toward euxinic (anoxic and sulfide rich) conditions and, finally, much after the advent of oxygenic photosynthesis, to a predominantly oxic environment. Under these new conditions photoferrotrophs lost importance as primary producers, and now photoferrotrophy remains as a vestige of a formerly relevant photosynthetic process. Apart from the geological record and other biogeochemical markers, modern environments resembling the redox conditions of these ancient oceans can offer insights into the past significance of photoferrotrophy and help to explain how this metabolism operated as an important source of organic carbon for the early biosphere. Iron-rich meromictic (permanently stratified) lakes can be considered as modern analogs of the ancient Archean ocean, as they present anoxic ferruginous water columns where light can still be available at the chemocline, thus offering suitable niches for photoferrotrophs. A few bacterial strains of purple bacteria as well as of green sulfur bacteria have been shown to possess photoferrotrophic capacities, and hence, could thrive in these modern Archean ocean analogs. Studies addressing the occurrence and the biogeochemical significance of photoferrotrophy in ferruginous environments have been

  4. Robustness and uncertainties in global water scarcity projections

    NASA Astrophysics Data System (ADS)

    Floerke, Martina; Eisner, Stephanie; Hanasaki, Naota; Wada, Yoshihide

    2014-05-01

    Water scarcity is both a natural and human-made phenomenon and defined as the condition where there are insufficient water resources to satisfy long-term average requirements. Many regions of the world are affected by this chronic imbalance between renewable water resources and water demand leading to depletion of surface water and groundwater stocks. Total freshwater abstraction today amounts to 3856 km³ of which 70% are withdrawn by the agricultural sector, followed by the industry (19%) and domestic sectors (11%) (FAO 2010). Population growth and consumption change have led to threefold increase in total water withdrawals in the last 60 years through a rising demand for electricity, industrial and agricultural products, and thus for water (Flörke et al. 2013). The newly developed "Shared Socio-Economic Pathways" (SSPs) project global population to increase up to 7.2 or even 14 billion people by 2100 (O'Neill et al. 2012); and meeting future water demand in sufficient quantity and quality is seen as one of the key challenges of the 21st century. So far, the assessment of regional and global water-scarcity patterns mostly focused on climate change impacts by driving global hydrological models with climate projections from different GCMs while little emphasis has been put on the water demand side. Changes in future water scarcity, however, are found to be mainly driven by changes in water withdrawals (Alcamo et al. 2007, Hanasaki et al. 2012), i.e. sensitivity to climate change outweighs exposure. Likewise, uncertainties have mainly been assessed in relation to the spread among climate scenarios and from global hydrological models (GHMs) (Haddeland et al. 2011, 2013; Schewe et al. 2013, Wada et al. 2013) while the contribution of water use modelling related to total uncertainty remains largely unstudied. The main objective of this study is to address the main uncertainties related to both climate and socio-economic impacts on global and regional water scarcity

  5. Perturbed dynamics of discrete-time switched nonlinear systems with delays and uncertainties.

    PubMed

    Liu, Xingwen; Cheng, Jun

    2016-05-01

    This paper addresses the dynamics of a class of discrete-time switched nonlinear systems with time-varying delays and uncertainties and subject to perturbations. It is assumed that the nominal switched nonlinear system is robustly uniformly exponentially stable. It is revealed that there exists a maximal Lipschitz constant, if perturbation satisfies a Lipschitz condition with any Lipschitz constant less than the maximum, then the perturbed system can preserve the stability property of the nominal system. In situations where the perturbations are known, it is proved that there exists an upper bound of coefficient such that the perturbed system remains exponentially stable provided that the perturbation is scaled by any coefficient bounded by the upper bound. A numerical example is provided to illustrate the proposed theoretical results.

  6. Dynamic diagnostic and decision procedures under uncertainty

    SciTech Connect

    Baranov, V.V.

    1995-01-01

    In this paper, we consider uncertainty that arises when the true state x {element_of} E is not accessible to direct observation and remains unknown. Instead, we observe some features {theta} {element_of} {Theta} that carry a certain information about the true state. This information is described by the conditional distribution P({Theta}{vert_bar}E), which we call the linkage distribution. Regarding this distribution we assume that it exists but is unknown. This leads to uncertainty with respect to states from E and the linkage distribution P({Theta}{vert_bar}E), which we denote by NEP. The substantive problem can be stated as follows: from observations of the features {theta}{element_of}{Theta} made at each time instant n = 1,2,...,recognize the state x {element_of} E, identify the linkage distribution P, and use the results of recognition and identification to choose a decision y {element_of} Y so that the decision process is optimal in some sense. State recognition is the subject of diagnostics. The uncertainty NEP thus generates a problem of diagnostics and dynamic decision making.

  7. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  8. Detectability and Interpretational Uncertainties: Considerations in Gauging the Impacts of Land Disturbance on Streamflow

    EPA Science Inventory

    Hydrologic impacts of land disturbance and management can be confounded by rainfall variability. As a consequence, attempts to gauge and quantify these effects through streamflow monitoring are typically subject to uncertainties. This paper addresses the quantification and deline...

  9. New Evidence Links Stellar Remains to Oldest Recorded Supernova

    NASA Astrophysics Data System (ADS)

    2006-09-01

    Recent observations have uncovered evidence that helps to confirm the identification of the remains of one of the earliest stellar explosions recorded by humans. The new study shows that the supernova remnant RCW 86 is much younger than previously thought. As such, the formation of the remnant appears to coincide with a supernova observed by Chinese astronomers in 185 A.D. The study used data from NASA's Chandra X-ray Observatory and the European Space Agency's XMM-Newton Observatory, "There have been previous suggestions that RCW 86 is the remains of the supernova from 185 A.D.," said Jacco Vink of University of Utrecht, the Netherlands, and lead author of the study. "These new X-ray data greatly strengthen the case." When a massive star runs out of fuel, it collapses on itself, creating a supernova that can outshine an entire galaxy. The intense explosion hurls the outer layers of the star into space and produces powerful shock waves. The remains of the star and the material it encounters are heated to millions of degrees and can emit intense X-ray radiation for thousands of years. Animation of a Massive Star Explosion Animation of a Massive Star Explosion In their stellar forensic work, Vink and colleagues studied the debris in RCW 86 to estimate when its progenitor star originally exploded. They calculated how quickly the shocked, or energized, shell is moving in RCW 86, by studying one part of the remnant. They combined this expansion velocity with the size of the remnant and a basic understanding of how supernovas expand to estimate the age of RCW 86. "Our new calculations tell us the remnant is about 2,000 years old," said Aya Bamba, a coauthor from the Institute of Physical and Chemical Research (RIKEN), Japan. "Previously astronomers had estimated an age of 10,000 years." The younger age for RCW 86 may explain an astronomical event observed almost 2000 years ago. In 185 AD, Chinese astronomers (and possibly the Romans) recorded the appearance of a new

  10. Back to the future: The Grassroots of Hydrological Uncertainty

    NASA Astrophysics Data System (ADS)

    Smith, K. A.

    2013-12-01

    Uncertainties are widespread within hydrological science, and as society is looking to models to provide answers as to how climate change may affect our future water resources, the performance of hydrological models should be evaluated. With uncertainties being introduced from input data, parameterisation, model structure, validation data, and ';unknown unknowns' it is easy to be pessimistic about model outputs. But uncertainties are an opportunity for scientific endeavour, not a threat. Investigation and suitable presentation of uncertainties, which results in a range of potential outcomes, provides more insight into model projections than just one answer. This paper aims to demonstrate the feasibility of conducting computationally demanding parameter uncertainty estimation experiments on global hydrological models (GHMs). Presently, individual GHMs tend to present their one, best projection, but this leads to spurious precision - a false impression of certainty - which can be misleading to decision makers. Whilst uncertainty estimation is firmly established in catchment hydrology, GHM uncertainty, and parameter uncertainty in particular, has remained largely overlooked. Model inter-comparison studies that investigate model structure uncertainty have been undertaken (e.g. ISI-MIP, EU-WATCH etc.), but these studies seem premature when the uncertainties within each individual model itself have not yet been considered. This study takes a few steps back, going down to one of the first introductions of assumptions in model development, the assignment of model parameter values. Making use of the University of Nottingham's High Performance Computer Cluster (HPC), the Mac-PDM.09 GHM has been subjected to rigorous uncertainty experiments. The Generalised Likelihood Uncertainty Estimation method (GLUE) with Latin Hypercube Sampling has been applied to a GHM for the first time, to produce 100,000 simultaneous parameter perturbations. The results of this ensemble of 100

  11. Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Technical abstract: Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analys...

  12. Sensitivity and uncertainty analysis for the annual phosphorus loss estimator model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  13. Estimating the magnitude of prediction uncertainties for field-scale P loss models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, an uncertainty analysis for the Annual P Loss Estima...

  14. Sensitivity and uncertainty analysis for a field-scale P loss model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  15. Sources of Uncertainty in Climate Change Projections of Precipitation

    NASA Astrophysics Data System (ADS)

    Gutmann, Ethan; Clark, Martyn; Eidhammer, Trude; Ikeda, Kyoko; Deser, Clara; Brekke, Levi; Arnold, Jeffrey; Rasmussen, Roy

    2016-04-01

    Predicting the likely changes in precipitation due to anthropogenic climate influences is one of the most important problems in earth science today. This problem is complicated by the enormous uncertainty in current predictions. Until all such sources of uncertainty are adequately addressed and quantified, we can not know what changes may be predictable, and which masked by the internal variability of the climate system itself. Here we assess multiple sources of uncertainty including those due to internal variability, climate model selection, emissions scenario, regional climate model physics, and statistical downscaling methods. This work focuses on the Colorado Rocky Mountains because these mountains serve as the water towers for much of the western United States, but the results are more broadly applicable, and results will be presented covering the Columbia River Basin and the California Sierra Nevadas as well. Internal variability is assessed using 30 members of the CESM Large Ensemble. Uncertainty due to the choice of climate models is assessed using 100 climate projections from the CMIP5 archive, including multiple emissions scenarios. Uncertainty due to regional climate model physics is assessed using a limited set of high-resolution Weather Research and Forecasting (WRF) model simulations in comparison to a larger multi-physics ensemble using the Intermediate Complexity Atmospheric Research (ICAR) model. Finally, statistical downscaling uncertainty is assessed using multiple statistical downscaling models. In near-term projections (25-35 years) internal variability is the largest source of uncertainty; however, over longer time scales (70-80 years) other sources of uncertainty become more important, with the importance of different sources of uncertainty varying depending on the metric assessed.

  16. Pragmatic aspects of uncertainty propagation: A conceptual review

    NASA Astrophysics Data System (ADS)

    Thacker, W. Carlisle; Iskandarani, Mohamed; Gonçalves, Rafael C.; Srinivasan, Ashwanth; Knio, Omar M.

    2015-11-01

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and (ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  17. Multiphysics modeling and uncertainty quantification for an active composite reflector

    NASA Astrophysics Data System (ADS)

    Peterson, Lee D.; Bradford, S. C.; Schiermeier, John E.; Agnes, Gregory S.; Basinger, Scott A.

    2013-09-01

    A multiphysics, high resolution simulation of an actively controlled, composite reflector panel is developed to extrapolate from ground test results to flight performance. The subject test article has previously demonstrated sub-micron corrected shape in a controlled laboratory thermal load. This paper develops a model of the on-orbit performance of the panel under realistic thermal loads, with an active heater control system, and performs an uncertainty quantification of the predicted response. The primary contribution of this paper is the first reported application of the Sandia developed Sierra mechanics simulation tools to a spacecraft multiphysics simulation of a closed-loop system, including uncertainty quantification. The simulation was developed so as to have sufficient resolution to capture the residual panel shape error that remains after the thermal and mechanical control loops are closed. An uncertainty quantification analysis was performed to assess the predicted tolerance in the closed-loop wavefront error. Key tools used for the uncertainty quantification are also described.

  18. Probability expression for changeable and changeless uncertainties: an implicit test

    PubMed Central

    Wang, Yun; Du, Xue-Lei; Rao, Li-Lin; Li, Shu

    2014-01-01

    “Everything changes and nothing remains still.”We designed three implicit studies to understand how people react or adapt to a rapidly changing world by testing whether verbal probability is better in expressing changeable uncertainty while numerical probability is better in expressing unchangeable uncertainty. We found that the “verbal-changeable” combination in implicit tasks was more compatible than the “numerical-changeable” combination. Furthermore, the “numerical-changeless” combination was more compatible than the “verbal-changeless” combination. Thus, a novel feature called “changeability” was proposed to describe the changeable nature of verbal probability. However, numerical probability is a better carrier of changeless uncertainty than verbal probability. These results extend the domain of probability predictions and enrich our general understanding of communication with verbal and numerical probabilities. Given that the world around us is constantly changing, this “changeability” feature may play a major role in preparing for uncertainty. PMID:25431566

  19. Uncertainties Quantification and Propagation of Multiple Correlated Variables with Limited Samples

    NASA Astrophysics Data System (ADS)

    Zhanpeng, Shen; Xueqian, Chen; Xinen, Liu; Chaoping, Zang

    2016-09-01

    In order to estimate the reliability of an engineering structure based on limited test data, it is distinctly important to address both the epistemic uncertainty from lacking in samples and correlations between input uncertain variables. Both the probability boxes theory and copula function theory are utilized in proposed method to represent uncertainty and correlation of input variables respectively. Moreover, the uncertainty of response of interest is obtained by uncertainty propagation of correlated input variables. Nested sampling technique is adopted here to insure the propagation is always feasible and the response's uncertainty is characterized by a probability box. Finally, a numerical example illustrates the validity and effectiveness of our method. The results indicate that the epistemic uncertainty cannot be conveniently ignored when available samples are very limited and correlations among input variables may significantly affect the uncertainty of responses.

  20. Estimation of Measurement Uncertainties for the DGT Passive Sampler Used for Determination of Copper in Water

    PubMed Central

    Rauch, Sebastien; Morrison, Gregory M.

    2014-01-01

    Diffusion-based passive samplers are increasingly used for water quality monitoring. While the overall method robustness and reproducibility for passive samplers in water are widely reported, there has been a lack of a detailed description of uncertainty sources. In this paper an uncertainty budget for the determination of fully labile Cu in water using a DGT passive sampler is presented. Uncertainty from the estimation of effective cross-sectional diffusion area and the instrumental determination of accumulated mass of analyte are the most significant sources of uncertainty, while uncertainties from contamination and the estimation of diffusion coefficient are negligible. The results presented highlight issues with passive samplers which are important to address if overall method uncertainty is to be reduced and effective strategies to reduce overall method uncertainty are presented. PMID:25258629

  1. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Reidel, S.P.; Gardner, M.G.

    2007-07-01

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy's (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84. percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis. A key uncertainty identified in the 2005 analysis was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The velocity structure of the upper four basalt flows (Saddle Mountains Basalt) and the inter-layered sedimentary interbeds (Ellensburg Formation) produces strong reductions in modeled earthquake ground motions propagating through them. Uncertainty in the strength of velocity contrasts between these basalts and interbeds primarily resulted from an absence of measured shear wave velocities (Vs) in the interbeds. For this study, Vs in the interbeds was estimated from older, limited compressional wave velocity (Vp) data using estimated ranges for the ratio of the two velocities (Vp/Vs) based on analogues in similar materials. A range of possible Vs for the interbeds and basalts was used and produced additional uncertainty in the resulting response spectra. Because of the

  2. Identifying and Addressing Vaccine Hesitancy

    PubMed Central

    Kestenbaum, Lori A.; Feemster, Kristen A.

    2015-01-01

    In the 20th century, the introduction of multiple vaccines significantly reduced childhood morbidity, mortality, and disease outbreaks. Despite, and perhaps because of, their public health impact, an increasing number of parents and patients are choosing to delay or refuse vaccines. These individuals are described as vaccine hesitant. This phenomenon has developed due to the confluence of multiple social, cultural, political and personal factors. As immunization programs continue to expand, understanding and addressing vaccine hesitancy will be crucial to their successful implementation. This review explores the history of vaccine hesitancy, its causes, and suggested approaches for reducing hesitancy and strengthening vaccine acceptance. PMID:25875982

  3. Identifying and addressing vaccine hesitancy.

    PubMed

    Kestenbaum, Lori A; Feemster, Kristen A

    2015-04-01

    In the 20th century, the introduction of multiple vaccines significantly reduced childhood morbidity, mortality, and disease outbreaks. Despite, and perhaps because of, their public health impact, an increasing number of parents and patients are choosing to delay or refuse vaccines. These individuals are described as "vaccine hesitant." This phenomenon has developed due to the confluence of multiple social, cultural, political, and personal factors. As immunization programs continue to expand, understanding and addressing vaccine hesitancy will be crucial to their successful implementation. This review explores the history of vaccine hesitancy, its causes, and suggested approaches for reducing hesitancy and strengthening vaccine acceptance.

  4. Nanoscale content-addressable memory

    NASA Technical Reports Server (NTRS)

    Davis, Bryan (Inventor); Principe, Jose C. (Inventor); Fortes, Jose (Inventor)

    2009-01-01

    A combined content addressable memory device and memory interface is provided. The combined device and interface includes one or more one molecular wire crossbar memories having spaced-apart key nanowires, spaced-apart value nanowires adjacent to the key nanowires, and configurable switches between the key nanowires and the value nanowires. The combination further includes a key microwire-nanowire grid (key MNG) electrically connected to the spaced-apart key nanowires, and a value microwire-nanowire grid (value MNG) electrically connected to the spaced-apart value nanowires. A key or value MNGs selects multiple nanowires for a given key or value.

  5. Addressing inequities in healthy eating.

    PubMed

    Friel, Sharon; Hattersley, Libby; Ford, Laura; O'Rourke, Kerryn

    2015-09-01

    What, when, where and how much people eat is influenced by a complex mix of factors at societal, community and individual levels. These influences operate both directly through the food system and indirectly through political, economic, social and cultural pathways that cause social stratification and influence the quality of conditions in which people live their lives. These factors are the social determinants of inequities in healthy eating. This paper provides an overview of the current evidence base for addressing these determinants and for the promotion of equity in healthy eating.

  6. Addressing the workforce pipeline challenge

    SciTech Connect

    Leonard Bond; Kevin Kostelnik; Richard Holman

    2006-11-01

    A secure and affordable energy supply is essential for achieving U.S. national security, in continuing U.S. prosperity and in laying the foundations to enable future economic growth. To meet this goal the next generation energy workforce in the U.S., in particular those needed to support instrumentation, controls and advanced operations and maintenance, is a critical element. The workforce is aging and a new workforce pipeline, to support both current generation and new build has yet to be established. The paper reviews the challenges and some actions being taken to address this need.

  7. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  8. Sequestration: Documenting and Assessing Lessons Learned Would Assist DOD in Planning for Future Budget Uncertainty

    DTIC Science & Technology

    2015-05-01

    Future Budget Uncertainty 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...SEQUESTRATION Documenting and Assessing Lessons Learned Would Assist DOD in Planning for Future Budget Uncertainty...UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) U.S. Government Accountability Office,441 G Street NW,Washington,DC,20548 8

  9. Credible Computations: Standard and Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; VanDalsem, William (Technical Monitor)

    1995-01-01

    The discipline of computational fluid dynamics (CFD) is at a crossroad. Most of the significant advances related to computational methods have taken place. The emphasis is now shifting from methods to results. Significant efforts are made in applying CFD to solve design problems. The value of CFD results in design depends on the credibility of computed results for the intended use. The process of establishing credibility requires a standard so that there is a consistency and uniformity in this process and in the interpretation of its outcome. The key element for establishing the credibility is the quantification of uncertainty. This paper presents salient features of a proposed standard and a procedure for determining the uncertainty. A customer of CFD products - computer codes and computed results - expects the following: A computer code in terms of its logic, numerics, and fluid dynamics and the results generated by this code are in compliance with specified requirements. This expectation is fulfilling by verification and validation of these requirements. The verification process assesses whether the problem is solved correctly and the validation process determines whether the right problem is solved. Standards for these processes are recommended. There is always some uncertainty, even if one uses validated models and verified computed results. The value of this uncertainty is important in the design process. This value is obtained by conducting a sensitivity-uncertainty analysis. Sensitivity analysis is generally defined as the procedure for determining the sensitivities of output parameters to input parameters. This analysis is a necessary step in the uncertainty analysis, and the results of this analysis highlight which computed quantities and integrated quantities in computations need to be determined accurately and which quantities do not require such attention. Uncertainty analysis is generally defined as the analysis of the effect of the uncertainties

  10. Method and apparatus to predict the remaining service life of an operating system

    DOEpatents

    Greitzer, Frank L.; Kangas, Lars J.; Terrones, Kristine M.; Maynard, Melody A.; Pawlowski, Ronald A. , Ferryman; Thomas A.; Skorpik, James R.; Wilson, Bary W.

    2008-11-25

    A method and computer-based apparatus for monitoring the degradation of, predicting the remaining service life of, and/or planning maintenance for, an operating system are disclosed. Diagnostic information on degradation of the operating system is obtained through measurement of one or more performance characteristics by one or more sensors onboard and/or proximate the operating system. Though not required, it is preferred that the sensor data are validated to improve the accuracy and reliability of the service life predictions. The condition or degree of degradation of the operating system is presented to a user by way of one or more calculated, numeric degradation figures of merit that are trended against one or more independent variables using one or more mathematical techniques. Furthermore, more than one trendline and uncertainty interval may be generated for a given degradation figure of merit/independent variable data set. The trendline(s) and uncertainty interval(s) are subsequently compared to one or more degradation figure of merit thresholds to predict the remaining service life of the operating system. The present invention enables multiple mathematical approaches in determining which trendline(s) to use to provide the best estimate of the remaining service life.

  11. Uncertainty relations for characteristic functions

    NASA Astrophysics Data System (ADS)

    Rudnicki, Łukasz; Tasca, D. S.; Walborn, S. P.

    2016-02-01

    We present the uncertainty relation for the characteristic functions (ChUR) of the quantum mechanical position and momentum probability distributions. This inequality is more general than the Heisenberg uncertainty relation and is saturated in two extreme cases for wave functions described by periodic Dirac combs. We further discuss a broad spectrum of applications of the ChUR; in particular, we constrain quantum optical measurements involving general detection apertures and provide the uncertainty relation that is relevant for loop quantum cosmology. A method to measure the characteristic function directly using an auxiliary qubit is also briefly discussed.

  12. Assessing uncertainty in physical constants

    NASA Astrophysics Data System (ADS)

    Henrion, Max; Fischhoff, Baruch

    1986-09-01

    Assessing the uncertainty due to possible systematic errors in a physical measurement unavoidably involves an element of subjective judgment. Examination of historical measurements and recommended values for the fundamental physical constants shows that the reported uncertainties have a consistent bias towards underestimating the actual errors. These findings are comparable to findings of persistent overconfidence in psychological research on the assessment of subjective probability distributions. Awareness of these biases could help in interpreting the precision of measurements, as well as provide a basis for improving the assessment of uncertainty in measurements.

  13. On the formulation of a minimal uncertainty model for robust control with structured uncertainty

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1991-01-01

    In the design and analysis of robust control systems for uncertain plants, representing the system transfer matrix in the form of what has come to be termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents a transfer function matrix M(s) of the nominal closed loop system, and the delta represents an uncertainty matrix acting on M(s). The nominal closed loop system M(s) results from closing the feedback control system, K(s), around a nominal plant interconnection structure P(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unsaturated uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, but for real parameter variations delta is a diagonal matrix of real elements. Conceptually, the M-delta structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the currently available literature addresses computational methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty, where the term minimal refers to the dimension of the delta matrix. Since having a minimally dimensioned delta matrix would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta would be useful. Hence, a method of obtaining the interconnection system P(s) is required. A generalized procedure for obtaining a minimal P-delta structure for systems with real parameter variations is presented. Using this model, the minimal M-delta model can then be easily obtained by closing the feedback loop. The procedure involves representing the system in a cascade-form state-space realization, determining the minimal uncertainty matrix

  14. Space Surveillance Network Scheduling Under Uncertainty: Models and Benefits

    NASA Astrophysics Data System (ADS)

    Valicka, C.; Garcia, D.; Staid, A.; Watson, J.; Rintoul, M.; Hackebeil, G.; Ntaimo, L.

    2016-09-01

    Advances in space technologies continue to reduce the cost of placing satellites in orbit. With more entities operating space vehicles, the number of orbiting vehicles and debris has reached unprecedented levels and the number continues to grow. Sensor operators responsible for maintaining the space catalog and providing space situational awareness face an increasingly complex and demanding scheduling requirements. Despite these trends, a lack of advanced tools continues to prevent sensor planners and operators from fully utilizing space surveillance resources. One key challenge involves optimally selecting sensors from a network of varying capabilities for missions with differing requirements. Another open challenge, the primary focus of our work, is building robust schedules that effectively plan for uncertainties associated with weather, ad hoc collections, and other target uncertainties. Existing tools and techniques are not amenable to rigorous analysis of schedule optimality and do not adequately address the presented challenges. Building on prior research, we have developed stochastic mixed-integer linear optimization models to address uncertainty due to weather's effect on collection quality. By making use of the open source Pyomo optimization modeling software, we have posed and solved sensor network scheduling models addressing both forms of uncertainty. We present herein models that allow for concurrent scheduling of collections with the same sensor configuration and for proactively scheduling against uncertain ad hoc collections. The suitability of stochastic mixed-integer linear optimization for building sensor network schedules under different run-time constraints will be discussed.

  15. Addressing Asthma Health Disparities: A Multilevel Challenge

    PubMed Central

    Canino, Glorisa; McQuaid, Elizabeth L.; Rand, Cynthia S.

    2009-01-01

    Substantial research has documented pervasive disparities in the prevalence, severity, and morbidity of asthma among minority populations compared to non-Latino whites. The underlying causes of these disparities are not well understood, and as a result, the leverage points to address them remain unclear. A multilevel framework for integrating research in asthma health disparities is proposed in order to advance both future research and clinical practice. The components of the proposed model include health care policies and regulations, operation of the health care system, provider/clinician-level factors, social/environmental factors, and individual/family attitudes and behaviors. The body of research suggests that asthma disparities have multiple, complex and inter-related sources. Disparities occur when individual, environmental, health system, and provider factors interact with one another over time. Given that the causes of asthma disparities are complex and multilevel, clinical strategies to address these disparities must therefore be comparably multilevel and target many aspects of asthma care. Clinical Implications: Several strategies that could be applied in clinical settings to reduce asthma disparities are described including the need for routine assessment of the patient’s beliefs, financial barriers to disease management, and health literacy, and the provision of cultural competence training and communication skills to health care provider groups. PMID:19447484

  16. 7 CFR 160.29 - Containers to remain intact.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Containers to remain intact. 160.29 Section 160.29... STANDARDS FOR NAVAL STORES Analysis, Inspection, and Grading on Request § 160.29 Containers to remain intact... the containers holding such naval stores remain intact as sampled until the analysis,...

  17. Estimations of uncertainties of frequencies

    NASA Astrophysics Data System (ADS)

    Eyer, Laurent; Nicoletti, Jean-Marc; Morgenthaler, Stephan

    2016-10-01

    Diverse variable phenomena in the Universe are periodic. Astonishingly many of the periodic signals present in stars have timescales coinciding with human ones (from minutes to years). The periods of signals often have to be deduced from time series which are irregularly sampled and sparse, furthermore correlations between the brightness measurements and their estimated uncertainties are common. The uncertainty on the frequency estimation is reviewed. We explore the astronomical and statistical literature, in both cases of regular and irregular samplings. The frequency uncertainty is depending on signal to noise ratio, the frequency, the observational timespan. The shape of the light curve should also intervene, since sharp features such as exoplanet transits, stellar eclipses, raising branches of pulsation stars give stringent constraints. We propose several procedures (parametric and nonparametric) to estimate the uncertainty on the frequency which are subsequently tested against simulated data to assess their performances.

  18. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections.

  19. Content-addressable holographic databases

    NASA Astrophysics Data System (ADS)

    Grawert, Felix; Kobras, Sebastian; Burr, Geoffrey W.; Coufal, Hans J.; Hanssen, Holger; Riedel, Marc; Jefferson, C. Michael; Jurich, Mark C.

    2000-11-01

    Holographic data storage allows the simultaneous search of an entire database by performing multiple optical correlations between stored data pages and a search argument. We have recently developed fuzzy encoding techniques for this fast parallel search and demonstrated a holographic data storage system that searches digital data records with high fidelity. This content-addressable retrieval is based on the ability to take the two-dimensional inner product between the search page and each stored data page. We show that this ability is lost when the correlator is defocussed to avoid material oversaturation, but can be regained by the combination of a random phase mask and beam confinement through total internal reflection. Finally, we propose an architecture in which spatially multiplexed holograms are distributed along the path of the search beam, allowing parallel search of large databases.

  20. Addressing viral resistance through vaccines

    PubMed Central

    Laughlin, Catherine; Schleif, Amanda; Heilman, Carole A

    2015-01-01

    Antimicrobial resistance is a serious healthcare concern affecting millions of people around the world. Antiviral resistance has been viewed as a lesser threat than antibiotic resistance, but it is important to consider approaches to address this growing issue. While vaccination is a logical strategy, and has been shown to be successful many times over, next generation viral vaccines with a specific goal of curbing antiviral resistance will need to clear several hurdles including vaccine design, evaluation and implementation. This article suggests that a new model of vaccination may need to be considered: rather than focusing on public health, this model would primarily target sectors of the population who are at high risk for complications from certain infections. PMID:26604979

  1. Addressing Failures in Exascale Computing

    SciTech Connect

    Snir, Marc; Wisniewski, Robert; Abraham, Jacob; Adve, Sarita; Bagchi, Saurabh; Balaji, Pavan; Belak, J.; Bose, Pradip; Cappello, Franck; Carlson, Bill; Chien, Andrew; Coteus, Paul; DeBardeleben, Nathan; Diniz, Pedro; Engelmann, Christian; Erez, Mattan; Fazzari, Saverio; Geist, Al; Gupta, Rinku; Johnson, Fred; Krishnamoorthy, Sriram; Leyffer, Sven; Liberty, Dean; Mitra, Subhasish; Munson, Todd; Schreiber, Rob; Stearley, Jon; Van Hensbergen, Eric

    2014-01-01

    We present here a report produced by a workshop on Addressing failures in exascale computing' held in Park City, Utah, 4-11 August 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system, discuss existing knowledge on resilience across the various hardware and software layers of an exascale system, and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, and academia, and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.

  2. Addressing failures in exascale computing

    SciTech Connect

    Snir, Marc; Wisniewski, Robert W.; Abraham, Jacob A.; Adve, Sarita; Bagchi, Saurabh; Balaji, Pavan; Belak, Jim; Bose, Pradip; Cappello, Franck; Carlson, William; Chien, Andrew A.; Coteus, Paul; Debardeleben, Nathan A.; Diniz, Pedro; Engelmann, Christian; Erez, Mattan; Saverio, Fazzari; Geist, Al; Gupta, Rinku; Johnson, Fred; Krishnamoorthy, Sriram; Leyffer, Sven; Liberty, Dean; Mitra, Subhasish; Munson, Todd; Schreiber, Robert; Stearly, Jon; Van Hensbergen, Eric

    2014-05-01

    We present here a report produced by a workshop on “Addressing Failures in Exascale Computing” held in Park City, Utah, August 4–11, 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system; discuss existing knowledge on resilience across the various hardware and software layers of an exascale system; and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, and academia; and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.

  3. Light addressable photoelectrochemical cyanide sensor

    SciTech Connect

    Licht, S.; Myung, N.; Sun, Y.

    1996-03-15

    A sensor is demonstrated that is capable of spatial discrimination of cyanide with use of only a single stationary sensing element. Different spatial regions of the sensing element are light activated to reveal the solution cyanide concentration only at the point of illumination. In this light addressable photoelectrochemical (LAP) sensor the sensing element consists of an n-CdSe electrode immersed in solution, with the open-circuit potential determined under illumination. In alkaline ferro-ferri-cyanide solution, the open-circuit photopotential is highly responsive to cyanide, with a linear response of (120 mV) log [KCN]. LAP detection with a spatial resolution of {+-}1 mm for cyanide detection is demonstrated. The response is almost linear for 0.001-0.100 m cyanide with a resolution of 5 mV. 38 refs., 7 figs., 1 tab.

  4. Significant predictors of patients' uncertainty in primary brain tumors.

    PubMed

    Lin, Lin; Chien, Lung-Chang; Acquaye, Alvina A; Vera-Bolanos, Elizabeth; Gilbert, Mark R; Armstrong, Terri S

    2015-05-01

    Patients with primary brain tumors (PBT) face uncertainty related to prognosis, symptoms and treatment response and toxicity. Uncertainty is correlated to negative mood states and symptom severity and interference. This study identified predictors of uncertainty during different treatment stages (newly-diagnosed, on treatment, followed-up without active treatment). One hundred eighty six patients with PBT were accrued at various points in the illness trajectory. Data collection tools included: a clinical checklist/a demographic data sheet/the Mishel Uncertainty in Illness Scale-Brain Tumor Form. The structured additive regression model was used to identify significant demographic and clinical predictors of illness-related uncertainty. Participants were primarily white (80 %) males (53 %). They ranged in age from 19-80 (mean = 44.2 ± 12.6). Thirty-two of the 186 patients were newly-diagnosed, 64 were on treatment at the time of clinical visit with MRI evaluation, 21 were without MRI, and 69 were not on active treatment. Three subscales (ambiguity/inconsistency; unpredictability-disease prognoses; unpredictability-symptoms and other triggers) were different amongst the treatment groups (P < .01). However, patients' uncertainty during active treatment was as high as in newly-diagnosed period. Other than treatment stages, change of employment status due to the illness was the most significant predictor of illness-related uncertainty. The illness trajectory of PBT remains ambiguous, complex, and unpredictable, leading to a high incidence of uncertainty. There was variation in the subscales of uncertainty depending on treatment status. Although patients who are newly diagnosed reported the highest scores on most of the subscales, patients on treatment felt more uncertain about unpredictability of symptoms than other groups. Due to the complexity and impact of the disease, associated symptoms, and interference with functional status, comprehensive assessment of patients

  5. Dynamical Realism and Uncertainty Propagation

    NASA Astrophysics Data System (ADS)

    Park, Inkwan

    In recent years, Space Situational Awareness (SSA) has become increasingly important as the number of tracked Resident Space Objects (RSOs) continues their growth. One of the most significant technical discussions in SSA is how to propagate state uncertainty in a consistent way with the highly nonlinear dynamical environment. In order to keep pace with this situation, various methods have been proposed to propagate uncertainty accurately by capturing the nonlinearity of the dynamical system. We notice that all of the methods commonly focus on a way to describe the dynamical system as precisely as possible based on a mathematical perspective. This study proposes a new perspective based on understanding dynamics of the evolution of uncertainty itself. We expect that profound insights of the dynamical system could present the possibility to develop a new method for accurate uncertainty propagation. These approaches are naturally concluded in goals of the study. At first, we investigate the most dominant factors in the evolution of uncertainty to realize the dynamical system more rigorously. Second, we aim at developing the new method based on the first investigation enabling orbit uncertainty propagation efficiently while maintaining accuracy. We eliminate the short-period variations from the dynamical system, called a simplified dynamical system (SDS), to investigate the most dominant factors. In order to achieve this goal, the Lie transformation method is introduced since this transformation can define the solutions for each variation separately. From the first investigation, we conclude that the secular variations, including the long-period variations, are dominant for the propagation of uncertainty, i.e., short-period variations are negligible. Then, we develop the new method by combining the SDS and the higher-order nonlinear expansion method, called state transition tensors (STTs). The new method retains advantages of the SDS and the STTs and propagates

  6. Thermodynamic and relativistic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Artamonov, A. A.; Plotnikov, E. M.

    2017-01-01

    Thermodynamic uncertainty relation (UR) was verified experimentally. The experiments have shown the validity of the quantum analogue of the zeroth law of stochastic thermodynamics in the form of the saturated Schrödinger UR. We have also proposed a new type of UR for the relativistic mechanics. These relations allow us to consider macroscopic phenomena within the limits of the ratio of the uncertainty relations for different physical quantities.

  7. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  8. Pharmacological Fingerprints of Contextual Uncertainty

    PubMed Central

    Ruge, Diane; Stephan, Klaas E.

    2016-01-01

    Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses. PMID:27846219

  9. Uncertainty of empirical correlation equations

    NASA Astrophysics Data System (ADS)

    Feistel, R.; Lovell-Smith, J. W.; Saunders, P.; Seitz, S.

    2016-08-01

    The International Association for the Properties of Water and Steam (IAPWS) has published a set of empirical reference equations of state, forming the basis of the 2010 Thermodynamic Equation of Seawater (TEOS-10), from which all thermodynamic properties of seawater, ice, and humid air can be derived in a thermodynamically consistent manner. For each of the equations of state, the parameters have been found by simultaneously fitting equations for a range of different derived quantities using large sets of measurements of these quantities. In some cases, uncertainties in these fitted equations have been assigned based on the uncertainties of the measurement results. However, because uncertainties in the parameter values have not been determined, it is not possible to estimate the uncertainty in many of the useful quantities that can be calculated using the parameters. In this paper we demonstrate how the method of generalised least squares (GLS), in which the covariance of the input data is propagated into the values calculated by the fitted equation, and in particular into the covariance matrix of the fitted parameters, can be applied to one of the TEOS-10 equations of state, namely IAPWS-95 for fluid pure water. Using the calculated parameter covariance matrix, we provide some preliminary estimates of the uncertainties in derived quantities, namely the second and third virial coefficients for water. We recommend further investigation of the GLS method for use as a standard method for calculating and propagating the uncertainties of values computed from empirical equations.

  10. Sky-view factor visualization for detection of archaeological remains

    NASA Astrophysics Data System (ADS)

    Kokalj, Žiga; Oštir, Krištof; Zakšek, Klemen

    2013-04-01

    Many archaeological remains are covered by sand or vegetation but it still possible to detect them by remote sensing techniques. One of them is airborne laser scanning that enables production of digital elevation models (DEM) of very high resolution (better than 1 m) with high relative elevation accuracy (centimetre level), even under forest. Thus, it has become well established in archaeological applications. However, effective interpretation of digital elevation models requires appropriate data visualization. Analytical relief shading is used in most cases. Although widely accepted, this method has two major drawbacks: identifying details in deep shades and inability to properly represent linear features lying parallel to the light beam. Several authors have tried to overcome these limitations by changing the position of the light source or by filtering. This contribution addresses the DEM visualization problem by sky-view factor, a visualization technique based on diffuse light that overcomes the directional problems of hill-shading. Sky-view factor is a parameter that describes the portion of visible sky limited by relief. It can be used as a general relief visualization technique to show relief characteristics. In particular, we show that this visualization is a very useful tool in archaeology. Applying the sky-view factor for visualization purposes gives advantages over other techniques because it reveals small (or large, depending on the scale of the observed phenomenon and consequential algorithm settings) relief features while preserving the perception of general topography. In the case study (DEM visualization of a fortified enclosure of Tonovcov grad in Slovenia) we show that for the archaeological purposes the sky-view factor is the optimal DEM visualization method. Its ability to consider the neighborhood context makes it an outstanding tool when compared to other visualization techniques. One can choose a large search radius and the most important

  11. The NIST Simple Guide for Evaluating and Expressing Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio

    2016-11-01

    NIST has recently published guidance on the evaluation and expression of the uncertainty of NIST measurement results [1, 2], supplementing but not replacing B. N. Taylor and C. E. Kuyatt's (1994) Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results (NIST Technical Note 1297) [3], which tracks closely the Guide to the expression of uncertainty in measurement (GUM) [4], originally published in 1995 by the Joint Committee for Guides in Metrology of the International Bureau of Weights and Measures (BIPM). The scope of this Simple Guide, however, is much broader than the scope of both NIST Technical Note 1297 and the GUM, because it attempts to address several of the uncertainty evaluation challenges that have arisen at NIST since the 1990s, for example to include molecular biology, greenhouse gases and climate science measurements, and forensic science. The Simple Guide also expands the scope of those two other guidance documents by recognizing observation equations (that is, statistical models) as bona fide measurement models. These models are indispensable to reduce data from interlaboratory studies, to combine measurement results for the same measurand obtained by different methods, and to characterize the uncertainty of calibration and analysis functions used in the measurement of force, temperature, or composition of gas mixtures. This presentation reviews the salient aspects of the Simple Guide, illustrates the use of models and methods for uncertainty evaluation not contemplated in the GUM, and also demonstrates the NIST Uncertainty Machine [5] and the NIST Consensus Builder, which are web-based applications accessible worldwide that facilitate evaluations of measurement uncertainty and the characterization of consensus values in interlaboratory studies.

  12. The Democratic Imperative to Address Sexual Equality Rights in Schools

    ERIC Educational Resources Information Center

    Gereluk, Dianne

    2013-01-01

    Issues of sexual orientation elicit ethical debates in schools and society. In jurisdictions where a legal right has not yet been established, one argument commonly rests on whether schools ought to address issues of same-sex relationships and marriage on the basis of civil equality, or whether such controversial issues ought to remain in the…

  13. Climate change adaptation under uncertainty in the developing world: A case study of sea level rise in Kiribati

    NASA Astrophysics Data System (ADS)

    Donner, S. D.; Webber, S.

    2011-12-01

    Climate change is expected to have the greatest impact in parts of the developing world. At the 2010 meeting of U.N. Framework Convention on Climate Change in Cancun, industrialized countries agreed in principle to provide US$100 billion per year by 2020 to assist the developing world respond to climate change. This "Green Climate Fund" is a critical step towards addressing the challenge of climate change. However, the policy and discourse on supporting adaptation in the developing world remains highly idealized. For example, the efficacy of "no regrets" adaptation efforts or "mainstreaming" adaptation into decision-making are rarely evaluated in the real world. In this presentation, I will discuss the gap between adaptation theory and practice using a multi-year case study of the cultural, social and scientific obstacles to adapting to sea level rise in the Pacific atoll nation of Kiribati. Our field research reveals how scientific and institutional uncertainty can limit international efforts to fund adaptation and lead to spiraling costs. Scientific uncertainty about hyper-local impacts of sea level rise, though irreducible, can at times limit decision-making about adaptation measures, contrary to the notion that "good" decision-making practices can incorporate scientific uncertainty. Efforts to improve institutional capacity must be done carefully, or they risk inadvertently slowing the implementation of adaptation measures and increasing the likelihood of "mal"-adaptation.

  14. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    SciTech Connect

    Van Dyk, J; Palta, J; Bortfeld, T; Mijnheer, B

    2014-06-15

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “how do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.

  15. Uncertainty in Measured Data and Model Predictions: Essential Components for Mobilizing Environmental Data and Modeling

    NASA Astrophysics Data System (ADS)

    Harmel, D.

    2014-12-01

    In spite of pleas for uncertainty analysis - such as Beven's (2006) "Should it not be required that every paper in both field and modeling studies attempt to evaluate the uncertainty in the results?" - the uncertainty associated with hydrology and water quality data is rarely quantified and rarely considered in model evaluation. This oversight, justified in the past by mainly tenuous philosophical concerns, diminishes the value of measured data and ignores the environmental and socio-economic benefits of improved decisions and policies based on data with estimated uncertainty. This oversight extends to researchers, who typically fail to estimate uncertainty in measured discharge and water quality data because of additional effort required, lack of adequate scientific understanding on the subject, and fear of negative perception if data with "high" uncertainty are reported; however, the benefits are certain. Furthermore, researchers have a responsibility for scientific integrity in reporting what is known and what is unknown, including the quality of measured data. In response we produced an uncertainty estimation framework and the first cumulative uncertainty estimates for measured water quality data (Harmel et al., 2006). From that framework, DUET-H/WQ was developed (Harmel et al., 2009). Application to several real-world data sets indicated that substantial uncertainty can be contributed by each data collection procedural category and that uncertainties typically occur in order discharge < sediment < dissolved N and P < total N and P. Similarly, modelers address certain aspects of model uncertainty but ignore others, such as the impact of uncertainty in discharge and water quality data. Thus, we developed methods to incorporate prediction uncertainty as well as calibration/validation data uncertainty into model goodness-of-fit evaluation (Harmel and Smith, 2007; Harmel et al., 2010). These enhance model evaluation by: appropriately sharing burden with "data

  16. A region addresses patient safety.

    PubMed

    Feinstein, Karen Wolk; Grunden, Naida; Harrison, Edward I

    2002-06-01

    The Pittsburgh Regional Healthcare Initiative (PRHI) is a coalition of 35 hospitals, 4 major insurers, more than 30 major and small-business health care purchasers, dozens of corporate and civic leaders, organized labor, and partnerships with state and federal government all working together to deliver perfect patient care throughout Southwestern Pennsylvania. PRHI believes that in pursuing perfection, many of the challenges facing today's health care delivery system (eg, waste and error in the delivery of care, rising costs, frustration and shortage among clinicians and workers, financial distress, overcapacity, and lack of access to care) will be addressed. PRHI has identified patient safety (nosocomial infections and medication errors) and 5 clinical areas (obstetrics, orthopedic surgery, cardiac surgery, depression, and diabetes) as ideal starting points. In each of these areas of work, PRHI partners have assembled multifacility/multidisciplinary groups charged with defining perfection, establishing region-wide reporting systems, and devising and implementing recommended improvement strategies and interventions. Many design and conceptual elements of the PRHI strategy are adapted from the Toyota Production System and its Pittsburgh derivative, the Alcoa Business System. PRHI is in the proof-of-concept phase of development.

  17. Long-time uncertainty propagation using generalized polynomial chaos and flow map composition

    SciTech Connect

    Luchtenburg, Dirk M.; Brunton, Steven L.; Rowley, Clarence W.

    2014-10-01

    We present an efficient and accurate method for long-time uncertainty propagation in dynamical systems. Uncertain initial conditions and parameters are both addressed. The method approximates the intermediate short-time flow maps by spectral polynomial bases, as in the generalized polynomial chaos (gPC) method, and uses flow map composition to construct the long-time flow map. In contrast to the gPC method, this approach has spectral error convergence for both short and long integration times. The short-time flow map is characterized by small stretching and folding of the associated trajectories and hence can be well represented by a relatively low-degree basis. The composition of these low-degree polynomial bases then accurately describes the uncertainty behavior for long integration times. The key to the method is that the degree of the resulting polynomial approximation increases exponentially in the number of time intervals, while the number of polynomial coefficients either remains constant (for an autonomous system) or increases linearly in the number of time intervals (for a non-autonomous system). The findings are illustrated on several numerical examples including a nonlinear ordinary differential equation (ODE) with an uncertain initial condition, a linear ODE with an uncertain model parameter, and a two-dimensional, non-autonomous double gyre flow.

  18. Decisions on new product development under uncertainties

    NASA Astrophysics Data System (ADS)

    Huang, Yeu-Shiang; Liu, Li-Chen; Ho, Jyh-Wen

    2015-04-01

    In an intensively competitive market, developing a new product has become a valuable strategy for companies to establish their market positions and enhance their competitive advantages. Therefore, it is essential to effectively manage the process of new product development (NPD). However, since various problems may arise in NPD projects, managers should set up some milestones and subsequently construct evaluative mechanisms to assess their feasibility. This paper employed the approach of Bayesian decision analysis to deal with the two crucial uncertainties for NPD, which are the future market share and the responses of competitors. The proposed decision process can provide a systematic analytical procedure to determine whether an NPD project should be continued or not under the consideration of whether effective usage is being made of the organisational resources. Accordingly, the proposed decision model can assist the managers in effectively addressing the NPD issue under the competitive market.

  19. Communicating uncertainties in natural hazard forecasts

    NASA Astrophysics Data System (ADS)

    Stein, Seth; Geller, Robert J.

    2012-09-01

    Natural hazards research seeks to help society develop strategies that appropriately balance risks and mitigation costs in addressing potential imminent threats and possible longer-term hazards. However, because scientists have only limited knowledge of the future, they must also communicate the uncertainties in what they know about the hazards. How to do so has been the subject of extensive recent discussion [Sarewitz et al., 2000; Oreskes, 2000; Pilkey and Pilkey-Jarvis, 2006]. One approach is General Colin Powell's charge to intelligence officers [Powell, 2012]: "Tell me what you know. Tell me what you don't know. Then tell me what you think. Always distinguish which is which." In dealing with natural hazards, the last point can be modified to "which is which and why." To illustrate this approach, it is helpful to consider some successful and unsuccessful examples [Stein, 2010; Stein et al., 2012].

  20. Reducing long-term reservoir performance uncertainty

    SciTech Connect

    Lippmann, M.J.

    1988-04-01

    Reservoir performance is one of the key issues that have to be addressed before going ahead with the development of a geothermal field. In order to select the type and size of the power plant and design other surface installations, it is necessary to know the characteristics of the production wells and of the produced fluids, and to predict the changes over a 10--30 year period. This is not a straightforward task, as in most cases the calculations have to be made on the basis of data collected before significant fluid volumes have been extracted from the reservoir. The paper describes the methodology used in predicting the long-term performance of hydrothermal systems, as well as DOE/GTD-sponsored research aimed at reducing the uncertainties associated with these predictions. 27 refs., 1 fig.

  1. Optimal fingerprinting under multiple sources of uncertainty

    NASA Astrophysics Data System (ADS)

    Hannart, Alexis; Ribes, Aurélien; Naveau, Philippe

    2014-02-01

    Detection and attribution studies routinely use linear regression methods referred to as optimal fingerprinting. Within the latter methodological paradigm, it is usually recognized that multiple sources of uncertainty affect both the observations and the simulated climate responses used as regressors. These include for instance internal variability, climate model error, or observational error. When all errors share the same covariance, the statistical inference is usually performed with the so-called total least squares procedure, but to date no inference procedure is readily available in the climate literature to treat the general case where this assumption does not hold. Here we address this deficiency. After a brief outlook on the error-in-variable models literature, we describe an inference procedure based on likelihood maximization, inspired by a recent article dealing with a similar situation in geodesy. We evaluate the performance of our approach via an idealized test bed. We find the procedure to outperform existing procedures when the latter wrongly neglect some sources of uncertainty.

  2. Uncertainty quantification in reacting flow modeling.

    SciTech Connect

    Le MaÒitre, Olivier P.; Reagan, Matthew T.; Knio, Omar M.; Ghanem, Roger Georges; Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  3. Visualizing uncertainty in biological expression data

    NASA Astrophysics Data System (ADS)

    Holzhüter, Clemens; Lex, Alexander; Schmalstieg, Dieter; Schulz, Hans-Jörg; Schumann, Heidrun; Streit, Marc

    2012-01-01

    Expression analysis of ~omics data using microarrays has become a standard procedure in the life sciences. However, microarrays are subject to technical limitations and errors, which render the data gathered likely to be uncertain. While a number of approaches exist to target this uncertainty statistically, it is hardly ever even shown when the data is visualized using for example clustered heatmaps. Yet, this is highly useful when trying not to omit data that is "good enough" for an analysis, which otherwise would be discarded as too unreliable by established conservative thresholds. Our approach addresses this shortcoming by first identifying the margin above the error threshold of uncertain, yet possibly still useful data. It then displays this uncertain data in the context of the valid data by enhancing a clustered heatmap. We employ different visual representations for the different kinds of uncertainty involved. Finally, it lets the user interactively adjust the thresholds, giving visual feedback in the heatmap representation, so that an informed choice on which thresholds to use can be made instead of applying the usual rule-of-thumb cut-offs. We exemplify the usefulness of our concept by giving details for a concrete use case from our partners at the Medical University of Graz, thereby demonstrating our implementation of the general approach.

  4. Uncertainty in Vs30-based site response

    USGS Publications Warehouse

    Thompson, Eric; Wald, David J.

    2016-01-01

    Methods that account for site response range in complexity from simple linear categorical adjustment factors to sophisticated nonlinear constitutive models. Seismic‐hazard analysis usually relies on ground‐motion prediction equations (GMPEs); within this framework site response is modeled statistically with simplified site parameters that include the time‐averaged shear‐wave velocity to 30 m (VS30) and basin depth parameters. Because VS30 is not known in most locations, it must be interpolated or inferred through secondary information such as geology or topography. In this article, we analyze a subset of stations for which VS30 has been measured to address effects of VS30 proxies on the uncertainty in the ground motions as modeled by GMPEs. The stations we analyze also include multiple recordings, which allow us to compute the repeatable site effects (or empirical amplification factors [EAFs]) from the ground motions. Although all methods exhibit similar bias, the proxy methods only reduce the ground‐motion standard deviations at long periods when compared to GMPEs without a site term, whereas measured VS30 values reduce the standard deviations at all periods. The standard deviation of the ground motions are much lower when the EAFs are used, indicating that future refinements of the site term in GMPEs have the potential to substantially reduce the overall uncertainty in the prediction of ground motions by GMPEs.

  5. Scheduling Future Water Supply Investments Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.

    2014-12-01

    Uncertain hydrological impacts of climate change, population growth and institutional changes pose a major challenge to planning of water supply systems. Planners seek optimal portfolios of supply and demand management schemes but also when to activate assets whilst considering many system goals and plausible futures. Incorporation of scheduling into the planning under uncertainty problem strongly increases its complexity. We investigate some approaches to scheduling with many-objective heuristic search. We apply a multi-scenario many-objective scheduling approach to the Thames River basin water supply system planning problem in the UK. Decisions include which new supply and demand schemes to implement, at what capacity and when. The impact of different system uncertainties on scheme implementation schedules are explored, i.e. how the choice of future scenarios affects the search process and its outcomes. The activation of schemes is influenced by the occurrence of extreme hydrological events in the ensemble of plausible scenarios and other factors. The approach and results are compared with a previous study where only the portfolio problem is addressed (without scheduling).

  6. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  7. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making.

  8. Economic Value Of Accurate Assessments Of Hydrological Uncertainty

    NASA Astrophysics Data System (ADS)

    Ajami, N. K.; Sunding, D. L.; Hornberger, G. M.

    2008-12-01

    The improvement of techniques to assist in the sustainable management of water resource systems is a crucial issue since our limited resources are under ever increasing pressure. A proper understanding of the sources and effects of uncertainty is needed to achieve goals related to improvements in reliability and sustainability in water resource management and planning. To date, many hydrological techniques have been developed to improve the quality and accuracy of hydrological forecasts and to assess the uncertainty associated with these forecasts. The economic value of improvements in calculations of uncertainty associated with hydrological forecasts from the water supply and demand management perspective remains largely unknown. We first explore the effect of more accurate assessments of hydrological uncertainty on the management of water resources by using an integrated approach to identify and quantify the sources of uncertainty. Subsequently, we analyze the value of a more reliable water supply forecast by studying the change in moments of the distribution of final surface water deliveries. This allows us to calculate the economic value of improving the information about uncertainty provided to stakeholders, especially during drought spells.

  9. Seabed variability and its influence on acoustic prediction uncertainty

    NASA Astrophysics Data System (ADS)

    Holland, Charles W.; Calder, Brian; Kraft, Barbara; Mayer, Larry; Goff, John; Harrison, Chris

    2005-09-01

    Kevin LePage (Naval Research Laboratory, Washington, DC), Robert I. Odom (University of Washington, Applied Physics Laboratory), Irina Overeem, James Syvitski (University of Colorado, INSTAAR, Boulder, CO) and Lincoln Pratson (Duke University, Durham, NC). The weakest link in performance prediction for naval systems operating in coastal regions is the environmental data that drive the models. In shallow-water downward refracting environments, the seabed properties and morphology often are the controlling environmental factors. In order to address the issue of uncertainty in seabed properties, we focused on two overarching goals: (1) assess and characterize seafloor variability in shelf environments, (2) determine the impact of the seafloor variability on acoustic prediction uncertainty. Our inherently multidisciplinary approach brought marine geology/geophysics and ocean acoustics together at the intersection of geoacoustic modeling. This talk will review results from a 3-year collaboration under the ONR Capturing Uncertainty DRI. [Work supported by the Office of Naval Research.

  10. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2015-11-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from [Berta et al., Nat. Phys. 6, 659 (2010)] is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the "uncertainty witness" lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from [Coles et al., Phys. Rev. Lett. 108, 210405 (2012)] makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM Quantum Experience and find reasonable agreement between our predictions and experimental outcomes.

  11. Uncertainty Quantification for Airfoil Icing

    NASA Astrophysics Data System (ADS)

    DeGennaro, Anthony Matteo

    Ensuring the safety of airplane flight in icing conditions is an important and active arena of research in the aerospace community. Notwithstanding the research, development, and legislation aimed at certifying airplanes for safe operation, an analysis of the effects of icing uncertainties on certification quantities of interest is generally lacking. The central objective of this thesis is to examine and analyze problems in airfoil ice accretion from the standpoint of uncertainty quantification. We focus on three distinct areas: user-informed, data-driven, and computational uncertainty quantification. In the user-informed approach to uncertainty quantification, we discuss important canonical icing classifications and show how these categories can be modeled using a few shape parameters. We then investigate the statistical effects of these parameters. In the data-driven approach, we build statistical models of airfoil ice shapes from databases of actual ice shapes, and quantify the effects of these parameters. Finally, in the computational approach, we investigate the effects of uncertainty in the physics of the ice accretion process, by perturbing the input to an in-house numerical ice accretion code that we develop in this thesis.

  12. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  13. Cascading rainfall uncertainty into flood inundation impact models

    NASA Astrophysics Data System (ADS)

    Souvignet, Maxime; Freer, Jim E.; de Almeida, Gustavo A. M.; Coxon, Gemma; Neal, Jeffrey C.; Champion, Adrian J.; Cloke, Hannah L.; Bates, Paul D.

    2014-05-01

    Observed and numerical weather prediction (NWP) simulated precipitation products typically show differences in their spatial and temporal distribution. These differences can considerably influence the ability to predict hydrological responses. For flood inundation impact studies, as in forecast situations, an atmospheric-hydrologic-hydraulic model chain is needed to quantify the extent of flood risk. Uncertainties cascaded through the model chain are seldom explored, and more importantly, how potential input uncertainties propagate through this cascade, and how best to approach this, is still poorly understood. This requires a combination of modelling capabilities, the non-linear transformation of rainfall to river flow using rainfall-runoff models, and finally the hydraulic flood wave propagation based on the runoff predictions. Improving the characterisation of uncertainty, and what is important to include, in each component is important for quantifying impacts and understanding flood risk for different return periods. In this paper, we propose to address this issue by i) exploring the effects of errors in rainfall on inundation predictive capacity within an uncertainty framework by testing inundation uncertainty against different comparable meteorological conditions (i.e. using different rainfall products) and ii) testing different techniques to cascade uncertainties (e.g. bootstrapping, PPU envelope) within the GLUE (generalised likelihood uncertainty estimation) framework. Our method cascades rainfall uncertainties into multiple rainfall-runoff model structures using the Framework for Understanding Structural Errors (FUSE). The resultant prediction uncertainties in upstream discharge provide uncertain boundary conditions that are cascaded into a simplified shallow water hydraulic model (LISFLOOD-FP). Rainfall data captured by three different measurement techniques - rain gauges, gridded radar data and numerical weather predictions (NWP) models are evaluated

  14. Uncertainty Analysis for RELAP5-3D

    SciTech Connect

    Aaron J. Pawel; Dr. George L. Mesina

    2011-08-01

    In its current state, RELAP5-3D is a 'best-estimate' code; it is one of our most reliable programs for modeling what occurs within reactor systems in transients from given initial conditions. This code, however, remains an estimator. A statistical analysis has been performed that begins to lay the foundation for a full uncertainty analysis. By varying the inputs over assumed probability density functions, the output parameters were shown to vary. Using such statistical tools as means, variances, and tolerance intervals, a picture of how uncertain the results are based on the uncertainty of the inputs has been obtained.

  15. An address geocoding solution for Chinese cities

    NASA Astrophysics Data System (ADS)

    Zhang, Xuehu; Ma, Haoming; Li, Qi

    2006-10-01

    We introduce the challenges of address geocoding for Chinese cities and present a potential solution along with a prototype system that deal with these challenges by combining and extending current geocoding solutions developed for United States and Japan. The proposed solution starts by separating city addresses into "standard" addresses which meet a predefined address model and non-standard ones. The standard addresses are stored in a structured relational database in their normalized forms, while a selected portion of the non-standard addresses are stored as aliases to the standard addresses. An in-memory address index is then constructed from the address database and serves as the basis for real-time address matching. Test results were obtained from two trials conducted in the city Beijing. On average 80% matching rate were achieved. Possible improvements to the current design are also discussed.

  16. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans

    2015-04-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  17. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.

    2015-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  18. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  19. Sub-Heisenberg phase uncertainties

    NASA Astrophysics Data System (ADS)

    Pezzé, Luca

    2013-12-01

    Phase shift estimation with uncertainty below the Heisenberg limit, ΔϕHL∝1/N¯T, where N¯T is the total average number of particles employed, is a mirage of linear quantum interferometry. Recently, Rivas and Luis, [New J. Phys.NJOPFM1367-263010.1088/1367-2630/14/9/093052 14, 093052 (2012)] proposed a scheme to achieve a phase uncertainty Δϕ∝1/N¯Tk, with k an arbitrary exponent. This sparked an intense debate in the literature which, ultimately, does not exclude the possibility to overcome ΔϕHL at specific phase values. Our numerical analysis of the Rivas and Luis proposal shows that sub-Heisenberg uncertainties are obtained only when the estimator is strongly biased. No violation of the Heisenberg limit is found after bias correction or when using a bias-free Bayesian analysis.

  20. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  1. Uncertainty and precaution in environmental management.

    PubMed

    Krayer von Krauss, M; van Asselt, M B A; Henze, M; Ravetz, J; Beck, M B

    2005-01-01

    In this paper, two different visions of the relationship between science and policy are contrasted with one another: the "modern" vision and the "precautionary" vision. Conditions which must apply in order to invoke the Precautionary Principle are presented, as are some of the main challenges posed by the principle. The following central question remains: If scientific certainty cannot be provided, what may then justify regulatory interventions, and what degree of intervention is justifiable? The notion of "quality of information" is explored, and it is emphasized that there can be no absolute definition of good or bad quality. Collective judgments of quality are only possible through deliberation on the characteristics of the information, and on the relevance of the information to the policy context. Reference to a relative criterion therefore seems inevitable and legal complexities are to be expected. Uncertainty is presented as a multidimensional concept, reaching far beyond the conventional statistical interpretation of the concept. Of critical importance is the development of methods for assessing qualitative categories of uncertainty. Model quality assessment should observe the following rationale: identify a model that is suited to the purpose, yet bears some reasonable resemblance to the "real" phenomena. In this context, "purpose" relates to the policy and societal contexts in which the assessment results are to be used. It is therefore increasingly agreed that judgment of the quality of assessments necessarily involves the participation of non-modellers and non-scientists. A challenging final question is: How to use uncertainty information in policy contexts? More research is required in order to answer this question.

  2. Inverse covariance simplification for efficient uncertainty management

    NASA Astrophysics Data System (ADS)

    Jalobeanu, A.; Gutiérrez, J. A.

    2007-11-01

    When it comes to manipulating uncertain knowledge such as noisy observations of physical quantities, one may ask how to do it in a simple way. Processing corrupted signals or images always propagates the uncertainties from the data to the final results, whether these errors are explicitly computed or not. When such error estimates are provided, it is crucial to handle them in such a way that their interpretation, or their use in subsequent processing steps, remain user-friendly and computationally tractable. A few authors follow a Bayesian approach and provide uncertainties as an inverse covariance matrix. Despite its apparent sparsity, this matrix contains many small terms that carry little information. Methods have been developed to select the most significant entries, through the use of information-theoretic tools for instance. One has to find a Gaussian pdf that is close enough to the posterior pdf, and with a small number of non-zero coefficients in the inverse covariance matrix. We propose to restrict the search space to Markovian models (where only neighbors can interact), well-suited to signals or images. The originality of our approach is in conserving the covariances between neighbors while setting to zero the entries of the inverse covariance matrix for all other variables. This fully constrains the solution, and the computation is performed via a fast, alternate minimization scheme involving quadratic forms. The Markovian structure advantageously reduces the complexity of Bayesian updating (where the simplified pdf is used as a prior). Moreover, uncertainties exhibit the same temporal or spatial structure as the data.

  3. Groundwater management under sustainable yield uncertainty

    NASA Astrophysics Data System (ADS)

    Delottier, Hugo; Pryet, Alexandre; Dupuy, Alain

    2015-04-01

    groundwater systems. For predictive analysis of the SY to be realistic for real world problems, we test a calibration method based on the Gauss-Levenberg-Marquardt algorithm. Our results highlight that the analysis of the SY predictive uncertainty is essential for groundwater management. This uncertainty is expected to be large and can be addressed with better a priori information on model parameter values.

  4. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  5. Uncertainties in offsite consequence analysis

    SciTech Connect

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  6. Awe, uncertainty, and agency detection.

    PubMed

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.

  7. Adaptive Control Law Design for Model Uncertainty Compensation

    DTIC Science & Technology

    1989-06-14

    AD-A211 712 WRDC-TR-89-3061 ADAPTIVE CONTROL LAW DESIGN FOR MODEL UNCERTAINTY COMPENSATION J. E. SORRELLS DYNETICS , INC. U 1000 EXPLORER BLVD. L Ell...MONITORING ORGANIZATION Dynetics , Inc. (If applicable) Wright Research and Development Center netics,_ _ I _nc.Flight Dynamics Laboratory, AFSC 6c. ADDRESS...controllers designed using Dynetics innovative aporoach were able to equal or surpass the STR and MRAC controllers in terms of performance robustness

  8. Estimating Uncertainties in the Multi-Instrument SBUV Profile Ozone Merged Data Set

    NASA Technical Reports Server (NTRS)

    Frith, Stacey; Stolarski, Richard

    2015-01-01

    The MOD data set is uniquely qualified for use in long-term ozone analysis because of its long record, high spatial coverage, and consistent instrument design and algorithm. The estimated MOD uncertainty term significantly increases the uncertainty over the statistical error alone. Trends in the post-2000 period are generally positive in the upper stratosphere, but only significant at 1-1.6 hPa. Remaining uncertainties not yet included in the Monte Carlo model are Smoothing Error ( 1 from 10 to 1 hPa) Relative calibration uncertainty between N11 and N17Seasonal cycle differences between SBUV records.

  9. Uncertainty in Bohr's response to the Heisenberg microscope

    NASA Astrophysics Data System (ADS)

    Tanona, Scott

    2004-09-01

    In this paper, I analyze Bohr's account of the uncertainty relations in Heisenberg's gamma-ray microscope thought experiment and address the question of whether Bohr thought uncertainty was epistemological or ontological. Bohr's account seems to allow that the electron being investigated has definite properties which we cannot measure, but other parts of his Como lecture seem to indicate that he thought that electrons are wave-packets which do not have well-defined properties. I argue that his account merges the ontological and epistemological aspects of uncertainty. However, Bohr reached this conclusion not from positivism, as perhaps Heisenberg did, but because he was led to that conclusion by his understanding of the physics in terms of nonseparability and the correspondence principle. Bohr argued that the wave theory from which he derived the uncertainty relations was not to be taken literally, but rather symbolically, as an expression of the limited applicability of classical concepts to parts of entangled quantum systems. Complementarity and uncertainty are consequences of the formalism, properly interpreted, and not something brought to the physics from external philosophical views.

  10. An Uncertainty-Aware Approach for Exploratory Microblog Retrieval.

    PubMed

    Liu, Mengchen; Liu, Shixia; Zhu, Xizhou; Liao, Qinying; Wei, Furu; Pan, Shimei

    2016-01-01

    Although there has been a great deal of interest in analyzing customer opinions and breaking news in microblogs, progress has been hampered by the lack of an effective mechanism to discover and retrieve data of interest from microblogs. To address this problem, we have developed an uncertainty-aware visual analytics approach to retrieve salient posts, users, and hashtags. We extend an existing ranking technique to compute a multifaceted retrieval result: the mutual reinforcement rank of a graph node, the uncertainty of each rank, and the propagation of uncertainty among different graph nodes. To illustrate the three facets, we have also designed a composite visualization with three visual components: a graph visualization, an uncertainty glyph, and a flow map. The graph visualization with glyphs, the flow map, and the uncertainty analysis together enable analysts to effectively find the most uncertain results and interactively refine them. We have applied our approach to several Twitter datasets. Qualitative evaluation and two real-world case studies demonstrate the promise of our approach for retrieving high-quality microblog data.

  11. Modelling ecosystem service flows under uncertainty with stochiastic SPAN

    USGS Publications Warehouse

    Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.

    2012-01-01

    Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.

  12. Estimation of uncertainty for fatigue growth rate at cryogenic temperatures

    NASA Astrophysics Data System (ADS)

    Nyilas, Arman; Weiss, Klaus P.; Urbach, Elisabeth; Marcinek, Dawid J.

    2014-01-01

    Fatigue crack growth rate (FCGR) measurement data for high strength austenitic alloys at cryogenic environment suffer in general from a high degree of data scatter in particular at ΔK regime below 25 MPa√m. Using standard mathematical smoothing techniques forces ultimately a linear relationship at stage II regime (crack propagation rate versus ΔK) in a double log field called Paris law. However, the bandwidth of uncertainty relies somewhat arbitrary upon the researcher's interpretation. The present paper deals with the use of the uncertainty concept on FCGR data as given by GUM (Guidance of Uncertainty in Measurements), which since 1993 is a recommended procedure to avoid subjective estimation of error bands. Within this context, the lack of a true value addresses to evaluate the best estimate by a statistical method using the crack propagation law as a mathematical measurement model equation and identifying all input parameters. Each parameter necessary for the measurement technique was processed using the Gaussian distribution law by partial differentiation of the terms to estimate the sensitivity coefficients. The combined standard uncertainty determined for each term with its computed sensitivity coefficients finally resulted in measurement uncertainty of the FCGR test result. The described procedure of uncertainty has been applied within the framework of ITER on a recent FCGR measurement for high strength and high toughness Type 316LN material tested at 7 K using a standard ASTM proportional compact tension specimen. The determined values of Paris law constants such as C0 and the exponent m as best estimate along with the their uncertainty value may serve a realistic basis for the life expectancy of cyclic loaded members.

  13. Quantifying relative uncertainties in the detection and attribution of human-induced climate change on winter streamflow

    NASA Astrophysics Data System (ADS)

    Ahn, Kuk-Hyun; Merwade, Venkatesh; Ojha, C. S. P.; Palmer, Richard N.

    2016-11-01

    In spite of recent popularity for investigating human-induced climate change in regional areas, understanding the contributors to the relative uncertainties in the process remains unclear. To remedy this, this study presents a statistical framework to quantify relative uncertainties in a detection and attribution study. Primary uncertainty contributors are categorized into three types: climate data, hydrologic, and detection uncertainties. While an ensemble of climate models is used to define climate data uncertainty, hydrologic uncertainty is defined using a Bayesian approach. Before relative uncertainties in the detection and attribution study are quantified, an optimal fingerprint-based detection and attribution analysis is employed to investigate changes in winter streamflow in the Connecticut River Basin, which is located in the Eastern United States. Results indicate that winter streamflow over a period of 64 years (1950-2013) lies outside the range expected from natural variability of climate alone with a 90% confidence interval in the climate models. Investigation of relative uncertainties shows that the uncertainty linked to the climate data is greater than the uncertainty induced by hydrologic modeling. Detection uncertainty, defined as the uncertainty related to time evolution of the anthropogenic climate change in the historical data (signal) above the natural internal climate variability (noise), shows that uncertainties in natural internal climate variability (piControl) scenarios may be the source of the significant degree of uncertainty in the regional Detection and Attribution study.

  14. Experimental warming in a dryland community reduced plant photosynthesis and soil CO2 efflux although the relationship between the fluxes remained unchanged

    USGS Publications Warehouse

    Wertin, Timothy M.; Belnap, Jayne; Reed, Sasha C.

    2016-01-01

    1. Drylands represent our planet's largest terrestrial biome and, due to their extensive area, maintain large stocks of carbon (C). Accordingly, understanding how dryland C cycling will respond to climate change is imperative for accurately forecasting global C cycling and future climate. However, it remains difficult to predict how increased temperature will affect dryland C cycling, as substantial uncertainties surround the potential responses of the two main C fluxes: plant photosynthesis and soil CO2 efflux. In addition to a need for an improved understanding of climate effects on individual dryland C fluxes, there is also notable uncertainty regarding how climate change may influence the relationship between these fluxes.2. To address this important knowledge gap, we measured a growing season's in situphotosynthesis, plant biomass accumulation, and soil CO2 efflux of mature Achnatherum hymenoides (a common and ecologically important C3 bunchgrass growing throughout western North America) exposed to ambient or elevated temperature (+2°C above ambient, warmed via infrared lamps) for three years.3. The 2°C increase in temperature caused a significant reduction in photosynthesis, plant growth, and soil CO2 efflux. Of important note, photosynthesis and soil respiration appeared tightly coupled and the relationship between these fluxes was not altered by the elevated temperature treatment, suggesting C fixation's strong control of both above-ground and below-ground dryland C cycling. Leaf water use efficiency was substantially increased in the elevated temperature treatment compared to the control treatment.4. Taken together, our results suggest notable declines in photosynthesis with relatively subtle warming, reveal strong coupling between above- and below-ground C fluxes in this dryland, and highlight temperature's strong effect on fundamental components of dryland C and water cycles.

  15. An analysis of uncertainties and skill in forecasts of winter storm losses

    NASA Astrophysics Data System (ADS)

    Pardowitz, Tobias; Osinski, Robert; Kruschke, Tim; Ulbrich, Uwe

    2016-11-01

    This paper describes an approach to derive probabilistic predictions of local winter storm damage occurrences from a global medium-range ensemble prediction system (EPS). Predictions of storm damage occurrences are subject to large uncertainty due to meteorological forecast uncertainty (typically addressed by means of ensemble predictions) and uncertainties in modelling weather impacts. The latter uncertainty arises from the fact that local vulnerabilities are not known in sufficient detail to allow for a deterministic prediction of damages, even if the forecasted gust wind speed contains no uncertainty. Thus, to estimate the damage model uncertainty, a statistical model based on logistic regression analysis is employed, relating meteorological analyses to historical damage records. A quantification of the two individual contributions (meteorological and damage model uncertainty) to the total forecast uncertainty is achieved by neglecting individual uncertainty sources and analysing resulting predictions. Results show an increase in forecast skill measured by means of a reduced Brier score if both meteorological and damage model uncertainties are taken into account. It is demonstrated that skilful predictions on district level (dividing the area of Germany into 439 administrative districts) are possible on lead times of several days. Skill is increased through the application of a proper ensemble calibration method, extending the range of lead times for which skilful damage predictions can be made.

  16. Addressing Underrepresentation: Physics Teaching for All

    NASA Astrophysics Data System (ADS)

    Rifkin, Moses

    2016-02-01

    Every physics teacher wants to give his or her students the opportunity to learn physics well. Despite these intentions, certain groups of students—including women and underrepresented minorities (URMs)—are not taking and not remaining in physics. In many cases, these disturbing trends are more significant in physics than in any other science. This is a missed opportunity for our discipline because demographic diversity strengthens science. The question is what we can do about these trends in our classrooms, as very few physics teachers have been explicitly prepared to address them. In this article, I will share some steps that I've taken in my classroom that have moved my class in the right direction. In the words of Nobel Prize-winning physicist Carl Wieman and psychologists Lauren Aguilar and Gregory Walton: "By investing a small amount of class time in carefully designed and implemented interventions, physics teachers can promote greater success among students from diverse backgrounds. Ultimately, we hope such efforts will indeed improve the diversity and health of the physics profession."

  17. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

  18. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  19. Saccade Adaptation and Visual Uncertainty

    PubMed Central

    Souto, David; Gegenfurtner, Karl R.; Schütz, Alexander C.

    2016-01-01

    Visual uncertainty may affect saccade adaptation in two complementary ways. First, an ideal adaptor should take into account the reliability of visual information for determining the amount of correction, predicting that increasing visual uncertainty should decrease adaptation rates. We tested this by comparing observers' direction discrimination and adaptation rates in an intra-saccadic-step paradigm. Second, clearly visible target steps may generate a slower adaptation rate since the error can be attributed to an external cause, instead of an internal change in the visuo-motor mapping that needs to be compensated. We tested this prediction by measuring saccade adaptation to different step sizes. Most remarkably, we found little correlation between estimates of visual uncertainty and adaptation rates and no slower adaptation rates with more visible step sizes. Additionally, we show that for low contrast targets backward steps are perceived as stationary after the saccade, but that adaptation rates are independent of contrast. We suggest that the saccadic system uses different position signals for adapting dysmetric saccades and for generating a trans-saccadic stable visual percept, explaining that saccade adaptation is found to be independent of visual uncertainty. PMID:27252635

  20. The face of uncertainty eats.

    PubMed

    Corwin, Rebecca L W

    2011-09-01

    The idea that foods rich in fat and sugar may be addictive has generated much interest, as well as controversy, among both scientific and lay communities. Recent research indicates that fatty and sugary food in-and-of itself is not addictive. Rather, the food and the context in which it is consumed interact to produce an addiction-like state. One of the contexts that appears to be important is the intermittent opportunity to consume foods rich in fat and sugar in environments where food is plentiful. Animal research indicates that, under these conditions, intake of the fatty sugary food escalates across time and binge-type behavior develops. However, the mechanisms that account for the powerful effect of intermittency on ingestive behavior have only begun to be elucidated. In this review, it is proposed that intermittency stimulates appetitive behavior that is associated with uncertainty regarding what, when, and how much of the highly palatable food to consume. Uncertainty may stimulate consumption of optional fatty and sugary treats due to differential firing of midbrain dopamine neurons, activation of the stress axis, and involvement of orexin signaling. In short, uncertainty may produce an aversive state that bingeing on palatable food can alleviate, however temporarily. "Food addiction" may not be "addiction" to food at all; it may be a response to uncertainty within environments of food abundance.

  1. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the

  2. The taphonomy of human remains in a glacial environment.

    PubMed

    Pilloud, Marin A; Megyesi, Mary S; Truffer, Martin; Congram, Derek

    2016-04-01

    A glacial environment is a unique setting that can alter human remains in characteristic ways. This study describes glacial dynamics and how glaciers can be understood as taphonomic agents. Using a case study of human remains recovered from Colony Glacier, Alaska, a glacial taphonomic signature is outlined that includes: (1) movement of remains, (2) dispersal of remains, (3) altered bone margins, (4) splitting of skeletal elements, and (5) extensive soft tissue preservation and adipocere formation. As global glacier area is declining in the current climate, there is the potential for more materials of archaeological and medicolegal significance to be exposed. It is therefore important for the forensic anthropologist to have an idea of the taphonomy in this setting and to be able to differentiate glacial effects from other taphonomic agents.

  3. 13. View South, showing the remaining pier footings for the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. View South, showing the remaining pier footings for the steam engine water tower for the Chesapeake and Ohio Railroad. - Cotton Hill Station Bridge, Spanning New River at State Route 16, Cotton Hill, Fayette County, WV

  4. 53. INTERIOR VIEW LOOKING NORTH NORTHEAST SHOWING THE REMAINS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    53. INTERIOR VIEW LOOKING NORTH NORTHEAST SHOWING THE REMAINS OF A WOODEN SETTLING BOX IN THE BACKGROUND RIGHT. AMALGAMATING PANS IN THE FOREGROUND. - Standard Gold Mill, East of Bodie Creek, Northeast of Bodie, Bodie, Mono County, CA

  5. View of submerged remains of Read Sawmill, showing floor boards, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of submerged remains of Read Sawmill, showing floor boards, cross beams and notches for wall post beams. - Silas C. Read Sawmill, Outlet of Maxwell Lake near North Range Road, Fort Gordon, Richmond County, GA

  6. View of submerged remains of Read Sawmill, showing floor boards, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of submerged remains of Read Sawmill, showing floor boards, wall boards, tenoned uprights and mortised sill beams. - Silas C. Read Sawmill, Outlet of Maxwell Lake near North Range Road, Fort Gordon, Richmond County, GA

  7. View of submerged remains of Read Sawmill, with floor boards ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of submerged remains of Read Sawmill, with floor boards removed, showing cross beams, foundation sill and mortises, and horizontal wall boards. - Silas C. Read Sawmill, Outlet of Maxwell Lake near North Range Road, Fort Gordon, Richmond County, GA

  8. View of submerged remains of Read Sawmill with most floorboards ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of submerged remains of Read Sawmill with most floorboards removed, showing cross beams with mortises, vertical wall boards, and horizontal floor boards. - Silas C. Read Sawmill, Outlet of Maxwell Lake near North Range Road, Fort Gordon, Richmond County, GA

  9. [The craniofacial identification of the remains from the Yekaterinburg burial].

    PubMed

    Abramov, S S

    1998-01-01

    Based on expert evaluation of remains of 7 members of Imperial Romanov family and 4 persons in their attendance, the author demonstrates methodological approaches to identification craniocephalic studies in cases with group burials.

  10. 4. Band Wheel and Walking Beam Mechanism, Including Remains of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Band Wheel and Walking Beam Mechanism, Including Remains of Frame Belt House, Looking Southeast - David Renfrew Oil Rig, East side of Connoquenessing Creek, 0.4 mile North of confluence with Thorn Creek, Renfrew, Butler County, PA

  11. Looking east inside of casthouse no. 6 at the remains ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Looking east inside of casthouse no. 6 at the remains of slag runner and slag notch of blast furnace no. 6. - U.S. Steel Edgar Thomson Works, Blast Furnace Plant, Along Monongahela River, Braddock, Allegheny County, PA

  12. 7. REMAINS OF PLANK WALL WITHIN CANAL CONSTRUCTED TO PROTECT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. REMAINS OF PLANK WALL WITHIN CANAL CONSTRUCTED TO PROTECT OUTSIDE CANAL BANK, LOOKING SOUTHWEST. NOTE CROSS SUPPORT POLES EXTENDING TO HILLSIDE. - Snake River Ditch, Headgate on north bank of Snake River, Dillon, Summit County, CO

  13. 6. REMAINS OF PLANK WALL NAILED TO POSTS WITHIN CANAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. REMAINS OF PLANK WALL NAILED TO POSTS WITHIN CANAL CONSTRUCTED TO PROTECT OUTSIDE CANAL BANK. VIEW IS TO THE WEST. - Snake River Ditch, Headgate on north bank of Snake River, Dillon, Summit County, CO

  14. 1. SOUTHWEST FRONT AND SOUTHEAST SIDE OF BLACKSMITH SHOP REMAINS, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. SOUTHWEST FRONT AND SOUTHEAST SIDE OF BLACKSMITH SHOP REMAINS, TENANT HOUSE IN BACKGROUND - Mount Etna Iron Works, Blacksmith Shop, East of U.S. Route 22 on T.R. 463, Williamsburg, Blair County, PA

  15. 21. REMAINS OF HOP BAILING CHUTE ON SECOND FLOOR; THIS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. REMAINS OF HOP BAILING CHUTE ON SECOND FLOOR; THIS CHUTE EXTENDS TO THE GROUND FLOOR. - James W. Seavey Hop Driers, 0.6 mile East from junction of Highway 99 & Alexander Avenue, Corvallis, Benton County, OR

  16. 20. REMAINS OF HOP BAILING CHUTE ON GROUND FLOOR; THIS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    20. REMAINS OF HOP BAILING CHUTE ON GROUND FLOOR; THIS CHUTE EXTENDS TO THE SECOND FLOOR. - James W. Seavey Hop Driers, 0.6 mile East from junction of Highway 99 & Alexander Avenue, Corvallis, Benton County, OR

  17. 21. Detail of remains of machinery house viewed from below ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. Detail of remains of machinery house viewed from below anchor-span deck, showing drawspan cable running back to the winding drum of the winch; view to northeast. - Summer Street Bridge, Spanning Reserved Channel, Boston, Suffolk County, MA

  18. Attempted Suicide Rates in U.S. Remain Unchanged

    MedlinePlus

    ... U.S. Remain Unchanged Men more often resorted to violent means, while women turned to poisoning, drowning, study ... likely to attempt suicide, but males used more violent methods. And all attempts were most common in ...

  19. 11. LOOKING SOUTH AT THE ONLY REMAINING PART OF THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. LOOKING SOUTH AT THE ONLY REMAINING PART OF THE NORTH SIDE OF ORIGINAL LAB, FROM COURTYARD. - U.S. Geological Survey, Rock Magnetics Laboratory, 345 Middlefield Road, Menlo Park, San Mateo County, CA

  20. 11. Remains of Douglasfir cordwood abandoned when kilns ceased operation, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. Remains of Douglas-fir cordwood abandoned when kilns ceased operation, looking northeast. - Warren King Charcoal Kilns, 5 miles west of Idaho Highway 28, Targhee National Forest, Leadore, Lemhi County, ID

  1. 25. CAFETERIA Note remains of tile floor in foreground. Food ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. CAFETERIA Note remains of tile floor in foreground. Food cooked on the stove was served to workers in the eating area to the left of the counter (off picture). - Hovden Cannery, 886 Cannery Row, Monterey, Monterey County, CA

  2. 3. INTERIOR OF THE WATER FILTRATION PLANT SHOWING REMAINS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. INTERIOR OF THE WATER FILTRATION PLANT SHOWING REMAINS OF THE FILTRATION APPARATUS. - Tower Hill No. 2 Mine, Approximately 0.47 mile Southwest of intersection of Stone Church Road & Township Route 561, Hibbs, Fayette County, PA

  3. 60. NORTHEASTERN VIEW OF THE REMAINS OF THE DOROTHY SIX ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    60. NORTHEASTERN VIEW OF THE REMAINS OF THE DOROTHY SIX BLAST FURNACE COMPLEX. (Martin Stupich) - U.S. Steel Duquesne Works, Blast Furnace Plant, Along Monongahela River, Duquesne, Allegheny County, PA

  4. 59. REMAINS OF THE DOROTHY SIX BLAST FURNACE COMPLEX LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    59. REMAINS OF THE DOROTHY SIX BLAST FURNACE COMPLEX LOOKING NORTHEAST. THE LADLE HOUSE IS ON THE RIGHT. (Martin Stupich) - U.S. Steel Duquesne Works, Blast Furnace Plant, Along Monongahela River, Duquesne, Allegheny County, PA

  5. 1. VIEW SHOWING REMAINS OF CAMOUFLAGE COVERING CONCRETE FOOTING FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW SHOWING REMAINS OF CAMOUFLAGE COVERING CONCRETE FOOTING FOR A GENERATOR PAD - Fort Cronkhite, Anti-Aircraft Battery No. 1, Concrete Footing-Generator Pad, Wolf Road, Sausalito, Marin County, CA

  6. 15. DETAIL VIEW, AT STREET LEVEL, OF REMAINING STONE POST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. DETAIL VIEW, AT STREET LEVEL, OF REMAINING STONE POST ON NORTH SIDE, STONE WALL AND METAL RAILING ON SOUTH SIDE, LOOKING SOUTHEAST - Lake Street Bridge, Spanning Ruddiman Creek at Lake Shore Drive, Muskegon, Muskegon County, MI

  7. The uncertainty principle - A simplified review of the four versions

    NASA Astrophysics Data System (ADS)

    Jijnasu, Vasudeva

    2016-08-01

    The complexity of the historical confusions around different versions of the uncertainty principle, in addition to the increasing technicality of physics in general, has made its affairs predominantly accessible only to specialists. Consequently, the clarity that has dawned upon physicists over the decades regarding quantum uncertainty remains mostly imperceptible for general readers, students, philosophers and even non-expert scientists. In an attempt to weaken this barrier, the article presents a summary of this technical subject, focussing at the prime case of the position-momentum pair, as modestly and informatively as possible. This includes a crisp analysis of the historical as well as of the latest developments. In the process the article provides arguments to show that the usually sidelined version of uncertainty-the intrinsic 'unsharpness' or 'indeterminacy'-forms the basis for all the other three versions, and subsequently presents its hard philosophical implications.

  8. The Slopes Remain the Same: Reply to Wolfe (2016)

    PubMed Central

    2016-01-01

    Wolfe (2016) responds to my article (Kristjánsson, 2015), arguing among other things, that the differences in slope by response method in my data reflect speed accuracy trade-offs. But when reaction times and errors are combined in one score (inverse efficiency) to sidestep speed accuracy trade-offs, slope differences still remain. The problem that slopes, which are thought to measure search speed, differ by response type therefore remains. PMID:27872743

  9. 52. VIEW OF REMAINS OF ORIGINAL 1907 CONTROL PANEL, LOCATED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    52. VIEW OF REMAINS OF ORIGINAL 1907 CONTROL PANEL, LOCATED ON NORTH WALL OF EAST END OF CONTROL ROOM. PORTIONS OF THIS PANEL REMAINED IN USE UNTIL THE PLANT CLOSED. THE METERS AND CONTROLS ARE MOUNTED ON SOAPSTONE PANELS. THE INSTRUMENT IN THE LEFT CENTER OF THE PHOTOGRAPH IS A TIRRILL VOLTAGE REGULATOR. - New York, New Haven & Hartford Railroad, Cos Cob Power Plant, Sound Shore Drive, Greenwich, Fairfield County, CT

  10. A non-destructive method for dating human remains

    USGS Publications Warehouse

    Lail, Warren K.; Sammeth, David; Mahan, Shannon; Nevins, Jason

    2013-01-01

    The skeletal remains of several Native Americans were recovered in an eroded state from a creek bank in northeastern New Mexico. Subsequently stored in a nearby museum, the remains became lost for almost 36 years. In a recent effort to repatriate the remains, it was necessary to fit them into a cultural chronology in order to determine the appropriate tribe(s) for consultation pursuant to the Native American Grave Protection and Repatriation Act (NAGPRA). Because the remains were found in an eroded context with no artifacts or funerary objects, their age was unknown. Having been asked to avoid destructive dating methods such as radiocarbon dating, the authors used Optically Stimulated Luminescence (OSL) to date the sediments embedded in the cranium. The OSL analyses yielded reliable dates between A.D. 1415 and A.D. 1495. Accordingly, we conclude that the remains were interred somewhat earlier than A.D. 1415, but no later than A.D. 1495. We believe the remains are from individuals ancestral to the Ute Mouache Band, which is now being contacted for repatriation efforts. Not only do our methods contribute to the immediate repatriation efforts, they provide archaeologists with a versatile, non-destructive, numerical dating method that can be used in many burial contexts.

  11. Stereo-particle image velocimetry uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John J.; Vlachos, Pavlos P.

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  12. Portfolios as Evidence of Reflective Practice: What Remains "Untold"

    ERIC Educational Resources Information Center

    Orland-Barak, Lily

    2005-01-01

    Addressing recent calls for investigating the specific quality of reflection associated with the uses of portfolios in teacher education, this paper describes and interprets the "practice of portfolio construction" as revealed in the construction and presentation of two kinds of portfolio in two in-service courses for mentors of teachers…

  13. A stochastic optimization model under modeling uncertainty and parameter certainty for groundwater remediation design--part I. Model development.

    PubMed

    He, L; Huang, G H; Lu, H W

    2010-04-15

    Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes.

  14. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in

  15. A violation of the uncertainty principle implies a violation of the second law of thermodynamics.

    PubMed

    Hänggi, Esther; Wehner, Stephanie

    2013-01-01

    Uncertainty relations state that there exist certain incompatible measurements, to which the outcomes cannot be simultaneously predicted. While the exact incompatibility of quantum measurements dictated by such uncertainty relations can be inferred from the mathematical formalism of quantum theory, the question remains whether there is any more fundamental reason for the uncertainty relations to have this exact form. What, if any, would be the operational consequences if we were able to go beyond any of these uncertainty relations? Here we give a strong argument that justifies uncertainty relations in quantum theory by showing that violating them implies that it is also possible to violate the second law of thermodynamics. More precisely, we show that violating the uncertainty relations in quantum mechanics leads to a thermodynamic cycle with positive net work gain, which is very unlikely to exist in nature.

  16. 16 CFR 0.2 - Official address.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Official address. The principal office of the Commission is at Washington, DC. All communications to the Commission should be addressed to the Federal Trade Commission, 600 Pennsylvania Avenue, NW, Washington,...

  17. 16 CFR 0.2 - Official address.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Official address. The principal office of the Commission is at Washington, DC. All communications to the Commission should be addressed to the Federal Trade Commission, 600 Pennsylvania Avenue, NW, Washington,...

  18. 16 CFR 0.2 - Official address.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Official address. The principal office of the Commission is at Washington, DC. All communications to the Commission should be addressed to the Federal Trade Commission, 600 Pennsylvania Avenue, NW, Washington,...

  19. Searching for resilience: addressing the impacts of changing disturbance regimes on forest ecosystem services

    PubMed Central

    Seidl, Rupert; Spies, Thomas A.; Peterson, David L.; Stephens, Scott L.; Hicke, Jeffrey A.

    2016-01-01

    Summary 1. The provisioning of ecosystem services to society is increasingly under pressure from global change. Changing disturbance regimes are of particular concern in this context due to their high potential impact on ecosystem structure, function and composition. Resilience-based stewardship is advocated to address these changes in ecosystem management, but its operational implementation has remained challenging. 2. We review observed and expected changes in disturbance regimes and their potential impacts on provisioning, regulating, cultural and supporting ecosystem services, concentrating on temperate and boreal forests. Subsequently, we focus on resilience as a powerful concept to quantify and address these changes and their impacts, and present an approach towards its operational application using established methods from disturbance ecology. 3. We suggest using the range of variability concept – characterizing and bounding the long-term behaviour of ecosystems – to locate and delineate the basins of attraction of a system. System recovery in relation to its range of variability can be used to measure resilience of ecosystems, allowing inferences on both engineering resilience (recovery rate) and monitoring for regime shifts (directionality of recovery trajectory). 4. It is important to consider the dynamic nature of these properties in ecosystem analysis and management decision-making, as both disturbance processes and mechanisms of resilience will be subject to changes in the future. Furthermore, because ecosystem services are at the interface between natural and human systems, the social dimension of resilience (social adaptive capacity and range of variability) requires consideration in responding to changing disturbance regimes in forests. 5. Synthesis and applications. Based on examples from temperate and boreal forests we synthesize principles and pathways for fostering resilience to changing disturbance regimes in ecosystem management. We

  20. Searching for resilience: addressing the impacts of changing disturbance regimes on forest ecosystem services.

    PubMed

    Seidl, Rupert; Spies, Thomas A; Peterson, David L; Stephens, Scott L; Hicke, Jeffrey A

    2016-02-01

    1. The provisioning of ecosystem services to society is increasingly under pressure from global change. Changing disturbance regimes are of particular concern in this context due to their high potential impact on ecosystem structure, function and composition. Resilience-based stewardship is advocated to address these changes in ecosystem management, but its operational implementation has remained challenging. 2. We review observed and expected changes in disturbance regimes and their potential impacts on provisioning, regulating, cultural and supporting ecosystem services, concentrating on temperate and boreal forests. Subsequently, we focus on resilience as a powerful concept to quantify and address these changes and their impacts, and present an approach towards its operational application using established methods from disturbance ecology. 3. We suggest using the range of variability concept - characterizing and bounding the long-term behaviour of ecosystems - to locate and delineate the basins of attraction of a system. System recovery in relation to its range of variability can be used to measure resilience of ecosystems, allowing inferences on both engineering resilience (recovery rate) and monitoring for regime shifts (directionality of recovery trajectory). 4. It is important to consider the dynamic nature of these properties in ecosystem analysis and management decision-making, as both disturbance processes and mechanisms of resilience will be subject to changes in the future. Furthermore, because ecosystem services are at the interface between natural and human systems, the social dimension of resilience (social adaptive capacity and range of variability) requires consideration in responding to changing disturbance regimes in forests. 5.Synthesis and applications. Based on examples from temperate and boreal forests we synthesize principles and pathways for fostering resilience to changing disturbance regimes in ecosystem management. We conclude that

  1. Phylogeny, extinction and conservation: embracing uncertainties in a time of urgency

    PubMed Central

    Forest, Félix; Crandall, Keith A.; Chase, Mark W.; Faith, Daniel P.

    2015-01-01

    Evolutionary studies have played a fundamental role in our understanding of life, but until recently, they had only a relatively modest involvement in addressing conservation issues. The main goal of the present discussion meeting issue is to offer a platform to present the available methods allowing the integration of phylogenetic and extinction risk data in conservation planning. Here, we identify the main knowledge gaps in biodiversity science, which include incomplete sampling, reconstruction biases in phylogenetic analyses, partly known species distribution ranges, and the difficulty in producing conservation assessments for all known species, not to mention that much of the effective biological diversity remains to be discovered. Given the impact that human activities have on biodiversity and the urgency with which we need to address these issues, imperfect assumptions need to be sanctioned and surrogates used in the race to salvage as much as possible of our natural and evolutionary heritage. We discuss some aspects of the uncertainties found in biodiversity science, such as the ideal surrogates for biodiversity, the gaps in our knowledge and the numerous available phylogenetic diversity-based methods. We also introduce a series of cases studies that demonstrate how evolutionary biology can effectively contribute to biodiversity conservation science. PMID:25561663

  2. Phylogeny, extinction and conservation: embracing uncertainties in a time of urgency.

    PubMed

    Forest, Félix; Crandall, Keith A; Chase, Mark W; Faith, Daniel P

    2015-02-19

    Evolutionary studies have played a fundamental role in our understanding of life, but until recently, they had only a relatively modest involvement in addressing conservation issues. The main goal of the present discussion meeting issue is to offer a platform to present the available methods allowing the integration of phylogenetic and extinction risk data in conservation planning. Here, we identify the main knowledge gaps in biodiversity science, which include incomplete sampling, reconstruction biases in phylogenetic analyses, partly known species distribution ranges, and the difficulty in producing conservation assessments for all known species, not to mention that much of the effective biological diversity remains to be discovered. Given the impact that human activities have on biodiversity and the urgency with which we need to address these issues, imperfect assumptions need to be sanctioned and surrogates used in the race to salvage as much as possible of our natural and evolutionary heritage. We discuss some aspects of the uncertainties found in biodiversity science, such as the ideal surrogates for biodiversity, the gaps in our knowledge and the numerous available phylogenetic diversity-based methods. We also introduce a series of cases studies that demonstrate how evolutionary biology can effectively contribute to biodiversity conservation science.

  3. Visualizing Flow of Uncertainty through Analytical Processes.

    PubMed

    Wu, Yingcai; Yuan, Guo-Xun; Ma, Kwan-Liu

    2012-12-01

    Uncertainty can arise in any stage of a visual analytics process, especially in data-intensive applications with a sequence of data transformations. Additionally, throughout the process of multidimensional, multivariate data analysis, uncertainty due to data transformation and integration may split, merge, increase, or decrease. This dynamic characteristic along with other features of uncertainty pose a great challenge to effective uncertainty-aware visualization. This paper presents a new framework for modeling uncertainty and characterizing the evolution of the uncertainty information through analytical processes. Based on the framework, we have designed a visual metaphor called uncertainty flow to visually and intuitively summarize how uncertainty information propagates over the whole analysis pipeline. Our system allows analysts to interact with and analyze the uncertainty information at different levels of detail. Three experiments were conducted to demonstrate the effectiveness and intuitiveness of our design.

  4. Laser triangulation: fundamental uncertainty in distance measurement.

    PubMed

    Dorsch, R G; Häusler, G; Herrmann, J M

    1994-03-01

    We discuss the uncertainty limit in distance sensing by laser triangulation. The uncertainty in distance measurement of laser triangulation sensors and other coherent sensors is limited by speckle noise. Speckle arises because of the coherent illumination in combination with rough surfaces. A minimum limit on the distance uncertainty is derived through speckle statistics. This uncertainty is a function of wavelength, observation aperture, and speckle contrast in the spot image. Surprisingly, it is the same distance uncertainty that we obtained from a single-photon experiment and from Heisenberg's uncertainty principle. Experiments confirm the theory. An uncertainty principle connecting lateral resolution and distance uncertainty is introduced. Design criteria for a sensor with minimum distanc uncertainty are determined: small temporal coherence, small spatial coherence, a large observation aperture.

  5. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  6. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  7. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 5 2013-10-01 2013-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  8. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 5 2014-10-01 2014-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  9. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  10. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  11. 37 CFR 41.10 - Correspondence addresses.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Correspondence addresses. 41... Correspondence addresses. Except as the Board may otherwise direct, (a) Appeals. Correspondence in an application... correspondence in an application or a patent involved in an appeal to the Board for which an address is...

  12. 37 CFR 41.10 - Correspondence addresses.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Correspondence addresses. 41... Correspondence addresses. Except as the Board may otherwise direct, (a) Appeals. Correspondence in an application... correspondence in an application or a patent involved in an appeal to the Board for which an address is...

  13. 47 CFR 13.10 - Licensee address.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Licensee address. 13.10 Section 13.10 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMERCIAL RADIO OPERATORS General § 13.10 Licensee address. In accordance with § 1.923 of this chapter all applications must specify an address where...

  14. 32 CFR 516.7 - Mailing addresses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Mailing addresses. 516.7 Section 516.7 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION General § 516.7 Mailing addresses. Mailing addresses for organizations referenced...

  15. Mapping Soil Transmitted Helminths and Schistosomiasis under Uncertainty: A Systematic Review and Critical Appraisal of Evidence

    PubMed Central

    Hamm, Nicholas A. S.; Soares Magalhães, Ricardo J.; Stein, Alfred

    2016-01-01

    Background Spatial modelling of STH and schistosomiasis epidemiology is now commonplace. Spatial epidemiological studies help inform decisions regarding the number of people at risk as well as the geographic areas that need to be targeted with mass drug administration; however, limited attention has been given to propagated uncertainties, their interpretation, and consequences for the mapped values. Using currently published literature on the spatial epidemiology of helminth infections we identified: (1) the main uncertainty sources, their definition and quantification and (2) how uncertainty is informative for STH programme managers and scientists working in this domain. Methodology/Principal Findings We performed a systematic literature search using the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) protocol. We searched Web of Knowledge and PubMed using a combination of uncertainty, geographic and disease terms. A total of 73 papers fulfilled the inclusion criteria for the systematic review. Only 9% of the studies did not address any element of uncertainty, while 91% of studies quantified uncertainty in the predicted morbidity indicators and 23% of studies mapped it. In addition, 57% of the studies quantified uncertainty in the regression coefficients but only 7% incorporated it in the regression response variable (morbidity indicator). Fifty percent of the studies discussed uncertainty in the covariates but did not quantify it. Uncertainty was mostly defined as precision, and quantified using credible intervals by means of Bayesian approaches. Conclusion/Significance None of the studies considered adequately all sources of uncertainties. We highlighted the need for uncertainty in the morbidity indicator and predictor variable to be incorporated into the modelling framework. Study design and spatial support require further attention and uncertainty associated with Earth observation data should be quantified. Finally, more attention

  16. On genetic information uncertainty and the mutator phenotype in cancer.

    PubMed

    Chan, Jason Yongsheng

    2012-01-01

    Recent evidence supports the existence of a mutator phenotype in cancer cells, although the mechanistic basis remains unknown. In this paper, it is shown that this enhanced genetic instability is generated by an amplified measurement uncertainty on genetic information during DNA replication. At baseline, an inherent measurement uncertainty implies an imprecision of the recognition, replication and transfer genetic information, and forms the basis for an intrinsic genetic instability in all biological cells. Genetic information is contained in the sequence of DNA bases, each existing due to proton tunnelling, as a coherent superposition of quantum states composed of both the canonical and rare tautomeric forms until decoherence by interaction with DNA polymerase. The result of such a quantum measurement process may be interpreted classically as akin to a Bernoulli trial, whose outcome X is random and can be either of two possibilities, depending on whether the proton is tunnelled (X=1) or not (X=0). This inherent quantum uncertainty is represented by a binary entropy function and quantified in terms of Shannon information entropy H(X)=-P(X=1)log(2)P(X=1)-P(X=0)log(2)P(X=0). Enhanced genetic instability may either be directly derived from amplified uncertainty induced by increases in quantum and thermodynamic fluctuation, or indirectly arise from the loss of natural uncertainty reduction mechanisms.

  17. Uncertainties in environmental radiological assessment models and their implications

    SciTech Connect

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible.

  18. Direct tests of measurement uncertainty relations: what it takes.

    PubMed

    Busch, Paul; Stevens, Neil

    2015-02-20

    The uncertainty principle being a cornerstone of quantum mechanics, it is surprising that, in nearly 90 years, there have been no direct tests of measurement uncertainty relations. This lacuna was due to the absence of two essential ingredients: appropriate measures of measurement error (and disturbance) and precise formulations of such relations that are universally valid and directly testable. We formulate two distinct forms of direct tests, based on different measures of error. We present a prototype protocol for a direct test of measurement uncertainty relations in terms of value deviation errors (hitherto considered nonfeasible), highlighting the lack of universality of these relations. This shows that the formulation of universal, directly testable measurement uncertainty relations for state-dependent error measures remains an important open problem. Recent experiments that were claimed to constitute invalidations of Heisenberg's error-disturbance relation, are shown to conform with the spirit of Heisenberg's principle if interpreted as direct tests of measurement uncertainty relations for error measures that quantify distances between observables.

  19. Identification of the remains of King Richard III.

    PubMed

    King, Turi E; Fortes, Gloria Gonzalez; Balaresque, Patricia; Thomas, Mark G; Balding, David; Maisano Delser, Pierpaolo; Neumann, Rita; Parson, Walther; Knapp, Michael; Walsh, Susan; Tonasso, Laure; Holt, John; Kayser, Manfred; Appleby, Jo; Forster, Peter; Ekserdjian, David; Hofreiter, Michael; Schürer, Kevin

    2014-12-02

    In 2012, a skeleton was excavated at the presumed site of the Grey Friars friary in Leicester, the last-known resting place of King Richard III. Archaeological, osteological and radiocarbon dating data were consistent with these being his remains. Here we report DNA analyses of both the skeletal remains and living relatives of Richard III. We find a perfect mitochondrial DNA match between the sequence obtained from the remains and one living relative, and a single-base substitution when compared with a second relative. Y-chromosome haplotypes from male-line relatives and the remains do not match, which could be attributed to a false-paternity event occurring in any of the intervening generations. DNA-predicted hair and eye colour are consistent with Richard's appearance in an early portrait. We calculate likelihood ratios for the non-genetic and genetic data separately, and combined, and conclude that the evidence for the remains being those of Richard III is overwhelming.

  20. Identification of the remains of King Richard III

    PubMed Central

    King, Turi E.; Fortes, Gloria Gonzalez; Balaresque, Patricia; Thomas, Mark G.; Balding, David; Delser, Pierpaolo Maisano; Neumann, Rita; Parson, Walther; Knapp, Michael; Walsh, Susan; Tonasso, Laure; Holt, John; Kayser, Manfred; Appleby, Jo; Forster, Peter; Ekserdjian, David; Hofreiter, Michael; Schürer, Kevin

    2014-01-01

    In 2012, a skeleton was excavated at the presumed site of the Grey Friars friary in Leicester, the last-known resting place of King Richard III. Archaeological, osteological and radiocarbon dating data were consistent with these being his remains. Here we report DNA analyses of both the skeletal remains and living relatives of Richard III. We find a perfect mitochondrial DNA match between the sequence obtained from the remains and one living relative, and a single-base substitution when compared with a second relative. Y-chromosome haplotypes from male-line relatives and the remains do not match, which could be attributed to a false-paternity event occurring in any of the intervening generations. DNA-predicted hair and eye colour are consistent with Richard’s appearance in an early portrait. We calculate likelihood ratios for the non-genetic and genetic data separately, and combined, and conclude that the evidence for the remains being those of Richard III is overwhelming. PMID:25463651

  1. Forensic considerations when dealing with incinerated human dental remains.

    PubMed

    Reesu, Gowri Vijay; Augustine, Jeyaseelan; Urs, Aadithya B

    2015-01-01

    Establishing the human dental identification process relies upon sufficient post-mortem data being recovered to allow for a meaningful comparison with ante-mortem records of the deceased person. Teeth are the most indestructible components of the human body and are structurally unique in their composition. They possess the highest resistance to most environmental effects like fire, desiccation, decomposition and prolonged immersion. In most natural as well as man-made disasters, teeth may provide the only means of positive identification of an otherwise unrecognizable body. It is imperative that dental evidence should not be destroyed through erroneous handling until appropriate radiographs, photographs, or impressions can be fabricated. Proper methods of physical stabilization of incinerated human dental remains should be followed. The maintenance of integrity of extremely fragile structures is crucial to the successful confirmation of identity. In such situations, the forensic dentist must stabilise these teeth before the fragile remains are transported to the mortuary to ensure preservation of possibly vital identification evidence. Thus, while dealing with any incinerated dental remains, a systematic approach must be followed through each stage of evaluation of incinerated dental remains to prevent the loss of potential dental evidence. This paper presents a composite review of various studies on incinerated human dental remains and discusses their impact on the process of human identification and suggests a step by step approach.

  2. Multidimensional biases, gaps and uncertainties in global plant occurrence information.

    PubMed

    Meyer, Carsten; Weigelt, Patrick; Kreft, Holger

    2016-08-01

    Plants are a hyperdiverse clade that plays a key role in maintaining ecological and evolutionary processes as well as human livelihoods. Biases, gaps and uncertainties in plant occurrence information remain a central problem in ecology and conservation, but these limitations remain largely unassessed globally. In this synthesis, we propose a conceptual framework for analysing gaps in information coverage, information uncertainties and biases in these metrics along taxonomic, geographical and temporal dimensions, and apply it to all c. 370 000 species of land plants. To this end, we integrated 120 million point-occurrence records with independent databases on plant taxonomy, distributions and conservation status. We find that different data limitations are prevalent in each dimension. Different metrics of information coverage and uncertainty are largely uncorrelated, and reducing taxonomic, spatial or temporal uncertainty by filtering out records would usually come at great costs to coverage. In light of these multidimensional data limitations, we discuss prospects for global plant ecological and biogeographical research, monitoring and conservation and outline critical next steps towards more effective information usage and mobilisation. Our study provides an empirical baseline for evaluating and improving global floristic knowledge, along with a conceptual framework that can be applied to study other hyperdiverse clades.

  3. Characterization of the volatile organic compounds present in the headspace of decomposing animal remains, and compared with human remains.

    PubMed

    Cablk, Mary E; Szelagowski, Erin E; Sagebiel, John C

    2012-07-10

    Human Remains Detection (HRD) dogs can be a useful tool to locate buried human remains because they rely on olfactory rather than visual cues. Trained specifically to locate deceased humans, it is widely believed that HRD dogs can differentiate animal remains from human remains. This study analyzed the volatile organic compounds (VOCs) present in the headspace above partially decomposed animal tissue samples and directly compared them with results published from human tissues using established solid-phase microextraction (SPME) and gas chromatography/mass spectrometry (GC/MS) methods. Volatile organic compounds present in the headspace of four different animal tissue samples (bone, muscle, fat and skin) from each of cow, pig and chicken were identified and compared to published results from human samples. Although there were compounds common to both animal and human remains, the VOC signatures of each of the animal remains differed from those of humans. Of particular interest was the difference between pigs and humans, because in some countries HRD dogs are trained on pig remains rather than human remains. Pig VOC signatures were not found to be a subset of human; in addition to sharing only seven of thirty human-specific compounds, an additional nine unique VOCs were recorded from pig samples which were not present in human samples. The VOC signatures from chicken and human samples were most similar sharing the most compounds of the animals studied. Identifying VOCs that are unique to humans may be useful to develop human-specific training aids for HRD canines, and may eventually lead to an instrument that can detect clandestine human burial sites.

  4. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    NASA Astrophysics Data System (ADS)

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  5. Handling uncertainty in quantitative estimates in integrated resource planning

    SciTech Connect

    Tonn, B.E.; Wagner, C.G.

    1995-01-01

    This report addresses uncertainty in Integrated Resource Planning (IRP). IRP is a planning and decisionmaking process employed by utilities, usually at the behest of Public Utility Commissions (PUCs), to develop plans to ensure that utilities have resources necessary to meet consumer demand at reasonable cost. IRP has been used to assist utilities in developing plans that include not only traditional electricity supply options but also demand-side management (DSM) options. Uncertainty is a major issue for IRP. Future values for numerous important variables (e.g., future fuel prices, future electricity demand, stringency of future environmental regulations) cannot ever be known with certainty. Many economically significant decisions are so unique that statistically-based probabilities cannot even be calculated. The entire utility strategic planning process, including IRP, encompasses different types of decisions that are made with different time horizons and at different points in time. Because of fundamental pressures for change in the industry, including competition in generation, gone is the time when utilities could easily predict increases in demand, enjoy long lead times to bring on new capacity, and bank on steady profits. The purpose of this report is to address in detail one aspect of uncertainty in IRP: Dealing with Uncertainty in Quantitative Estimates, such as the future demand for electricity or the cost to produce a mega-watt (MW) of power. A theme which runs throughout the report is that every effort must be made to honestly represent what is known about a variable that can be used to estimate its value, what cannot be known, and what is not known due to operational constraints. Applying this philosophy to the representation of uncertainty in quantitative estimates, it is argued that imprecise probabilities are superior to classical probabilities for IRP.

  6. Microscopic residues of bone from dissolving human remains in acids.

    PubMed

    Vermeij, Erwin; Zoon, Peter; van Wijk, Mayonne; Gerretsen, Reza

    2015-05-01

    Dissolving bodies is a current method of disposing of human remains and has been practiced throughout the years. During the last decade in the Netherlands, two cases have emerged in which human remains were treated with acid. In the first case, the remains of a cremated body were treated with hydrofluoric acid. In the second case, two complete bodies were dissolved in a mixture of hydrochloric and sulfuric acid. In both cases, a great variety of evidence was collected at the scene of crime, part of which was embedded in resin, polished, and investigated using SEM/EDX. Apart from macroscopic findings like residual bone and artificial teeth, in both cases, distinct microscopic residues of bone were found as follows: (partly) digested bone, thin-walled structures, and recrystallized calcium phosphate. Although some may believe it is possible to dissolve a body in acid completely, at least some of these microscopic residues will always be found.

  7. Osteometric sex determination of burned human skeletal remains.

    PubMed

    Gonçalves, D; Thompson, T J U; Cunha, E

    2013-10-01

    Sex determination of human burned skeletal remains is extremely hard to achieve because of heat-related fragmentation, warping and dimensional changes. In particular, the latter is impeditive of osteometric analyses that are based on references developed on unburned bones. New osteometric references were thus obtained which allow for more reliable sex determinations. The calcined remains of cremated Portuguese individuals were examined and specific standard measurements of the humerus, femur, talus and calcaneus were recorded. This allowed for the compilation of new sex discriminating osteometric references which were then tested on independent samples with good results. Both the use of simple section points and of logistic regression equations provided successful sex classification scores. These references may now be used for the sex determination of burned skeletons. Its reliability is highest for contemporary Portuguese remains but nonetheless these results have important repercussion for forensic research. More conservative use of these references may also prove valuable for other populations as well as for archaeological research.

  8. Classification of pelvic ring fractures in skeletonized human remains.

    PubMed

    Báez-Molgado, Socorro; Bartelink, Eric J; Jellema, Lyman M; Spurlock, Linda; Sholts, Sabrina B

    2015-01-01

    Pelvic ring fractures are associated with high rates of mortality and thus can provide key information about circumstances surrounding death. These injuries can be particularly informative in skeletonized remains, yet difficult to diagnose and interpret. This study adapted a clinical system of classifying pelvic ring fractures according to their resultant degree of pelvic stability for application to gross human skeletal remains. The modified Tile criteria were applied to the skeletal remains of 22 individuals from the Cleveland Museum of Natural History and Universidad Nacional Autónoma de México that displayed evidence of pelvic injury. Because these categories are tied directly to clinical assessments concerning the severity and treatment of injuries, this approach can aid in the identification of manner and cause of death, as well as interpretations of possible mechanisms of injury, such as those typical in car-to-pedestrian and motor vehicle accidents.

  9. Adjoint-Based Uncertainty Quantification with MCNP

    SciTech Connect

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  10. A Stronger Multi-observable Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Song, Qiu-Cheng; Li, Jun-Li; Peng, Guang-Xiong; Qiao, Cong-Feng

    2017-03-01

    Uncertainty relation lies at the heart of quantum mechanics, characterizing the incompatibility of non-commuting observables in the preparation of quantum states. An important question is how to improve the lower bound of uncertainty relation. Here we present a variance-based sum uncertainty relation for N incompatible observables stronger than the simple generalization of an existing uncertainty relation for two observables. Further comparisons of our uncertainty relation with other related ones for spin- and spin-1 particles indicate that the obtained uncertainty relation gives a better lower bound.

  11. Adjoint-Based Uncertainty Quantification with MCNP

    NASA Astrophysics Data System (ADS)

    Seifried, Jeffrey Edwin

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  12. A Stronger Multi-observable Uncertainty Relation

    PubMed Central

    Song, Qiu-Cheng; Li, Jun-Li; Peng, Guang-Xiong; Qiao, Cong-Feng

    2017-01-01

    Uncertainty relation lies at the heart of quantum mechanics, characterizing the incompatibility of non-commuting observables in the preparation of quantum states. An important question is how to improve the lower bound of uncertainty relation. Here we present a variance-based sum uncertainty relation for N incompatible observables stronger than the simple generalization of an existing uncertainty relation for two observables. Further comparisons of our uncertainty relation with other related ones for spin- and spin-1 particles indicate that the obtained uncertainty relation gives a better lower bound. PMID:28317917

  13. Dating human skeletal remains using 90Sr and 210Pb: case studies.

    PubMed

    Schrag, Bettina; Uldin, Tanya; Mangin, Patrice; Bochud, François; Froidevaux, Pascal

    2014-01-01

    In legal medicine, the post mortem interval (PMI) of interest covers the last 50 years. When only human skeletal remains are found, determining the PMI currently relies mostly on the experience of the forensic anthropologist, with few techniques available to help. Recently, several radiometric methods have been proposed to reveal PMI. For instance, (14)C and (90)Sr bomb pulse dating covers the last 60 years and give reliable PMI when teeth or bones are available. (232)Th series dating has also been proposed but requires a large amount of bones. In addition, (210)Pb dating is promising but is submitted to diagenesis and individual habits like smoking that must be handled carefully. Here we determine PMI on 29 cases of forensic interest using (90)Sr bomb pulse. In 12 cases, (210)Pb dating was added to narrow the PMI interval. In addition, anthropological investigations were carried out on 15 cases to confront anthropological expertise to the radiometric method. Results show that 10 of the 29 cases can be discarded as having no forensic interest (PMI>50 years) based only on the (90)Sr bomb pulse dating. For 10 other cases, the additional (210)Pb dating restricts the PMI uncertainty to a few years. In 15 cases, anthropological investigations corroborate the radiometric PMI. This study also shows that diagenesis and inter-individual difference in radionuclide uptake represent the main sources of uncertainty in the PMI determination using radiometric methods.

  14. Decay rates of human remains in an arid environment.

    PubMed

    Galloway, A; Birkby, W H; Jones, A M; Henry, T E; Parks, B O

    1989-05-01

    The environment of southern Arizona with mild winters and hot, dry summers produces great variability in decay rates of human remains. Summer temperatures, which range well over 38 degrees C (100 degrees F), induce rapid bloating as a result of the accumulation of decompositional gases. However, in certain circumstances, the aridity can lead to extensive mummification, allowing preservation of remains for hundreds of years. A retrospective study of 189 cases, concentrating on remains found on the desert floor or in the surrounding mountains and on remains found within closed structures, outlines the time frame and sequences of the decay process. Remains can retain a fresh appearance for a considerable time in the winter, but the onset of marked decomposition is rapid in the summer months. Bloating of the body usually is present two to seven days following death. Following this, within structures, there is frequently rapid decomposition and skeletonization. With outdoor exposure, remains are more likely to pass through a long period of dehydration of outer tissues, mummification, and reduction of desiccated tissue. Exposure of large portions of the skeleton usually does not occur until four to six months after death. Bleaching and exfoliation of bone--the beginning stages of destruction of the skeletal elements--begins at about nine months' exposure. Insect activity, including that of maggot and beetle varieties, may accelerate decomposition, but this process is greatly affected by location of the body, seasonal weather, and accessibility of the soft tissues. Carnivores and other scavengers also are contributing factors, as are clothing or covering of the body, substrate, elevation, and latitude.

  15. Accelerating adaptation of natural resource management to address climate change.

    PubMed

    Cross, Molly S; McCarthy, Patrick D; Garfin, Gregg; Gori, David; Enquist, Carolyn A F

    2013-02-01

    Natural resource managers are seeking tools to help them address current and future effects of climate change. We present a model for collaborative planning aimed at identifying ways to adapt management actions to address the effects of climate change in landscapes that cross public and private jurisdictional boundaries. The Southwest Climate Change Initiative (SWCCI) piloted the Adaptation for Conservation Targets (ACT) planning approach at workshops in 4 southwestern U.S. landscapes. This planning approach successfully increased participants' self-reported capacity to address climate change by providing them with a better understanding of potential effects and guiding the identification of solutions. The workshops fostered cross-jurisdictional and multidisciplinary dialogue on climate change through active participation of scientists and managers in assessing climate change effects, discussing the implications of those effects for determining management goals and activities, and cultivating opportunities for regional coordination on adaptation of management plans. Facilitated application of the ACT framework advanced group discussions beyond assessing effects to devising options to mitigate the effects of climate change on specific species, ecological functions, and ecosystems. Participants addressed uncertainty about future conditions by considering more than one climate-change scenario. They outlined opportunities and identified next steps for implementing several actions, and local partnerships have begun implementing actions and conducting additional planning. Continued investment in adaptation of management plans and actions to address the effects of climate change in the southwestern United States and extension of the approaches used in this project to additional landscapes are needed if biological diversity and ecosystem services are to be maintained in a rapidly changing world.

  16. OVERVIEW OF REMAINS OF DEWATERING BUILDING, LOOKING SOUTH TOWARD CYANIDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    OVERVIEW OF REMAINS OF DEWATERING BUILDING, LOOKING SOUTH TOWARD CYANIDE PROCESSING AREA. WATER USED IN PROCESSING AT THE STAMP MILL WAS CIRCULATED HERE FOR RECLAMATION. SANDS WERE SETTLED OUT AND DEPOSITED IN ONE OF TWO TAILINGS HOLDING AREAS. CLEARED WATER WAS PUMPED BACK TO THE MILL FOR REUSE. THIS PROCESS WAS ACCOMPLISHED BY THE USE OF SETTLING CONES, EIGHT FEET IN DIAMETER AND SIX FEET HIGH. THE REMAINS OF FOUR CONES ARE AT CENTER, BEHIND THE TANK IN THE FOREGROUND. TO THE LEFT IS THE MAIN ACCESS ROAD BETWEEN THE MILL AND THE PARKING LOT. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  17. Uncertainty relation for mutual information

    NASA Astrophysics Data System (ADS)

    Schneeloch, James; Broadbent, Curtis J.; Howell, John C.

    2014-12-01

    We postulate the existence of a universal uncertainty relation between the quantum and classical mutual informations between pairs of quantum systems. Specifically, we propose that the sum of the classical mutual information, determined by two mutually unbiased pairs of observables, never exceeds the quantum mutual information. We call this the complementary-quantum correlation (CQC) relation and prove its validity for pure states, for states with one maximally mixed subsystem, and for all states when one measurement is minimally disturbing. We provide results of a Monte Carlo simulation suggesting that the CQC relation is generally valid. Importantly, we also show that the CQC relation represents an improvement to an entropic uncertainty principle in the presence of a quantum memory, and that it can be used to verify an achievable secret key rate in the quantum one-time pad cryptographic protocol.

  18. MODEL VALIDATION AND UNCERTAINTY QUANTIFICATION.

    SciTech Connect

    Hemez, F.M.; Doebling, S.W.

    2000-10-01

    This session offers an open forum to discuss issues and directions of research in the areas of model updating, predictive quality of computer simulations, model validation and uncertainty quantification. Technical presentations review the state-of-the-art in nonlinear dynamics and model validation for structural dynamics. A panel discussion introduces the discussion on technology needs, future trends and challenges ahead with an emphasis placed on soliciting participation of the audience, One of the goals is to show, through invited contributions, how other scientific communities are approaching and solving difficulties similar to those encountered in structural dynamics. The session also serves the purpose of presenting the on-going organization of technical meetings sponsored by the U.S. Department of Energy and dedicated to health monitoring, damage prognosis, model validation and uncertainty quantification in engineering applications. The session is part of the SD-2000 Forum, a forum to identify research trends, funding opportunities and to discuss the future of structural dynamics.

  19. Aspects of complementarity and uncertainty

    NASA Astrophysics Data System (ADS)

    Vathsan, Radhika; Qureshi, Tabish

    2016-08-01

    The two-slit experiment with quantum particles provides many insights into the behavior of quantum mechanics, including Bohr’s complementarity principle. Here, we analyze Einstein’s recoiling slit version of the experiment and show how the inevitable entanglement between the particle and the recoiling slit as a which-way detector is responsible for complementarity. We derive the Englert-Greenberger-Yasin duality from this entanglement, which can also be thought of as a consequence of sum-uncertainty relations between certain complementary observables of the recoiling slit. Thus, entanglement is an integral part of the which-way detection process, and so is uncertainty, though in a completely different way from that envisaged by Bohr and Einstein.

  20. Improved Event Location Uncertainty Estimates

    DTIC Science & Technology

    2008-06-30

    model (such as Gaussian, spherical or exponential) typically used in geostatistics, we define the robust variogram model as the median regression curve...variogram model estimation We define the robust variogram model as the median regression curve of the residual difference squares for station pairs of...develop methodologies that improve location uncertainties in the presence of correlated, systematic model errors and non-Gaussian measurement errors. We