Science.gov

Sample records for address remaining uncertainties

  1. Remaining Useful Life Estimation in Prognosis: An Uncertainty Propagation Problem

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    The estimation of remaining useful life is significant in the context of prognostics and health monitoring, and the prediction of remaining useful life is essential for online operations and decision-making. However, it is challenging to accurately predict the remaining useful life in practical aerospace applications due to the presence of various uncertainties that affect prognostic calculations, and in turn, render the remaining useful life prediction uncertain. It is challenging to identify and characterize the various sources of uncertainty in prognosis, understand how each of these sources of uncertainty affect the uncertainty in the remaining useful life prediction, and thereby compute the overall uncertainty in the remaining useful life prediction. In order to achieve these goals, this paper proposes that the task of estimating the remaining useful life must be approached as an uncertainty propagation problem. In this context, uncertainty propagation methods which are available in the literature are reviewed, and their applicability to prognostics and health monitoring are discussed.

  2. Programmatic methods for addressing contaminated volume uncertainties.

    SciTech Connect

    DURHAM, L.A.; JOHNSON, R.L.; RIEMAN, C.R.; SPECTOR, H.L.; Environmental Science Division; U.S. ARMY CORPS OF ENGINEERS BUFFALO DISTRICT

    2007-01-01

    Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the preremedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly contributes to the uncertainty in the overall project cost estimates, especially since excavation and off-site disposal are the primary cost items in soil remedial action projects. The Army Corps of Engineers Buffalo District's experience has been that historical contaminated soil volume estimates developed under the Formerly Utilized Sites Remedial Action Program (FUSRAP) often underestimated the actual volume of subsurface contaminated soils requiring excavation during the course of a remedial activity. In response, the Buffalo District has adopted a variety of programmatic methods for addressing contaminated volume uncertainties. These include developing final status survey protocols prior to remedial design, explicitly estimating the uncertainty associated with volume estimates, investing in predesign data collection to reduce volume uncertainties, and incorporating dynamic work strategies and real-time analytics in predesign characterization and remediation activities. This paper describes some of these experiences in greater detail, drawing from the knowledge gained at Ashland1, Ashland2, Linde, and Rattlesnake Creek. In the case of Rattlesnake Creek, these approaches provided the Buffalo District with an accurate predesign contaminated volume estimate and resulted in one of the first successful FUSRAP fixed-price remediation contracts for the Buffalo District.

  3. Programmatic methods for addressing contaminated volume uncertainties

    SciTech Connect

    Rieman, C.R.; Spector, H.L.; Durham, L.A.; Johnson, R.L.

    2007-07-01

    Accurate estimates of the volumes of contaminated soils or sediments are critical to effective program planning and to successfully designing and implementing remedial actions. Unfortunately, data available to support the pre-remedial design are often sparse and insufficient for accurately estimating contaminated soil volumes, resulting in significant uncertainty associated with these volume estimates. The uncertainty in the soil volume estimates significantly contributes to the uncertainty in the overall project cost estimates, especially since excavation and off-site disposal are the primary cost items in soil remedial action projects. The U.S. Army Corps of Engineers Buffalo District's experience has been that historical contaminated soil volume estimates developed under the Formerly Utilized Sites Remedial Action Program (FUSRAP) often underestimated the actual volume of subsurface contaminated soils requiring excavation during the course of a remedial activity. In response, the Buffalo District has adopted a variety of programmatic methods for addressing contaminated volume uncertainties. These include developing final status survey protocols prior to remedial design, explicitly estimating the uncertainty associated with volume estimates, investing in pre-design data collection to reduce volume uncertainties, and incorporating dynamic work strategies and real-time analytics in pre-design characterization and remediation activities. This paper describes some of these experiences in greater detail, drawing from the knowledge gained at Ashland 1, Ashland 2, Linde, and Rattlesnake Creek. In the case of Rattlesnake Creek, these approaches provided the Buffalo District with an accurate pre-design contaminated volume estimate and resulted in one of the first successful FUSRAP fixed-price remediation contracts for the Buffalo District. (authors)

  4. Addressing uncertainty in adaptation planning for agriculture

    PubMed Central

    Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.

    2013-01-01

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681

  5. Uncertainty Quantification in Remaining Useful Life of Aerospace Components using State Space Models and Inverse FORM

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    This paper investigates the use of the inverse first-order reliability method (inverse- FORM) to quantify the uncertainty in the remaining useful life (RUL) of aerospace components. The prediction of remaining useful life is an integral part of system health prognosis, and directly helps in online health monitoring and decision-making. However, the prediction of remaining useful life is affected by several sources of uncertainty, and therefore it is necessary to quantify the uncertainty in the remaining useful life prediction. While system parameter uncertainty and physical variability can be easily included in inverse-FORM, this paper extends the methodology to include: (1) future loading uncertainty, (2) process noise; and (3) uncertainty in the state estimate. The inverse-FORM method has been used in this paper to (1) quickly obtain probability bounds on the remaining useful life prediction; and (2) calculate the entire probability distribution of remaining useful life prediction, and the results are verified against Monte Carlo sampling. The proposed methodology is illustrated using a numerical example.

  6. Adaptively Addressing Uncertainty in Estuarine and Near Coastal Restoration Projects

    SciTech Connect

    Thom, Ronald M.; Williams, Greg D.; Borde, Amy B.; Southard, John A.; Sargeant, Susan L.; Woodruff, Dana L.; Laufle, Jeffrey C.; Glasoe, Stuart

    2005-03-01

    Restoration projects have an uncertain outcome because of a lack of information about current site conditions, historical disturbance levels, effects of landscape alterations on site development, unpredictable trajectories or patterns of ecosystem structural development, and many other factors. A poor understanding of the factors that control the development and dynamics of a system, such as hydrology, salinity, wave energies, can also lead to an unintended outcome. Finally, lack of experience in restoring certain types of systems (e.g., rare or very fragile habitats) or systems in highly modified situations (e.g., highly urbanized estuaries) makes project outcomes uncertain. Because of these uncertainties, project costs can rise dramatically in an attempt to come closer to project goals. All of the potential sources of error can be addressed to a certain degree through adaptive management. The first step is admitting that these uncertainties can exist, and addressing as many of the uncertainties with planning and directed research prior to implementing the project. The second step is to evaluate uncertainties through hypothesis-driven experiments during project implementation. The third step is to use the monitoring program to evaluate and adjust the project as needed to improve the probability of the project to reach is goal. The fourth and final step is to use the information gained in the project to improve future projects. A framework that includes a clear goal statement, a conceptual model, and an evaluation framework can help in this adaptive restoration process. Projects and programs vary in their application of adaptive management in restoration, and it is very difficult to be highly prescriptive in applying adaptive management to projects that necessarily vary widely in scope, goal, ecosystem characteristics, and uncertainties. Very large ecosystem restoration programs in the Mississippi River delta (Coastal Wetlands Planning, Protection, and Restoration

  7. Incretin-based drugs and adverse pancreatic events: almost a decade later and uncertainty remains.

    PubMed

    Azoulay, Laurent

    2015-06-01

    Over the past few years, substantial clinical data have been presented showing that incretin-based therapies are effective glucose-lowering agents. Specifically, glucagon-like peptide 1 receptor agonists demonstrate an efficacy comparable to insulin treatment with minimal hypoglycemia and have favorable effects on body weight. Thus, many of the unmet clinical needs noted from prior therapies are addressed by these agents. However, even after many years of use, many continue to raise concerns about the long-term safety of these agents and, in particular, the concern with pancreatitis. This clearly remains a complicated topic. Thus, in this issue of Diabetes Care, we continue to update our readers on this very important issue by presenting two studies evaluating incretin-based medications and risk of pancreatitis. Both have undergone significant revisions based on peer review that provided significant clarification of the data. We applaud both author groups for being extremely responsive in providing the additional data and revisions requested by the editorial team. As such, because of the critical peer review, we feel both articles achieve the high level we require for Diabetes Care and are pleased to now present them to our readers. In keeping with our aim to comprehensively evaluate this topic, we asked for additional commentaries to be prepared. In the narrative outlined below, Dr. Laurent Azoulay provides a commentary about the remaining uncertainty in this area and also discusses the results from a nationwide population-based case-control study. In the narrative preceding Dr. Azoulay's contribution, Prof. Edwin A.M. Gale provides a commentary on the report that focuses on clinical trials of liraglutide in the treatment of diabetes. From the journal's perspective, both of the articles on pancreatitis and incretin-based therapies reported in this issue have been well vetted, and we feel both of the commentaries are insightful. PMID:25998285

  8. Addressing uncertainty in rock properties through geostatistical simulation

    SciTech Connect

    McKenna, S.A.; Rautman, A.; Cromer, M.V.; Zelinski, W.P.

    1996-09-01

    Fracture and matrix properties in a sequence of unsaturated, welded tuffs at Yucca Mountain, Nevada, are modeled in two-dimensional cross-sections through geostatistical simulation. In the absence of large amounts of sample data, an n interpretive, deterministic, stratigraphic model is coupled with a gaussian simulation algorithm to constrain realizations of both matrix porosity and fracture frequency. Use of the deterministic, stratigraphic model imposes scientific judgment, in the form of a conceptual geologic model, onto the property realizations. Linear coregionalization and a regression relationship between matrix porosity and matrix hydraulic conductivity are used to generate realizations of matrix hydraulic conductivity. Fracture-frequency simulations conditioned on the stratigraphic model represent one class of fractures (cooling fractures) in the conceptual model of the geology. A second class of fractures (tectonic fractures) is conceptualized as fractures that cut across strata vertically and includes discrete features such as fault zones. Indicator geostatistical simulation provides locations of this second class of fractures. The indicator realizations are combined with the realizations of fracture spacing to create realizations of fracture frequency that are a combination of both classes of fractures. Evaluations of the resulting realizations include comparing vertical profiles of rock properties within the model to those observed in boreholes and checking intra-unit property distributions against collected data. Geostatistical simulation provides an efficient means of addressing spatial uncertainty in dual continuum rock properties.

  9. Analytical algorithms to quantify the uncertainty in remaining useful life prediction

    NASA Astrophysics Data System (ADS)

    Sankararaman, S.; Daigle, M.; Saxena, A.; Goebel, K.

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decision-making. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the first-order second moment method (FOSM), the first-order reliabilitymethod (FORM), and the inverse first-order reliabilitymethod (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  10. Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai

    2013-01-01

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  11. Addressing uncertainty in fecal indicator bacteria dark inactivation rates.

    PubMed

    Gronewold, Andrew D; Myers, Luke; Swall, Jenise L; Noble, Rachel T

    2011-01-01

    Assessing the potential threat of fecal contamination in surface water often depends on model forecasts which assume that fecal indicator bacteria (FIB, a proxy for the concentration of pathogens found in fecal contamination from warm-blooded animals) are lost or removed from the water column at a certain rate (often referred to as an "inactivation" rate). In efforts to reduce human health risks in these water bodies, regulators enforce limits on easily-measured FIB concentrations, commonly reported as most probable number (MPN) and colony forming unit (CFU) values. Accurate assessment of the potential threat of fecal contamination, therefore, depends on propagating uncertainty surrounding "true" FIB concentrations into MPN and CFU values, inactivation rates, model forecasts, and management decisions. Here, we explore how empirical relationships between FIB inactivation rates and extrinsic factors might vary depending on how uncertainty in MPN values is expressed. Using water samples collected from the Neuse River Estuary (NRE) in eastern North Carolina, we compare Escherichia coli (EC) and Enterococcus (ENT) dark inactivation rates derived from two statistical models of first-order loss; a conventional model employing ordinary least-squares (OLS) regression with MPN values, and a novel Bayesian model utilizing the pattern of positive wells in an IDEXX Quanti-Tray®/2000 test. While our results suggest that EC dark inactivation rates tend to decrease as initial EC concentrations decrease and that ENT dark inactivation rates are relatively consistent across different ENT concentrations, we find these relationships depend upon model selection and model calibration procedures. We also find that our proposed Bayesian model provides a more defensible approach to quantifying uncertainty in microbiological assessments of water quality than the conventional MPN-based model, and that our proposed model represents a new strategy for developing robust relationships between

  12. Addressing Uncertainty in Fecal Indicator Bacteria Dark Inactivation Rates

    EPA Science Inventory

    Fecal contamination is a leading cause of surface water quality degradation. Roughly 20% of all total maximum daily load assessments approved by the United States Environmental Protection Agency since 1995, for example, address water bodies with unacceptably high fecal indicator...

  13. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    NASA Astrophysics Data System (ADS)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  14. An Assessment of Uncertainty in Remaining Life Estimation for Nuclear Structural Materials

    SciTech Connect

    Ramuhalli, Pradeep; Griffin, Jeffrey W.; Fricke, Jacob M.; Bond, Leonard J.

    2012-12-01

    In recent years, several operating US light-water nuclear power reactors (LWRs) have moved to extended-life operations (from 40 years to 60 years), and there is interest in the feasibility of extending plant life to 80 years. Operating experience suggests that material degradation of structural components in LWRs (such as the reactor pressure vessel) is expected to be the limiting factor for safe operation during extended life. Therefore, a need exists for assessing the condition of LWR structural components and determining its remaining useful life (RUL). The ability to estimate RUL of degraded structural components provides a basis for determining safety margins (i.e., whether safe operation over some pre-determined time horizon is possible), and scheduling degradation management activities (such as potentially modifying operating conditions to limit further degradation growth). A key issue in RUL estimation is calculation of uncertainty bounds, which are dependent on current material state, as well as past and future stressor levels (such as time-at-temperature, pressure, and irradiation). This paper presents a preliminary empirical investigation into the uncertainty of RUL estimates for nuclear structural materials.

  15. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... calculating probability of causation estimates at 42 CFR 81. In this way, claimants will receive the benefit... 42 Public Health 1 2011-10-01 2011-10-01 false How will NIOSH address uncertainty about dose... § 82.19 How will NIOSH address uncertainty about dose levels? The estimate of each annual dose will...

  16. Addressing the Uncertainty in Prescribing High Flows for River Restoration

    NASA Astrophysics Data System (ADS)

    Downs, P. W.; Sklar, L.; Braudrick, C. A.

    2002-12-01

    Flow prescriptions for environmental benefit in regulated rivers are commonly focused on the provision of minimum flow depths to achieve fish passage and holding habitat objectives. Assessment of these flows can be achieved readily and with reasonable confidence by using low-flow hydrological records and channel morphology data in combination with one dimensional hydraulic modeling. More recently, as understanding has increased of the critical role played by high flows in maintaining a wide range of habitats for instream and riparian flora and fauna, attention has turned to prescribing high flows to invoke the geomorphic processes that maintain suitable habitat niches. Prediction of the effects of these flows may require high-flow discharge and sediment transport data, high resolution topographic data, hydraulic and sediment transport modeling (often in two or three spatial dimensions), knowledge of the watershed historical context, and an understanding of the thresholds for channel morphological change. Not surprisingly, the associated level of uncertainty in this analysis increases tremendously. High flows are defined by a combination of magnitude, frequency, timing and duration parameters and their impact varies according to antecedent events. High flow bedload sediment transport records are rare, sediment transport equations are reliable usually to only an order of magnitude, practical applications of two and three-dimensional sediment transport models are in their infancy, the watershed historical record may be patchy with the link between cause and effect difficult to ascertain, and thresholds of channel morphological change are poorly understood. As the first step in reducing uncertainty, it is essential to state precisely the ecological target objectives of prescribed high flows, and to link these objectives to the hydraulic and geomorphic thresholds to be achieved or exceeded. Such thresholds provide the basis for a systematic classification of high flows

  17. Addressing sources of uncertainty in a global terrestrial carbon model

    NASA Astrophysics Data System (ADS)

    Exbrayat, J.; Pitman, A. J.; Zhang, Q.; Abramowitz, G.; Wang, Y.

    2013-12-01

    Several sources of uncertainty exist in the parameterization of the land carbon cycle in current Earth System Models (ESMs). For example, recently implemented interactions between the carbon (C), nitrogen (N) and phosphorus (P) cycles lead to diverse changes in land-atmosphere C fluxes simulated by different models. Further, although soil organic matter decomposition is commonly parameterized as a first-order decay process, the formulation of the microbial response to changes in soil moisture and soil temperature varies tremendously between models. Here, we examine the sensitivity of historical land-atmosphere C fluxes simulated by an ESM to these two major sources of uncertainty. We implement three soil moisture (SMRF) and three soil temperature (STRF) respiration functions in the CABLE-CASA-CNP land biogeochemical component of the coarse resolution CSIRO Mk3L climate model. Simulations are undertaken using three degrees of biogeochemical nutrient limitation: C-only, C and N, and C and N and P. We first bring all 27 possible combinations of a SMRF with a STRF and a biogeochemical mode to a steady-state in their biogeochemical pools. Then, transient historical (1850-2005) simulations are driven by prescribed atmospheric CO2 concentrations used in the fifth phase of the Coupled Model Intercomparison Project (CMIP5). Similarly to some previously published results, representing N and P limitation on primary production reduces the global land carbon sink while some regions become net C sources over the historical period (1850-2005). However, the uncertainty due to the SMRFs and STRFs does not decrease relative to the inter-annual variability in net uptake when N and P limitations are added. Differences in the SMRFs and STRFs and their effect on the soil C balance can also change the sign of some regional sinks. We show that this response is mostly driven by the pool size achieved at the end of the spin-up procedure. Further, there exists a six-fold range in the level

  18. Remaining Uncertainties in the Causes of Past and Future Atlantic Hurricane Activity

    NASA Astrophysics Data System (ADS)

    Kossin, J. P.

    2014-12-01

    There is no debate that hurricane activity in the North Atlantic has increased substantially since the relatively quiescent period of the 1970s and 1980s, but there is still uncertainty in the dominant cause of the increase. Increases in anthropogenic greenhouse gases (aGHG) have contributed to the observed increase in tropical sea surface temperatures (SST) over the past century, while shorter-term decadal variability in regions where hurricanes form and track is generally dominated by 1) internal variability, 2) natural factors such as volcanic eruptions and mineral aerosol variability, and 3) changes in anthropogenic aerosols. Direct SST warming from globally well-mixed aGHG is understood to have a much smaller effect on hurricane formation and intensification compared to the effect of regional warming due to changes in the three factors noted above. While most recent papers implicate both internal and external anthropogenic causes for the presently heightened Atlantic hurricane activity, some show that internal variability dominates and others show that anthropogenic factors dominate. In the Atlantic, model projection-based consensus indicates no change in storm frequency over the next century but the uncertainty is large and spans -50% to +50%. Mean storm intensity and rainfall rates are projected to increase with continued warming, and the models tend to agree better when projecting these measures of activity. Models that are capable of producing very strong hurricanes usually project increases in the frequency of the most intense hurricanes. This measure is highly relevant to physical and societal impacts. In the Atlantic, model-based consensus indicates substantial increases in the strongest hurricanes, but the uncertainty is large and spans -100% to +200% change over the next century.

  19. Addressing Uncertainty in the ISCORS Multimedia Radiological Dose Assessment of Municipal Sewage Sludge and Ash

    NASA Astrophysics Data System (ADS)

    Chiu, W. A.; Bachmaier, J.; Bastian, R.; Hogan, R.; Lenhart, T.; Schmidt, D.; Wolbarst, A.; Wood, R.; Yu, C.

    2002-05-01

    Managing municipal wastewater at publicly owned treatment works (POTWs) leads to the production of considerable amounts of residual solid material, which is known as sewage sludge or biosolids. If the wastewater entering a POTW contains radioactive material, then the treatment process may concentrate radionuclides in the sludge, leading to possible exposure of the general public or the POTW workers. The Sewage Sludge Subcommittee of the Interagency Steering Committee on Radiation Standards (ISCORS), which consists of representatives from the Environmental Protection Agency, the Nuclear Regulatory Commission, the Department of Energy, and several other federal, state, and local agencies, is developing guidance for POTWs on the management of sewage sludge that may contain radioactive materials. As part of this effort, they are conducting an assessment of potential radiation exposures using the Department of Energy's RESidual RADioactivity (RESRAD) family of computer codes developed by Argonne National Laboratory. This poster describes several approaches used by the Subcommittee to address the uncertainties associated with their assessment. For instance, uncertainties in the source term are addressed through a combination of analytic and deterministic computer code calculations. Uncertainties in the exposure pathways are addressed through the specification of a number of hypothetical scenarios, some of which can be scaled to address changes in exposure parameters. In addition, the uncertainty in some physical and behavioral parameters are addressed through probabilistic methods.

  20. Dry deposition of reactive nitrogen to marine environments: recent advances and remaining uncertainties.

    PubMed

    Pryor, Sara C; Sørensen, Lise Lotte

    2002-12-01

    Many highly productive marine ecosystems exhibit nitrogen limitation or co-limitation. This article is a status review of research into the exchange of nitrogen between the atmosphere and these ecosystems with a particular focus on reactive nitrogen compounds. A summary of research conducted over the past ten years is presented and a perspective given as to remaining uncertainities and research needs. Looking toward development of coastal management modeling tools, we illustrate the processes that need to be resolved in order to accurately simulate the flux from the atmosphere and provide guidance on the required resolution of such models. PMID:12523536

  1. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    SciTech Connect

    Cooke, Roger; MacDonell, Margaret

    2007-07-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  2. Empirical estimates to reduce modeling uncertainties of soil organic carbon in permafrost regions: a review of recent progress and remaining challenges

    USGS Publications Warehouse

    Mishra, U.; Jastrow, J.D.; Matamala, R.; Hugelius, G.; Koven, C.D.; Harden, J.W.; Ping, S.L.; Michaelson, G.J.; Fan, Z.; Miller, R.M.; McGuire, A.D.; Tarnocai, C.; Kuhry, P.; Riley, W.J.; Schaefer, K.; Schuur, E.A.G.; Jorgenson, M.T.; Hinzman, L.D.

    2013-01-01

    The vast amount of organic carbon (OC) stored in soils of the northern circumpolar permafrost region is a potentially vulnerable component of the global carbon cycle. However, estimates of the quantity, decomposability, and combustibility of OC contained in permafrost-region soils remain highly uncertain, thereby limiting our ability to predict the release of greenhouse gases due to permafrost thawing. Substantial differences exist between empirical and modeling estimates of the quantity and distribution of permafrost-region soil OC, which contribute to large uncertainties in predictions of carbon–climate feedbacks under future warming. Here, we identify research challenges that constrain current assessments of the distribution and potential decomposability of soil OC stocks in the northern permafrost region and suggest priorities for future empirical and modeling studies to address these challenges.

  3. Empirical estimates to reduce modeling uncertainties of soil organic carbon in permafrost regions: a review of recent progress and remaining challenges

    NASA Astrophysics Data System (ADS)

    Mishra, U.; Jastrow, J. D.; Matamala, R.; Hugelius, G.; Koven, C. D.; Harden, J. W.; Ping, C. L.; Michaelson, G. J.; Fan, Z.; Miller, R. M.; McGuire, A. D.; Tarnocai, C.; Kuhry, P.; Riley, W. J.; Schaefer, K.; Schuur, E. A. G.; Jorgenson, M. T.; Hinzman, L. D.

    2013-09-01

    The vast amount of organic carbon (OC) stored in soils of the northern circumpolar permafrost region is a potentially vulnerable component of the global carbon cycle. However, estimates of the quantity, decomposability, and combustibility of OC contained in permafrost-region soils remain highly uncertain, thereby limiting our ability to predict the release of greenhouse gases due to permafrost thawing. Substantial differences exist between empirical and modeling estimates of the quantity and distribution of permafrost-region soil OC, which contribute to large uncertainties in predictions of carbon-climate feedbacks under future warming. Here, we identify research challenges that constrain current assessments of the distribution and potential decomposability of soil OC stocks in the northern permafrost region and suggest priorities for future empirical and modeling studies to address these challenges.

  4. Application of fuzzy system theory in addressing the presence of uncertainties

    SciTech Connect

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-03

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  5. Application of fuzzy system theory in addressing the presence of uncertainties

    NASA Astrophysics Data System (ADS)

    Yusmye, A. Y. N.; Goh, B. Y.; Adnan, N. F.; Ariffin, A. K.

    2015-02-01

    In this paper, the combinations of fuzzy system theory with the finite element methods are present and discuss to deal with the uncertainties. The present of uncertainties is needed to avoid for prevent the failure of the material in engineering. There are three types of uncertainties, which are stochastic, epistemic and error uncertainties. In this paper, the epistemic uncertainties have been considered. For the epistemic uncertainty, it exists as a result of incomplete information and lack of knowledge or data. Fuzzy system theory is a non-probabilistic method, and this method is most appropriate to interpret the uncertainty compared to statistical approach when the deal with the lack of data. Fuzzy system theory contains a number of processes started from converting the crisp input to fuzzy input through fuzzification process and followed by the main process known as mapping process. The term mapping here means that the logical relationship between two or more entities. In this study, the fuzzy inputs are numerically integrated based on extension principle method. In the final stage, the defuzzification process is implemented. Defuzzification is an important process to allow the conversion of the fuzzy output to crisp outputs. Several illustrative examples are given and from the simulation, the result showed that propose the method produces more conservative results comparing with the conventional finite element method.

  6. Addressing Uncertainty in Contaminant Transport in Groundwater Using the Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Dwivedi, D.; Mohanty, B. P.

    2011-12-01

    Nitrate in groundwater shows significant uncertainty which arises from sparse data and interaction among multiple geophysical factors such as source availability (land use), thickness and composition of the vadose zone, types of aquifers (confined or unconfined), aquifer heterogeneity (geological and alluvial), precipitation characteristics, etc. This work presents the fusion of the ensemble Kalman filter (EnKF) with the numerical groundwater flow model MODFLOW and the solute transport model MT3DMS. The EnKF is a sequential data assimilation approach, which is applied to quantify and reduce the uncertainty of groundwater flow and solute transport models. We conducted numerical simulation experiments for the period January 1990 to December 2005 with MODFLOW and MT3DMS models for variably saturated groundwater flow in various aquifers across Texas. The EnKF was used to update the model parameters, hydraulic conductivity, hydraulic head and solute concentration. Results indicate that the EnKF method notably improves the estimation of the hydraulic conductivity distribution and solute transport prediction by assimilating piezometric head measurements with a known nitrate initial condition. A better estimation of hydraulic conductivity and assimilation of continuous measurements of solute concentrations resulted in reduced uncertainty in MODFLOW and MT3DMS models. It was found that the observation locations and locations in spatial proximity were appropriately corrected by the EnKF. The knowledge of nitrate plume evolution provided an insight into model structure, parameters, and sources of uncertainty.

  7. 42 CFR 82.19 - How will NIOSH address uncertainty about dose levels?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... characterized with a probability distribution that accounts for the uncertainty of the estimate. This information will be used by DOL in the calculation of probability of causation, under HHS guidelines for calculating probability of causation estimates at 42 CFR 81. In this way, claimants will receive the...

  8. DEVELOPMENTS AT U.S. EPA IN ADDRESSING UNCERTAINTY IN RISK ASSESSMENT

    EPA Science Inventory

    An emerging trend in risk assessment is to be more explicit about uncertainties, both during the analytical procedures and in communicating results. In February 1 992, then-Deputy EPA Administrator Henry Habicht set out Agency goals in a memorandum stating that the Agency will "p...

  9. Towards a common oil spill risk assessment framework – Adapting ISO 31000 and addressing uncertainties.

    PubMed

    Sepp Neves, Antonio Augusto; Pinardi, Nadia; Martins, Flavio; Janeiro, Joao; Samaras, Achilleas; Zodiatis, George; De Dominicis, Michela

    2015-08-15

    Oil spills are a transnational problem, and establishing a common standard methodology for Oil Spill Risk Assessments (OSRAs) is thus paramount in order to protect marine environments and coastal communities. In this study we firstly identified the strengths and weaknesses of the OSRAs carried out in various parts of the globe. We then searched for a generic and recognized standard, i.e. ISO 31000, in order to design a method to perform OSRAs in a scientific and standard way. The new framework was tested for the Lebanon oil spill that occurred in 2006 employing ensemble oil spill modeling to quantify the risks and uncertainties due to unknown spill characteristics. The application of the framework generated valuable visual instruments for the transparent communication of the risks, replacing the use of risk tolerance levels, and thus highlighting the priority areas to protect in case of an oil spill. PMID:26067897

  10. The Role of Health Education in Addressing Uncertainty about Health and Cell Phone Use--A Commentary

    ERIC Educational Resources Information Center

    Ratnapradipa, Dhitinut; Dundulis, William P., Jr.; Ritzel, Dale O.; Haseeb, Abdul

    2012-01-01

    Although the fundamental principles of health education remain unchanged, the practice of health education continues to evolve in response to the rapidly changing lifestyles and technological advances. Emerging health risks are often associated with these lifestyle changes. The purpose of this article is to address the role of health educators…

  11. Addressing the impact of environmental uncertainty in plankton model calibration with a dedicated software system: the Marine Model Optimization Testbed (MarMOT)

    NASA Astrophysics Data System (ADS)

    Hemmings, J. C. P.; Challenor, P. G.

    2011-08-01

    A wide variety of different marine plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. The Marine Model Optimization Testbed is a new software tool designed for rigorous analysis of plankton models in a multi-site 1-D framework, in particular to address uncertainty issues in model assessment. A flexible user interface ensures its suitability to more general inter-comparison, sensitivity and uncertainty analyses, including model comparison at the level of individual processes, and to state estimation for specific locations. The principal features of MarMOT are described and its application to model calibration is demonstrated by way of a set of twin experiments, in which synthetic observations are assimilated in an attempt to recover the true parameter values of a known system. The experimental aim is to investigate the effect of different misfit weighting schemes on parameter recovery in the presence of error in the plankton model's environmental input data. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergences of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error over an annual cycle, indicating

  12. Addressing the impact of environmental uncertainty in plankton model calibration with a dedicated software system: the Marine Model Optimization Testbed (MarMOT 1.1 alpha)

    NASA Astrophysics Data System (ADS)

    Hemmings, J. C. P.; Challenor, P. G.

    2012-04-01

    A wide variety of different plankton system models have been coupled with ocean circulation models, with the aim of understanding and predicting aspects of environmental change. However, an ability to make reliable inferences about real-world processes from the model behaviour demands a quantitative understanding of model error that remains elusive. Assessment of coupled model output is inhibited by relatively limited observing system coverage of biogeochemical components. Any direct assessment of the plankton model is further inhibited by uncertainty in the physical state. Furthermore, comparative evaluation of plankton models on the basis of their design is inhibited by the sensitivity of their dynamics to many adjustable parameters. Parameter uncertainty has been widely addressed by calibrating models at data-rich ocean sites. However, relatively little attention has been given to quantifying uncertainty in the physical fields required by the plankton models at these sites, and tendencies in the biogeochemical properties due to the effects of horizontal processes are often neglected. Here we use model twin experiments, in which synthetic data are assimilated to estimate a system's known "true" parameters, to investigate the impact of error in a plankton model's environmental input data. The experiments are supported by a new software tool, the Marine Model Optimization Testbed, designed for rigorous analysis of plankton models in a multi-site 1-D framework. Simulated errors are derived from statistical characterizations of the mixed layer depth, the horizontal flux divergence tendencies of the biogeochemical tracers and the initial state. Plausible patterns of uncertainty in these data are shown to produce strong temporal and spatial variability in the expected simulation error variance over an annual cycle, indicating variation in the significance attributable to individual model-data differences. An inverse scheme using ensemble-based estimates of the

  13. Eliciting climate experts' knowledge to address model uncertainties in regional climate projections: a case study of Guanacaste, Northwest Costa Rica

    NASA Astrophysics Data System (ADS)

    Grossmann, I.; Steyn, D. G.

    2014-12-01

    Global general circulation models typically cannot provide the detailed and accurate regional climate information required by stakeholders for climate adaptation efforts, given their limited capacity to resolve the regional topography and changes in local sea surface temperature, wind and circulation patterns. The study region in Northwest Costa Rica has a tropical wet-dry climate with a double-peak wet season. During the dry season the central Costa Rican mountains prevent tropical Atlantic moisture from reaching the region. Most of the annual precipitation is received following the northward migration of the ITCZ in May that allows the region to benefit from moist southwesterly flow from the tropical Pacific. The wet season begins with a short period of "early rains" and is interrupted by the mid-summer drought associated with the intensification and westward expansion of the North Atlantic subtropical high in late June. Model projections for the 21st century indicate a lengthening and intensification of the mid-summer drought and a weakening of the early rains on which current crop cultivation practices rely. We developed an expert elicitation to systematically address uncertainties in the available model projections of changes in the seasonal precipitation pattern. Our approach extends an elicitation approach developed previously at Carnegie Mellon University. Experts in the climate of the study region or Central American climate were asked to assess the mechanisms driving precipitation during each part of the season, uncertainties regarding these mechanisms, expected changes in each mechanism in a warming climate, and the capacity of current models to reproduce these processes. To avoid overconfidence bias, a step-by-step procedure was followed to estimate changes in the timing and intensity of precipitation during each part of the season. The questions drew upon interviews conducted with the regions stakeholders to assess their climate information needs. This

  14. Optimal regeneration planning for old-growth forest: addressing scientific uncertainty in endangered species recovery through adaptive management

    USGS Publications Warehouse

    Moore, C.T.; Conroy, M.J.

    2006-01-01

    Stochastic and structural uncertainties about forest dynamics present challenges in the management of ephemeral habitat conditions for endangered forest species. Maintaining critical foraging and breeding habitat for the endangered red-cockaded woodpecker (Picoides borealis) requires an uninterrupted supply of old-growth forest. We constructed and optimized a dynamic forest growth model for the Piedmont National Wildlife Refuge (Georgia, USA) with the objective of perpetuating a maximum stream of old-growth forest habitat. Our model accommodates stochastic disturbances and hardwood succession rates, and uncertainty about model structure. We produced a regeneration policy that was indexed by current forest state and by current weight of evidence among alternative model forms. We used adaptive stochastic dynamic programming, which anticipates that model probabilities, as well as forest states, may change through time, with consequent evolution of the optimal decision for any given forest state. In light of considerable uncertainty about forest dynamics, we analyzed a set of competing models incorporating extreme, but plausible, parameter values. Under any of these models, forest silviculture practices currently recommended for the creation of woodpecker habitat are suboptimal. We endorse fully adaptive approaches to the management of endangered species habitats in which predictive modeling, monitoring, and assessment are tightly linked.

  15. MEETING IN TUCSON: MODEL EVALUATION SCIENCE TO MEET TODAY'S QUALITY ASSURANCE REQUIREMENTS FOR REGULATORY USE: ADDRESSING UNCERTAINTY, SENSITIVITY, AND PARAMETERIZATION

    EPA Science Inventory

    The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental dec...

  16. MODEL EVALUATION SCIENCE TO MEET TODAY'S QUALITY ASSURANCE REQUIREMENTS FOR REGULATORY USE: ADDRESSING UNCERTAINTY, SENSITIVITY, AND PARAMETERIZATION

    EPA Science Inventory

    The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental de...

  17. Effectiveness and Tradeoffs between Portfolios of Adaptation Strategies Addressing Future Climate and Socioeconomic Uncertainties in California's Central Valley

    NASA Astrophysics Data System (ADS)

    Tansey, M. K.; Van Lienden, B.; Das, T.; Munevar, A.; Young, C. A.; Flores-Lopez, F.; Huntington, J. L.

    2013-12-01

    The Central Valley of California is one of the major agricultural areas in the United States. The Central Valley Project (CVP) is operated by the Bureau of Reclamation to serve multiple purposes including generating approximately 4.3 million gigawatt hours of hydropower and providing, on average, 5 million acre-feet of water per year to irrigate approximately 3 million acres of land in the Sacramento, San Joaquin, and Tulare Lake basins, 600,000 acre-feet per year of water for urban users, and 800,000 acre-feet of annual supplies for environmental purposes. The development of effective adaptation and mitigation strategies requires assessing multiple risks including potential climate changes as well as uncertainties in future socioeconomic conditions. In this study, a scenario-based analytical approach was employed by combining three potential 21st century socioeconomic futures with six representative climate and sea level change projections developed using a transient hybrid delta ensemble method from an archive of 112 bias corrected spatially downscaled CMIP3 global climate model simulations to form 18 future socioeconomic-climate scenarios. To better simulate the effects of climate changes on agricultural water demands, analyses of historical agricultural meteorological station records were employed to develop estimates of future changes in solar radiation and atmospheric humidity from the GCM simulated temperature and precipitation. Projected changes in atmospheric carbon dioxide were computed directly by weighting SRES emissions scenarios included in each representative climate projection. These results were used as inputs to a calibrated crop water use, growth and yield model to simulate the effects of climate changes on the evapotranspiration and yields of major crops grown in the Central Valley. Existing hydrologic, reservoir operations, water quality, hydropower, greenhouse gas (GHG) emissions and both urban and agricultural economic models were integrated

  18. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes

    PubMed Central

    Curtis, Janelle M.R.

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  19. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes.

    PubMed

    Naujokaitis-Lewis, Ilona; Curtis, Janelle M R

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  20. Addressing solar modulation and long-term uncertainties in scaling secondary cosmic rays for in situ cosmogenic nuclide applications [rapid communication

    NASA Astrophysics Data System (ADS)

    Lifton, Nathaniel A.; Bieber, John W.; Clem, John M.; Duldig, Marc L.; Evenson, Paul; Humble, John E.; Pyle, Roger

    2005-10-01

    Solar modulation affects the secondary cosmic rays responsible for in situ cosmogenic nuclide (CN) production the most at the high geomagnetic latitudes to which CN production rates are traditionally referenced. While this has long been recognized (e.g., D. Lal, B. Peters, Cosmic ray produced radioactivity on the Earth, in: K. Sitte (Ed.), Handbuch Der Physik XLVI/2, Springer-Verlag, Berlin, 1967, pp. 551-612 and D. Lal, Theoretically expected variations in the terrestrial cosmic ray production rates of isotopes, in: G.C. Castagnoli (Ed.), Proceedings of the Enrico Fermi International School of Physics 95, Italian Physical Society, Varenna 1988, pp. 216-233), these variations can lead to potentially significant scaling model uncertainties that have not been addressed in detail. These uncertainties include the long-term (millennial-scale) average solar modulation level to which secondary cosmic rays should be referenced, and short-term fluctuations in cosmic ray intensity measurements used to derive published secondary cosmic ray scaling models. We have developed new scaling models for spallogenic nucleons, slow-muon capture and fast-muon interactions that specifically address these uncertainties. Our spallogenic nucleon scaling model, which includes data from portions of 5 solar cycles, explicitly incorporates a measure of solar modulation ( S), and our fast- and slow-muon scaling models (based on more limited data) account for solar modulation effects through increased uncertainties. These models improve on previously published models by better sampling the observed variability in measured cosmic ray intensities as a function of geomagnetic latitude, altitude, and solar activity. Furthermore, placing the spallogenic nucleon data in a consistent time-space framework allows for a more realistic assessment of uncertainties in our model than in earlier ones. We demonstrate here that our models reasonably account for the effects of solar modulation on measured

  1. Procedures for addressing uncertainty and variability in exposure to characterize potential health risk from trichloroethylene contaminated groundwater at Beale Air Force Base in California

    SciTech Connect

    Bogen, K T; Daniels, J I; Hall, L C

    1999-09-01

    This study was designed to accomplish two objectives. The first was to provide to the US Air Force and the regulatory community quantitative procedures that they might want to consider using for addressing uncertainty and variability in exposure to better characterize potential health risk. Such methods could be used at sites where populations may now or in the future be faced with using groundwater contaminated with low concentrations of the chemical trichloroethylene (TCE). The second was to illustrate and explain the application of these procedures with respect to available data for TCE in ground water beneath an inactive landfill site that is undergoing remediation at Beale Air Force Base in California. The results from this illustration provide more detail than the more traditional conservative deterministic, screening-level calculations of risk, also computed for purposes of comparison. Application of the procedures described in this report can lead to more reasonable and equitable risk-acceptability criteria for potentially exposed populations at specific sites.

  2. Procedures for addressing uncertainty and variability in exposure to characterize potential health risk from trichloroethylene contaminated ground water at Beale Air Force Base in California

    SciTech Connect

    Daniels, J I; Bogen, K T; Hall, L C

    1999-10-05

    Conservative deterministic, screening-level calculations of exposure and risk commonly are used in quantitative assessments of potential human-health consequences from contaminants in environmental media. However, these calculations generally are based on multiple upper-bound point estimates of input parameters, particularly for exposure attributes, and can therefore produce results for decision makers that actually overstate the need for costly remediation. Alternatively, a more informative and quantitative characterization of health risk can be obtained by quantifying uncertainty and variability in exposure. This process is illustrated in this report for a hypothetical population at a specific site at Beale Air Force Base in California, where there is trichloroethylene (TCE) contaminated ground water and a potential for future residential use. When uncertainty and variability in exposure were addressed jointly for this case, the 95th-percentile upper-bound value of individual excess lifetime cancer risk was a factor approaching 10 lower than the most conservative deterministic estimate. Additionally, the probability of more than zero additional cases of cancer can be estimated, and in this case it is less than 0.5 for a hypothetical future residential population of up to 26,900 individuals present for any 7.6-y interval of a 70-y time period. Clearly, the results from application of this probabilistic approach can provide reasonable and equitable risk-acceptability criteria for a contaminated site.

  3. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk from Trichloroethylene-Contaminated Ground Water at Beale Air Force Base in California:Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response

    SciTech Connect

    Bogen, K T

    2001-05-24

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability within a systematic probabilistic framework to integrate the joint effects on risk of distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such a framework was used to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA{sub G}) vs. a cytotoxic/nonlinear (MA{sub c}) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA{sub G} based on seven rodent-bioassay data sets, and under MA{sub c} based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were <10{sup -6} and 10{sup -4}, respectively, while corresponding estimates based on traditional deterministic methods were >10{sup -5} and 10{sup -4}, respectively. It was estimated that no TCE-related harm is likely to occur due to any plausible residential exposure scenario involving the site. The systematic probabilistic framework illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action.

  4. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk From Trichloroethylene-Contaminated Ground Water Beale Air Force Base in California: Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response

    SciTech Connect

    Bogen, K.T.

    1999-09-29

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability after applying a unified probabilistic approach to the distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such an approach was applied to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA{sub g}) vs. a cytotoxic/nonlinear (MA{sub c}) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA{sub G} based on seven rodent-bioassay data sets, and under MA, based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were <10{sup -6} and <10{sup -4}, respectively, while corresponding estimates based on traditional deterministic methods were >10{sup -5} and >10{sup -4}, respectively. It was estimated that no TCE-related harm is likely occur due any plausible residential exposure scenario involving the site. The unified approach illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action.

  5. Propellant-remaining modeling

    NASA Technical Reports Server (NTRS)

    Torgovitsky, S.

    1991-01-01

    A successful satellite mission is predicted upon the proper maintenance of the spacecraft's orbit and attitude. One requirement for planning and predicting the orbit and attitude is the accurate estimation of the propellant remaining onboard the spacecraft. Focuss is on the three methods that were developed for calculating the propellant budget: the errors associated with each method and the uncertainties in the variables required to determine the propellant remaining that contribute to these errors. Based on these findings, a strategy is developed for improved propellant-remaining estimation. The first method is based on Boyle's law, which related the values of pressure, volume, and temperature (PVT) of an ideal gas. The PVT method is used for the monopropellant and the bipropellant engines. The second method is based on the engine performance tests, which provide data that relate thrust and specific impulse associated with a propellant tank to that tank's pressure. Two curves representing thrust and specific impulse as functions of pressure are then generated using a polynomial fit on the engine performance data. The third method involves a computer simulation of the propellant system. The propellant flow is modeled by creating a conceptual model of the propulsion system configuration, taking into account such factors as the propellant and pressurant tank characteristics, thruster functionality, and piping layout. Finally, a thrust calibration technique is presented that uses differential correction with the computer simulation method of propellant-remaining modeling. Thrust calibration provides a better assessment of thruster performance and therefore enables a more accurate estimation of propellant consumed during a given maneuver.

  6. Development of a physiologically-based pharmacokinetic model of 2-phenoxyethanol and its metabolite phenoxyacetic acid in rats and humans to address toxicokinetic uncertainty in risk assessment.

    PubMed

    Troutman, John A; Rick, David L; Stuard, Sharon B; Fisher, Jeffrey; Bartels, Michael J

    2015-11-01

    2-Phenoxyethanol (PhE) has been shown to induce hepatotoxicity, renal toxicity, and hemolysis at dosages ≥ 400 mg/kg/day in subchronic and chronic studies in multiple species. To reduce uncertainty associated with interspecies extrapolations and to evaluate the margin of exposure (MOE) for use of PhE in cosmetics and baby products, a physiologically-based pharmacokinetic (PBPK) model of PhE and its metabolite 2-phenoxyacetic acid (PhAA) was developed. The PBPK model incorporated key kinetic processes describing the absorption, distribution, metabolism and excretion of PhE and PhAA following oral and dermal exposures. Simulations of repeat dose rat studies facilitated the selection of systemic AUC as the appropriate dose metric for evaluating internal exposures to PhE and PhAA in rats and humans. Use of the PBPK model resulted in refinement of the total default UF for extrapolation of the animal data to humans from 100 to 25. Based on very conservative assumptions for product composition and aggregate product use, model-predicted exposures to PhE and PhAA resulting from adult and infant exposures to cosmetic products are significantly below the internal dose of PhE observed at the NOAEL dose in rats. Calculated MOEs for all exposure scenarios were above the PBPK-refined UF of 25. PMID:26188115

  7. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  8. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  9. [PALEOPATHOLOGY OF HUMAN REMAINS].

    PubMed

    Minozzi, Simona; Fornaciari, Gino

    2015-01-01

    Many diseases induce alterations in the human skeleton, leaving traces of their presence in ancient remains. Paleopathological examination of human remains not only allows the study of the history and evolution of the disease, but also the reconstruction of health conditions in the past populations. This paper describes the most interesting diseases observed in skeletal samples from the Roman Imperial Age necropoles found in urban and suburban areas of Rome during archaeological excavations in the last decades. The diseases observed were grouped into the following categories: articular diseases, traumas, infections, metabolic or nutritional diseases, congenital diseases and tumours, and some examples are reported for each group. Although extensive epidemiological investigation in ancient skeletal records is impossible, the palaeopathological study allowed to highlight the spread of numerous illnesses, many of which can be related to the life and health conditions of the Roman population. PMID:27348992

  10. Masters change, slaves remain.

    PubMed

    Graham, Patricia; Penn, Jill K M; Schedl, Paul

    2003-01-01

    Sex determination offers an opportunity to address many classic questions of developmental biology. In addition, because sex determination evolves rapidly, it offers an opportunity to investigate the evolution of genetic hierarchies. Sex determination in Drosophila melanogaster is controlled by the master regulatory gene, Sex lethal (Sxl). DmSxl controls the alternative splicing of a downstream gene, transformer (tra), which acts with tra2 to control alternative splicing of doublesex (dsx). DmSxl also controls its own splicing, creating an autoregulatory feedback loop that ensures expression of Sxl in females, but not males. A recent paper has shown that in the dipteran Ceratitis capitata later (downstream) steps in the regulatory hierarchy are conserved, while earlier (upstream) steps are not. Cctra is regulated by alternative splicing and apparently controls the alternative splicing of Ccdsx. However, Cctra is not regulated by CcSxl. Instead it appears to autoregulate in a manner similar to the autoregulation seen with DmSxl. PMID:12508274

  11. Chemical Loss of Polar Ozone: Present Understanding and Remaining Uncertainties

    NASA Technical Reports Server (NTRS)

    Salawitch, Ross; Canty, Tim; Cunnold, Derek; Dorf, Marcel; Frieler, Katja; Godin-Beekman, Sophie; Newchurch, Michael; Pfeilsticker, Klaus; Rex, Markus; Stimpfle, Rick; Streibel, Martin; vonderGathen, Peter; Weisenstein, Debra; Yan, Eun-Su

    2005-01-01

    Not long after the discovery of the Antarctic ozone hole, it was established that halogen compounds, supplied to the atmosphere mainly by anthropogenic activities, are the primary driver of polar ozone loss. We will briefly review the chemical mechanisms that cause polar ozone loss and the early evidence showing the key role played by anthropogenic halogens. Recently, stratospheric halogen loading has leveled off, due to adherence to the Montreal Protocol and its amendments that has essentially banned CFCs (chlorofluorocarbons) and other halocarbons. We will describe recent reports of the first stage of recovery of the Antarctic ozone hole (e.g., a statistically significant slowing of the downward trend), associated with the leveling off of stratospheric halogens. Despite this degree of understanding, we will discuss the tendency of photochemical models to underestimate the observed rate of polar ozone loss and a hypothesis that has recently been put forth that might resolve this discrepancy. Finally, we will briefly discuss chemical loss of Arctic ozone, which

  12. Addressing healthcare.

    PubMed

    Daly, Rich

    2013-02-11

    Though President Barack Obama has rarely made healthcare references in his State of the Union addresses, health policy experts are hoping he changes that strategy this year. "The question is: Will he say anything? You would hope that he would, given that that was the major issue he started his presidency with," says Dr. James Weinstein, left, of the Dartmouth-Hitchcock health system. PMID:23487896

  13. Inaugural address

    NASA Astrophysics Data System (ADS)

    Joshi, P. S.

    2014-03-01

    From jets to cosmos to cosmic censorship P S Joshi Tata Institute of Fundamental Research, Homi Bhabha Road, Colaba, Mumbai 400005, India E-mail: psj@tifr.res.in 1. Introduction At the outset, I should like to acknowledge that part of the title above, which tries to capture the main flavour of this meeting, and has been borrowed from one of the plenary talks at the conference. When we set out to make the programme for the conference, we thought of beginning with observations on the Universe, but then we certainly wanted to go further and address deeper questions, which were at the very foundations of our inquiry, and understanding on the nature and structure of the Universe. I believe, we succeeded to a good extent, and it is all here for you in the form of these Conference Proceedings, which have been aptly titled as 'Vishwa Mimansa', which could be possibly translated as 'Analysis of the Universe'! It is my great pleasure and privilege to welcome you all to the ICGC-2011 meeting at Goa. The International Conference on Gravitation and Cosmology (ICGC) series of meetings are being organized by the Indian Association for General Relativity and Gravitation (IAGRG), and the first such meeting was planned and conducted in Goa in 1987, with subsequent meetings taking place at a duration of about four years at various locations in India. So, it was thought appropriate to return to Goa to celebrate the 25 years of the ICGC meetings. The recollections from that first meeting have been recorded elsewhere here in these Proceedings. The research and teaching on gravitation and cosmology was initiated quite early in India, by V V Narlikar at the Banares Hindu University, and by N R Sen in Kolkata in the 1930s. In course of time, this activity grew and gained momentum, and in early 1969, at the felicitation held for the 60 years of V V Narlikar at a conference in Ahmedabad, P C Vaidya proposed the formation of the IAGRG society, with V V Narlikar being the first President. This

  14. Opening addresses.

    PubMed

    Chukudebelu, W O; Lucas, A O; Ransome-kuti, O; Akinla, O; Obayi, G U

    1988-01-01

    The theme of the 3rd International Conference of the Society of Gynecology and Obstetrics of Nigeria (SOGON) held October 26, 1986 in Enugu was maternal morbidity and mortality in Africa. The opening addresses emphasize the high maternal mortality rate in Africa and SOGON's dedication to promoting women's health and welfare. In order to reduce maternal mortality, the scope of this problem must be made evident by gathering accurate mortality rates through maternity care monitoring and auditing. Governments, health professionals, educators, behavioral scientists, and communication specialists have a responsibility to improve maternal health services in this country. By making the population aware of this problem through education, measures can be taken to reduce the presently high maternal mortality rates. Nigerian women are physically unprepared for childbirth; therefore, balanced diets and disease prevention should be promoted. Since about 40% of deliveries are unmanaged, training for traditional birth attendants should be provided. Furthermore, family planning programs should discourage teenage pregnancies, encourage birth spacing and small families, and promote the use of family planning techniques among men. The problem of child bearing and rearing accompanied by hard work should also be investigated. For practices to change so that maternal mortality rates can be reduced, attitudes must be changed such that the current rates are viewed as unacceptable. PMID:12179275

  15. Opening address

    NASA Astrophysics Data System (ADS)

    Castagnoli, C.

    1994-01-01

    Ladies and Gentlemen My cordial thanks to you for participating in our workshop and to all those who have sponsored it. When in 1957 I attended the International Congress on Fundamental Constants held in Turin on the occasion of the first centenary of the death of Amedeo Avogadro, I did not expect that about thirty-five years later a small but representative number of distinguished scientists would meet here again, to discuss how to go beyond the sixth decimal figure of the Avogadro constant. At that time, the uncertainty of the value of this constant was linked to the fourth decimal figure, as reported in the book by DuMond and Cohen. The progress made in the meantime is universally acknowledged to be due to the discovery of x-ray interferometry. We are honoured that one of the two founding fathers, Prof. Ulrich Bonse, is here with us, but we regret that the other, Prof. Michael Hart, is not present. After Bonse and Hart's discovery, the x-ray crystal density method triggered, as in a chain reaction, the investigation of two other quantities related to the Avogadro constant—density and molar mass. Scientists became, so to speak, resonant and since then have directed their efforts, just to mention a few examples, to producing near-perfect silicon spheres and determining their density, to calibrating, with increasing accuracy, mass spectrometers, and to studying the degree of homogeneity of silicon specimens. Obviously, I do not need to explain to you why the Avogadro constant is important. I wish, however, to underline that it is not only because of its position among fundamental constants, as we all know very well its direct links with the fine structure constant, the Boltzmann and Faraday constants, the h/e ratio, but also because when a new value of NA is obtained, the whole structure of the fundamental constants is shaken to a lesser or greater extent. Let me also remind you that the second part of the title of this workshop concerns the silicon

  16. Opening Address

    NASA Astrophysics Data System (ADS)

    Yamada, T.

    2014-12-01

    Ladies and Gentlemen, it is my great honor and pleasure to present an opening address of the 3rd International Workshop on "State of the Art in Nuclear Cluster Physics"(SOTANCP3). On the behalf of the organizing committee, I certainly welcome all your visits to KGU Kannai Media Center belonging to Kanto Gakuin University, and stay in Yokohama. In particular, to whom come from abroad more than 17 countries, I would appreciate your participations after long long trips from your homeland to Yokohama. The first international workshop on "State of the Art in Nuclear Cluster Physics", called SOTANCP, was held in Strasbourg, France, in 2008, and the second one was held in Brussels, Belgium, in 2010. Then the third workshop is now held in Yokohama. In this period, we had the traditional 10th cluster conference in Debrecen, Hungary, in 2012. Thus we have the traditional cluster conference and SOTANCP, one after another, every two years. This obviously shows our field of nuclear cluster physics is very active and flourishing. It is for the first time in about 10 years to hold the international workshop on nuclear cluster physics in Japan, because the last cluster conference held in Japan was in Nara in 2003, about 10 years ago. The president in Nara conference was Prof. K. Ikeda, and the chairpersons were Prof. H. Horiuchi and Prof. I. Tanihata. I think, quite a lot of persons in this room had participated at the Nara conference. Since then, about ten years passed. So, this workshop has profound significance for our Japanese colleagues. The subjects of this workshop are to discuss "the state of the art in nuclear cluster physics" and also discuss the prospect of this field. In a couple of years, we saw significant progresses of this field both in theory and in experiment, which have brought better and new understandings on the clustering aspects in stable and unstable nuclei. I think, the concept of clustering has been more important than ever. This is true also in the

  17. Presidential address.

    PubMed

    Vohra, U

    1993-07-01

    The Secretary of India's Ministry of Health and Family Welfare serves as Chair of the Executive Council of the International Institute for Population Sciences in Bombay. She addressed its 35th convocation in 1993. Global population stands at 5.43 billion and increases by about 90 million people each year. 84 million of these new people are born in developing countries. India contributes 17 million new people annually. The annual population growth rate in India is about 2%. Its population size will probably surpass 1 billion by the 2000. High population growth rates are a leading obstacle to socioeconomic development in developing countries. Governments of many developing countries recognize this problem and have expanded their family planning programs to stabilize population growth. Asian countries that have done so and have completed the fertility transition include China, Japan, Singapore, South Korea, and Thailand. Burma, Malaysia, North Korea, Sri Lanka, and Vietnam have not yet completed the transition. Afghanistan, Bangladesh, Iran, Nepal, and Pakistan are half-way through the transition. High population growth rates put pressure on land by fragmenting finite land resources, increasing the number of landless laborers and unemployment, and by causing considerable rural-urban migration. All these factors bring about social stress and burden civic services. India has reduced its total fertility rate from 5.2 to 3.9 between 1971 and 1991. Some Indian states have already achieved replacement fertility. Considerable disparity in socioeconomic development exists among states and districts. For example, the states of Bihar, Madhya Pradesh, Rajasthan, and Uttar Pradesh have female literacy rates lower than 27%, while that for Kerala is 87%. Overall, infant mortality has fallen from 110 to 80 between 1981 and 1990. In Uttar Pradesh, it has fallen from 150 to 98, while it is at 17 in Kerala. India needs innovative approaches to increase contraceptive prevalence rates

  18. Welcome Address

    NASA Astrophysics Data System (ADS)

    Kiku, H.

    2014-12-01

    Ladies and Gentlemen, It is an honor for me to present my welcome address in the 3rd International Workshop on "State of the Art in Nuclear Cluster Physics"(SOTANCP3), as the president of Kanto Gakuin University. Particularly to those from abroad more than 17 countries, I am very grateful for your participation after long long trips from your home to Yokohama. On the behalf of the Kanto Gakuin University, we certainly welcome your visit to our university and stay in Yokohama. First I would like to introduce Kanto Gakuin University briefly. Kanto Gakuin University, which is called KGU, traces its roots back to the Yokohama Baptist Seminary founded in 1884 in Yamate, Yokohama. The seminary's founder was Albert Arnold Bennett, alumnus of Brown University, who came to Japan from the United States to establish a theological seminary for cultivating and training Japanese missionaries. Now KGU is a major member of the Kanto Gakuin School Corporation, which is composed of two kindergartens, two primary schools, two junior high schools, two senior high schools as well as KGU. In this university, we have eight faculties with graduate school including Humanities, Economics, Law, Sciences and Engineering, Architecture and Environmental Design, Human and Environmental Studies, Nursing, and Law School. Over eleven thousands students are currently learning in our university. By the way, my major is the geotechnical engineering, and I belong to the faculty of Sciences and Engineering in my university. Prof. T. Yamada, here, is my colleague in the same faculty. I know that the nuclear physics is one of the most active academic fields in the world. In fact, about half of the participants, namely, more than 50 scientists, come from abroad in this conference. Moreover, I know that the nuclear physics is related to not only the other fundamental physics such as the elementary particle physics and astrophysics but also chemistry, medical sciences, medical cares, and radiation metrology

  19. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  20. Some Aspects of uncertainty in computational fluid dynamics results

    NASA Technical Reports Server (NTRS)

    Mehta, U. B.

    1991-01-01

    Uncertainties are inherent in computational fluid dynamics (CFD). These uncertainties need to be systematically addressed and managed. Sources of these uncertainty analysis are discussed. Some recommendations are made for quantification of CFD uncertainties. A practical method of uncertainty analysis is based on sensitivity analysis. When CFD is used to design fluid dynamic systems, sensitivity-uncertainty analysis is essential.

  1. Content and Access Remain Key

    ERIC Educational Resources Information Center

    Johnson, Linda B.

    2007-01-01

    It is impossible to review the year's outstanding government publication landscape without acknowledging that change remains paramount. Just as striking, however, is that these changes go hand in hand with some familiar constants. Within this shifting environment, there are the consistency and dependability of government information itself,…

  2. Decomposition Technique for Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)

    2014-01-01

    The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.

  3. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  4. Uncertainty Assessment: What Good Does it Do? (Invited)

    NASA Astrophysics Data System (ADS)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  5. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained. PMID:26688360

  6. Silicon photonics: some remaining challenges

    NASA Astrophysics Data System (ADS)

    Reed, G. T.; Topley, R.; Khokhar, A. Z.; Thompson, D. J.; Stanković, S.; Reynolds, S.; Chen, X.; Soper, N.; Mitchell, C. J.; Hu, Y.; Shen, L.; Martinez-Jimenez, G.; Healy, N.; Mailis, S.; Peacock, A. C.; Nedeljkovic, M.; Gardes, F. Y.; Soler Penades, J.; Alonso-Ramos, C.; Ortega-Monux, A.; Wanguemert-Perez, G.; Molina-Fernandez, I.; Cheben, P.; Mashanovich, G. Z.

    2016-03-01

    This paper discusses some of the remaining challenges for silicon photonics, and how we at Southampton University have approached some of them. Despite phenomenal advances in the field of Silicon Photonics, there are a number of areas that still require development. For short to medium reach applications, there is a need to improve the power consumption of photonic circuits such that inter-chip, and perhaps intra-chip applications are viable. This means that yet smaller devices are required as well as thermally stable devices, and multiple wavelength channels. In turn this demands smaller, more efficient modulators, athermal circuits, and improved wavelength division multiplexers. The debate continues as to whether on-chip lasers are necessary for all applications, but an efficient low cost laser would benefit many applications. Multi-layer photonics offers the possibility of increasing the complexity and effectiveness of a given area of chip real estate, but it is a demanding challenge. Low cost packaging (in particular, passive alignment of fibre to waveguide), and effective wafer scale testing strategies, are also essential for mass market applications. Whilst solutions to these challenges would enhance most applications, a derivative technology is emerging, that of Mid Infra-Red (MIR) silicon photonics. This field will build on existing developments, but will require key enhancements to facilitate functionality at longer wavelengths. In common with mainstream silicon photonics, significant developments have been made, but there is still much left to do. Here we summarise some of our recent work towards wafer scale testing, passive alignment, multiplexing, and MIR silicon photonics technology.

  7. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    NASA Astrophysics Data System (ADS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-03-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found.

  8. Uncertainties in large space systems

    NASA Technical Reports Server (NTRS)

    Fuh, Jon-Shen

    1988-01-01

    Uncertainties of a large space system (LSS) can be deterministic or stochastic in nature. The former may result in, for example, an energy spillover problem by which the interaction between unmodeled modes and controls may cause system instability. The stochastic uncertainties are responsible for mode localization and estimation errors, etc. We will address the effects of uncertainties on structural model formulation, use of available test data to verify and modify analytical models before orbiting, and how the system model can be further improved in the on-orbit environment.

  9. Generalized uncertainty principle: Approaches and applications

    NASA Astrophysics Data System (ADS)

    Tawfik, A.; Diab, A.

    2014-11-01

    In this paper, we review some highlights from the String theory, the black hole physics and the doubly special relativity and some thought experiments which were suggested to probe the shortest distances and/or maximum momentum at the Planck scale. Furthermore, all models developed in order to implement the minimal length scale and/or the maximum momentum in different physical systems are analyzed and compared. They entered the literature as the generalized uncertainty principle (GUP) assuming modified dispersion relation, and therefore are allowed for a wide range of applications in estimating, for example, the inflationary parameters, Lorentz invariance violation, black hole thermodynamics, Saleker-Wigner inequalities, entropic nature of gravitational laws, Friedmann equations, minimal time measurement and thermodynamics of the high-energy collisions. One of the higher-order GUP approaches gives predictions for the minimal length uncertainty. A second one predicts a maximum momentum and a minimal length uncertainty, simultaneously. An extensive comparison between the different GUP approaches is summarized. We also discuss the GUP impacts on the equivalence principles including the universality of the gravitational redshift and the free fall and law of reciprocal action and on the kinetic energy of composite system. The existence of a minimal length and a maximum momentum accuracy is preferred by various physical observations. The concern about the compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action should be addressed. We conclude that the value of the GUP parameters remain a puzzle to be verified.

  10. Realising the Uncertainty Enabled Model Web

    NASA Astrophysics Data System (ADS)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  11. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  12. Important Questions Remain to Be Addressed before Adopting a Dimensional Classification of Mental Disorders

    ERIC Educational Resources Information Center

    Ruscio, Ayelet Meron

    2008-01-01

    Comments on the original article "Plate tectonics in the classification of personality disorder: Shifting to a dimensional model," by T. A. Widiger and T. J. Trull (2007). Widiger and Trull raised important nosological issues that warrant serious consideration not only for the personality disorders but for all mental disorders as the Diagnostic…

  13. Awards and Addresses Summary

    PubMed Central

    2008-01-01

    Each year at the annual ASHG meeting, addresses are given in honor of the society and a number of award winners. A summary of each of these addresses is given below. On the next pages, we have printed the Presidential Address and the addresses for the William Allan Award. The other addresses, accompanied by pictures of the speakers, can be found at www.ashg.org.

  14. The uncertainty of the half-life

    NASA Astrophysics Data System (ADS)

    Pommé, S.

    2015-06-01

    Half-life measurements of radionuclides are undeservedly perceived as ‘easy’ and the experimental uncertainties are commonly underestimated. Data evaluators, scanning the literature, are faced with bad documentation, lack of traceability, incomplete uncertainty budgets and discrepant results. Poor control of uncertainties has its implications for the end-user community, varying from limitations to the accuracy and reliability of nuclear-based analytical techniques to the fundamental question whether half-lives are invariable or not. This paper addresses some issues from the viewpoints of the user community and of the decay data provider. It addresses the propagation of the uncertainty of the half-life in activity measurements and discusses different types of half-life measurements, typical parameters influencing their uncertainty, a tool to propagate the uncertainties and suggestions for a more complete reporting style. Problems and solutions are illustrated with striking examples from literature.

  15. Capturing the uncertainty in adversary attack simulations.

    SciTech Connect

    Darby, John L.; Brooks, Traci N.; Berry, Robert Bruce

    2008-09-01

    This work provides a comprehensive uncertainty technique to evaluate uncertainty, resulting in a more realistic evaluation of PI, thereby requiring fewer resources to address scenarios and allowing resources to be used across more scenarios. For a given set of dversary resources, two types of uncertainty are associated with PI for a scenario: (1) aleatory (random) uncertainty for detection probabilities and time delays and (2) epistemic (state of knowledge) uncertainty for the adversary resources applied during an attack. Adversary esources consist of attributes (such as equipment and training) and knowledge about the security system; to date, most evaluations have assumed an adversary with very high resources, adding to the conservatism in the evaluation of PI. The aleatory uncertainty in PI is ddressed by assigning probability distributions to detection probabilities and time delays. A numerical sampling technique is used to evaluate PI, addressing the repeated variable dependence in the equation for PI.

  16. Uncertainty and Equipoise: At Interplay Between Epistemology, Decision-Making and Ethics

    PubMed Central

    Djulbegovic, Benjamin

    2011-01-01

    In recent years, various authors have proposed that the concept of equipoise be abandoned since it conflates the practice of clinical care with clinical research. At the same time, the equipoise opponents acknowledge the necessity of clinical research if there are unresolved uncertainties about the effects of proposed healthcare interventions. Since equipoise represents just one measure of uncertainty, proposals to abandon equipoise while maintaining a requirement for addressing uncertainties are contradictory and ultimately not valid. As acknowledgment and articulation of uncertainties represent key scientific and moral requirements for human experimentation, the concept of equipoise remains the most useful framework to link the theory of human experimentation with the theory of rational choice. In this paper, I show how uncertainty (equipoise) is at the intersection between epistemology, decision-making and ethics of clinical research. In particular, I show how our formulation of responses to uncertainties of hoped-for benefits and unknown harms of testing is a function of the way humans cognitively process information. This approach is based on the view that considerations of ethics and rationality cannot be separated. I analyze the response to uncertainties as it relates to the dual-processing theory, which postulates that rational approach to (clinical research) decision-making depends both on analytical, deliberative processes embodied in scientific method (system II) and “good” human intuition (system I). Ultimately, our choices can only become wiser if we understand a close and intertwined relationship between irreducible uncertainty, inevitable errors, and unavoidable injustice. PMID:21817885

  17. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  18. Visualizing uncertainty about the future.

    PubMed

    Spiegelhalter, David; Pearson, Mike; Short, Ian

    2011-09-01

    We are all faced with uncertainty about the future, but we can get the measure of some uncertainties in terms of probabilities. Probabilities are notoriously difficult to communicate effectively to lay audiences, and in this review we examine current practice for communicating uncertainties visually, using examples drawn from sport, weather, climate, health, economics, and politics. Despite the burgeoning interest in infographics, there is limited experimental evidence on how different types of visualizations are processed and understood, although the effectiveness of some graphics clearly depends on the relative numeracy of an audience. Fortunately, it is increasingly easy to present data in the form of interactive visualizations and in multiple types of representation that can be adjusted to user needs and capabilities. Nonetheless, communicating deeper uncertainties resulting from incomplete or disputed knowledge--or from essential indeterminacy about the future--remains a challenge. PMID:21903802

  19. The Role of Uncertainty in Climate Science

    NASA Astrophysics Data System (ADS)

    Oreskes, N.

    2012-12-01

    Scientific discussions of climate change place considerable weight on uncertainty. The research frontier, by definition, rests at the interface between the known and the unknown and our scientific investigations necessarily track this interface. Yet, other areas of active scientific research are not necessarily characterized by a similar focus on uncertainty; previous assessments of science for policy, for example, do not reveal such extensive efforts at uncertainty quantification. Why has uncertainty loomed so large in climate science? This paper argues that the extensive discussions of uncertainty surrounding climate change are at least in part a response to the social and political context of climate change. Skeptics and contrarians focus on uncertainty as a political strategy, emphasizing or exaggerating uncertainties as a means to undermine public concern about climate change and delay policy action. The strategy works in part because it appeals to a certain logic: if our knowledge is uncertain, then it makes sense to do more research. Change, as the tobacco industry famously realized, requires justification; doubt favors the status quo. However, the strategy also works by pulling scientists into an "uncertainty framework," inspiring them to respond to the challenge by addressing and quantifying the uncertainties. The problem is that all science is uncertain—nothing in science is ever proven absolutely, positively—so as soon as one uncertainty is addressed, another can be raised, which is precisely what contrarians have done over the past twenty years.

  20. Characterizing Uncertainty for Regional Climate Change Mitigation and Adaptation Decisions

    SciTech Connect

    Unwin, Stephen D.; Moss, Richard H.; Rice, Jennie S.; Scott, Michael J.

    2011-09-30

    This white paper describes the results of new research to develop an uncertainty characterization process to help address the challenges of regional climate change mitigation and adaptation decisions.

  1. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  2. Messaging climate change uncertainty

    NASA Astrophysics Data System (ADS)

    Cooke, Roger M.

    2015-01-01

    Climate change is full of uncertainty and the messengers of climate science are not getting the uncertainty narrative right. To communicate uncertainty one must first understand it, and then avoid repeating the mistakes of the past.

  3. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  4. Estimating uncertainties in complex joint inverse problems

    NASA Astrophysics Data System (ADS)

    Afonso, Juan Carlos

    2016-04-01

    Sources of uncertainty affecting geophysical inversions can be classified either as reflective (i.e. the practitioner is aware of her/his ignorance) or non-reflective (i.e. the practitioner does not know that she/he does not know!). Although we should be always conscious of the latter, the former are the ones that, in principle, can be estimated either empirically (by making measurements or collecting data) or subjectively (based on the experience of the researchers). For complex parameter estimation problems in geophysics, subjective estimation of uncertainty is the most common type. In this context, probabilistic (aka Bayesian) methods are commonly claimed to offer a natural and realistic platform from which to estimate model uncertainties. This is because in the Bayesian approach, errors (whatever their nature) can be naturally included as part of the global statistical model, the solution of which represents the actual solution to the inverse problem. However, although we agree that probabilistic inversion methods are the most powerful tool for uncertainty estimation, the common claim that they produce "realistic" or "representative" uncertainties is not always justified. Typically, ALL UNCERTAINTY ESTIMATES ARE MODEL DEPENDENT, and therefore, besides a thorough characterization of experimental uncertainties, particular care must be paid to the uncertainty arising from model errors and input uncertainties. We recall here two quotes by G. Box and M. Gunzburger, respectively, of special significance for inversion practitioners and for this session: "…all models are wrong, but some are useful" and "computational results are believed by no one, except the person who wrote the code". In this presentation I will discuss and present examples of some problems associated with the estimation and quantification of uncertainties in complex multi-observable probabilistic inversions, and how to address them. Although the emphasis will be on sources of uncertainty related

  5. Integrated uncertainty assessment of discharge predictions with a statistical error model

    NASA Astrophysics Data System (ADS)

    Honti, M.; Stamm, C.; Reichert, P.

    2013-08-01

    A proper uncertainty assessment of rainfall-runoff predictions has always been an important objective for modelers. Several sources of uncertainty have been identified, but their representation was limited to complicated mechanistic error propagation frameworks only. The typical statistical error models used in the modeling practice still build on outdated and invalidated assumptions like the independence and homoscedasticity of model residuals and thus result in wrong uncertainty estimates. The primary reason for the popularity of the traditional faulty methods is the enormous computational requirement of full Bayesian error propagation frameworks. We introduce a statistical error model that can account for the effect of various uncertainty sources present in conceptual rainfall-runoff modeling studies and at the same time has limited computational demand. We split the model residuals into three different components: a random noise term and two bias processes with different response characteristics. The effects of the input uncertainty are simulated with a stochastic linearized rainfall-runoff model. While the description of model bias with Bayesian statistics cannot directly help to improve on the model's deficiencies, it is still beneficial to get realistic estimates on the overall predictive uncertainty and to rank the importance of different uncertainty sources. This feature is particularly important if the error sources cannot be addressed individually, but it is also relevant for the description of remaining bias when input and structural errors are considered explicitly.

  6. Davis-Besse uncertainty study

    SciTech Connect

    Davis, C B

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results.

  7. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  8. Dissociating Uncertainty Responses and Reinforcement Signals in the Comparative Study of Uncertainty Monitoring

    ERIC Educational Resources Information Center

    Smith, J. David; Redford, Joshua S.; Beran, Michael J.; Washburn, David A.

    2006-01-01

    Although researchers are exploring animals' capacity for monitoring their states of uncertainty, the use of some paradigms allows the criticism that animals map avoidance responses to error-causing stimuli not because of uncertainty monitored but because of feedback signals and stimulus aversion. The authors addressed this criticism with an…

  9. Where do those remains come from?

    PubMed

    Nociarová, Dominika; Adserias, M Jose; Malgosa, Assumpció; Galtés, Ignasi

    2014-12-01

    Part of the study of skeletal remains or corpses in advance decay located in the field involves determining their origin. They may be the result of criminal activity, accident, unearthed because of erosion, or they may also have originated from a cemetery. The discovery site, condition of the remains, and the associated artifacts, are factors that could be helpful for the forensic anthropologist to identify the origin of the remains. In order to contribute to this recognition, an analysis was made of the exhumations of 168 unclaimed human remains from the cemetery of Terrassa (Catalonia, Spain). This investigation presents a description of artifacts and conditions of remains that could indicate that the human remains may have originated from a cemetery. PMID:25459276

  10. Addressivity in cogenerative dialogues

    NASA Astrophysics Data System (ADS)

    Hsu, Pei-Ling

    2014-03-01

    Ashraf Shady's paper provides a first-hand reflection on how a foreign teacher used cogens as culturally adaptive pedagogy to address cultural misalignments with students. In this paper, Shady drew on several cogen sessions to showcase his journey of using different forms of cogens with his students. To improve the quality of cogens, one strategy he used was to adjust the number of participants in cogens. As a result, some cogens worked and others did not. During the course of reading his paper, I was impressed by his creative and flexible use of cogens and at the same time was intrigued by the question of why some cogens work and not others. In searching for an answer, I found that Mikhail Bakhtin's dialogism, especially the concept of addressivity, provides a comprehensive framework to address this question. In this commentary, I reanalyze the cogen episodes described in Shady's paper in the light of dialogism. My analysis suggests that addressivity plays an important role in mediating the success of cogens. Cogens with high addressivity function as internally persuasive discourse that allows diverse consciousnesses to coexist and so likely affords productive dialogues. The implications of addressivity in teaching and learning are further discussed.

  11. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  12. Addressing fear, fighting complacency.

    PubMed

    Pergam, S A

    2015-10-01

    Nature has us wired instinctively to be cautious of things that are unknown or unfamiliar. Children may fear the dark, but even as adults we remain afraid of things that go bump in the night. PMID:26095023

  13. Application of uncertainty analysis to cooling tower thermal performance tests

    SciTech Connect

    Yost, J.G.; Wheeler, D.E.

    1986-01-01

    The purpose of this paper is to provide an overview of uncertainty analyses. The following topics are addressed: l. A review and summary of the basic constituents of an uncertainty analysis with definitions and discussion of basic terms; 2. A discussion of the benefits and uses of uncertainty analysis; and 3. Example uncertainty analyses with emphasis on the problems, limitations, and site-specific complications.

  14. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  15. Toys Remain Viral Playground for 24 Hours

    MedlinePlus

    ... a toy's surface at typical indoor temperatures and humidity levels. Specifically, they tested the ability of so- ... East Respiratory Syndrome (MERS). At 60 percent relative humidity, 1 percent of the virus remained infectious on ...

  16. Addressing Social Issues.

    ERIC Educational Resources Information Center

    Schoebel, Susan

    1991-01-01

    Maintains that advertising can help people become more aware of social responsibilities. Describes a successful nationwide newspaper advertising competition for college students in which ads address social issues such as literacy, drugs, teen suicide, and teen pregnancy. Notes how the ads have helped grassroots programs throughout the United…

  17. States Address Achievement Gaps.

    ERIC Educational Resources Information Center

    Christie, Kathy

    2002-01-01

    Summarizes 2 state initiatives to address the achievement gap: North Carolina's report by the Advisory Commission on Raising Achievement and Closing Gaps, containing an 11-point strategy, and Kentucky's legislation putting in place 10 specific processes. The North Carolina report is available at www.dpi.state.nc.us.closingthegap; Kentucky's…

  18. Address of the President

    ERIC Educational Resources Information Center

    Ness, Frederic W.

    1976-01-01

    The president of the Association of American Colleges addresses at the 62nd annual meeting the theme of the conference: "Looking to the Future--Liberal Education in a Radically Changing Society." Contributions to be made by AAC are examined. (LBH)

  19. Addressing Sexual Harassment

    ERIC Educational Resources Information Center

    Young, Ellie L.; Ashbaker, Betty Y.

    2008-01-01

    This article discusses ways on how to address the problem of sexual harassment in schools. Sexual harassment--simply defined as any unwanted and unwelcome sexual behavior--is a sensitive topic. Merely providing students, parents, and staff members with information about the school's sexual harassment policy is insufficient; schools must take…

  20. Space sciences - Keynote address

    NASA Technical Reports Server (NTRS)

    Alexander, Joseph K.

    1990-01-01

    The present status and projected future developments of the NASA Space Science and Applications Program are addressed. Emphasis is given to biochemistry experiments that are planned for the Space Station. Projects for the late 1990s which will study the sun, the earth's magnetosphere, and the geosphere are briefly discussed.

  1. The Peter Shaw Award Acceptance Address: An Immigrant Sociologist

    ERIC Educational Resources Information Center

    Hollander, Paul

    2003-01-01

    This article presents the author's acceptance address for receiving the Peter Shaw award. In this address, the author, an immigrant sociologist, tells how this award helps to resolve questions and uncertainties he has as to the degree to which he can or should consider himself an American--about the extent to which he has become a part, a member…

  2. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  3. Assessing uncertainty in stormwater quality modelling.

    PubMed

    Wijesiri, Buddhi; Egodawatta, Prasanna; McGree, James; Goonetilleke, Ashantha

    2016-10-15

    Designing effective stormwater pollution mitigation strategies is a challenge in urban stormwater management. This is primarily due to the limited reliability of catchment scale stormwater quality modelling tools. As such, assessing the uncertainty associated with the information generated by stormwater quality models is important for informed decision making. Quantitative assessment of build-up and wash-off process uncertainty, which arises from the variability associated with these processes, is a major concern as typical uncertainty assessment approaches do not adequately account for process uncertainty. The research study undertaken found that the variability of build-up and wash-off processes for different particle size ranges leads to processes uncertainty. After variability and resulting process uncertainties are accurately characterised, they can be incorporated into catchment stormwater quality predictions. Accounting of process uncertainty influences the uncertainty limits associated with predicted stormwater quality. The impact of build-up process uncertainty on stormwater quality predictions is greater than that of wash-off process uncertainty. Accordingly, decision making should facilitate the designing of mitigation strategies which specifically addresses variations in load and composition of pollutants accumulated during dry weather periods. Moreover, the study outcomes found that the influence of process uncertainty is different for stormwater quality predictions corresponding to storm events with different intensity, duration and runoff volume generated. These storm events were also found to be significantly different in terms of the Runoff-Catchment Area ratio. As such, the selection of storm events in the context of designing stormwater pollution mitigation strategies needs to take into consideration not only the storm event characteristics, but also the influence of process uncertainty on stormwater quality predictions. PMID:27423532

  4. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  5. Universal Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Gour, Gilad

    2014-03-01

    Uncertainty relations are a distinctive characteristic of quantum theory that imposes intrinsic limitations on the precision with which physical properties can be simultaneously determined. The modern work on uncertainty relations employs entropic measures to quantify the lack of knowledge associated with measuring non-commuting observables. However, I will show here that there is no fundamental reason for using entropies as quantifiers; in fact, any functional relation that characterizes the uncertainty of the measurement outcomes can be used to define an uncertainty relation. Starting from a simple assumption that any measure of uncertainty is non-decreasing under mere relabeling of the measurement outcomes, I will show that Schur-concave functions are the most general uncertainty quantifiers. I will then introduce a novel fine-grained uncertainty relation written in terms of a majorization relation, which generates an infinite family of distinct scalar uncertainty relations via the application of arbitrary measures of uncertainty. This infinite family of uncertainty relations includes all the known entropic uncertainty relations, but is not limited to them. In this sense, the relation is universally valid and captures the essence of the uncertainty principle in quantum theory. This talk is based on a joint work with Shmuel Friedland and Vlad Gheorghiu. This research is supported by the Natural Sciences and Engineering Research Council (NSERC) of Canada and by the Pacific Institute for Mathematical Sciences (PIMS).

  6. Fission Spectrum Related Uncertainties

    SciTech Connect

    G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores

    2007-10-01

    The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.

  7. Excerpts from keynote address

    SciTech Connect

    Creel, G.C.

    1995-06-01

    Excerpts from the keynote principally address emissions issues in the fossil power industry as related to heat rate improvements. Stack emissions of both sulfur and nitrogen oxides are discussed, and a number of examples are given: (1) PEPCO`s Potomac River Station, and (2) Morgantown station`s NOX reduction efforts. Circulating water emissions are also briefly discussed, as are O & M costs of emission controls.

  8. Holographic content addressable storage

    NASA Astrophysics Data System (ADS)

    Chao, Tien-Hsin; Lu, Thomas; Reyes, George

    2015-03-01

    We have developed a Holographic Content Addressable Storage (HCAS) architecture. The HCAS systems consists of a DMD (Digital Micromirror Array) as the input Spatial Light Modulator (SLM), a CMOS (Complementary Metal-oxide Semiconductor) sensor as the output photodetector and a photorefractive crystal as the recording media. The HCAS system is capable of performing optical correlation of an input image/feature against massive reference data set stored in the holographic memory. Detailed system analysis will be reported in this paper.

  9. Pore Velocity Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Devary, J. L.; Doctor, P. G.

    1982-08-01

    Geostatistical data analysis techniques were used to stochastically model the spatial variability of groundwater pore velocity in a potential waste repository site. Kriging algorithms were applied to Hanford Reservation data to estimate hydraulic conductivities, hydraulic head gradients, and pore velocities. A first-order Taylor series expansion for pore velocity was used to statistically combine hydraulic conductivity, hydraulic head gradient, and effective porosity surfaces and uncertainties to characterize the pore velocity uncertainty. Use of these techniques permits the estimation of pore velocity uncertainties when pore velocity measurements do not exist. Large pore velocity estimation uncertainties were found to be located in the region where the hydraulic head gradient relative uncertainty was maximal.

  10. Catholic Identity Remains a Public Relations Asset

    ERIC Educational Resources Information Center

    Wirth, Eileen

    2004-01-01

    The massive sex scandal that rocked the Roman Catholic Church raises a question as to whether Catholic identity remains an asset that the nation's 8,000 Catholic schools should continue to promote. This case study found that continuing to promote Catholic identity has had no adverse effect on recruitment and enrollment at four Omaha, Nebraska,…

  11. Essential Qualities of Math Teaching Remain Unknown

    ERIC Educational Resources Information Center

    Cavanagh, Sean

    2008-01-01

    According to a new federal report, the qualities of an effective mathematics teacher remain frustratingly elusive. The report of the National Mathematics Advisory Panel does not show what college math content and coursework are most essential for teachers. While the report offered numerous conclusions about math curriculum, cognition, and…

  12. Juveniles' Motivations for Remaining in Prostitution

    ERIC Educational Resources Information Center

    Hwang, Shu-Ling; Bedford, Olwen

    2004-01-01

    Qualitative data from in-depth interviews were collected in 1990-1991, 1992, and 2000 with 49 prostituted juveniles remanded to two rehabilitation centers in Taiwan. These data are analyzed to explore Taiwanese prostituted juveniles' feelings about themselves and their work, their motivations for remaining in prostitution, and their difficulties…

  13. Predicting the remaining service life of concrete

    SciTech Connect

    Clifton, J.F.

    1991-11-01

    Nuclear power plants are providing, currently, about 17 percent of the U.S. electricity and many of these plants are approaching their licensed life of 40 years. The U.S. Nuclear Regulatory Commission and the Department of Energy`s Oak Ridge National Laboratory are carrying out a program to develop a methodology for assessing the remaining safe-life of the concrete components and structures in nuclear power plants. This program has the overall objective of identifying potential structural safety issues, as well as acceptance criteria, for use in evaluations of nuclear power plants for continued service. The National Institute of Standards and Technology (NIST) is contributing to this program by identifying and analyzing methods for predicting the remaining life of in-service concrete materials. This report examines the basis for predicting the remaining service lives of concrete materials of nuclear power facilities. Methods for predicting the service life of new and in-service concrete materials are analyzed. These methods include (1) estimates based on experience, (2) comparison of performance, (3) accelerated testing, (4) stochastic methods, and (5) mathematical modeling. New approaches for predicting the remaining service lives of concrete materials are proposed and recommendations for their further development given. Degradation processes are discussed based on considerations of their mechanisms, likelihood of occurrence, manifestations, and detection. They include corrosion, sulfate attack, alkali-aggregate reactions, frost attack, leaching, radiation, salt crystallization, and microbiological attack.

  14. Odor analysis of decomposing buried human remains

    SciTech Connect

    Vass, Arpad Alexander; Smith, Rob R; Thompson, Cyril V; Burnett, Michael N; Dulgerian, Nishan; Eckenrode, Brian A

    2008-01-01

    This study, conducted at the University of Tennessee's Anthropological Research Facility (ARF), lists and ranks the primary chemical constituents which define the odor of decomposition of human remains as detected at the soil surface of shallow burial sites. Triple sorbent traps were used to collect air samples in the field and revealed eight major classes of chemicals which now contain 478 specific volatile compounds associated with burial decomposition. Samples were analyzed using gas chromatography-mass spectrometry (GC-MS) and were collected below and above the body, and at the soil surface of 1.5-3.5 ft. (0.46-1.07 m) deep burial sites of four individuals over a 4-year time span. New data were incorporated into the previously established Decompositional Odor Analysis (DOA) Database providing identification, chemical trends, and semi-quantitation of chemicals for evaluation. This research identifies the 'odor signatures' unique to the decomposition of buried human remains with projected ramifications on human remains detection canine training procedures and in the development of field portable analytical instruments which can be used to locate human remains in shallow burial sites.

  15. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  16. Uncertainties in radiation flow experiments

    NASA Astrophysics Data System (ADS)

    Fryer, C. L.; Dodd, E.; Even, W.; Fontes, C. J.; Greeff, C.; Hungerford, A.; Kline, J.; Mussack, K.; Tregillis, I.; Workman, J. B.; Benstead, J.; Guymer, T. M.; Moore, A. S.; Morton, J.

    2016-03-01

    Although the fundamental physics behind radiation and matter flow is understood, many uncertainties remain in the exact behavior of macroscopic fluids in systems ranging from pure turbulence to coupled radiation hydrodynamics. Laboratory experiments play an important role in studying this physics to allow scientists to test their macroscopic models of these phenomena. However, because the fundamental physics is well understood, precision experiments are required to validate existing codes already tested by a suite of analytic, manufactured and convergence solutions. To conduct such high-precision experiments requires a detailed understanding of the experimental errors and the nature of their uncertainties on the observed diagnostics. In this paper, we study the uncertainties plaguing many radiation-flow experiments, focusing on those using a hohlraum (dynamic or laser-driven) source and a foam-density target. This study focuses on the effect these uncertainties have on the breakout time of the radiation front. We find that, even if the errors in the initial conditions and numerical methods are Gaussian, the errors in the breakout time are asymmetric, leading to a systematic bias in the observed data. We must understand these systematics to produce the high-precision experimental results needed to study this physics.

  17. Content addressable memory project

    NASA Technical Reports Server (NTRS)

    Hall, J. Storrs; Levy, Saul; Smith, Donald E.; Miyake, Keith M.

    1992-01-01

    A parameterized version of the tree processor was designed and tested (by simulation). The leaf processor design is 90 percent complete. We expect to complete and test a combination of tree and leaf cell designs in the next period. Work is proceeding on algorithms for the computer aided manufacturing (CAM), and once the design is complete we will begin simulating algorithms for large problems. The following topics are covered: (1) the practical implementation of content addressable memory; (2) design of a LEAF cell for the Rutgers CAM architecture; (3) a circuit design tool user's manual; and (4) design and analysis of efficient hierarchical interconnection networks.

  18. Bioreactors Addressing Diabetes Mellitus

    PubMed Central

    Minteer, Danielle M.; Gerlach, Jorg C.

    2014-01-01

    The concept of bioreactors in biochemical engineering is a well-established process; however, the idea of applying bioreactor technology to biomedical and tissue engineering issues is relatively novel and has been rapidly accepted as a culture model. Tissue engineers have developed and adapted various types of bioreactors in which to culture many different cell types and therapies addressing several diseases, including diabetes mellitus types 1 and 2. With a rising world of bioreactor development and an ever increasing diagnosis rate of diabetes, this review aims to highlight bioreactor history and emerging bioreactor technologies used for diabetes-related cell culture and therapies. PMID:25160666

  19. Addressing Environmental Health Inequalities.

    PubMed

    Gouveia, Nelson

    2016-01-01

    Environmental health inequalities refer to health hazards disproportionately or unfairly distributed among the most vulnerable social groups, which are generally the most discriminated, poor populations and minorities affected by environmental risks. Although it has been known for a long time that health and disease are socially determined, only recently has this idea been incorporated into the conceptual and practical framework for the formulation of policies and strategies regarding health. In this Special Issue of the International Journal of Environmental Research and Public Health (IJERPH), "Addressing Environmental Health Inequalities-Proceedings from the ISEE Conference 2015", we incorporate nine papers that were presented at the 27th Conference of the International Society for Environmental Epidemiology (ISEE), held in Sao Paulo, Brazil, in 2015. This small collection of articles provides a brief overview of the different aspects of this topic. Addressing environmental health inequalities is important for the transformation of our reality and for changing the actual development model towards more just, democratic, and sustainable societies driven by another form of relationship between nature, economy, science, and politics. PMID:27618906

  20. Mill and the right to remain uninformed.

    PubMed

    Strasser, M

    1986-08-01

    In a recent article in the Journal of Medicine and Philosophy, David Ost (1984) claims that patients do not have a right to waive their right to information. He argues that patients cannot make informed rational decisions without full information and thus, a right to waive information would involve a right to avoid one's responsibility to act as an autonomous moral agent. In support of his position, Ost cites a passage from Mill. Yet, a correct interpretation of the passage in question would support one's right to remain uninformed in certain situations. If the information would hurt one's chances for survival or hurt one's ability to make calm, rational decisions, then one not only does not have a duty to find out the information, but one's exercising one's right to remain uninformed may be the only rational course of action to take. PMID:3540171

  1. Explosives remain preferred methods for platform abandonment

    SciTech Connect

    Pulsipher, A.; Daniel, W. IV; Kiesler, J.E.; Mackey, V. III

    1996-05-06

    Economics and safety concerns indicate that methods involving explosives remain the most practical and cost-effective means for abandoning oil and gas structures in the Gulf of Mexico. A decade has passed since 51 dead sea turtles, many endangered Kemp`s Ridleys, washed ashore on the Texas coast shortly after explosives helped remove several offshore platforms. Although no relationship between the explosions and the dead turtles was ever established, in response to widespread public concern, the US Minerals Management Service (MMS) and National Marine Fisheries Service (NMFS) implemented regulations limiting the size and timing of explosive charges. Also, more importantly, they required that operators pay for observers to survey waters surrounding platforms scheduled for removal for 48 hr before any detonations. If observers spot sea turtles or marine mammals within the danger zone, the platform abandonment is delayed until the turtles leave or are removed. However, concern about the effects of explosives on marine life remains.

  2. Remains of Comet-Shoemaker/Levy

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This illustration of the Comet-Shoemaker/Levy collision shows the first piece of the remains of the comet crashing into Jupiter. This event occurred in 1994 after tidal forces from Jupiter caused the comet to break up into 21 separate pieces. Although on a very different scale, the physical mechanism for the breakup of Shoemaker/Levy also caused the tidal disruption of the star in RX J1242-11. (Illustration: SEDS/D. Seal (edited by CXC/M. Weiss)

  3. Direct Dating of Hominids Remains In Eurasia

    NASA Astrophysics Data System (ADS)

    Yokoyama, Y.; Falguères, C.

    When archaeological sites are associated with human remains, it is relevant to be able to date those valuable remains for different reasons. The main one is that it avoids the stratigraphical problems which can be due to intrusive burials in the sequence. The other reason consists in the fact that human bones may be encountered out of established stratigraphical context. On the other hand, the majority of dating methods currently used are destructive and can not be applied on these precious samples particularly when they are older than 40,000 years and can not be dated by radiocarbon. Since several years, we have developped a completely non-destructive method which consists in the measurement of human remains using the gamma -ray spectrometry. This technique has been used recently by other laboratories. We present here two important cases for the knowledge of human evolution in Eurasia. The first example is Qafzeh site in Israel where many human skeletons have been unearthed from burials associated with fauna and lithic artefacts. This site has been dated by several independent radiometric methods. So, it was possible to compare our gamma results with the other results yielded by the different methods. The second case concerns the most evolved Homo erectus found in Java, Indonesia, at Ngandong site, close to the Solo river. A recent debate has been focused on the age of these fossils and their direct dating is of outmost importance for the knowledge of settlement of Modern Humans in South-East Asia.

  4. Chemical Principles Revisited: Perspectives on the Uncertainty Principle and Quantum Reality.

    ERIC Educational Resources Information Center

    Bartell, Lawrence S.

    1985-01-01

    Explicates an approach that not only makes the uncertainty seem more useful to introductory students but also helps convey the real meaning of the term "uncertainty." General topic areas addressed include probability amplitudes, rationale behind the uncertainty principle, applications of uncertainty relations, and quantum processes. (JN)

  5. Human errors and measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Kuselman, Ilya; Pennecchi, Francesca

    2015-04-01

    Evaluating the residual risk of human errors in a measurement and testing laboratory, remaining after the error reduction by the laboratory quality system, and quantifying the consequences of this risk for the quality of the measurement/test results are discussed based on expert judgments and Monte Carlo simulations. A procedure for evaluation of the contribution of the residual risk to the measurement uncertainty budget is proposed. Examples are provided using earlier published sets of expert judgments on human errors in pH measurement of groundwater, elemental analysis of geological samples by inductively coupled plasma mass spectrometry, and multi-residue analysis of pesticides in fruits and vegetables. The human error contribution to the measurement uncertainty budget in the examples was not negligible, yet also not dominant. This was assessed as a good risk management result.

  6. Uncertainty Analysis of Model Coupling

    NASA Astrophysics Data System (ADS)

    Held, H.; Knopf, B.; Schneider von Deimling, T.; Schellnhuber, H.-J.

    The Earth System is a highly complex system that is often modelled by coupling sev- eral nonlinear submodules. For predicting the climate with these models, the following uncertainties play an essential role: parameter uncertainty, uncertainty in initial con- ditions or model uncertainty. Here we will address uncertainty in initial conditions as well as model uncertainty. As the process of coupling is an important part of model- ing, the main aspect of this work is the investigation of uncertainties that are due to the coupling process. For this study we use conceptual models that, compared to GCMs, have the advantage that the model itself as well as the output can be treated in a mathematically elabo- rated way. As the time for running the model is much shorter, the investigation is also possible for a longer period, e.g. for paleo runs. In consideration of these facts it is feasible to analyse the whole phase space of the model. The process of coupling is investigated by using different methods of examining low order coupled atmosphere-ocean systems. In the dynamical approach a fully coupled system of the two submodules can be compared to a system where one submodule forces the other. For a particular atmosphere-ocean system, based on the Lorenz model for the atmosphere, there can be shown significant differences in the predictability of a forced system depending whether the subsystems are coupled in a linear or a non- linear way. In [1] it is shown that in the linear case the forcing cannot represent the coupling, but in the nonlinear case, that we investigated in our study, the variability and the statistics of the coupled system can be reproduced by the forcing. Another approach to analyse the coupling is to carry out a bifurcation analysis. Here the bifurcation diagram of a single atmosphere system is compared to that of a cou- pled atmosphere-ocean system. Again it can be seen from the different behaviour of the coupled and the uncoupled system, that the

  7. Content addressable memory project

    NASA Technical Reports Server (NTRS)

    Hall, Josh; Levy, Saul; Smith, D.; Wei, S.; Miyake, K.; Murdocca, M.

    1991-01-01

    The progress on the Rutgers CAM (Content Addressable Memory) Project is described. The overall design of the system is completed at the architectural level and described. The machine is composed of two kinds of cells: (1) the CAM cells which include both memory and processor, and support local processing within each cell; and (2) the tree cells, which have smaller instruction set, and provide global processing over the CAM cells. A parameterized design of the basic CAM cell is completed. Progress was made on the final specification of the CPS. The machine architecture was driven by the design of algorithms whose requirements are reflected in the resulted instruction set(s). A few of these algorithms are described.

  8. Bax: Addressed to kill.

    PubMed

    Renault, Thibaud T; Manon, Stéphen

    2011-09-01

    The pro-apoptototic protein Bax (Bcl-2 Associated protein X) plays a central role in the mitochondria-dependent apoptotic pathway. In healthy mammalian cells, Bax is essentially cytosolic and inactive. Following a death signal, the protein is translocated to the outer mitochondrial membrane, where it promotes a permeabilization that favors the release of different apoptogenic factors, such as cytochrome c. The regulation of Bax translocation is associated to conformational changes that are under the control of different factors. The evidences showing the involvement of different Bax domains in its mitochondrial localization are presented. The interactions between Bax and its different partners are described in relation to their ability to promote (or prevent) Bax conformational changes leading to mitochondrial addressing and to the acquisition of the capacity to permeabilize the outer mitochondrial membrane. PMID:21641962

  9. Micro-Pulse Lidar Signals: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; Campbell, James R.; Starr, David OC. (Technical Monitor)

    2002-01-01

    Micro-pulse lidar (MPL) systems are small, autonomous, eye-safe lidars used for continuous observations of the vertical distribution of cloud and aerosol layers. Since the construction of the first MPL in 1993, procedures have been developed to correct for various instrument effects present in MPL signals. The primary instrument effects include afterpulse, laser-detector cross-talk, and overlap, poor near-range (less than 6 km) focusing. The accurate correction of both afterpulse and overlap effects are required to study both clouds and aerosols. Furthermore, the outgoing energy of the laser pulses and the statistical uncertainty of the MPL detector must also be correctly determined in order to assess the accuracy of MPL observations. The uncertainties associated with the afterpulse, overlap, pulse energy, detector noise, and all remaining quantities affecting measured MPL signals, are determined in this study. The uncertainties are propagated through the entire MPL correction process to give a net uncertainty on the final corrected MPL signal. The results show that in the near range, the overlap uncertainty dominates. At altitudes above the overlap region, the dominant source of uncertainty is caused by uncertainty in the pulse energy. However, if the laser energy is low, then during mid-day, high solar background levels can significantly reduce the signal-to-noise of the detector. In such a case, the statistical uncertainty of the detector count rate becomes dominant at altitudes above the overlap region.

  10. Accounting for uncertainty in marine reserve design.

    PubMed

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals. PMID:16958861

  11. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  12. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-04-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, including for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  13. Why Do Some Cores Remain Starless?

    NASA Astrophysics Data System (ADS)

    Anathpindika, S.

    2016-08-01

    Prestellar cores, by definition, are gravitationally bound but starless pockets of dense gas. Physical conditions that could render a core starless (in the local Universe) is the subject of investigation in this work. To this end, we studied the evolution of four starless cores, B68, L694-2, L1517B, L1689, and L1521F, a VeLLO. We demonstrate: (i) cores contracted in quasistatic manner over a timescale on the order of ~ 105 yr. Those that remained starless briefly acquired a centrally concentrated density configuration that mimicked the profile of a unstable BonnorEbert sphere before rebounding, (ii) three cores viz. L694-2, L1689-SMM16, and L1521F remained starless despite becoming thermally super-critical. By contrast, B68 and L1517B remained sub-critical; L1521F collapsed to become a VeLLO only when gas-cooling was enhanced by increasing the size of dust-grains. This result is robust, for other starless cores viz. B68, L694-2, L1517B, and L1689 could also be similarly induced to collapse. The temperature-profile of starless cores and those that collapsed was found to be radically different. While in the former type, only very close to the centre of a core was there any evidence of decline in gas temperature, by contrast, a core of the latter type developed a more uniformly cold interior. Our principle conclusions are: (a) thermal super-criticality of a core is insufficient to ensure it will become protostellar, (b) potential star-forming cores (the VeLLO L1521F here), could be experiencing dust-coagulation that must enhance gasdust coupling and in turn lower gas temperature, thereby assisting collapse. This also suggests, mere gravitational/virial boundedness of a core is insufficient to ensure it will form stars.

  14. USING CONDITION MONITORING TO PREDICT REMAINING LIFE OF ELECTRIC CABLES.

    SciTech Connect

    LOFARO,R.; SOO,P.; VILLARAN,M.; GROVE,E.

    2001-03-29

    Electric cables are passive components used extensively throughout nuclear power stations to perform numerous safety and non-safety functions. It is known that the polymers commonly used to insulate the conductors on these cables can degrade with time; the rate of degradation being dependent on the severity of the conditions in which the cables operate. Cables do not receive routine maintenance and, since it can be very costly, they are not replaced on a regular basis. Therefore, to ensure their continued functional performance, it would be beneficial if condition monitoring techniques could be used to estimate the remaining useful life of these components. A great deal of research has been performed on various condition monitoring techniques for use on electric cables. In a research program sponsored by the U.S. Nuclear Regulatory Commission, several promising techniques were evaluated and found to provide trendable information on the condition of low-voltage electric cables. These techniques may be useful for predicting remaining life if well defined limiting values for the aging properties being measured can be determined. However, each technique has advantages and limitations that must be addressed in order to use it effectively, and the necessary limiting values are not always easy to obtain. This paper discusses how condition monitoring measurements can be used to predict the remaining useful life of electric cables. The attributes of an appropriate condition monitoring technique are presented, and the process to be used in estimating the remaining useful life of a cable is discussed along with the difficulties that must be addressed.

  15. Measurement Uncertainty and Probability

    NASA Astrophysics Data System (ADS)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  16. Uncertainty of decibel levels.

    PubMed

    Taraldsen, Gunnar; Berge, Truls; Haukland, Frode; Lindqvist, Bo Henry; Jonasson, Hans

    2015-09-01

    The mean sound exposure level from a source is routinely estimated by the mean of the observed sound exposures from repeated measurements. A formula for the standard uncertainty based on the Guide to the expression of Uncertainty in Measurement (GUM) is derived. An alternative formula is derived for the case where the GUM method fails. The formulas are applied on several examples, and compared with a Monte Carlo calculation of the standard uncertainty. The recommended formula can be seen simply as a convenient translation of the uncertainty on an energy scale into the decibel level scale, but with a theoretical foundation. PMID:26428824

  17. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  18. The identification of submerged skeletonized remains.

    PubMed

    Byard, Roger W; Both, Katrin; Simpson, Ellie

    2008-03-01

    Examination was undertaken of skeletonized remains contained within 2 rubber boots dredged by a fishing boat from a depth of 145 m, approximately 185 km off the southern Australian coast in the Great Australian Bight. The boots had been manufactured in Australia in July 1993 and were of a type commonly used by local fishermen. Examination of the lower legs and feet revealed well-preserved bones with arthritic changes in keeping with an older male. DNA analyses using reference samples taken from relatives of fishermen who had disappeared in the area resulted in the identification of the victim as a 52-year-old prawn fisherman who had been swept off a boat over a decade earlier. DNA stability had been maintained by the low light, cold temperatures, and alkaline pH of the ocean floor. Integration of pathologic, anthropologic, and biologic analyses with police investigations enabled a positive identification to be made despite the unusual nature of the location of the remains and the time lapse since the disappearance of the victim. PMID:19749621

  19. Shotgun microbial profiling of fossil remains.

    PubMed

    Der Sarkissian, C; Ermini, L; Jónsson, H; Alekseev, A N; Crubezy, E; Shapiro, B; Orlando, L

    2014-04-01

    Millions to billions of DNA sequences can now be generated from ancient skeletal remains thanks to the massive throughput of next-generation sequencing platforms. Except in cases of exceptional endogenous DNA preservation, most of the sequences isolated from fossil material do not originate from the specimen of interest, but instead reflect environmental organisms that colonized the specimen after death. Here, we characterize the microbial diversity recovered from seven c. 200- to 13 000-year-old horse bones collected from northern Siberia. We use a robust, taxonomy-based assignment approach to identify the microorganisms present in ancient DNA extracts and quantify their relative abundance. Our results suggest that molecular preservation niches exist within ancient samples that can potentially be used to characterize the environments from which the remains are recovered. In addition, microbial community profiling of the seven specimens revealed site-specific environmental signatures. These microbial communities appear to comprise mainly organisms that colonized the fossils recently. Our approach significantly extends the amount of useful data that can be recovered from ancient specimens using a shotgun sequencing approach. In future, it may be possible to correlate, for example, the accumulation of postmortem DNA damage with the presence and/or abundance of particular microbes. PMID:24612293

  20. So close: remaining challenges to eradicating polio.

    PubMed

    Toole, Michael J

    2016-01-01

    The Global Polio Eradication Initiative, launched in 1988, is close to achieving its goal. In 2015, reported cases of wild poliovirus were limited to just two countries - Afghanistan and Pakistan. Africa has been polio-free for more than 18 months. Remaining barriers to global eradication include insecurity in areas such as Northwest Pakistan and Eastern and Southern Afghanistan, where polio cases continue to be reported. Hostility to vaccination is either based on extreme ideologies, such as in Pakistan, vaccination fatigue by parents whose children have received more than 15 doses, and misunderstandings about the vaccine's safety and effectiveness such as in Ukraine. A further challenge is continued circulation of vaccine-derived poliovirus in populations with low immunity, with 28 cases reported in 2015 in countries as diverse as Madagascar, Ukraine, Laos, and Myanmar. This paper summarizes the current epidemiology of wild and vaccine-derived poliovirus, and describes the remaining challenges to eradication and innovative approaches being taken to overcome them. PMID:26971523

  1. Magnetic content addressable memories

    NASA Astrophysics Data System (ADS)

    Jiang, Zhenye

    Content Addressable Memories are designed with comparison circuits built into every bit cell. This parallel structure can increase the speed of searching from O(n) (as with Random Access Memories) to O(1), where n is the number of entries being searched. The high cost in hardware limits the application of CAM within situations where higher searching speed is extremely desired. Spintronics technology can build non-volatile Magnetic RAM with only one device for one bit cell. There are various technologies involved, like Magnetic Tunnel Junctions, off-easy-axis programming method, Synthetic Anti-Ferromagnetic tri-layers, Domain Wall displacement, Spin Transfer Torque tri-layers and etc. With them, particularly the Tunnel Magneto-Resistance variation in MTJ due to difference in magnetization polarity of the two magnets, Magnetic CAM can be developed with reduced hardware cost. And this is demonstrated by the discussion in this dissertation. Six MCAM designs are discussed. In the first design, comparand (C), local information (S) and their complements are stored into 4 MTJs connected in XOR gate pattern. The other five designs have one or two stacks for both information storage and comparison, and full TMR ratio can be taken advantage of. Two challenges for the five are specifically programming C without changing S and selectively programming a cell out of an array. The solutions to specific programming are: by confining the programming field for C in a ring structure design; by using field programming and spin polarized current programming respectively for C and S in the SAF+DW and SAF+STT tri-layer design; by making use of the difference in thresholds between direct mode and toggle mode switching in the SAF+SAF design. The problem of selective programming is addressed by off-easy-axis method and by including SAF tri-layers. Cell with STT tri-layers for both C and S can completely avoid the problems of specific and selective programming, but subject to the limit of

  2. A review of uncertainty research in impact assessment

    SciTech Connect

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  3. Calibration and uncertainty issues of a hydrological model (SWAT) applied to West Africa

    NASA Astrophysics Data System (ADS)

    Schuol, J.; Abbaspour, K. C.

    2006-09-01

    Distributed hydrological models like SWAT (Soil and Water Assessment Tool) are often highly over-parameterized, making parameter specification and parameter estimation inevitable steps in model calibration. Manual calibration is almost infeasible due to the complexity of large-scale models with many objectives. Therefore we used a multi-site semi-automated inverse modelling routine (SUFI-2) for calibration and uncertainty analysis. Nevertheless, the question of when a model is sufficiently calibrated remains open, and requires a project dependent definition. Due to the non-uniqueness of effective parameter sets, parameter calibration and prediction uncertainty of a model are intimately related. We address some calibration and uncertainty issues using SWAT to model a four million km2 area in West Africa, including mainly the basins of the river Niger, Volta and Senegal. This model is a case study in a larger project with the goal of quantifying the amount of global country-based available freshwater. Annual and monthly simulations with the "calibrated" model for West Africa show promising results in respect of the freshwater quantification but also point out the importance of evaluating the conceptual model uncertainty as well as the parameter uncertainty.

  4. Balancing Certainty and Uncertainty in Clinical Practice

    ERIC Educational Resources Information Center

    Kamhi, Alan G.

    2011-01-01

    Purpose: In this epilogue, I respond to each of the five commentaries, discussing in some depth a central issue raised in each commentary. In the final section, I discuss how my thinking about certainty and uncertainty in clinical practice has evolved since I wrote the initial article. Method: Topics addressed include the similarities/differences…

  5. Solving navigational uncertainty using grid cells on robots.

    PubMed

    Milford, Michael J; Wiles, Janet; Wyeth, Gordon F

    2010-01-01

    To successfully navigate their habitats, many mammals use a combination of two mechanisms, path integration and calibration using landmarks, which together enable them to estimate their location and orientation, or pose. In large natural environments, both these mechanisms are characterized by uncertainty: the path integration process is subject to the accumulation of error, while landmark calibration is limited by perceptual ambiguity. It remains unclear how animals form coherent spatial representations in the presence of such uncertainty. Navigation research using robots has determined that uncertainty can be effectively addressed by maintaining multiple probabilistic estimates of a robot's pose. Here we show how conjunctive grid cells in dorsocaudal medial entorhinal cortex (dMEC) may maintain multiple estimates of pose using a brain-based robot navigation system known as RatSLAM. Based both on rodent spatially-responsive cells and functional engineering principles, the cells at the core of the RatSLAM computational model have similar characteristics to rodent grid cells, which we demonstrate by replicating the seminal Moser experiments. We apply the RatSLAM model to a new experimental paradigm designed to examine the responses of a robot or animal in the presence of perceptual ambiguity. Our computational approach enables us to observe short-term population coding of multiple location hypotheses, a phenomenon which would not be easily observable in rodent recordings. We present behavioral and neural evidence demonstrating that the conjunctive grid cells maintain and propagate multiple estimates of pose, enabling the correct pose estimate to be resolved over time even without uniquely identifying cues. While recent research has focused on the grid-like firing characteristics, accuracy and representational capacity of grid cells, our results identify a possible critical and unique role for conjunctive grid cells in filtering sensory uncertainty. We anticipate our

  6. Tularemia vaccines: recent developments and remaining hurdles.

    PubMed

    Conlan, J Wayne

    2011-04-01

    Francisella tularensis subsp. tularensis is a facultative intracellular bacterial pathogen of humans and other mammals. Its inhaled infectious dose is very low and can result in very high mortality. Historically, subsp. tularensis was developed as a biological weapon and there are now concerns about its abuse as such by terrorists. A live attenuated vaccine developed pragmatically more than half a century ago from the less virulent holarctica subsp. is the sole prophylactic available, but it remains unlicensed. In recent years several other potential live, killed and subunit vaccine candidates have been developed and tested in mice for their efficacy against respiratory challenge with subsp. tularensis. This article will review these vaccine candidates and the development hurdles they face. PMID:21526941

  7. Some remaining problems in HCDA analysis. [LMFBR

    SciTech Connect

    Chang, Y.W.

    1981-01-01

    The safety assessment and licensing of liquid-metal fast breeder reactors (LMFBRs) requires an analysis on the capability of the reactor primary system to sustain the consequences of a hypothetical core-disruptive accident (HCDA). Although computational methods and computer programs developed for HCDA analyses can predict reasonably well the response of the primary containment system, and follow up the phenomena of HCDA from the start of excursion to the time of dynamic equilibrium in the system, there remain areas in the HCDA analysis that merit further analytical and experimental studies. These are the analysis of fluid impact on reactor cover, three-dimensional analysis, the treatment of the perforated plates, material properties under high strain rates and under high temperatures, the treatment of multifield flows, and the treatment of prestressed concrete reactor vessels. The purpose of this paper is to discuss the structural mechanics of HCDA analysis in these areas where improvements are needed.

  8. Climate change, uncertainty, and natural resource management

    USGS Publications Warehouse

    Nichols, J.D.; Koneff, M.D.; Heglund, P.J.; Knutson, M.G.; Seamans, M.E.; Lyons, J.E.; Morton, J.M.; Jones, M.T.; Boomer, G.S.; Williams, B.K.

    2011-01-01

    Climate change and its associated uncertainties are of concern to natural resource managers. Although aspects of climate change may be novel (e.g., system change and nonstationarity), natural resource managers have long dealt with uncertainties and have developed corresponding approaches to decision-making. Adaptive resource management is an application of structured decision-making for recurrent decision problems with uncertainty, focusing on management objectives, and the reduction of uncertainty over time. We identified 4 types of uncertainty that characterize problems in natural resource management. We examined ways in which climate change is expected to exacerbate these uncertainties, as well as potential approaches to dealing with them. As a case study, we examined North American waterfowl harvest management and considered problems anticipated to result from climate change and potential solutions. Despite challenges expected to accompany the use of adaptive resource management to address problems associated with climate change, we conclude that adaptive resource management approaches will be the methods of choice for managers trying to deal with the uncertainties of climate change. ?? 2010 The Wildlife Society.

  9. Uncertainty quantification approaches for advanced reactor analyses.

    SciTech Connect

    Briggs, L. L.; Nuclear Engineering Division

    2009-03-24

    The original approach to nuclear reactor design or safety analyses was to make very conservative modeling assumptions so as to ensure meeting the required safety margins. Traditional regulation, as established by the U. S. Nuclear Regulatory Commission required conservatisms which have subsequently been shown to be excessive. The commission has therefore moved away from excessively conservative evaluations and has determined best-estimate calculations to be an acceptable alternative to conservative models, provided the best-estimate results are accompanied by an uncertainty evaluation which can demonstrate that, when a set of analysis cases which statistically account for uncertainties of all types are generated, there is a 95% probability that at least 95% of the cases meet the safety margins. To date, nearly all published work addressing uncertainty evaluations of nuclear power plant calculations has focused on light water reactors and on large-break loss-of-coolant accident (LBLOCA) analyses. However, there is nothing in the uncertainty evaluation methodologies that is limited to a specific type of reactor or to specific types of plant scenarios. These same methodologies can be equally well applied to analyses for high-temperature gas-cooled reactors and to liquid metal reactors, and they can be applied to steady-state calculations, operational transients, or severe accident scenarios. This report reviews and compares both statistical and deterministic uncertainty evaluation approaches. Recommendations are given for selection of an uncertainty methodology and for considerations to be factored into the process of evaluating uncertainties for advanced reactor best-estimate analyses.

  10. Aeroservoelastic Uncertainty Model Identification from Flight Data

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.

    2001-01-01

    Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.

  11. Uncertainty and global climate change research

    SciTech Connect

    Tonn, B.E.; Weiher, R.

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  12. MOUSE UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cost or risk analysis equations. t was especially intended for use by individuals with li...

  13. Electoral Knowledge and Uncertainty.

    ERIC Educational Resources Information Center

    Blood, R. Warwick; And Others

    Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may…

  14. When prostate cancer remains undetectable: The dilemma.

    PubMed

    Mustafa, Mahmoud Othman; Pisters, Louis

    2015-03-01

    Since the first report on the efficacy of sextant biopsy under transrectal ultrasound guidance, there have been many modifications related to the total number of cores and the localization of biopsies to improve the prostate cancer (PCa) detection rate. The 2010 National Comprehensive Cancer Network Early PCa Detection Guidelines noted the 12-core biopsy scheme as the standard. However, this extended biopsy scheme still fails to detect 20% of high-grade PCa that can be detected by detailed pathological evaluation of radical prostatectomy; therefore, there is need for saturation biopsies. The existence of suspicions of PCa after previous negative biopsy or biopsies represents a valid indication for saturation biopsy. There has been no significant increment in morbidity or in insignificant PCa detection rates when a saturation biopsy scheme was used with an extended biopsy scheme. Along with the improvement in the PCa detection rate, accurate oncological mapping of PCa is another important consideration of saturation biopsies. The ideal number of cores and the diagnostic value of saturation biopsy after the failure of initial therapy are some of the issues that need to be addressed. Preliminary reports have shown that magnetic resonance imaging can improve the PCa detection rate, save patients from unnecessary biopsies, and decrease the need for a high number of cores; however, multiple limitations continue to exist. PMID:26328196

  15. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  16. Physical Uncertainty Bounds (PUB)

    SciTech Connect

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  17. Economic uncertainty and econophysics

    NASA Astrophysics Data System (ADS)

    Schinckus, Christophe

    2009-10-01

    The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.

  18. PIV uncertainty propagation

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard

    2016-08-01

    This paper discusses the propagation of the instantaneous uncertainty of PIV measurements to statistical and instantaneous quantities of interest derived from the velocity field. The expression of the uncertainty of vorticity, velocity divergence, mean value and Reynolds stresses is derived. It is shown that the uncertainty of vorticity and velocity divergence requires the knowledge of the spatial correlation between the error of the x and y particle image displacement, which depends upon the measurement spatial resolution. The uncertainty of statistical quantities is often dominated by the random uncertainty due to the finite sample size and decreases with the square root of the effective number of independent samples. Monte Carlo simulations are conducted to assess the accuracy of the uncertainty propagation formulae. Furthermore, three experimental assessments are carried out. In the first experiment, a turntable is used to simulate a rigid rotation flow field. The estimated uncertainty of the vorticity is compared with the actual vorticity error root-mean-square, with differences between the two quantities within 5–10% for different interrogation window sizes and overlap factors. A turbulent jet flow is investigated in the second experimental assessment. The reference velocity, which is used to compute the reference value of the instantaneous flow properties of interest, is obtained with an auxiliary PIV system, which features a higher dynamic range than the measurement system. Finally, the uncertainty quantification of statistical quantities is assessed via PIV measurements in a cavity flow. The comparison between estimated uncertainty and actual error demonstrates the accuracy of the proposed uncertainty propagation methodology.

  19. Critical evaluation of parameter consistency and predictive uncertainty in hydrological modeling: A case study using Bayesian total error analysis

    NASA Astrophysics Data System (ADS)

    Thyer, Mark; Renard, Benjamin; Kavetski, Dmitri; Kuczera, George; Franks, Stewart William; Srikanthan, Sri

    2009-12-01

    The lack of a robust framework for quantifying the parametric and predictive uncertainty of conceptual rainfall-runoff (CRR) models remains a key challenge in hydrology. The Bayesian total error analysis (BATEA) methodology provides a comprehensive framework to hypothesize, infer, and evaluate probability models describing input, output, and model structural error. This paper assesses the ability of BATEA and standard calibration approaches (standard least squares (SLS) and weighted least squares (WLS)) to address two key requirements of uncertainty assessment: (1) reliable quantification of predictive uncertainty and (2) reliable estimation of parameter uncertainty. The case study presents a challenging calibration of the lumped GR4J model to a catchment with ephemeral responses and large rainfall gradients. Postcalibration diagnostics, including checks of predictive distributions using quantile-quantile analysis, suggest that while still far from perfect, BATEA satisfied its assumed probability models better than SLS and WLS. In addition, WLS/SLS parameter estimates were highly dependent on the selected rain gauge and calibration period. This will obscure potential relationships between CRR parameters and catchment attributes and prevent the development of meaningful regional relationships. Conversely, BATEA provided consistent, albeit more uncertain, parameter estimates and thus overcomes one of the obstacles to parameter regionalization. However, significant departures from the calibration assumptions remained even in BATEA, e.g., systematic overestimation of predictive uncertainty, especially in validation. This is likely due to the inferred rainfall errors compensating for simplified treatment of model structural error.

  20. Body size prediction from juvenile skeletal remains.

    PubMed

    Ruff, Christopher

    2007-05-01

    There are currently no methods for predicting body mass from juvenile skeletal remains and only a very limited number for predicting stature. In this study, stature and body mass prediction equations are generated for each year from 1 to 17 years of age using a subset of the Denver Growth Study sample, followed longitudinally (n = 20 individuals, 340 observations). Radiographic measurements of femoral distal metaphyseal and head breadth are used to predict body mass and long bone lengths are used to predict stature. In addition, pelvic bi-iliac breadth and long bone lengths are used to predict body mass in older adolescents. Relative prediction errors are equal to or smaller than those associated with similar adult estimation formulae. Body proportions change continuously throughout growth, necessitating age-specific formulae. Adult formulae overestimate stature and body mass in younger juveniles, but work well in 17-year-olds from the sample, indicating that in terms of body proportions they are representative of the general population. To illustrate use of the techniques, they are applied to the juvenile Homo erectus (ergaster) KNM-WT 15000 skeleton. New body mass and stature estimates for this specimen are similar to previous estimates derived using other methods. Body mass estimates range from 50 to 53 kg, and stature was probably slightly under 157 cm, although a precise stature estimate is difficult to determine due to differences in linear body proportions between KNM-WT 15000 and the Denver reference sample. PMID:17295297

  1. Chandra Reveals Remains of Giant Eruption

    NASA Technical Reports Server (NTRS)

    2002-01-01

    This is a photo taken by NASA's Chandra X-ray Observatory that reveals the remains of an explosion in the form of two enormous arcs of multimillion-degree gas in the galaxy Centaurus A that appear to be part of a ring 25,000 light years in diameter. The size and location of the ring suggest that it could have been an explosion that occurred about 10 million years ago. A composite image made with radio (red and green), optical (yellow-orange), and X-ray data (blue) presents a sturning tableau of a turbulent galaxy. A broad band of dust and cold gas is bisected at an angle by opposing jets of high-energy particles blasting away from the supermassive black hole in the nucleus. Lying in a plane perpendicular to the jets are the two large arcs of x-ray emitting multi-million degree gas. This discovery can help astronomers better understand the cause and effect of violent outbursts from the vicinity of supermassive black holes of active galaxies. The Chandra program is managed by the Marshall Space Flight Center in Huntsville, Alabama.

  2. Estimating the magnitude of prediction uncertainties for the APLE model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...

  3. Constructing the Uncertainty of Due Dates

    PubMed Central

    Vos, Sarah C.; Anthony, Kathryn E.; O'Hair, H. Dan

    2015-01-01

    By its nature, the date that a baby is predicted to be born, or the due date, is uncertain. How women construct the uncertainty of their due dates may have implications for when and how women give birth. In the United States as many as 15% of births occur before 39 weeks because of elective inductions or cesarean sections, putting these babies at risk for increased medical problems after birth and later in life. This qualitative study employs a grounded theory approach to understand the decisions women make of how and when to give birth. Thirty-three women who were pregnant or had given birth within the past two years participated in key informant or small group interviews. The results suggest that women interpret the uncertainty of their due dates as a reason to wait on birth and as a reason to start the process early; however, information about a baby's brain development in the final weeks of pregnancy may persuade women to remain pregnant longer. The uncertainties of due dates are analyzed using Babrow's problematic integration, which distinguishes between epistemological and ontological uncertainty. The results point to a third type uncertainty, axiological uncertainty. Axiological uncertainty is rooted in the values and ethics of outcomes. PMID:24266788

  4. Ghost Remains After Black Hole Eruption

    NASA Astrophysics Data System (ADS)

    2009-05-01

    NASA's Chandra X-ray Observatory has found a cosmic "ghost" lurking around a distant supermassive black hole. This is the first detection of such a high-energy apparition, and scientists think it is evidence of a huge eruption produced by the black hole. This discovery presents astronomers with a valuable opportunity to observe phenomena that occurred when the Universe was very young. The X-ray ghost, so-called because a diffuse X-ray source has remained after other radiation from the outburst has died away, is in the Chandra Deep Field-North, one of the deepest X-ray images ever taken. The source, a.k.a. HDF 130, is over 10 billion light years away and existed at a time 3 billion years after the Big Bang, when galaxies and black holes were forming at a high rate. "We'd seen this fuzzy object a few years ago, but didn't realize until now that we were seeing a ghost", said Andy Fabian of the Cambridge University in the United Kingdom. "It's not out there to haunt us, rather it's telling us something - in this case what was happening in this galaxy billions of year ago." Fabian and colleagues think the X-ray glow from HDF 130 is evidence for a powerful outburst from its central black hole in the form of jets of energetic particles traveling at almost the speed of light. When the eruption was ongoing, it produced prodigious amounts of radio and X-radiation, but after several million years, the radio signal faded from view as the electrons radiated away their energy. HDF 130 Chandra X-ray Image of HDF 130 However, less energetic electrons can still produce X-rays by interacting with the pervasive sea of photons remaining from the Big Bang - the cosmic background radiation. Collisions between these electrons and the background photons can impart enough energy to the photons to boost them into the X-ray energy band. This process produces an extended X-ray source that lasts for another 30 million years or so. "This ghost tells us about the black hole's eruption long after

  5. Ghost Remains After Black Hole Eruption

    NASA Astrophysics Data System (ADS)

    2009-05-01

    NASA's Chandra X-ray Observatory has found a cosmic "ghost" lurking around a distant supermassive black hole. This is the first detection of such a high-energy apparition, and scientists think it is evidence of a huge eruption produced by the black hole. This discovery presents astronomers with a valuable opportunity to observe phenomena that occurred when the Universe was very young. The X-ray ghost, so-called because a diffuse X-ray source has remained after other radiation from the outburst has died away, is in the Chandra Deep Field-North, one of the deepest X-ray images ever taken. The source, a.k.a. HDF 130, is over 10 billion light years away and existed at a time 3 billion years after the Big Bang, when galaxies and black holes were forming at a high rate. "We'd seen this fuzzy object a few years ago, but didn't realize until now that we were seeing a ghost", said Andy Fabian of the Cambridge University in the United Kingdom. "It's not out there to haunt us, rather it's telling us something - in this case what was happening in this galaxy billions of year ago." Fabian and colleagues think the X-ray glow from HDF 130 is evidence for a powerful outburst from its central black hole in the form of jets of energetic particles traveling at almost the speed of light. When the eruption was ongoing, it produced prodigious amounts of radio and X-radiation, but after several million years, the radio signal faded from view as the electrons radiated away their energy. HDF 130 Chandra X-ray Image of HDF 130 However, less energetic electrons can still produce X-rays by interacting with the pervasive sea of photons remaining from the Big Bang - the cosmic background radiation. Collisions between these electrons and the background photons can impart enough energy to the photons to boost them into the X-ray energy band. This process produces an extended X-ray source that lasts for another 30 million years or so. "This ghost tells us about the black hole's eruption long after

  6. The challenges on uncertainty analysis for pebble bed HTGR

    SciTech Connect

    Hao, C.; Li, F.; Zhang, H.

    2012-07-01

    The uncertainty analysis is very popular and important, and many works have been done for Light Water Reactor (LWR), although the experience for the uncertainty analysis in High Temperature Gas cooled Reactor (HTGR) modeling is still in the primary stage. IAEA will launch a Coordination Research Project (CRP) on this topic soon. This paper addresses some challenges for the uncertainty analysis in HTGR modeling, based on the experience of OECD LWR Uncertainty Analysis in Modeling (UAM) activities, and taking into account the peculiarities of pebble bed HTGR designs. The main challenges for HTGR UAM are: the lack of experience, the totally different code packages, the coupling of power distribution, temperature distribution and burnup distribution through the temperature feedback and pebble flow. The most serious challenge is how to deal with the uncertainty in pebble flow, the uncertainty in pebble bed flow modeling, and their contribution to the uncertainty of maximum fuel temperature, which is the most interested parameter for the modular HTGR. (authors)

  7. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  8. Estimating uncertainty of streamflow simulation using Bayesian neural networks

    NASA Astrophysics Data System (ADS)

    Zhang, Xuesong; Liang, Faming; Srinivasan, Raghavan; van Liew, Michael

    2009-02-01

    Recent studies have shown that Bayesian neural networks (BNNs) are powerful tools for providing reliable hydrologic prediction and quantifying the prediction uncertainty. The reasonable estimation of the prediction uncertainty, a valuable tool for decision making to address water resources management and design problems, is influenced by the techniques used to deal with different uncertainty sources. In this study, four types of BNNs with different treatments of the uncertainties related to parameters (neural network's weights) and model structures were applied for uncertainty estimation of streamflow simulation in two U.S. Department of Agriculture Agricultural Research Service watersheds (Little River Experimental Watershed in Georgia and Reynolds Creek Experimental Watershed in Idaho). An advanced Markov chain Monte Carlo algorithm, evolutionary Monte Carlo, was used to train the BNNs and to estimate uncertainty limits of streamflow simulation. The results obtained in these two case study watersheds show that the 95% uncertainty limits estimated by different types of BNNs are different from each other. The BNNs that only consider the parameter uncertainty with noninformative prior knowledge contain the least number of observed streamflow data in their 95% uncertainty bound. By considering variable model structure and informative prior knowledge, the BNNs can provide more reasonable quantification of the uncertainty of streamflow simulation. This study stresses the need for improving understanding and quantifying methods of different uncertainty sources for effective estimation of uncertainty of hydrologic simulation using BNNs.

  9. Ciguatera: recent advances but the risk remains.

    PubMed

    Lehane, L; Lewis, R J

    2000-11-01

    Ciguatera is an important form of human poisoning caused by the consumption of seafood. The disease is characterised by gastrointestinal, neurological and cardiovascular disturbances. In cases of severe toxicity, paralysis, coma and death may occur. There is no immunity, and the toxins are cumulative. Symptoms may persist for months or years, or recur periodically. The epidemiology of ciguatera is complex and of central importance to the management and future use of marine resources. Ciguatera is an important medical entity in tropical and subtropical Pacific and Indian Ocean regions, and in the tropical Caribbean. As reef fish are increasingly exported to other areas, it has become a world health problem. The disease is under-reported and often misdiagnosed. Lipid-soluble, polyether toxins known as ciguatoxins accumulated in the muscles of certain subtropical and tropical marine finfish cause ciguatera. Ciguatoxins arise from biotransformation in the fish of less polar ciguatoxins (gambiertoxins) produced by Gambierdiscus toxicus, a marine dinoflagellate that lives on macroalgae, usually attached to dead coral. The toxins and their metabolites are concentrated in the food chain when carnivorous fish prey on smaller herbivorous fish. Humans are exposed at the end of the food chain. More than 400 species of fish can be vectors of ciguatoxins, but generally only a relatively small number of species are regularly incriminated in ciguatera. Ciguateric fish look, taste and smell normal, and detection of toxins in fish remains a problem. More than 20 precursor gambiertoxins and ciguatoxins have been identified in G. toxicus and in herbivorous and carnivorous fish. The toxins become more polar as they undergo oxidative metabolism and pass up the food chain. The main Pacific ciguatoxin (P-CTX-1) causes ciguatera at levels=0.1 microg/kg in the flesh of carnivorous fish. The main Caribbean ciguatoxin (C-CTX-1) is less polar and 10-fold less toxic than P-CTX-1. Ciguatoxins

  10. Cascading rainfall uncertainties into 2D inundation impact models

    NASA Astrophysics Data System (ADS)

    Souvignet, Maxime; de Almeida, Gustavo; Champion, Adrian; Garcia Pintado, Javier; Neal, Jeff; Freer, Jim; Cloke, Hannah; Odoni, Nick; Coxon, Gemma; Bates, Paul; Mason, David

    2013-04-01

    Existing precipitation products show differences in their spatial and temporal distribution and several studies have presented how these differences influence the ability to predict hydrological responses. However, an atmospheric-hydrologic-hydraulic uncertainty cascade is seldom explored and how, importantly, input uncertainties propagate through this cascade is still poorly understood. Such a project requires a combination of modelling capabilities, runoff generation predictions based on those rainfall forecasts, and hydraulic flood wave propagation based on the runoff predictions. Accounting for uncertainty in each component is important in decision making for issuing flood warnings, monitoring or planning. We suggest a better understanding of uncertainties in inundation impact modelling must consider these differences in rainfall products. This will improve our understanding of the input uncertainties on our predictive capability. In this paper, we propose to address this issue by i) exploring the effects of errors in rainfall on inundation predictive capacity within an uncertainty framework, i.e. testing inundation uncertainty against different comparable meteorological conditions (i.e. using different rainfall products). Our method cascades rainfall uncertainties into a lumped hydrologic model (FUSE) within the GLUE uncertainty framework. The resultant prediction uncertainties in discharge provide uncertain boundary conditions, which are cascaded into a simplified shallow water 2D hydraulic model (LISFLOOD-FP). Rainfall data captured by three different measurement techniques - rain gauges, gridded data and numerical weather predictions (NWP) models are used to assess the combined input data and model parameter uncertainty. The study is performed in the Severn catchment over the period between June and July 2007, where a series of rainfall events causing record floods in the study area). Changes in flood area extent are compared and the uncertainty envelope is

  11. Bayesian calibration of coarse-grained forces: Efficiently addressing transferability

    NASA Astrophysics Data System (ADS)

    Patrone, Paul N.; Rosch, Thomas W.; Phelan, Frederick R.

    2016-04-01

    Generating and calibrating forces that are transferable across a range of state-points remains a challenging task in coarse-grained (CG) molecular dynamics. In this work, we present a coarse-graining workflow, inspired by ideas from uncertainty quantification and numerical analysis, to address this problem. The key idea behind our approach is to introduce a Bayesian correction algorithm that uses functional derivatives of CG simulations to rapidly and inexpensively recalibrate initial estimates f0 of forces anchored by standard methods such as force-matching. Taking density-temperature relationships as a running example, we demonstrate that this algorithm, in concert with various interpolation schemes, can be used to efficiently compute physically reasonable force curves on a fine grid of state-points. Importantly, we show that our workflow is robust to several choices available to the modeler, including the interpolation schemes and tools used to construct f0. In a related vein, we also demonstrate that our approach can speed up coarse-graining by reducing the number of atomistic simulations needed as inputs to standard methods for generating CG forces.

  12. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  13. Evaluating prediction uncertainty

    SciTech Connect

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented.

  14. Conundrums with uncertainty factors.

    PubMed

    Cooke, Roger

    2010-03-01

    The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets. PMID:20030767

  15. Atomic data for stellar spectroscopy: recent successes and remaining needs

    NASA Astrophysics Data System (ADS)

    Sneden, Christopher; Lawler, James E.; Wood, Michael P.; Den Hartog, Elizabeth A.; Cowan, John J.

    2014-11-01

    Stellar chemical composition analyses provide vital insights into galactic nucleosynthesis. Atomic line data are critical inputs to stellar abundance computations. Recent lab studies have made significant progress in refining and extending knowledge of transition probabilities, isotopic wavelength shifts, and hyperfine substructure patterns for the absorption lines that are of most interest to stellar spectroscopists. The observable neutron-capture (n-capture) element species (Z \\gt 30) have been scrutinized in lab studies by several groups. For many species the uncertainties in experimental oscillator strengths are ≤slant 10%, which permits detailed assessment of rapid and slow n-capture nucleosynthesis contributions. In this review, extreme examples of r-process-enriched stars in the galactic halo will be shown, which suggest that the description of observable n-capture abundances in these stars is nearly complete. Unfortunately, there are serious remaining concerns about the reliability of observed abundances of lighter elements. In particular, it is not clear that line formation in real stellar atmospheres is being modeled correctly. But for many elements with Z \\lt 30 the atomic transition data are not yet settled. Highlights will be given of some recent large improvements, with suggestions for the most important needs for the near future.

  16. Uncertainty, conflict and consent: revisiting the futility debate in neurotrauma.

    PubMed

    Honeybul, Stephen; Gillett, Grant R; Ho, Kwok M

    2016-07-01

    The concept of futility has been debated for many years, and a precise definition remains elusive. This is not entirely unsurprising given the increasingly complex and evolving nature of modern medicine. Progressively more complex decisions are required when considering increasingly sophisticated diagnostic and therapeutic interventions. Allocating resources appropriately amongst a population whose expectations continue to increase raises a number of ethical issues not least of which are the difficulties encountered when consideration is being given to withholding "life-preserving" treatment. In this discussion we have used decompressive craniectomy for severe traumatic brain injury as a clinical example with which to frame an approach to the concept. We have defined those issues that initially lead us to consider futility and thereafter actually provoke a significant discussion. We contend that these issues are uncertainty, conflict and consent. We then examine recent scientific advances in outcome prediction that may address some of the uncertainty and perhaps help achieve consensus amongst stakeholders. Whilst we do not anticipate that this re-framing of the idea of futility is applicable to all medical situations, the approach to specify patient-centred benefit may assist those making such decisions when patients are incompetent to participate. PMID:27143027

  17. Physics and Operational Research: measure of uncertainty via Nonlinear Programming

    NASA Astrophysics Data System (ADS)

    Davizon-Castillo, Yasser A.

    2008-03-01

    Physics and Operational Research presents an interdisciplinary interaction in problems such as Quantum Mechanics, Classical Mechanics and Statistical Mechanics. The nonlinear nature of the physical phenomena in a single well and double well quantum systems is resolved via Nonlinear Programming (NLP) techniques (Kuhn-Tucker conditions, Dynamic Programming) subject to Heisenberg Uncertainty Principle and an extended equality uncertainty relation to exploit the NLP Lagrangian method. This review addresses problems in Kinematics and Thermal Physics developing uncertainty relations for each case of study, under a novel way to quantify uncertainty.

  18. 2014 ASHG Awards and Addresses

    PubMed Central

    2015-01-01

    Each year at the annual meeting of The American Society of Human Genetics (ASHG), addresses are given in honor of The Society and a number of award winners. A summary of each of these addresses is given below. On the following pages, we have printed the presidential address and the addresses for the William Allan Award, the Curt Stern Award, and the Victor A. McKusick Leadership Award. Webcasts of these addresses, as well as those of many other presentations, can be found at http://www.ashg.org.

  19. 2013 ASHG Awards and Addresses

    PubMed Central

    2014-01-01

    Each year at the annual meeting of The American Society of Human Genetics (ASHG), addresses are given in honor of The Society and a number of award winners. A summary of each of these addresses is given below. On the following pages, we have printed the Presidential Address and the addresses for the William Allan Award, the Curt Stern Award, and the Victor A. McKusick Leadership Award. Webcasts of these addresses, as well as those of many other presentations, can be found at http://www.ashg.org.

  20. Uncertainty of Pyrometers in a Casting Facility

    SciTech Connect

    Mee, D.K.; Elkins, J.E.; Fleenor, R.M.; Morrision, J.M.; Sherrill, M.W.; Seiber, L.E.

    2001-12-07

    This work has established uncertainty limits for the EUO filament pyrometers, digital pyrometers, two-color automatic pyrometers, and the standards used to certify these instruments (Table 1). If symmetrical limits are used, filament pyrometers calibrated in Production have certification uncertainties of not more than {+-}20.5 C traceable to NIST over the certification period. Uncertainties of these pyrometers were roughly {+-}14.7 C before introduction of the working standard that allowed certification in the field. Digital pyrometers addressed in this report have symmetrical uncertainties of not more than {+-}12.7 C or {+-}18.1 C when certified on a Y-12 Standards Laboratory strip lamp or in a production area tube furnace, respectively. Uncertainty estimates for automatic two-color pyrometers certified in Production are {+-}16.7 C. Additional uncertainty and bias are introduced when measuring production melt temperatures. A -19.4 C bias was measured in a large 1987 data set which is believed to be caused primarily by use of Pyrex{trademark} windows (not present in current configuration) and window fogging. Large variability (2{sigma} = 28.6 C) exists in the first 10 m of the hold period. This variability is attributed to emissivity variation across the melt and reflection from hot surfaces. For runs with hold periods extending to 20 m, the uncertainty approaches the calibration uncertainty of the pyrometers. When certifying pyrometers on a strip lamp at the Y-12 Standards Laboratory, it is important to limit ambient temperature variation (23{+-}4 C), to order calibration points from high to low temperatures, to allow 6 m for the lamp to reach thermal equilibrium (12 m for certifications below 1200 C) to minimize pyrometer bias, and to calibrate the pyrometer if error exceeds vendor specifications. A procedure has been written to assure conformance.

  1. Classification images with uncertainty

    PubMed Central

    Tjan, Bosco S.; Nandy, Anirvan S.

    2009-01-01

    Classification image and other similar noise-driven linear methods have found increasingly wider applications in revealing psychophysical receptive field structures or perceptual templates. These techniques are relatively easy to deploy, and the results are simple to interpret. However, being a linear technique, the utility of the classification-image method is believed to be limited. Uncertainty about the target stimuli on the part of an observer will result in a classification image that is the superposition of all possible templates for all the possible signals. In the context of a well-established uncertainty model, which pools the outputs of a large set of linear frontends with a max operator, we show analytically, in simulations, and with human experiments that the effect of intrinsic uncertainty can be limited or even eliminated by presenting a signal at a relatively high contrast in a classification-image experiment. We further argue that the subimages from different stimulus-response categories should not be combined, as is conventionally done. We show that when the signal contrast is high, the subimages from the error trials contain a clear high-contrast image that is negatively correlated with the perceptual template associated with the presented signal, relatively unaffected by uncertainty. The subimages also contain a “haze” that is of a much lower contrast and is positively correlated with the superposition of all the templates associated with the erroneous response. In the case of spatial uncertainty, we show that the spatial extent of the uncertainty can be estimated from the classification subimages. We link intrinsic uncertainty to invariance and suggest that this signal-clamped classification-image method will find general applications in uncovering the underlying representations of high-level neural and psychophysical mechanisms. PMID:16889477

  2. Quantification of the uncertainties in the prediction of extinction of hydrogen-air diffusion flames

    NASA Astrophysics Data System (ADS)

    Kseib, Nicolas; Urzay, Javier; Iaccarino, Gianluca

    2011-11-01

    The study of the physical processes that lead to extinction of flames in gaseous hydrogen-air non-premixed combustion is of paramount importance for the reliable design of power plants and advanced propulsion systems in automobiles and hypersonic aircrafts. However, there remain several uncertainties in the experimental quantification of reaction rates of elementary steps in most of hydrogen-air mechanisms, which can produce hazards in hydrogen manipulation and engine malfunction. In this study, the effects of aleatory uncertainties in the chemical reaction-rate constants induced in hydrogen-air counterflow diffusion-flame extinction processes are addressed, with a probabilistic representation of the uncertain parameters sampled with a Markov-Chain Monte Carlo algorithm. Measurements of the reaction-rate constants and their associated uncertainty factors, reported earlier for the Stanford hydrogen-air detailed chemical mechanism, are used to study the propagation of uncertainties in the calculation of scalar dissipation rates at extinction. Non-intrusive methods are used to analyze the variablities, with the probability density function of the scalar dissipation rate being sampled around regions involving flame extinction and global sensitivity indices being computed by Monte Carlo sampling.

  3. Visualization of Uncertainty

    NASA Astrophysics Data System (ADS)

    Jones, P. W.; Strelitz, R. A.

    2012-12-01

    The output of a simulation is best comprehended through the agency and methods of visualization, but a vital component of good science is knowledge of uncertainty. While great strides have been made in the quantification of uncertainty, especially in simulation, there is still a notable gap: there is no widely accepted means of simultaneously viewing the data and the associated uncertainty in one pane. Visualization saturates the screen, using the full range of color, shadow, opacity and tricks of perspective to display even a single variable. There is no room in the visualization expert's repertoire left for uncertainty. We present a method of visualizing uncertainty without sacrificing the clarity and power of the underlying visualization that works as well in 3-D and time-varying visualizations as it does in 2-D. At its heart, it relies on a principal tenet of continuum mechanics, replacing the notion of value at a point with a more diffuse notion of density as a measure of content in a region. First, the uncertainties calculated or tabulated at each point are transformed into a piecewise continuous field of uncertainty density . We next compute a weighted Voronoi tessellation of a user specified N convex polygonal/polyhedral cells such that each cell contains the same amount of uncertainty as defined by . The problem thus devolves into minimizing . Computation of such a spatial decomposition is O(N*N ), and can be computed iteratively making it possible to update easily over time as well as faster. The polygonal mesh does not interfere with the visualization of the data and can be easily toggled on or off. In this representation, a small cell implies a great concentration of uncertainty, and conversely. The content weighted polygons are identical to the cartogram familiar to the information visualization community in the depiction of things voting results per stat. Furthermore, one can dispense with the mesh or edges entirely to be replaced by symbols or glyphs

  4. Uncertainty in quantum mechanics: faith or fantasy?

    PubMed

    Penrose, Roger

    2011-12-13

    The word 'uncertainty', in the context of quantum mechanics, usually evokes an impression of an essential unknowability of what might actually be going on at the quantum level of activity, as is made explicit in Heisenberg's uncertainty principle, and in the fact that the theory normally provides only probabilities for the results of quantum measurement. These issues limit our ultimate understanding of the behaviour of things, if we take quantum mechanics to represent an absolute truth. But they do not cause us to put that very 'truth' into question. This article addresses the issue of quantum 'uncertainty' from a different perspective, raising the question of whether this term might be applied to the theory itself, despite its unrefuted huge success over an enormously diverse range of observed phenomena. There are, indeed, seeming internal contradictions in the theory that lead us to infer that a total faith in it at all levels of scale leads us to almost fantastical implications. PMID:22042902

  5. Model development and data uncertainty integration

    SciTech Connect

    Swinhoe, Martyn Thomas

    2015-12-02

    The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the 240Pu spontaneous fission neutron spectrum, the effect of P(ν) for 240Pu spontaneous fission, and the effect of spontaneous fission and (α,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (α,n) processes, and uncertainties and important data depend on the analysis technique chosen.

  6. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on. PMID:25090127

  7. Low uncertainty method for inertia tensor identification

    NASA Astrophysics Data System (ADS)

    Barreto, J. P.; Muñoz, L. E.

    2016-02-01

    The uncertainty associated with the experimental identification of the inertia tensor can be reduced by implementing adequate rotational and translational motions in the experiment. This paper proposes a particular 3D trajectory that improves the experimental measurement of the inertia tensor of rigid bodies. Such a trajectory corresponds to a motion in which the object is rotated around a large number of instantaneous axes, while the center of gravity remains static. The uncertainty in the inertia tensor components obtained with this practice is reduced by 45% in average, compared with those calculated using simple rotations around three perpendicular axes (Roll, Pitch, Yaw).

  8. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  9. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  10. Measurement uncertainty relations

    NASA Astrophysics Data System (ADS)

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-01

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  11. Serenity in political uncertainty.

    PubMed

    Doumit, Rita; Afifi, Rema A; Devon, Holli A

    2015-01-01

    College students are often faced with academic and personal stressors that threaten their well-being. Added to that may be political and environmental stressors such as acts of violence on the streets, interruptions in schooling, car bombings, targeted religious intimidations, financial hardship, and uncertainty of obtaining a job after graduation. Research on how college students adapt to the latter stressors is limited. The aims of this study were (1) to investigate the associations between stress, uncertainty, resilience, social support, withdrawal coping, and well-being for Lebanese youth during their first year of college and (2) to determine whether these variables predicted well-being. A sample of 293 first-year students enrolled in a private university in Lebanon completed a self-reported questionnaire in the classroom setting. The mean age of sample participants was 18.1 years, with nearly an equal percentage of males and females (53.2% vs 46.8%), who lived with their family (92.5%), and whose family reported high income levels (68.4%). Multiple regression analyses revealed that best determinants of well-being are resilience, uncertainty, social support, and gender that accounted for 54.1% of the variance. Despite living in an environment of frequent violence and political uncertainty, Lebanese youth in this study have a strong sense of well-being and are able to go on with their lives. This research adds to our understanding on how adolescents can adapt to stressors of frequent violence and political uncertainty. Further research is recommended to understand the mechanisms through which young people cope with political uncertainty and violence. PMID:25658930

  12. Weighted Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-03-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation.

  13. Weighted Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-01-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation. PMID:26984295

  14. The legacy of uncertainty

    NASA Technical Reports Server (NTRS)

    Brown, Laurie M.

    1993-01-01

    An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

  15. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten

  16. A Certain Uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, Mark P.

    2014-07-01

    1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

  17. Methods for Assessing Uncertainties in Climate Change, Impacts and Responses (Invited)

    NASA Astrophysics Data System (ADS)

    Manning, M. R.; Swart, R.

    2009-12-01

    Assessing the scientific uncertainties or confidence levels for the many different aspects of climate change is particularly important because of the seriousness of potential impacts and the magnitude of economic and political responses that are needed to mitigate climate change effectively. This has made the treatment of uncertainty and confidence a key feature in the assessments carried out by the Intergovernmental Panel on Climate Change (IPCC). Because climate change is very much a cross-disciplinary area of science, adequately dealing with uncertainties requires recognition of their wide range and different perspectives on assessing and communicating those uncertainties. The structural differences that exist across disciplines are often embedded deeply in the corresponding literature that is used as the basis for an IPCC assessment. The assessment of climate change science by the IPCC has from its outset tried to report the levels of confidence and uncertainty in the degree of understanding in both the underlying multi-disciplinary science and in projections for future climate. The growing recognition of the seriousness of this led to the formation of a detailed approach for consistent treatment of uncertainties in the IPCC’s Third Assessment Report (TAR) [Moss and Schneider, 2000]. However, in completing the TAR there remained some systematic differences between the disciplines raising concerns about the level of consistency. So further consideration of a systematic approach to uncertainties was undertaken for the Fourth Assessment Report (AR4). The basis for the approach used in the AR4 was developed at an expert meeting of scientists representing many different disciplines. This led to the introduction of a broader way of addressing uncertainties in the AR4 [Manning et al., 2004] which was further refined by lengthy discussions among many IPCC Lead Authors, for over a year, resulting in a short summary of a standard approach to be followed for that

  18. Population growth of Yellowstone grizzly bears: Uncertainty and future monitoring

    USGS Publications Warehouse

    Harris, R.B.; White, Gary C.; Schwartz, C.C.; Haroldson, M.A.

    2007-01-01

    Grizzly bears (Ursus arctos) in the Greater Yellowstone Ecosystem of the US Rocky Mountains have recently increased in numbers, but remain vulnerable due to isolation from other populations and predicted reductions in favored food resources. Harris et al. (2006) projected how this population might fare in the future under alternative survival rates, and in doing so estimated the rate of population growth, 1983–2002. We address issues that remain from that earlier work: (1) the degree of uncertainty surrounding our estimates of the rate of population change (λ); (2) the effect of correlation among demographic parameters on these estimates; and (3) how a future monitoring system using counts of females accompanied by cubs might usefully differentiate between short-term, expected, and inconsequential fluctuations versus a true change in system state. We used Monte Carlo re-sampling of beta distributions derived from the demographic parameters used by Harris et al. (2006) to derive distributions of λ during 1983–2002 given our sampling uncertainty. Approximate 95% confidence intervals were 0.972–1.096 (assuming females with unresolved fates died) and 1.008–1.115 (with unresolved females censored at last contact). We used well-supported models of Haroldson et al. (2006) and Schwartz et al. (2006a,b,c) to assess the strength of correlations among demographic processes and the effect of omitting them in projection models. Incorporating correlations among demographic parameters yielded point estimates of λ that were nearly identical to those from the earlier model that omitted correlations, but yielded wider confidence intervals surrounding λ. Finally, we suggest that fitting linear and quadratic curves to the trend suggested by the estimated number of females with cubs in the ecosystem, and using AICc model weights to infer population sizes and λ provides an objective means to monitoring approximate population trajectories in addition to demographic

  19. Uncertainty Quantification for Quantitative Imaging Holdup Measurements

    SciTech Connect

    Bevill, Aaron M; Bledsoe, Keith C

    2016-01-01

    In nuclear fuel cycle safeguards, special nuclear material "held up" in pipes, ducts, and glove boxes causes significant uncertainty in material-unaccounted-for estimates. Quantitative imaging is a proposed non-destructive assay technique with potential to estimate the holdup mass more accurately and reliably than current techniques. However, uncertainty analysis for quantitative imaging remains a significant challenge. In this work we demonstrate an analysis approach for data acquired with a fast-neutron coded aperture imager. The work includes a calibrated forward model of the imager. Cross-validation indicates that the forward model predicts the imager data typically within 23%; further improvements are forthcoming. A new algorithm based on the chi-squared goodness-of-fit metric then uses the forward model to calculate a holdup confidence interval. The new algorithm removes geometry approximations that previous methods require, making it a more reliable uncertainty estimator.

  20. Reducing the uncertainty in subtropical cloud feedback

    NASA Astrophysics Data System (ADS)

    Myers, Timothy A.; Norris, Joel R.

    2016-03-01

    Large uncertainty remains on how subtropical clouds will respond to anthropogenic climate change and therefore whether they will act as a positive feedback that amplifies global warming or negative feedback that dampens global warming by altering Earth's energy budget. Here we reduce this uncertainty using an observationally constrained formulation of the response of subtropical clouds to greenhouse forcing. The observed interannual sensitivity of cloud solar reflection to varying meteorological conditions suggests that increasing sea surface temperature and atmospheric stability in the future climate will have largely canceling effects on subtropical cloudiness, overall leading to a weak positive shortwave cloud feedback (0.4 ± 0.9 W m-2 K-1). The uncertainty of this observationally based approximation of the cloud feedback is narrower than the intermodel spread of the feedback produced by climate models. Subtropical cloud changes will therefore complement positive cloud feedbacks identified by previous work, suggesting that future global cloud changes will amplify global warming.

  1. Uncertainty in NIST Force Measurements

    PubMed Central

    Bartel, Tom

    2005-01-01

    This paper focuses upon the uncertainty of force calibration measurements at the National Institute of Standards and Technology (NIST). The uncertainty of the realization of force for the national deadweight force standards at NIST is discussed, as well as the uncertainties associated with NIST’s voltage-ratio measuring instruments and with the characteristics of transducers being calibrated. The combined uncertainty is related to the uncertainty of dissemination for force transfer standards sent to NIST for calibration. PMID:27308181

  2. Uncertainty of measurement and clinical value of semen analysis: has standardisation through professional guidelines helped or hindered progress?

    PubMed

    Tomlinson, M J

    2016-09-01

    This article suggests that diagnostic semen analysis has no more clinical value today than it had 25-30 years ago, and both the confusion surrounding its evidence base (in terms of relationship with conception) and the low level of confidence in the clinical setting is attributable to an associated high level of 'uncertainty'. Consideration of the concept of measurement uncertainty is mandatory for medical laboratories applying for the ISO15189 standard. It is evident that the entire semen analysis process is prone to error every step from specimen collection to the reporting of results and serves to compound uncertainty associated with diagnosis or prognosis. Perceived adherence to published guidelines for the assessment of sperm concentration, motility and morphology does not guarantee a reliable and reproducible test result. Moreover, the high level of uncertainty associated with manual sperm motility and morphology can be attributed to subjectivity and lack a traceable standard. This article describes where and why uncertainty exists and suggests that semen analysis will continue to be of limited value until it is more adequately considered and addressed. Although professional guidelines for good practice have provided the foundations for testing procedures for many years, the risk in following rather prescriptive guidance to the letter is that unless they are based on an overwhelmingly firm evidence base, the quality of semen analysis will remain poor and the progress towards the development of more innovative methods for investigating male infertility will be slow. PMID:27529487

  3. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  4. Coping with Uncertainty.

    ERIC Educational Resources Information Center

    Wargo, John

    1985-01-01

    Draws conclusions on the scientific uncertainty surrounding most chemical use regulatory decisions, examining the evolution of law and science, benefit analysis, and improving information. Suggests: (1) rapid development of knowledge of chemical risks and (2) a regulatory system which is flexible to new scientific knowledge. (DH)

  5. Uncertainties in repository modeling

    SciTech Connect

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  6. Uncertainty and nonseparability

    NASA Astrophysics Data System (ADS)

    de La Torre, A. C.; Catuogno, P.; Ferrando, S.

    1989-06-01

    A quantum covariance function is introduced whose real and imaginary parts are related to the independent contributions to the uncertainty principle: noncommutativity of the operators and nonseparability. It is shown that factorizability of states is a sufficient but not necessary condition for separability. It is suggested that all quantum effects could be considered to be a consequence of nonseparability alone.

  7. Asymptotic entropic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Adamczak, Radosław; Latała, Rafał; Puchała, Zbigniew; Życzkowski, Karol

    2016-03-01

    We analyze entropic uncertainty relations for two orthogonal measurements on a N-dimensional Hilbert space, performed in two generic bases. It is assumed that the unitary matrix U relating both bases is distributed according to the Haar measure on the unitary group. We provide lower bounds on the average Shannon entropy of probability distributions related to both measurements. The bounds are stronger than those obtained with use of the entropic uncertainty relation by Maassen and Uffink, and they are optimal up to additive constants. We also analyze the case of a large number of measurements and obtain strong entropic uncertainty relations, which hold with high probability with respect to the random choice of bases. The lower bounds we obtain are optimal up to additive constants and allow us to prove a conjecture by Wehner and Winter on the asymptotic behavior of constants in entropic uncertainty relations as the dimension tends to infinity. As a tool we develop estimates on the maximum operator norm of a submatrix of a fixed size of a random unitary matrix distributed according to the Haar measure, which are of independent interest.

  8. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  9. Strategy under uncertainty.

    PubMed

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy. PMID:10174798

  10. Science, Uncertainty, and Adaptive Management in Large River Restoration Programs: Trinity River example

    NASA Astrophysics Data System (ADS)

    McBain, S.

    2002-12-01

    Following construction of Trinity and Lewiston dams on the upper Trinity River in 1964, dam induced changes to streamflows and sediment regime had severely simplified channel morphology and aquatic habitat downstream of the dams. This habitat change, combined with blocked access to over 100 miles of salmon and steelhead habitat upstream of the dams, caused salmon and steelhead populations to quickly plummet. An instream flow study was initiated in 1984 to address the flow needs to restore the fishery, and this study relied on the Physical Habitat Simulation (PHABSIM) Model to quantify instream flow needs. In 1992, geomorphic and riparian studies were integrated into the instream flow study, with the overall study completed in 1999 (USFWS 1999). This 13-year process continued through three presidential administrations, several agency managers, and many turnovers of the agency technical staff responsible for conducting the study. This process culminated in 1996-1998 when a group of scientists were convened to integrate all the studies and data to produce the final instream flow study document. This 13-year, non-linear process, resulted in many uncertainties that could not be resolved in the short amount of time allowed for completing the instream flow study document. Shortly after completion of the instream flow study document, the Secretary of Interior issued a Record of Decision to implement the recommendations contained in the instream flow study document. The uncertainties encountered as the instream flow study report was prepared were highlighted in the report, and the Record of Decision initiated an Adaptive Environmental Assessment and Management program to address these existing uncertainties and improve future river management. There have been many lessons learned going through this process, and the presentation will summarize: 1)The progression of science used to develop the instream flow study report; 2)How the scientists preparing the report addressed

  11. Computations of uncertainty mediate acute stress responses in humans

    PubMed Central

    de Berker, Archy O.; Rutledge, Robb B.; Mathys, Christoph; Marshall, Louise; Cross, Gemma F.; Dolan, Raymond J.; Bestmann, Sven

    2016-01-01

    The effects of stress are frequently studied, yet its proximal causes remain unclear. Here we demonstrate that subjective estimates of uncertainty predict the dynamics of subjective and physiological stress responses. Subjects learned a probabilistic mapping between visual stimuli and electric shocks. Salivary cortisol confirmed that our stressor elicited changes in endocrine activity. Using a hierarchical Bayesian learning model, we quantified the relationship between the different forms of subjective task uncertainty and acute stress responses. Subjective stress, pupil diameter and skin conductance all tracked the evolution of irreducible uncertainty. We observed a coupling between emotional and somatic state, with subjective and physiological tuning to uncertainty tightly correlated. Furthermore, the uncertainty tuning of subjective and physiological stress predicted individual task performance, consistent with an adaptive role for stress in learning under uncertain threat. Our finding that stress responses are tuned to environmental uncertainty provides new insight into their generation and likely adaptive function. PMID:27020312

  12. Addressing spatial scales and new mechanisms in climate impact ecosystem modeling

    NASA Astrophysics Data System (ADS)

    Poulter, B.; Joetzjer, E.; Renwick, K.; Ogunkoya, G.; Emmett, K.

    2015-12-01

    Climate change impacts on vegetation distributions are typically addressed using either an empirical approach, such as a species distribution model (SDM), or with process-based methods, for example, dynamic global vegetation models (DGVMs). Each approach has its own benefits and disadvantages. For example, an SDM is constrained by data and few parameters, but does not include adaptation or acclimation processes or other ecosystem feedbacks that may act to mitigate or enhance climate effects. Alternatively, a DGVM model includes many mechanisms relating plant growth and disturbance to climate, but simulations are costly to perform at high-spatial resolution and there remains large uncertainty on a variety of fundamental physical processes. To address these issues, here, we present two DGVM-based case studies where i) high-resolution (1 km) simulations are being performed for vegetation in the Greater Yellowstone Ecosystem using a biogeochemical, forest gap model, LPJ-GUESS, and ii) where new mechanisms for simulating tropical tree-mortality are being introduced. High-resolution DGVM model simulations require not only computing and reorganizing code but also a consideration of scaling issues on vegetation dynamics and stochasticity and also on disturbance and migration. New mechanisms for simulating forest mortality must consider hydraulic limitations and carbon reserves and their interactions on source-sink dynamics and in controlling water potentials. Improving DGVM approaches by addressing spatial scale challenges and integrating new approaches for estimating forest mortality will provide new insights more relevant for land management and possibly reduce uncertainty by physical processes more directly comparable to experimental and observational evidence.

  13. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect

    Smith, F.; Phifer, M.

    2011-06-30

    sand and clay), (b) Dose Parameters (34 parameters), (c) Material Properties (20 parameters), (d) Surface Water Flows (6 parameters), and (e) Vadose and Aquifer Flow (4 parameters). Results provided an assessment of which group of parameters is most significant in the dose uncertainty. It was found that K{sub d} and the vadose/aquifer flow parameters, both of which impact transport timing, had the greatest impact on dose uncertainty. Dose parameters had an intermediate level of impact while material properties and surface water flows had little impact on dose uncertainty. Results of the importance analysis are discussed further in Section 7 of this report. The objectives of this work were to address comments received during the CA review on the uncertainty analysis and to demonstrate an improved methodology for CA uncertainty calculations as part of CA maintenance. This report partially addresses the LFRG Review Team issue of producing an enhanced CA sensitivity and uncertainty analysis. This is described in Table 1-1 which provides specific responses to pertinent CA maintenance items extracted from Section 11 of the SRS CA (2009). As noted above, the original uncertainty analysis looked at each POA separately and only included the effects from at most five sources giving the highest peak doses at each POA. Only 17 of the 152 CA sources were used in the original uncertainty analysis and the simulation time was reduced from 10,000 to 2,000 years. A major constraint on the original uncertainty analysis was the limitation of only being able to use at most four distributed processes. This work expanded the analysis to 10,000 years using 39 of the CA sources, included cumulative dose effects at downstream POAs, with more realizations (1,000) and finer time steps. This was accomplished by using the GoldSim DP-Plus module and the 36 processors available on a new windows cluster. The last part of the work looked at the contribution to overall uncertainty from the main

  14. Temporal uncertainty of geographical information

    NASA Astrophysics Data System (ADS)

    Shu, Hong; Qi, Cuihong

    2005-10-01

    Temporal uncertainty is a crossing point of temporal and error-aware geographical information systems. In Geoinformatics, temporal uncertainty is of the same importance as spatial and thematic uncertainty of geographical information. However, until very recently, the standard organizations of ISO/TC211 and FGDC subsequently claimed that temporal uncertainty is one of geospatial data quality elements. Over the past decades, temporal uncertainty of geographical information is modeled insufficiently. To lay down a foundation of logically or physically modeling temporal uncertainty, this paper is aimed to clarify the semantics of temporal uncertainty to some extent. The general uncertainty is conceptualized with a taxonomy of uncertainty. Semantically, temporal uncertainty is progressively classified into uncertainty of time coordinates, changes, and dynamics. Uncertainty of multidimensional time (valid time, database time, and conceptual time, etc.) has been emphasized. It is realized that time scale (granularity) transition may lead to temporal uncertainty because of missing transition details. It is dialectically concluded that temporal uncertainty is caused by the complexity of the human-machine-earth system.

  15. 43 CFR 4730.2 - Disposal of remains.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... HORSES AND BURROS Destruction of Wild Horses or Burros and Disposal of Remains § 4730.2 Disposal of remains. Remains of wild horses or burros that die after capture shall be disposed of in accordance...

  16. 43 CFR 4730.2 - Disposal of remains.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... HORSES AND BURROS Destruction of Wild Horses or Burros and Disposal of Remains § 4730.2 Disposal of remains. Remains of wild horses or burros that die after capture shall be disposed of in accordance...

  17. 43 CFR 4730.2 - Disposal of remains.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HORSES AND BURROS Destruction of Wild Horses or Burros and Disposal of Remains § 4730.2 Disposal of remains. Remains of wild horses or burros that die after capture shall be disposed of in accordance...

  18. 43 CFR 4730.2 - Disposal of remains.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... HORSES AND BURROS Destruction of Wild Horses or Burros and Disposal of Remains § 4730.2 Disposal of remains. Remains of wild horses or burros that die after capture shall be disposed of in accordance...

  19. On the Directional Dependence and Null Space Freedom in Uncertainty Bound Identification

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    1997-01-01

    In previous work, the determination of uncertainty models via minimum norm model validation is based on a single set of input and output measurement data. Since uncertainty bounds at each frequency is directionally dependent for multivariable systems, this will lead to optimistic uncertainty levels. In addition, the design freedom in the uncertainty model has not been utilized to further reduce uncertainty levels. The above issues are addressed by formulating a min- max problem. An analytical solution to the min-max problem is given to within a generalized eigenvalue problem, thus avoiding a direct numerical approach. This result will lead to less conservative and more realistic uncertainty models for use in robust control.

  20. Coal's role in electrical power generation: Will it remain competitive?

    SciTech Connect

    Vogel, C.

    1999-07-01

    Coal is the most abundant worldwide fossil fuel. In the US, coal represents 95% of fossil energy reserves. The US coal resources represent more energy than either proven oil or natural gas reserves and can be expected to last more than 250 years at current consumption rates. Coal fired power plants currently produce 56% of electrical generation in the US and 36% worldwide, and forecasts show coal use to increase. Impressive statistics such as these, along with the direct correlation between electrical growth and GDP should indicate that coal has a bright future. There are some clouds on the horizon, however, that could dim this seemingly rosy picture. Potentially, the greatest challenge to coal's future is CO2 emission restrictions to address global climate change. Realistically, coal has to be a part of the generation mix of developing nations, particularly those with abundant coal resources such as China and India. If electrification of these countries and corresponding economic growth is to take place, there are not presently a lot of cost effective alternatives. This paper presents a discussion of what the coal industry is doing to remain competitive. It looks at environmental and competitive issues facing coal use.

  1. Addressing health literacy in patient decision aids

    PubMed Central

    2013-01-01

    Background Effective use of a patient decision aid (PtDA) can be affected by the user’s health literacy and the PtDA’s characteristics. Systematic reviews of the relevant literature can guide PtDA developers to attend to the health literacy needs of patients. The reviews reported here aimed to assess: 1. a) the effects of health literacy / numeracy on selected decision-making outcomes, and b) the effects of interventions designed to mitigate the influence of lower health literacy on decision-making outcomes, and 2. the extent to which existing PtDAs a) account for health literacy, and b) are tested in lower health literacy populations. Methods We reviewed literature for evidence relevant to these two aims. When high-quality systematic reviews existed, we summarized their evidence. When reviews were unavailable, we conducted our own systematic reviews. Results Aim 1: In an existing systematic review of PtDA trials, lower health literacy was associated with lower patient health knowledge (14 of 16 eligible studies). Fourteen studies reported practical design strategies to improve knowledge for lower health literacy patients. In our own systematic review, no studies reported on values clarity per se, but in 2 lower health literacy was related to higher decisional uncertainty and regret. Lower health literacy was associated with less desire for involvement in 3 studies, less question-asking in 2, and less patient-centered communication in 4 studies; its effects on other measures of patient involvement were mixed. Only one study assessed the effects of a health literacy intervention on outcomes; it showed that using video to improve the salience of health states reduced decisional uncertainty. Aim 2: In our review of 97 trials, only 3 PtDAs overtly addressed the needs of lower health literacy users. In 90% of trials, user health literacy and readability of the PtDA were not reported. However, increases in knowledge and informed choice were reported in those studies

  2. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  3. Uncertainty quantified trait predictions

    NASA Astrophysics Data System (ADS)

    Fazayeli, Farideh; Kattge, Jens; Banerjee, Arindam; Schrodt, Franziska; Reich, Peter

    2015-04-01

    Functional traits of organisms are key to understanding and predicting biodiversity and ecological change, which motivates continuous collection of traits and their integration into global databases. Such composite trait matrices are inherently sparse, severely limiting their usefulness for further analyses. On the other hand, traits are characterized by the phylogenetic trait signal, trait-trait correlations and environmental constraints, all of which provide information that could be used to statistically fill gaps. We propose the application of probabilistic models which, for the first time, utilize all three characteristics to fill gaps in trait databases and predict trait values at larger spatial scales. For this purpose we introduce BHPMF, a hierarchical Bayesian extension of Probabilistic Matrix Factorization (PMF). PMF is a machine learning technique which exploits the correlation structure of sparse matrices to impute missing entries. BHPMF additionally utilizes the taxonomic hierarchy for trait prediction. Implemented in the context of a Gibbs Sampler MCMC approach BHPMF provides uncertainty estimates for each trait prediction. We present comprehensive experimental results on the problem of plant trait prediction using the largest database of plant traits, where BHPMF shows strong empirical performance in uncertainty quantified trait prediction, outperforming the state-of-the-art based on point estimates. Further, we show that BHPMF is more accurate when it is confident, whereas the error is high when the uncertainty is high.

  4. Calibration Under Uncertainty.

    SciTech Connect

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  5. Uncertainty and error in computational simulations

    SciTech Connect

    Oberkampf, W.L.; Diegert, K.V.; Alvin, K.F.; Rutherford, B.M.

    1997-10-01

    The present paper addresses the question: ``What are the general classes of uncertainty and error sources in complex, computational simulations?`` This is the first step of a two step process to develop a general methodology for quantitatively estimating the global modeling and simulation uncertainty in computational modeling and simulation. The second step is to develop a general mathematical procedure for representing, combining and propagating all of the individual sources through the simulation. The authors develop a comprehensive view of the general phases of modeling and simulation. The phases proposed are: conceptual modeling of the physical system, mathematical modeling of the system, discretization of the mathematical model, computer programming of the discrete model, numerical solution of the model, and interpretation of the results. This new view is built upon combining phases recognized in the disciplines of operations research and numerical solution methods for partial differential equations. The characteristics and activities of each of these phases is discussed in general, but examples are given for the fields of computational fluid dynamics and heat transfer. They argue that a clear distinction should be made between uncertainty and error that can arise in each of these phases. The present definitions for uncertainty and error are inadequate and. therefore, they propose comprehensive definitions for these terms. Specific classes of uncertainty and error sources are then defined that can occur in each phase of modeling and simulation. The numerical sources of error considered apply regardless of whether the discretization procedure is based on finite elements, finite volumes, or finite differences. To better explain the broad types of sources of uncertainty and error, and the utility of their categorization, they discuss a coupled-physics example simulation.

  6. Multi-scenario modelling of uncertainty in stochastic chemical systems

    SciTech Connect

    Evans, R. David; Ricardez-Sandoval, Luis A.

    2014-09-15

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo.

  7. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  8. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    PubMed

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management. PMID:27450905

  9. Propagation of radar rainfall uncertainty in urban flood simulations

    NASA Astrophysics Data System (ADS)

    Liguori, Sara; Rico-Ramirez, Miguel

    2013-04-01

    This work discusses the results of the implementation of a novel probabilistic system designed to improve ensemble sewer flow predictions for the drainage network of a small urban area in the North of England. The probabilistic system has been developed to model the uncertainty associated to radar rainfall estimates and propagate it through radar-based ensemble sewer flow predictions. The assessment of this system aims at outlining the benefits of addressing the uncertainty associated to radar rainfall estimates in a probabilistic framework, to be potentially implemented in the real-time management of the sewer network in the study area. Radar rainfall estimates are affected by uncertainty due to various factors [1-3] and quality control and correction techniques have been developed in order to improve their accuracy. However, the hydrological use of radar rainfall estimates and forecasts remains challenging. A significant effort has been devoted by the international research community to the assessment of the uncertainty propagation through probabilistic hydro-meteorological forecast systems [4-5], and various approaches have been implemented for the purpose of characterizing the uncertainty in radar rainfall estimates and forecasts [6-11]. A radar-based ensemble stochastic approach, similar to the one implemented for use in the Southern-Alps by the REAL system [6], has been developed for the purpose of this work. An ensemble generator has been calibrated on the basis of the spatial-temporal characteristics of the residual error in radar estimates assessed with reference to rainfall records from around 200 rain gauges available for the year 2007, previously post-processed and corrected by the UK Met Office [12-13]. Each ensemble member is determined by summing a perturbation field to the unperturbed radar rainfall field. The perturbations are generated by imposing the radar error spatial and temporal correlation structure to purely stochastic fields. A

  10. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  11. New Programming Environments for Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Poeter, E. P.; Banta, E. R.; Christensen, S.; Cooley, R. L.; Ely, D. M.; Babendreier, J.; Leavesley, G.; Tonkin, M.; Julich, R.

    2005-12-01

    We live in a world of faster computers, better GUI's and visualization technology, increasing international cooperation made possible by new digital infrastructure, new agreements between US federal agencies (such as ISCMEM), new European Union programs (such as Harmoniqua), and greater collaboration between US university scientists through CUAHSI. These changes provide new resources for tackling the difficult job of quantifying how well our models perform. This talk introduces new programming environments that take advantage of these new developments and will change the paradigm of how we develop methods for uncertainty evaluation. For example, the programming environments provided by COSU API, JUPITER API, and Sensitivity/Optimization Toolbox provide enormous opportunities for faster and more meaningful evaluation of uncertainties. Instead of waiting years for ideas and theories to be compared in the complex circumstances of interest to resource managers, these new programming environments will expedite the process. In the new paradigm, unproductive ideas and theories will be revealed more quickly, productive ideas and theories will more quickly be used to address our increasingly difficult water resources problems. As examples, two ideas in JUPITER API applications are presented: uncertainty correction factors that account for system complexities not represented in models, and PPR and OPR statistics used to identify new data needed to reduce prediction uncertainty.

  12. Benefits of dealing with uncertainty in greenhouse gas inventories: introduction

    SciTech Connect

    Jonas, Matthias; Winiwarter, Wilfried; Marland, Gregg; White, Thomas; Nahorski, Zbigniew; Bun, Rostyslav

    2010-01-01

    The assessment of greenhouse gases emitted to and removed from the atmosphere is high on the international political and scientific agendas. Growing international concern and cooperation regarding the climate change problem have increased the need for policy-oriented solutions to the issue of uncertainty in, and related to, inventories of greenhouse gas (GHG) emissions. The approaches to addressing uncertainty discussed in this Special Issue reflect attempts to improve national inventories, not only for their own sake but also from a wider, systems analytical perspective-a perspective that seeks to strengthen the usefulness of national inventories under a compliance and/or global monitoring and reporting framework. These approaches demonstrate the benefits of including inventory uncertainty in policy analyses. The authors of the contributed papers show that considering uncertainty helps avoid situations that can, for example, create a false sense of certainty or lead to invalid views of subsystems. This may eventually prevent related errors from showing up in analyses. However, considering uncertainty does not come for free. Proper treatment of uncertainty is costly and demanding because it forces us to make the step from 'simple to complex' and only then to discuss potential simplifications. Finally, comprehensive treatment of uncertainty does not offer policymakers quick and easy solutions. The authors of the papers in this Special Issue do, however, agree that uncertainty analysis must be a key component of national GHG inventory analysis. Uncertainty analysis helps to provide a greater understanding and better science helps us to reduce and deal with uncertainty. By recognizing the importance of identifying and quantifying uncertainties, great strides can be made in ongoing discussions regarding GHG inventories and accounting for climate change. The 17 papers in this Special Issue deal with many aspects of analyzing and dealing with uncertainty in emissions

  13. Using Models that Incorporate Uncertainty

    ERIC Educational Resources Information Center

    Caulkins, Jonathan P.

    2002-01-01

    In this article, the author discusses the use in policy analysis of models that incorporate uncertainty. He believes that all models should consider incorporating uncertainty, but that at the same time it is important to understand that sampling variability is not usually the dominant driver of uncertainty in policy analyses. He also argues that…

  14. Sensitivity of Flow Uncertainty to Radar Rainfall Uncertainty in the Context of Operational Distributed Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Carpenter, T. M.; Georgakakos, K. P.; Georgakakos, K. P.

    2001-12-01

    The current study focuses on the sensitivity of distributed model flow forecast uncertainty to the uncertainty in the radar rainfall input. Various studies estimate a 30 to 100% uncertainty in radar rainfall estimates from the operational NEXRAD radars. This study addresses the following questions: How does this uncertainty in rainfall input impact the flow simulations produced by a hydrologic model? How does this effect compare to the uncertainty in flow forecasts resulting from initial condition and model parametric uncertainty? The hydrologic model used, HRCDHM, is a catchment-based, distributed hydrologic model and accepts hourly precipitation input from the operational WSR-88D weather radar. A GIS is used to process digital terrain data, delineate sub-catchments of a given large watershed, and supply sub-catchment characteristics (subbasin area, stream length, stream slope and channel-network topology) to the hydrologic model components. HRCDHM uses an adaptation of the U.S. NWS operational Sacramento soil moisture accounting model to produce runoff for each sub-catchment within the larger study watershed. Kinematic or Muskingum-Cunge channel routing is implemented to combine and route sub-catchment flows through the channel network. Available spatial soils information is used to vary hydrologic model parameters from sub-catchment to sub-catchment. HRCDHM was applied to the 2,500 km2 Illinois River watershed in Arkansas and Oklahoma with outlet at Tahlequah, Oklahoma. The watershed is under the coverage of the operational WSR-88D radar at Tulsa, Oklahoma. For distributed modeling, the watershed area has been subdivided into sub-catchments with an average area of 80km2. Flow simulations are validated at various gauged locations within the watershed. A Monte Carlo framework was used to assess the sensitivity of the simulated flows to uncertainty in radar input for different radar error distributions (uniform or exponential), and to make comparisons to the flow

  15. Future Remains: Industrial Heritage at the Hanford Plutonium Works

    NASA Astrophysics Data System (ADS)

    Freer, Brian

    This dissertation argues that U.S. environmental and historic preservation regulations, industrial heritage projects, history, and art only provide partial frameworks for successfully transmitting an informed story into the long range future about nuclear technology and its related environmental legacy. This argument is important because plutonium from nuclear weapons production is toxic to humans in very small amounts, threatens environmental health, has a half-life of 24, 110 years and because the industrial heritage project at Hanford is the first time an entire U.S. Department of Energy weapons production site has been designated a U.S. Historic District. This research is situated within anthropological interest in industrial heritage studies, environmental anthropology, applied visual anthropology, as well as wider discourses on nuclear studies. However, none of these disciplines is really designed or intended to be a completely satisfactory frame of reference for addressing this perplexing challenge of documenting and conveying an informed story about nuclear technology and its related environmental legacy into the long range future. Others have thought about this question and have made important contributions toward a potential solution. Examples here include: future generations movements concerning intergenerational equity as evidenced in scholarship, law, and amongst Native American groups; Nez Perce and Confederated Tribes of the Umatilla Indian Reservation responses to the Hanford End State Vision and Hanford's Canyon Disposition Initiative; as well as the findings of organizational scholars on the advantages realized by organizations that have a long term future perspective. While these ideas inform the main line inquiry of this dissertation, the principal approach put forth by the researcher of how to convey an informed story about nuclear technology and waste into the long range future is implementation of the proposed Future Remains clause, as

  16. Addressing spectroscopic quality of covariant density functional theory

    NASA Astrophysics Data System (ADS)

    Afanasjev, A. V.

    2015-03-01

    The spectroscopic quality of covariant density functional theory has been accessed by analyzing the accuracy and theoretical uncertainties in the description of spectroscopic observables. Such analysis is first presented for the energies of the single-particle states in spherical and deformed nuclei. It is also shown that the inclusion of particle-vibration coupling improves the description of the energies of predominantly single-particle states in medium and heavy-mass spherical nuclei. However, the remaining differences between theory and experiment clearly indicate missing physics and missing terms in covariant energy density functionals. The uncertainties in the predictions of the position of two-neutron drip line sensitively depend on the uncertainties in the prediction of the energies of the single-particle states. On the other hand, many spectroscopic observables in well deformed nuclei at ground state and finite spin only weakly depend on the choice of covariant energy density functional.

  17. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    SciTech Connect

    Díez, C.J.; Cabellos, O.; Martínez, J.S.

    2015-01-15

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  18. Nuclear Data Uncertainty Propagation in Depletion Calculations Using Cross Section Uncertainties in One-group or Multi-group

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Cabellos, O.; Martínez, J. S.

    2015-01-01

    Several approaches have been developed in last decades to tackle nuclear data uncertainty propagation problems of burn-up calculations. One approach proposed was the Hybrid Method, where uncertainties in nuclear data are propagated only on the depletion part of a burn-up problem. Because only depletion is addressed, only one-group cross sections are necessary, and hence, their collapsed one-group uncertainties. This approach has been applied successfully in several advanced reactor systems like EFIT (ADS-like reactor) or ESFR (Sodium fast reactor) to assess uncertainties on the isotopic composition. However, a comparison with using multi-group energy structures was not carried out, and has to be performed in order to analyse the limitations of using one-group uncertainties.

  19. Addressing problems of employee performance.

    PubMed

    McConnell, Charles R

    2011-01-01

    Employee performance problems are essentially of 2 kinds: those that are motivational in origin and those resulting from skill deficiencies. Both kinds of problems are the province of the department manager. Performance problems differ from problems of conduct in that traditional disciplinary processes ordinarily do not apply. Rather, performance problems are addressed through educational and remedial processes. The manager has a basic responsibility in ensuring that everything reasonable is done to help each employee succeed. There are a number of steps the manager can take to address employee performance problems. PMID:21537142

  20. Methods for handling uncertainty within pharmaceutical funding decisions

    NASA Astrophysics Data System (ADS)

    Stevenson, Matt; Tappenden, Paul; Squires, Hazel

    2014-01-01

    This article provides a position statement regarding decision making under uncertainty within the economic evaluation of pharmaceuticals, with a particular focus upon the National Institute for Health and Clinical Excellence context within England and Wales. This area is of importance as funding agencies have a finite budget from which to purchase a selection of competing health care interventions. The objective function generally used is that of maximising societal health with an explicit acknowledgement that there will be opportunity costs associated with purchasing a particular intervention. Three components of uncertainty are discussed within a pharmaceutical funding perspective: methodological uncertainty, parameter uncertainty and structural uncertainty, alongside a discussion of challenges that are particularly pertinent to health economic evaluation. The discipline has focused primarily on handling methodological and parameter uncertainty and a clear reference case has been developed for consistency across evaluations. However, uncertainties still remain. Less attention has been given to methods for handling structural uncertainty. The lack of adequate methods to explicitly incorporate this aspect of model development may result in the true uncertainty surrounding health care investment decisions being underestimated. Research in this area is ongoing as we review.

  1. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi

  2. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1972-01-01

    Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.

  3. Uncertainty and complexity in personal health records.

    PubMed

    Hudson, Donna L; Cohen, Maurice E

    2010-01-01

    New technologies in medicine have led to an explosion in the number of parameters that must be considered when diagnosing and treating a patient. Because of this high volume of data it is not possible for the human decision maker to take all information into account in arriving at a decision. Automated methods are needed to effectively evaluate electronic information in many formats and provide summaries to the medical professional. The task is complicated by the complexity of the data and the potential uncertainty of some of the results. In this article complexity and uncertainty in medical data are discussed in terms of both representation and types of analysis. Methods that can address multiple complex data types are illustrated and examples are provided for specific medical problems. These methods are particularly important for automated trend analysis in the personal health record as small errors can be propagated through the complex system resulting in incorrect diagnosis and treatment. PMID:21095837

  4. Theoretical uncertainties in proton lifetime estimates

    NASA Astrophysics Data System (ADS)

    Kolešová, Helena; Malinský, Michal; Mede, Timon

    2016-06-01

    We recapitulate the primary sources of theoretical uncertainties in proton lifetime estimates in renormalizable, four-dimensional & non-supersymmetric grand unifications that represent the most conservative framework in which this question may be addressed at the perturbative level. We point out that many of these uncertainties are so severe and often even irreducible that there are only very few scenarios in which an NLO approach, as crucial as it is for a real testability of any specific model, is actually sensible. Among these, the most promising seems to be the minimal renormalizable SO(10) GUT whose high-energy gauge symmetry is spontaneously broken by the adjoint and the five-index antisymmetric irreducible representations.

  5. A review of the generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Nasser Tawfik, Abdel; Magied Diab, Abdel

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some ‘thought’ experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed.

  6. A review of the generalized uncertainty principle.

    PubMed

    Tawfik, Abdel Nasser; Diab, Abdel Magied

    2015-12-01

    Based on string theory, black hole physics, doubly special relativity and some 'thought' experiments, minimal distance and/or maximum momentum are proposed. As alternatives to the generalized uncertainty principle (GUP), the modified dispersion relation, the space noncommutativity, the Lorentz invariance violation, and the quantum-gravity-induced birefringence effects are summarized. The origin of minimal measurable quantities and the different GUP approaches are reviewed and the corresponding observations are analysed. Bounds on the GUP parameter are discussed and implemented in the understanding of recent PLANCK observations of cosmic inflation. The higher-order GUP approaches predict minimal length uncertainty with and without maximum momenta. Possible arguments against the GUP are discussed; for instance, the concern about its compatibility with the equivalence principles, the universality of gravitational redshift and the free fall and law of reciprocal action are addressed. PMID:26512022

  7. Models in animal collective decision-making: information uncertainty and conflicting preferences.

    PubMed

    Conradt, Larissa

    2012-04-01

    Collective decision-making plays a central part in the lives of many social animals. Two important factors that influence collective decision-making are information uncertainty and conflicting preferences. Here, I bring together, and briefly review, basic models relating to animal collective decision-making in situations with information uncertainty and in situations with conflicting preferences between group members. The intention is to give an overview about the different types of modelling approaches that have been employed and the questions that they address and raise. Despite the use of a wide range of different modelling techniques, results show a coherent picture, as follows. Relatively simple cognitive mechanisms can lead to effective information pooling. Groups often face a trade-off between decision accuracy and speed, but appropriate fine-tuning of behavioural parameters could achieve high accuracy while maintaining reasonable speed. The right balance of interdependence and independence between animals is crucial for maintaining group cohesion and achieving high decision accuracy. In conflict situations, a high degree of decision-sharing between individuals is predicted, as well as transient leadership and leadership according to needs and physiological status. Animals often face crucial trade-offs between maintaining group cohesion and influencing the decision outcome in their own favour. Despite the great progress that has been made, there remains one big gap in our knowledge: how do animals make collective decisions in situations when information uncertainty and conflict of interest operate simultaneously? PMID:23565335

  8. Models in animal collective decision-making: information uncertainty and conflicting preferences

    PubMed Central

    Conradt, Larissa

    2012-01-01

    Collective decision-making plays a central part in the lives of many social animals. Two important factors that influence collective decision-making are information uncertainty and conflicting preferences. Here, I bring together, and briefly review, basic models relating to animal collective decision-making in situations with information uncertainty and in situations with conflicting preferences between group members. The intention is to give an overview about the different types of modelling approaches that have been employed and the questions that they address and raise. Despite the use of a wide range of different modelling techniques, results show a coherent picture, as follows. Relatively simple cognitive mechanisms can lead to effective information pooling. Groups often face a trade-off between decision accuracy and speed, but appropriate fine-tuning of behavioural parameters could achieve high accuracy while maintaining reasonable speed. The right balance of interdependence and independence between animals is crucial for maintaining group cohesion and achieving high decision accuracy. In conflict situations, a high degree of decision-sharing between individuals is predicted, as well as transient leadership and leadership according to needs and physiological status. Animals often face crucial trade-offs between maintaining group cohesion and influencing the decision outcome in their own favour. Despite the great progress that has been made, there remains one big gap in our knowledge: how do animals make collective decisions in situations when information uncertainty and conflict of interest operate simultaneously? PMID:23565335

  9. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  10. Matching Alternative Addresses: a Semantic Web Approach

    NASA Astrophysics Data System (ADS)

    Ariannamazi, S.; Karimipour, F.; Hakimpour, F.

    2015-12-01

    Rapid development of crowd-sourcing or volunteered geographic information (VGI) provides opportunities for authoritatives that deal with geospatial information. Heterogeneity of multiple data sources and inconsistency of data types is a key characteristics of VGI datasets. The expansion of cities resulted in the growing number of POIs in the OpenStreetMap, a well-known VGI source, which causes the datasets to outdate in short periods of time. These changes made to spatial and aspatial attributes of features such as names and addresses might cause confusion or ambiguity in the processes that require feature's literal information like addressing and geocoding. VGI sources neither will conform specific vocabularies nor will remain in a specific schema for a long period of time. As a result, the integration of VGI sources is crucial and inevitable in order to avoid duplication and the waste of resources. Information integration can be used to match features and qualify different annotation alternatives for disambiguation. This study enhances the search capabilities of geospatial tools with applications able to understand user terminology to pursuit an efficient way for finding desired results. Semantic web is a capable tool for developing technologies that deal with lexical and numerical calculations and estimations. There are a vast amount of literal-spatial data representing the capability of linguistic information in knowledge modeling, but these resources need to be harmonized based on Semantic Web standards. The process of making addresses homogenous generates a helpful tool based on spatial data integration and lexical annotation matching and disambiguating.

  11. New Evidence Links Stellar Remains to Oldest Recorded Supernova

    NASA Astrophysics Data System (ADS)

    2006-09-01

    Recent observations have uncovered evidence that helps to confirm the identification of the remains of one of the earliest stellar explosions recorded by humans. The new study shows that the supernova remnant RCW 86 is much younger than previously thought. As such, the formation of the remnant appears to coincide with a supernova observed by Chinese astronomers in 185 A.D. The study used data from NASA's Chandra X-ray Observatory and the European Space Agency's XMM-Newton Observatory, "There have been previous suggestions that RCW 86 is the remains of the supernova from 185 A.D.," said Jacco Vink of University of Utrecht, the Netherlands, and lead author of the study. "These new X-ray data greatly strengthen the case." When a massive star runs out of fuel, it collapses on itself, creating a supernova that can outshine an entire galaxy. The intense explosion hurls the outer layers of the star into space and produces powerful shock waves. The remains of the star and the material it encounters are heated to millions of degrees and can emit intense X-ray radiation for thousands of years. Animation of a Massive Star Explosion Animation of a Massive Star Explosion In their stellar forensic work, Vink and colleagues studied the debris in RCW 86 to estimate when its progenitor star originally exploded. They calculated how quickly the shocked, or energized, shell is moving in RCW 86, by studying one part of the remnant. They combined this expansion velocity with the size of the remnant and a basic understanding of how supernovas expand to estimate the age of RCW 86. "Our new calculations tell us the remnant is about 2,000 years old," said Aya Bamba, a coauthor from the Institute of Physical and Chemical Research (RIKEN), Japan. "Previously astronomers had estimated an age of 10,000 years." The younger age for RCW 86 may explain an astronomical event observed almost 2000 years ago. In 185 AD, Chinese astronomers (and possibly the Romans) recorded the appearance of a new

  12. Addressing Phonological Questions with Ultrasound

    ERIC Educational Resources Information Center

    Davidson, Lisa

    2005-01-01

    Ultrasound can be used to address unresolved questions in phonological theory. To date, some studies have shown that results from ultrasound imaging can shed light on how differences in phonological elements are implemented. Phenomena that have been investigated include transitional schwa, vowel coalescence, and transparent vowels. A study of…

  13. Communities Address Barriers to Connectivity.

    ERIC Educational Resources Information Center

    Byers, Anne

    1996-01-01

    Rural areas lag behind urban areas in access to information technologies. Public institutions play a critical role in extending the benefits of information technologies to those who would not otherwise have access. The most successful rural telecommunications plans address barriers to use, such as unawareness of the benefits, technophobia, the…

  14. Keynote Address: Rev. Mark Massa

    ERIC Educational Resources Information Center

    Massa, Mark S.

    2011-01-01

    Rev. Mark S. Massa, S.J., is the dean and professor of Church history at the School of Theology and Ministry at Boston College. He was invited to give a keynote to begin the third Catholic Higher Education Collaborative Conference (CHEC), cosponsored by Boston College and Fordham University. Fr. Massa's address posed critical questions about…

  15. State of the Lab Address

    ScienceCinema

    King, Alex

    2013-03-01

    In his third-annual State of the Lab address, Ames Laboratory Director Alex King called the past year one of "quiet but strong progress" and called for Ames Laboratory to continue to build on its strengths while responding to changing expectations for energy research.

  16. State of the Lab Address

    SciTech Connect

    King, Alex

    2010-01-01

    In his third-annual State of the Lab address, Ames Laboratory Director Alex King called the past year one of "quiet but strong progress" and called for Ames Laboratory to continue to build on its strengths while responding to changing expectations for energy research.

  17. Teaching Quantum Uncertainty1

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2011-10-01

    An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"—spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

  18. The maintenance of uncertainty

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    Introduction Preliminaries State-space dynamics Linearized dynamics of infinitesimal uncertainties Instantaneous infinitesimal dynamics Finite-time evolution of infinitesimal uncertainties Lyapunov exponents and predictability The Baker's apprentice map Infinitesimals and predictability Dimensions The Grassberger-Procaccia algorithm Towards a better estimate from Takens' estimators Space-time-separation diagrams Intrinsic limits to the analysis of geometry Takens' theorem The method of delays Noise Prediction, prophecy, and pontification Introduction Simulations, models and physics Ground rules Data-based models: dynamic reconstructions Analogue prediction Local prediction Global prediction Accountable forecasts of chaotic systems Evaluating ensemble forecasts The annulus Prophecies Aids for more reliable nonlinear analysis Significant results: surrogate data, synthetic data and self-deception Surrogate data and the bootstrap Surrogate predictors: Is my model any good? Hints for the evaluation of new techniques Avoiding simple straw men Feasibility tests for the identification of chaos On detecting "tiny" data sets Building models consistent with the observations Cost functions ι-shadowing: Is my model any good? (reprise) Casting infinitely long shadows (out-of-sample) Distinguishing model error and system sensitivity Forecast error and model sensitivity Accountability Residual predictability Deterministic or stochastic dynamics? Using ensembles to distinguish the expectation from the expected Numerical Weather Prediction Probabilistic prediction with a deterministic model The analysis Constructing and interpreting ensembles The outlook(s) for today Conclusion Summary

  19. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  20. Uncertainty in adaptive capacity

    NASA Astrophysics Data System (ADS)

    Adger, W. Neil; Vincent, Katharine

    2005-03-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).

  1. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  2. Using a Meniscus to Teach Uncertainty in Measurement

    NASA Astrophysics Data System (ADS)

    Backman, Philip

    2008-02-01

    I have found that students easily understand that a measurement cannot be exact, but they often seem to lack an understanding of why it is important to know something about the magnitude of the uncertainty. This tends to promote an attitude that almost any uncertainty value will do. Such indifference may exist because once an uncertainty is determined or calculated, it remains as only a number without a concrete physical connection back to the experiment. For the activity described here—presented as a challenge—groups of students are given a container and asked to make certain measurements and to estimate the uncertainty in each of those measurements. They are then challenged to complete a particular task involving the container and a volume of water. Whether the assigned task is actually achievable, however, slowly comes into question once the magnitude of the uncertainties in the original measurements is compared to the specific requirements of the challenge.

  3. Uncertainty Analysis of the Three Pagodas Fault-Source Geometry

    NASA Astrophysics Data System (ADS)

    Haller, K. M.

    2015-12-01

    Probabilistic seismic-hazard assessment generally relies on an earthquake catalog (to estimate future seismicity from the locations and rates of past earthquakes) and faults sources (to estimate future seismicity from the known paleoseismic history of surface rupture). The paleoseismic history of potentially active faults in Southeast Asia is addressed at few locations and spans only a few complete recurrence intervals; many faults remain unstudied. Even where the timing of a surface-rupturing earthquakes is known, the extent of rupture may not be well constrained. Therefore, subjective judgment of experts is often used to define the three-dimensional size of future ruptures; limited paleoseismic data can lead to large uncertainties in ground-motion hazard from fault sources due to the preferred models that underlie these judgments. The 300-km-long, strike-slip Three Pagodas fault in western Thailand is possibly one of the most active faults in the country. The fault parallels the plate boundary and may be characterized by a slip rate high enough to result in measurable ground-motion at periods of interest for building design. The known paleoseismic history is limited and likely does not include the largest possible earthquake on the fault. This lack of knowledge begs the question what sizes of earthquakes are expected? Preferred rupture models constrain possible magnitude-frequency distributions, and alternative rupture models can result in different ground-motion hazard near the fault. This analysis includes alternative rupture models for the Three Pagodas fault, a first-level check against gross modeling assumptions to assure the source model is a reasonable reflection of observed data, and resulting ground-motion hazard for each alternative. Inadequate paleoseismic data is an important source of uncertainty that could be compensated for by considering alternative rupture models for poorly known seismic sources.

  4. Are models, uncertainty, and dispute resolution compatible?

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  5. 7 CFR 160.29 - Containers to remain intact.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Containers to remain intact. 160.29 Section 160.29... STANDARDS FOR NAVAL STORES Analysis, Inspection, and Grading on Request § 160.29 Containers to remain intact... the containers holding such naval stores remain intact as sampled until the analysis,...

  6. Communicating Storm Surge Forecast Uncertainty

    NASA Astrophysics Data System (ADS)

    Troutman, J. A.; Rhome, J.

    2015-12-01

    When it comes to tropical cyclones, storm surge is often the greatest threat to life and property along the coastal United States. The coastal population density has dramatically increased over the past 20 years, putting more people at risk. Informing emergency managers, decision-makers and the public about the potential for wind driven storm surge, however, has been extremely difficult. Recently, the Storm Surge Unit at the National Hurricane Center in Miami, Florida has developed a prototype experimental storm surge watch/warning graphic to help communicate this threat more effectively by identifying areas most at risk for life-threatening storm surge. This prototype is the initial step in the transition toward a NWS storm surge watch/warning system and highlights the inundation levels that have a 10% chance of being exceeded. The guidance for this product is the Probabilistic Hurricane Storm Surge (P-Surge) model, which predicts the probability of various storm surge heights by statistically evaluating numerous SLOSH model simulations. Questions remain, however, if exceedance values in addition to the 10% may be of equal importance to forecasters. P-Surge data from 2014 Hurricane Arthur is used to ascertain the practicality of incorporating other exceedance data into storm surge forecasts. Extracting forecast uncertainty information through analyzing P-surge exceedances overlaid with track and wind intensity forecasts proves to be beneficial for forecasters and decision support.

  7. QUANTIFYING UNCERTAINTY IN LONG RANGE TRANSPORT MODELS: WORKSHOP REPORT ON SOURCES AND EVALUATION OF UNCERTAINTY IN LONG-RANGE TRANSPORT MODELS

    EPA Science Inventory

    The quantification of uncertainty in long-range transport model predictions and the implications of these uncertainties on formulations of control policy have been the subject of investigations by both the United States and Canada. To more fully address these topics, the American...

  8. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  9. Risk communication: Uncertainties and the numbers game

    SciTech Connect

    Ortigara, M.

    1995-08-30

    The science of risk assessment seeks to characterize the potential risk in situations that may pose hazards to human health or the environment. However, the conclusions reached by the scientists and engineers are not an end in themselves - they are passed on to the involved companies, government agencies, legislators, and the public. All interested parties must then decide what to do with the information. Risk communication is a type of technical communication that involves some unique challenges. This paper first defines the relationships between risk assessment, risk management, and risk communication and then explores two issues in risk communication: addressing uncertainty and putting risk number into perspective.

  10. Uncertainties in container failure time predictions

    SciTech Connect

    Williford, R.E.

    1990-01-01

    Stochastic variations in the local chemical environment of a geologic waste repository can cause corresponding variations in container corrosion rates and failure times, and thus in radionuclide release rates. This paper addresses how well the future variations in repository chemistries must be known in order to predict container failure times that are bounded by a finite time period within the repository lifetime. Preliminary results indicate that a 5000 year scatter in predicted container failure times requires that repository chemistries be known to within {plus minus}10% over the repository lifetime. These are small uncertainties compared to current estimates. 9 refs., 3 figs.

  11. Method and apparatus to predict the remaining service life of an operating system

    DOEpatents

    Greitzer, Frank L.; Kangas, Lars J.; Terrones, Kristine M.; Maynard, Melody A.; Pawlowski, Ronald A. , Ferryman; Thomas A.; Skorpik, James R.; Wilson, Bary W.

    2008-11-25

    A method and computer-based apparatus for monitoring the degradation of, predicting the remaining service life of, and/or planning maintenance for, an operating system are disclosed. Diagnostic information on degradation of the operating system is obtained through measurement of one or more performance characteristics by one or more sensors onboard and/or proximate the operating system. Though not required, it is preferred that the sensor data are validated to improve the accuracy and reliability of the service life predictions. The condition or degree of degradation of the operating system is presented to a user by way of one or more calculated, numeric degradation figures of merit that are trended against one or more independent variables using one or more mathematical techniques. Furthermore, more than one trendline and uncertainty interval may be generated for a given degradation figure of merit/independent variable data set. The trendline(s) and uncertainty interval(s) are subsequently compared to one or more degradation figure of merit thresholds to predict the remaining service life of the operating system. The present invention enables multiple mathematical approaches in determining which trendline(s) to use to provide the best estimate of the remaining service life.

  12. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1971-01-01

    Satellite altitude determination uncertainties are discussed from the standpoint of the GEOS-C satellite. GEOS-C will be tracked by a number of the conventional satellite tracking systems, as well as by two advanced systems; a satellite-to-satellite tracking system and lasers capable of decimeter accuracies which are being developed in connection with the Goddard Earth and Ocean Dynamics Applications program. The discussion is organized in terms of a specific type of GEOS-C orbit which would satisfy a number of scientific objectives including the study of the gravitational field by means of both the altimeter and the satellite-to-satellite tracking system, studies of tides, and the Gulf Stream meanders.

  13. Uncertainty as Certaint

    NASA Astrophysics Data System (ADS)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  14. Living with uncertainty

    SciTech Connect

    Rau, N.; Fong, C.C.; Grigg, C.H.; Silverstein, B.

    1994-11-01

    In the electric utility industry, only one thing can be guaranteed with absolute certainty: one lives and works with many unknowns. Thus, the industry has embraced probability methods to varying degrees over the last 25 years. These techniques aid decision makers in planning, operations, and maintenance by quantifying uncertainty. Examples include power system reliability, production costing simulation, and assessment of environmental factors. A series of brainstorming sessions was conducted by the Application of Probability Methods (APM) Subcommittee of the IEEE Power Engineering Society to identify research and development needs and to ask the question, ''where should we go from here '' The subcommittee examined areas of need in data development, applications, and methods for decision making. The purpose of this article is to share the thoughts of APM members with a broader audience to the findings and to invite comments and participation.

  15. Simple Resonance Hierarchy for Surmounting Quantum Uncertainty

    SciTech Connect

    Amoroso, Richard L.

    2010-12-22

    For a hundred years violation or surmounting the Quantum Uncertainty Principle has remained a Holy Grail of both theoretical and empirical physics. Utilizing an operationally completed form of Quantum Theory cast in a string theoretic Higher Dimensional (HD) form of Dirac covariant polarized vacuum with a complex Einstein energy dependent spacetime metric, M{sub 4{+-}}C{sub 4} with sufficient degrees of freedom to be causally free of the local quantum state, we present a simple empirical model for ontologically surmounting the phenomenology of uncertainty through a Sagnac Effect RF pulsed Laser Oscillated Vacuum Energy Resonance hierarchy cast within an extended form of a Wheeler-Feynman-Cramer Transactional Calabi-Yau mirror symmetric spacetime bachcloth.

  16. Propagation of radar rainfall uncertainty in urban flood simulations

    NASA Astrophysics Data System (ADS)

    Liguori, Sara; Rico-Ramirez, Miguel

    2013-04-01

    This work discusses the results of the implementation of a novel probabilistic system designed to improve ensemble sewer flow predictions for the drainage network of a small urban area in the North of England. The probabilistic system has been developed to model the uncertainty associated to radar rainfall estimates and propagate it through radar-based ensemble sewer flow predictions. The assessment of this system aims at outlining the benefits of addressing the uncertainty associated to radar rainfall estimates in a probabilistic framework, to be potentially implemented in the real-time management of the sewer network in the study area. Radar rainfall estimates are affected by uncertainty due to various factors [1-3] and quality control and correction techniques have been developed in order to improve their accuracy. However, the hydrological use of radar rainfall estimates and forecasts remains challenging. A significant effort has been devoted by the international research community to the assessment of the uncertainty propagation through probabilistic hydro-meteorological forecast systems [4-5], and various approaches have been implemented for the purpose of characterizing the uncertainty in radar rainfall estimates and forecasts [6-11]. A radar-based ensemble stochastic approach, similar to the one implemented for use in the Southern-Alps by the REAL system [6], has been developed for the purpose of this work. An ensemble generator has been calibrated on the basis of the spatial-temporal characteristics of the residual error in radar estimates assessed with reference to rainfall records from around 200 rain gauges available for the year 2007, previously post-processed and corrected by the UK Met Office [12-13]. Each ensemble member is determined by summing a perturbation field to the unperturbed radar rainfall field. The perturbations are generated by imposing the radar error spatial and temporal correlation structure to purely stochastic fields. A

  17. The Study of Address Tree Coding Based on the Maximum Matching Algorithm in Courier Business

    NASA Astrophysics Data System (ADS)

    Zhou, Shumin; Tang, Bin; Li, Wen

    As an important component of EMS monitoring system, address is different from user name with great uncertainty because there are many ways to represent it. Therefore, address standardization is a difficult task. Address tree coding has been trying to resolve that issue for many years. Zip code, as its most widely used algorithm, can only subdivide the address down to a designated post office, not the recipients' address. This problem needs artificial identification method to be accurately delivered. This paper puts forward a new encoding algorithm of the address tree - the maximum matching algorithm to solve the problem. This algorithm combines the characteristics of the address tree and the best matching theory, and brings in the associated layers of tree nodes to improve the matching efficiency. Taking the variability of address into account, the thesaurus of address tree should be updated timely by increasing new nodes automatically through intelligent tools.

  18. Sky-view factor visualization for detection of archaeological remains

    NASA Astrophysics Data System (ADS)

    Kokalj, Žiga; Oštir, Krištof; Zakšek, Klemen

    2013-04-01

    Many archaeological remains are covered by sand or vegetation but it still possible to detect them by remote sensing techniques. One of them is airborne laser scanning that enables production of digital elevation models (DEM) of very high resolution (better than 1 m) with high relative elevation accuracy (centimetre level), even under forest. Thus, it has become well established in archaeological applications. However, effective interpretation of digital elevation models requires appropriate data visualization. Analytical relief shading is used in most cases. Although widely accepted, this method has two major drawbacks: identifying details in deep shades and inability to properly represent linear features lying parallel to the light beam. Several authors have tried to overcome these limitations by changing the position of the light source or by filtering. This contribution addresses the DEM visualization problem by sky-view factor, a visualization technique based on diffuse light that overcomes the directional problems of hill-shading. Sky-view factor is a parameter that describes the portion of visible sky limited by relief. It can be used as a general relief visualization technique to show relief characteristics. In particular, we show that this visualization is a very useful tool in archaeology. Applying the sky-view factor for visualization purposes gives advantages over other techniques because it reveals small (or large, depending on the scale of the observed phenomenon and consequential algorithm settings) relief features while preserving the perception of general topography. In the case study (DEM visualization of a fortified enclosure of Tonovcov grad in Slovenia) we show that for the archaeological purposes the sky-view factor is the optimal DEM visualization method. Its ability to consider the neighborhood context makes it an outstanding tool when compared to other visualization techniques. One can choose a large search radius and the most important

  19. Communicating uncertainty in hydrological forecasts: mission impossible?

    NASA Astrophysics Data System (ADS)

    Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian

    2010-05-01

    scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.

  20. Impact of discharge data uncertainty on nutrient load uncertainty

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars

    2016-04-01

    Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.

  1. A framework for modeling anthropogenic impacts on waterbird habitats: addressing future uncertainty in conservation planning

    USGS Publications Warehouse

    Matchett, Elliott L.; Fleskes, Joseph P.; Young, Charles A.; Purkey, David R.

    2015-01-01

    The amount and quality of natural resources available for terrestrial and aquatic wildlife habitats are expected to decrease throughout the world in areas that are intensively managed for urban and agricultural uses. Changes in climate and management of increasingly limited water supplies may further impact water resources essential for sustaining habitats. In this report, we document adapting a Water Evaluation and Planning (WEAP) system model for the Central Valley of California. We demonstrate using this adapted model (WEAP-CVwh) to evaluate impacts produced from plausible future scenarios on agricultural and wetland habitats used by waterbirds and other wildlife. Processed output from WEAP-CVwh indicated varying levels of impact caused by projected climate, urbanization, and water supply management in scenarios used to exemplify this approach. Among scenarios, the NCAR-CCSM3 A2 climate projection had a greater impact than the CNRM-CM3 B1 climate projection, whereas expansive urbanization had a greater impact than strategic urbanization, on annual availability of waterbird habitat. Scenarios including extensive rice-idling or substantial instream flow requirements on important water supply sources produced large impacts on annual availability of waterbird habitat. In the year corresponding with the greatest habitat reduction for each scenario, the scenario including instream flow requirements resulted in the greatest decrease in habitats throughout all months of the wintering period relative to other scenarios. This approach provides a new and useful tool for habitat conservation planning in the Central Valley and a model to guide similar research investigations aiming to inform conservation, management, and restoration of important wildlife habitats.

  2. Using Robust Decision Making to Address Climate Change Uncertainties in Water Quality Management

    EPA Science Inventory

    Results of robust decision making simulations show that both climate and land use change will need to be taken into account in order to implement BMP strategies that are more likely to meet the goals for the Patuxent river for both Phosphorus and Nitrogen.

  3. Evaluating Health Risks from Inhaled Polychlorinated Biphenyls: Research Needs for Addressing Uncertainty

    EPA Science Inventory

    Indoor air polychlorinated biphenyl (PCB) concentrations in some U.S. schools are one or more orders of magnitude higher than background levels. In response to this, efforts have been made to assess the potential health risk posed by inhaled PCBs. These efforts are hindered by un...

  4. New approaches to uncertainty analysis for use in aggregate and cumulative risk assessment of pesticides.

    PubMed

    Kennedy, Marc C; van der Voet, Hilko; Roelofs, Victoria J; Roelofs, Willem; Glass, C Richard; de Boer, Waldo J; Kruisselbrink, Johannes W; Hart, Andy D M

    2015-05-01

    Risk assessments for human exposures to plant protection products (PPPs) have traditionally focussed on single routes of exposure and single compounds. Extensions to estimate aggregate (multi-source) and cumulative (multi-compound) exposure from PPPs present many new challenges and additional uncertainties that should be addressed as part of risk analysis and decision-making. A general approach is outlined for identifying and classifying the relevant uncertainties and variabilities. The implementation of uncertainty analysis within the MCRA software, developed as part of the EU-funded ACROPOLIS project to address some of these uncertainties, is demonstrated. An example is presented for dietary and non-dietary exposures to the triazole class of compounds. This demonstrates the chaining of models, linking variability and uncertainty generated from an external model for bystander exposure with variability and uncertainty in MCRA dietary exposure assessments. A new method is also presented for combining pesticide usage survey information with limited residue monitoring data, to address non-detect uncertainty. The results show that incorporating usage information reduces uncertainty in parameters of the residue distribution but that in this case quantifying uncertainty is not a priority, at least for UK grown crops. A general discussion of alternative approaches to treat uncertainty, either quantitatively or qualitatively, is included. PMID:25688423

  5. Uncertainty and Anticipation in Anxiety

    PubMed Central

    Grupe, Dan W.; Nitschke, Jack B.

    2014-01-01

    Uncertainty about a possible future threat disrupts our ability to avoid it or to mitigate its negative impact, and thus results in anxiety. Here, we focus the broad literature on the neurobiology of anxiety through the lens of uncertainty. We identify five processes essential for adaptive anticipatory responses to future threat uncertainty, and propose that alterations to the neural instantiation of these processes results in maladaptive responses to uncertainty in pathological anxiety. This framework has the potential to advance the classification, diagnosis, and treatment of clinical anxiety. PMID:23783199

  6. Nanoscale content-addressable memory

    NASA Technical Reports Server (NTRS)

    Davis, Bryan (Inventor); Principe, Jose C. (Inventor); Fortes, Jose (Inventor)

    2009-01-01

    A combined content addressable memory device and memory interface is provided. The combined device and interface includes one or more one molecular wire crossbar memories having spaced-apart key nanowires, spaced-apart value nanowires adjacent to the key nanowires, and configurable switches between the key nanowires and the value nanowires. The combination further includes a key microwire-nanowire grid (key MNG) electrically connected to the spaced-apart key nanowires, and a value microwire-nanowire grid (value MNG) electrically connected to the spaced-apart value nanowires. A key or value MNGs selects multiple nanowires for a given key or value.

  7. Addressing the workforce pipeline challenge

    SciTech Connect

    Leonard Bond; Kevin Kostelnik; Richard Holman

    2006-11-01

    A secure and affordable energy supply is essential for achieving U.S. national security, in continuing U.S. prosperity and in laying the foundations to enable future economic growth. To meet this goal the next generation energy workforce in the U.S., in particular those needed to support instrumentation, controls and advanced operations and maintenance, is a critical element. The workforce is aging and a new workforce pipeline, to support both current generation and new build has yet to be established. The paper reviews the challenges and some actions being taken to address this need.

  8. Identifying and Addressing Vaccine Hesitancy

    PubMed Central

    Kestenbaum, Lori A.; Feemster, Kristen A.

    2015-01-01

    In the 20th century, the introduction of multiple vaccines significantly reduced childhood morbidity, mortality, and disease outbreaks. Despite, and perhaps because of, their public health impact, an increasing number of parents and patients are choosing to delay or refuse vaccines. These individuals are described as vaccine hesitant. This phenomenon has developed due to the confluence of multiple social, cultural, political and personal factors. As immunization programs continue to expand, understanding and addressing vaccine hesitancy will be crucial to their successful implementation. This review explores the history of vaccine hesitancy, its causes, and suggested approaches for reducing hesitancy and strengthening vaccine acceptance. PMID:25875982

  9. Identifying and addressing vaccine hesitancy.

    PubMed

    Kestenbaum, Lori A; Feemster, Kristen A

    2015-04-01

    In the 20th century, the introduction of multiple vaccines significantly reduced childhood morbidity, mortality, and disease outbreaks. Despite, and perhaps because of, their public health impact, an increasing number of parents and patients are choosing to delay or refuse vaccines. These individuals are described as "vaccine hesitant." This phenomenon has developed due to the confluence of multiple social, cultural, political, and personal factors. As immunization programs continue to expand, understanding and addressing vaccine hesitancy will be crucial to their successful implementation. This review explores the history of vaccine hesitancy, its causes, and suggested approaches for reducing hesitancy and strengthening vaccine acceptance. PMID:25875982

  10. Understanding Uncertainty in Physics-based Snow Models

    NASA Astrophysics Data System (ADS)

    Clark, M. P.

    2011-12-01

    In spite of considerable efforts dedicated to understanding and simulating the accumulation and ablation of the snowpack, there is arguably still a colossal amount of uncertainty in physics-based snow models. The fidelity of model representations of snow processes remains compromised by limited process understanding, and limited data. Many modeling decisions are often represented using empirical formulations with model parameters defined based either on order-of-magnitude considerations or by fitting a curve through a limited amount of experimental data. An outstanding challenge for the snow modeling community is to understand the impact of differences among existing snow models on their physical validity and predictive performance, and use this to identify and address fundamental model weaknesses. We introduce an alternative methodology for understanding differences among snow models. We first define a single set of governing model equations - a "master modeling template" - from which many existing models can be reproduced, and new models derived. The master template is then implemented within a single robust numerical framework, and used to explore the impact of differences in the choice of modeling approaches and the choice of model parameter values. To keep the study tractable, we focus on a subset of modeling options available within the template, restricting attention to one-dimensional snow model applied over non-vegetated surfaces. Assessments of forest snow processes and spatial variability are deferred to a separate study. The differences among existing snow models can be broadly classified into three categories: (i) estimation of fluxes at the snow-atmosphere interface, including the approach used to estimate the surface albedo, the turbulent fluxes of sensible and latent heat, and the partitioning of precipitation between rain and snow; (ii) internal processes within the snowpack, including heat conduction, penetration of shortwave radiation, vertical

  11. Combined hepatocellular cholangiocarcinoma: Controversies to be addressed

    PubMed Central

    Wang, An-Qiang; Zheng, Yong-Chang; Du, Juan; Zhu, Cheng-Pei; Huang, Han-Chun; Wang, Shan-Shan; Wu, Liang-Cai; Wan, Xue-Shuai; Zhang, Hao-Hai; Miao, Ruo-Yu; Sang, Xin-Ting; Zhao, Hai-Tao

    2016-01-01

    Combined hepatocellular cholangiocarcinoma (CHC) accounts for 0.4%-14.2% of primary liver cancer cases and possesses pathological features of both hepatocellular carcinoma and cholangiocarcinoma. Since this disease was first described and classified in 1949, the classification of CHC has continuously evolved. The latest definition and classification of CHC by the World Health Organization is based on the speculation that CHC arises from hepatic progenitor cells. However, there is no evidence demonstrating the common origin of different components of CHC. Furthermore, the definition of CHC subtypes is still ambiguous and the identification of CHC subtype when a single tumor contains many components has remained unresolved. In addition, there is no summary on the newly recognized histopathology features or the contribution of CHC components to prognosis and outcome of this disease. Here we provide a review of the current literature to address these questions. PMID:27182157

  12. Remediation tradeoffs addressed with simulated annealing optimization

    SciTech Connect

    Rogers, L. L., LLNL

    1998-02-01

    Escalation of groundwater remediation costs has encouraged both advances in optimization techniques to balance remediation objectives and economics and development of innovative technologies to expedite source region clean-ups. We present an optimization application building on a pump-and-treat model, yet assuming a prior removal of different portions of the source area to address the evolving management issue of more aggressive source remediation. Separate economic estimates of in-situ thermal remediation are combined with the economic estimates of the subsequent optimal pump-and-treat remediation to observe tradeoff relationships of cost vs. highest remaining contamination levels (hot spot). The simulated annealing algorithm calls the flow and transport model to evaluate the success of a proposed remediation scenario at a U.S.A. Superfund site contaminated with volatile organic compounds (VOCs).

  13. Climate model uncertainty versus conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-09-01

    Projections of climate change impact are associated with a cascade of uncertainties including in CO2 emission scenarios, climate models, downscaling and impact models. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context-dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty due to the climate models is more important for groundwater hydraulic heads and stream flow.

  14. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    NASA Astrophysics Data System (ADS)

    Sonnenborg, T. O.; Seifert, D.; Refsgaard, J. C.

    2015-04-01

    Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project) forced by the same CO2 scenario (A1B). The changes from the reference period (1991-2010) to the future period (2081-2100) in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  15. Analysis of Infiltration Uncertainty

    SciTech Connect

    R. McCurley

    2003-10-27

    The primary objectives of this uncertainty analysis are: (1) to develop and justify a set of uncertain parameters along with associated distributions; and (2) to use the developed uncertain parameter distributions and the results from selected analog site calculations done in ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]) to obtain the net infiltration weighting factors for the glacial transition climate. These weighting factors are applied to unsaturated zone (UZ) flow fields in Total System Performance Assessment (TSPA), as outlined in the ''Total System Performance Assessment-License Application Methods and Approach'' (BSC 2002 [160146], Section 3.1) as a method for the treatment of uncertainty. This report is a scientific analysis because no new and mathematical physical models are developed herein, and it is based on the use of the models developed in or for ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). Any use of the term model refers to those developed in the infiltration numerical model report. TSPA License Application (LA) has included three distinct climate regimes in the comprehensive repository performance analysis for Yucca Mountain: present-day, monsoon, and glacial transition. Each climate regime was characterized using three infiltration-rate maps, including a lower- and upper-bound and a mean value (equal to the average of the two boundary values). For each of these maps, which were obtained based on analog site climate data, a spatially averaged value was also calculated by the USGS. For a more detailed discussion of these infiltration-rate maps, see ''Simulation of Net Infiltration for Modern and Potential Future Climates'' (USGS 2001 [160355]). For this Scientific Analysis Report, spatially averaged values were calculated for the lower-bound, mean, and upper-bound climate analogs only for the glacial transition climate regime, within the

  16. Uncertainty analysis for a field-scale P loss model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...

  17. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    SciTech Connect

    Gregory, Julie J.; Harper, Frederick T.

    1999-07-28

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry.

  18. Methods for exploring uncertainty in groundwater management predictions

    USGS Publications Warehouse

    Guillaume, Joseph H. A.; Hunt, Randall J.; Comunian, Alessandro; Fu, Baihua; Blakers, Rachel S

    2016-01-01

    Models of groundwater systems help to integrate knowledge about the natural and human system covering different spatial and temporal scales, often from multiple disciplines, in order to address a range of issues of concern to various stakeholders. A model is simply a tool to express what we think we know. Uncertainty, due to lack of knowledge or natural variability, means that there are always alternative models that may need to be considered. This chapter provides an overview of uncertainty in models and in the definition of a problem to model, highlights approaches to communicating and using predictions of uncertain outcomes and summarises commonly used methods to explore uncertainty in groundwater management predictions. It is intended to raise awareness of how alternative models and hence uncertainty can be explored in order to facilitate the integration of these techniques with groundwater management.

  19. Analysis and Reduction of Complex Networks Under Uncertainty

    SciTech Connect

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  20. Mass Uncertainty and Application For Space Systems

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey

    2013-01-01

    Expected development maturity under contract (spec) should correlate with Project/Program Approved MGA Depletion Schedule in Mass Properties Control Plan. If specification NTE, MGA is inclusive of Actual MGA (A5 & A6). If specification is not an NTE Actual MGA (e.g. nominal), then MGA values are reduced by A5 values and A5 is representative of remaining uncertainty. Basic Mass = Engineering Estimate based on design and construction principles with NO embedded margin MGA Mass = Basic Mass * assessed % from approved MGA schedule. Predicted Mass = Basic + MGA. Aggregate MGA % = (Aggregate Predicted - Aggregate Basic) /Aggregate Basic.

  1. Evaluation and management of measurement uncertainty in the new generation Geometrical Product Specification (GPS)

    NASA Astrophysics Data System (ADS)

    Peng, Heping; Liu, Xiaojun; Jiang, Xiangqian; Xu, Zhengao

    2006-11-01

    Because of measurement errors objectively existing, measurement results always deviate from the "true value" of the measurand. Measurement uncertainty, as a quality index characterizing measurement results, attracts more attention worldwide. Its evaluation and expression will help understand measurement results, and promote international technical and commercial communication. According to the International Organization for Standardization ISO/BIPM "Guide to the Expression of Uncertainty in Measurement" - usually referred to as the GUM, this paper deals with the way of evaluation and expression of measurement uncertainties, describes the Procedure for Uncertainty of Measurement MAnagement (PUMA) in the new generation Geometrical Product Specification (GPS). An example of uncertainty evaluation with the PUMA method is given to illustrate the validity of PUMA. Since the concept of uncertainty in the new generation GPS has been expanded beyond just measurement uncertainty, the new way of evaluation and management of uncertainty throughout the whole GPS system remains to be developed.

  2. Varieties of uncertainty in health care: a conceptual taxonomy

    PubMed Central

    Han, Paul K.J.; Klein, William M.P.; Arora, Neeraj K.

    2011-01-01

    Uncertainty is a pervasive and important problem that has attracted increasing attention in health care, given the growing emphasis on evidence-based medicine, shared decision making, and patient-centered care. However, our understanding of this problem is limited, due in part to the absence of a unified, coherent concept of uncertainty. There are multiple meanings and varieties of uncertainty in health care, which are not often distinguished or acknowledged although each may have unique effects or warrant different courses of action. The literature on uncertainty in health care is thus fragmented, and existing insights have been incompletely translated to clinical practice. In this paper we attempt to address this problem by synthesizing diverse theoretical and empirical literature from the fields of communication, decision science, engineering, health services research, and psychology, and developing a new integrative conceptual taxonomy of uncertainty. We propose a three-dimensional taxonomy that characterizes uncertainty in health care according to its fundamental sources, issues, and locus. We show how this new taxonomy facilitates an organized approach to the problem of uncertainty in health care by clarifying its nature and prognosis, and suggesting appropriate strategies for its analysis and management. PMID:22067431

  3. Knowledge, consensus and uncertainty.

    PubMed

    Cavell, M

    1999-12-01

    Some months ago the editors of this journal asked me if I would undertake a series of short entries of a general sort on philosophical topics germane to current discussions in psychoanalysis. Both authors and topics were left to my discretion. I thought the series was a good idea and gladly agreed to do it. To my surprise and pleasure, all the philosophers I invited accepted I am only sorry that the series could not be longer as there are other philosophers as well who would have been splendid participants, and other topics I would like to have addressed. The essays that will follow in subsequent issues represent by and large the tradition of analytic philosophy, though this has come in the last few decades to comprise many of the themes we used to associate with the Continental tradition. Future entries, by James Conant, Donald Davison, Pascal Engel, Dagfinn Føllesdal, James Hopkins, Ernest Le Pore, Jeffrey Malpas, Jerome Neu, Brian O'Shaughnessy, Richard Rorty and Richard Wollheim, will address the following topics: intersubjectivity, meaning and language, consciousness and perception, pragmatism, knowledge and belief, norms and nature, metaphor, hermeneutics, truth, self-deception, the emotions. The essay below on knowledge, which will also be the topic of another entry by a different author later on, is the only one in the series that I will write. PMID:10669971

  4. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    PubMed Central

    2012-01-01

    Background Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE) model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. Results We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP) estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation) as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS) in the biomass slurries. Conclusions We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of the TE model predictions

  5. Hydrology, society, change and uncertainty

    NASA Astrophysics Data System (ADS)

    Koutsoyiannis, Demetris

    2014-05-01

    Heraclitus, who predicated that "panta rhei", also proclaimed that "time is a child playing, throwing dice". Indeed, change and uncertainty are tightly connected. The type of change that can be predicted with accuracy is usually trivial. Also, decision making under certainty is mostly trivial. The current acceleration of change, due to unprecedented human achievements in technology, inevitably results in increased uncertainty. In turn, the increased uncertainty makes the society apprehensive about the future, insecure and credulous to a developing future-telling industry. Several scientific disciplines, including hydrology, tend to become part of this industry. The social demand for certainties, no matter if these are delusional, is combined by a misconception in the scientific community confusing science with uncertainty elimination. However, recognizing that uncertainty is inevitable and tightly connected with change will help to appreciate the positive sides of both. Hence, uncertainty becomes an important object to study, understand and model. Decision making under uncertainty, developing adaptability and resilience for an uncertain future, and using technology and engineering means for planned change to control the environment are important and feasible tasks, all of which will benefit from advancements in the Hydrology of Uncertainty.

  6. Housing Uncertainty and Childhood Impatience

    ERIC Educational Resources Information Center

    Anil, Bulent; Jordan, Jeffrey L.; Zahirovic-Herbert, Velma

    2011-01-01

    The study demonstrates a direct link between housing uncertainty and children's time preferences, or patience. We show that students who face housing uncertainties through mortgage foreclosures and eviction learn impatient behavior and are therefore at greater risk of making poor intertemporal choices such as dropping out of school. We find that…

  7. Mama Software Features: Uncertainty Testing

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  8. Quantification of Emission Factor Uncertainty

    EPA Science Inventory

    Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

  9. A Challenge for School Leaders: Gender Equity Issues Remain

    ERIC Educational Resources Information Center

    Ragland, Joyce C.; Hatcher, Denise L.; Thomas, Jerald A., Jr.

    2005-01-01

    Gender roles in North American education remain a pertinent and dynamic source of discourse. Many questions concerning gender bias remain. This study attempts to characterize a nine-year period of college students' recall of episodes of gender bias from their pre-college experiences. The survey instrument used in this research consisted of a nine…

  10. 49 CFR 845.51 - Investigation to remain open.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 7 2012-10-01 2012-10-01 false Investigation to remain open. 845.51 Section 845.51 Transportation Other Regulations Relating to Transportation (Continued) NATIONAL TRANSPORTATION... § 845.51 Investigation to remain open. Accident investigations are never officially closed but are...

  11. Addressing viral resistance through vaccines

    PubMed Central

    Laughlin, Catherine; Schleif, Amanda; Heilman, Carole A

    2015-01-01

    Antimicrobial resistance is a serious healthcare concern affecting millions of people around the world. Antiviral resistance has been viewed as a lesser threat than antibiotic resistance, but it is important to consider approaches to address this growing issue. While vaccination is a logical strategy, and has been shown to be successful many times over, next generation viral vaccines with a specific goal of curbing antiviral resistance will need to clear several hurdles including vaccine design, evaluation and implementation. This article suggests that a new model of vaccination may need to be considered: rather than focusing on public health, this model would primarily target sectors of the population who are at high risk for complications from certain infections. PMID:26604979

  12. Addressing failures in exascale computing

    SciTech Connect

    Snir, Marc; Wisniewski, Robert W.; Abraham, Jacob A.; Adve, Sarita; Bagchi, Saurabh; Balaji, Pavan; Belak, Jim; Bose, Pradip; Cappello, Franck; Carlson, William; Chien, Andrew A.; Coteus, Paul; Debardeleben, Nathan A.; Diniz, Pedro; Engelmann, Christian; Erez, Mattan; Saverio, Fazzari; Geist, Al; Gupta, Rinku; Johnson, Fred; Krishnamoorthy, Sriram; Leyffer, Sven; Liberty, Dean; Mitra, Subhasish; Munson, Todd; Schreiber, Robert; Stearly, Jon; Van Hensbergen, Eric

    2014-05-01

    We present here a report produced by a workshop on “Addressing Failures in Exascale Computing” held in Park City, Utah, August 4–11, 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system; discuss existing knowledge on resilience across the various hardware and software layers of an exascale system; and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, and academia; and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.

  13. Light addressable photoelectrochemical cyanide sensor

    SciTech Connect

    Licht, S.; Myung, N.; Sun, Y.

    1996-03-15

    A sensor is demonstrated that is capable of spatial discrimination of cyanide with use of only a single stationary sensing element. Different spatial regions of the sensing element are light activated to reveal the solution cyanide concentration only at the point of illumination. In this light addressable photoelectrochemical (LAP) sensor the sensing element consists of an n-CdSe electrode immersed in solution, with the open-circuit potential determined under illumination. In alkaline ferro-ferri-cyanide solution, the open-circuit photopotential is highly responsive to cyanide, with a linear response of (120 mV) log [KCN]. LAP detection with a spatial resolution of {+-}1 mm for cyanide detection is demonstrated. The response is almost linear for 0.001-0.100 m cyanide with a resolution of 5 mV. 38 refs., 7 figs., 1 tab.

  14. Addressing Failures in Exascale Computing

    SciTech Connect

    Snir, Marc; Wisniewski, Robert; Abraham, Jacob; Adve, Sarita; Bagchi, Saurabh; Balaji, Pavan; Belak, J.; Bose, Pradip; Cappello, Franck; Carlson, Bill; Chien, Andrew; Coteus, Paul; DeBardeleben, Nathan; Diniz, Pedro; Engelmann, Christian; Erez, Mattan; Fazzari, Saverio; Geist, Al; Gupta, Rinku; Johnson, Fred; Krishnamoorthy, Sriram; Leyffer, Sven; Liberty, Dean; Mitra, Subhasish; Munson, Todd; Schreiber, Rob; Stearley, Jon; Van Hensbergen, Eric

    2014-01-01

    We present here a report produced by a workshop on Addressing failures in exascale computing' held in Park City, Utah, 4-11 August 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system, discuss existing knowledge on resilience across the various hardware and software layers of an exascale system, and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, and academia, and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.

  15. Uncertainty in Integrated Assessment Scenarios

    SciTech Connect

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  16. Investments in energy technological change under uncertainty

    NASA Astrophysics Data System (ADS)

    Shittu, Ekundayo

    2009-12-01

    This dissertation addresses the crucial problem of how environmental policy uncertainty influences investments in energy technological change. The rising level of carbon emissions due to increasing global energy consumption calls for policy shift. In order to stem the negative consequences on the climate, policymakers are concerned with carving an optimal regulation that will encourage technology investments. However, decision makers are facing uncertainties surrounding future environmental policy. The first part considers the treatment of technological change in theoretical models. This part has two purposes: (1) to show--through illustrative examples--that technological change can lead to quite different, and surprising, impacts on the marginal costs of pollution abatement. We demonstrate an intriguing and uncommon result that technological change can increase the marginal costs of pollution abatement over some range of abatement; (2) to show the impact, on policy, of this uncommon observation. We find that under the assumption of technical change that can increase the marginal cost of pollution abatement over some range, the ranking of policy instruments is affected. The second part builds on the first by considering the impact of uncertainty in the carbon tax on investments in a portfolio of technologies. We determine the response of energy R&D investments as the carbon tax increases both in terms of overall and technology-specific investments. We determine the impact of risk in the carbon tax on the portfolio. We find that the response of the optimal investment in a portfolio of technologies to an increasing carbon tax depends on the relative costs of the programs and the elasticity of substitution between fossil and non-fossil energy inputs. In the third part, we zoom-in on the portfolio model above to consider how uncertainty in the magnitude and timing of a carbon tax influences investments. Under a two-stage continuous-time optimal control model, we

  17. Robustness and uncertainties in global water scarcity projections

    NASA Astrophysics Data System (ADS)

    Floerke, Martina; Eisner, Stephanie; Hanasaki, Naota; Wada, Yoshihide

    2014-05-01

    Water scarcity is both a natural and human-made phenomenon and defined as the condition where there are insufficient water resources to satisfy long-term average requirements. Many regions of the world are affected by this chronic imbalance between renewable water resources and water demand leading to depletion of surface water and groundwater stocks. Total freshwater abstraction today amounts to 3856 km³ of which 70% are withdrawn by the agricultural sector, followed by the industry (19%) and domestic sectors (11%) (FAO 2010). Population growth and consumption change have led to threefold increase in total water withdrawals in the last 60 years through a rising demand for electricity, industrial and agricultural products, and thus for water (Flörke et al. 2013). The newly developed "Shared Socio-Economic Pathways" (SSPs) project global population to increase up to 7.2 or even 14 billion people by 2100 (O'Neill et al. 2012); and meeting future water demand in sufficient quantity and quality is seen as one of the key challenges of the 21st century. So far, the assessment of regional and global water-scarcity patterns mostly focused on climate change impacts by driving global hydrological models with climate projections from different GCMs while little emphasis has been put on the water demand side. Changes in future water scarcity, however, are found to be mainly driven by changes in water withdrawals (Alcamo et al. 2007, Hanasaki et al. 2012), i.e. sensitivity to climate change outweighs exposure. Likewise, uncertainties have mainly been assessed in relation to the spread among climate scenarios and from global hydrological models (GHMs) (Haddeland et al. 2011, 2013; Schewe et al. 2013, Wada et al. 2013) while the contribution of water use modelling related to total uncertainty remains largely unstudied. The main objective of this study is to address the main uncertainties related to both climate and socio-economic impacts on global and regional water scarcity

  18. Reformulating the Quantum Uncertainty Relation.

    PubMed

    Li, Jun-Li; Qiao, Cong-Feng

    2015-01-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space. PMID:26234197

  19. Reformulating the Quantum Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-01

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the “triviality” problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  20. Quantifying Uncertainty in the Net Hydrologic Flux of Calcium at the Hubbard Brook Experimental Forest, New Hampshire, USA

    NASA Astrophysics Data System (ADS)

    Campbell, J. L.; Yanai, R. D.; Green, M.; Likens, G. E.; Buso, D. C.; See, C.; Barr, B.

    2013-12-01

    Small watersheds are hydrologically distinct ecological units that integrate chemical, physical and biological processes. The basic premise of the small watershed approach is that the flux of chemical elements in and out of watersheds can be used to evaluate nutrient gains or losses. In paired watershed studies, following a pre-treatment calibration period, a treated watershed is compared with a reference watershed enabling evaluation of the treatment on nutrient flux and cycling. This approach has provided invaluable insight into how ecosystems function and respond to both natural and human disturbances. Despite the great advances that have been made using this approach, the method is often criticized because the treatments are usually not replicated. The reason for this lack of replication is that it is often difficult to identify suitable replicate watersheds and is expensive due to the large scale of these studies. In cases where replication is not possible, traditional statistical approaches cannot be applied. Uncertainty analysis can help address this issue because it enables reporting of statistical confidence even when replicates are not used. However, estimating uncertainty can be challenging because it is difficult to identify and quantify sources of uncertainty, there are many different possible approaches, and the methods can be computationally challenging. In this study, we used uncertainty analysis to evaluate changes in the net hydrologic flux (inputs in precipitation minus outputs in stream water) of calcium following a whole-tree harvest at the Hubbard Brook Experimental Forest in New Hampshire, USA. In the year following the harvest, there was a large net loss of calcium (20 kg/ha/yr) in the treated watershed compared to the reference (5 kg/ha/yr). Net losses in the treated watershed have declined over the 26 years after the harvest, but still remain elevated compared to the reference. We used uncertainty analysis to evaluate whether the

  1. Elucidating Dicke superradiance by quantum uncertainty

    NASA Astrophysics Data System (ADS)

    dos Santos, Eduardo M.; Duzzioni, Eduardo I.

    2016-08-01

    Recently it was shown by Wolfe and Yelin [E. Wolfe and S. F. Yelin, Phys. Rev. Lett. 112, 140402 (2014), 10.1103/PhysRevLett.112.140402] that in the idealized Dicke model of superradiance there is no entanglement among any partitions of the system during the total evolution time of the system. This result immediately causes us to question if other measures from quantum information theory can explain the characteristic release of energy in a short time interval. In this work we identify the uncertainty of purely quantum origin as the property responsible for Dicke superradiance. The quantum uncertainty on the population of each emitter of the sample captured by the Wigner-Yanase skew information (WYSI) is proportional to the correlation radiation rate, which is the part of the total radiated power coming from dipole correlations and responsible for releasing in a short time a great intensity of radiation energy. We also show that the correlation measure called local quantum uncertainty, which is the minimization of the WYSI over all local observables, presents a double sudden change induced by the environment. The time window between these two sudden changes is used to define the interval in which symmetric global observables of the system behave classically for N →∞ , although the emitters remain strongly quantum correlated.

  2. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  3. Detectability and Interpretational Uncertainties: Considerations in Gauging the Impacts of Land Disturbance on Streamflow

    EPA Science Inventory

    Hydrologic impacts of land disturbance and management can be confounded by rainfall variability. As a consequence, attempts to gauge and quantify these effects through streamflow monitoring are typically subject to uncertainties. This paper addresses the quantification and deline...

  4. The Democratic Imperative to Address Sexual Equality Rights in Schools

    ERIC Educational Resources Information Center

    Gereluk, Dianne

    2013-01-01

    Issues of sexual orientation elicit ethical debates in schools and society. In jurisdictions where a legal right has not yet been established, one argument commonly rests on whether schools ought to address issues of same-sex relationships and marriage on the basis of civil equality, or whether such controversial issues ought to remain in the…

  5. Back to the future: The Grassroots of Hydrological Uncertainty

    NASA Astrophysics Data System (ADS)

    Smith, K. A.

    2013-12-01

    Uncertainties are widespread within hydrological science, and as society is looking to models to provide answers as to how climate change may affect our future water resources, the performance of hydrological models should be evaluated. With uncertainties being introduced from input data, parameterisation, model structure, validation data, and ';unknown unknowns' it is easy to be pessimistic about model outputs. But uncertainties are an opportunity for scientific endeavour, not a threat. Investigation and suitable presentation of uncertainties, which results in a range of potential outcomes, provides more insight into model projections than just one answer. This paper aims to demonstrate the feasibility of conducting computationally demanding parameter uncertainty estimation experiments on global hydrological models (GHMs). Presently, individual GHMs tend to present their one, best projection, but this leads to spurious precision - a false impression of certainty - which can be misleading to decision makers. Whilst uncertainty estimation is firmly established in catchment hydrology, GHM uncertainty, and parameter uncertainty in particular, has remained largely overlooked. Model inter-comparison studies that investigate model structure uncertainty have been undertaken (e.g. ISI-MIP, EU-WATCH etc.), but these studies seem premature when the uncertainties within each individual model itself have not yet been considered. This study takes a few steps back, going down to one of the first introductions of assumptions in model development, the assignment of model parameter values. Making use of the University of Nottingham's High Performance Computer Cluster (HPC), the Mac-PDM.09 GHM has been subjected to rigorous uncertainty experiments. The Generalised Likelihood Uncertainty Estimation method (GLUE) with Latin Hypercube Sampling has been applied to a GHM for the first time, to produce 100,000 simultaneous parameter perturbations. The results of this ensemble of 100

  6. Sensitivity and uncertainty analysis for the annual P loss estimator (APLE) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  7. Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Technical abstract: Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analys...

  8. Estimating the magnitude of prediction uncertainties for field-scale P loss models

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, an uncertainty analysis for the Annual P Loss Estima...

  9. Sensitivity and uncertainty analysis for a field-scale P loss model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  10. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  11. Quantification of Uncertainty in Full-Waveform Moment Tensor Inversion for Regional Seismicity

    NASA Astrophysics Data System (ADS)

    Jian, P.; Hung, S.; Tseng, T.

    2013-12-01

    Routinely and instantaneously determined moment tensor solutions deliver basic information for investigating faulting nature of earthquakes and regional tectonic structure. The accuracy of full-waveform moment tensor inversion mostly relies on azimuthal coverage of stations, data quality and previously known earth's structure (i.e., impulse responses or Green's functions). However, intrinsically imperfect station distribution, noise-contaminated waveform records and uncertain earth structure can often result in large deviations of the retrieved source parameters from the true ones, which prohibits the use of routinely reported earthquake catalogs for further structural and tectonic interferences. Duputel et al. (2012) first systematically addressed the significance of statistical uncertainty estimation in earthquake source inversion and exemplified that the data covariance matrix, if prescribed properly to account for data dependence and uncertainty due to incomplete and erroneous data and hypocenter mislocation, cannot only be mapped onto the uncertainty estimate of resulting source parameters, but it also aids obtaining more stable and reliable results. Over the past decade, BATS (Broadband Array in Taiwan for Seismology) has steadily devoted to building up a database of good-quality centroid moment tensor (CMT) solutions for moderate to large magnitude earthquakes that occurred in Taiwan area. Because of the lack of the uncertainty quantification and reliability analysis, it remains controversial to use the reported CMT catalog directly for further investigation of regional tectonics, near-source strong ground motions, and seismic hazard assessment. In this study, we develop a statistical procedure to make quantitative and reliable estimates of uncertainty in regional full-waveform CMT inversion. The linearized inversion scheme adapting efficient estimation of the covariance matrices associated with oversampled noisy waveform data and errors of biased centroid

  12. Reducing Uncertainty in the Seismic Design Basis for the Waste Treatment Plant, Hanford, Washington

    SciTech Connect

    Brouns, T.M.; Rohay, A.C.; Reidel, S.P.; Gardner, M.G.

    2007-07-01

    The seismic design basis for the Waste Treatment Plant (WTP) at the Department of Energy's (DOE) Hanford Site near Richland was re-evaluated in 2005, resulting in an increase by up to 40% in the seismic design basis. The original seismic design basis for the WTP was established in 1999 based on a probabilistic seismic hazard analysis completed in 1996. The 2005 analysis was performed to address questions raised by the Defense Nuclear Facilities Safety Board (DNFSB) about the assumptions used in developing the original seismic criteria and adequacy of the site geotechnical surveys. The updated seismic response analysis used existing and newly acquired seismic velocity data, statistical analysis, expert elicitation, and ground motion simulation to develop interim design ground motion response spectra which enveloped the remaining uncertainties. The uncertainties in these response spectra were enveloped at approximately the 84. percentile to produce conservative design spectra, which contributed significantly to the increase in the seismic design basis. A key uncertainty identified in the 2005 analysis was the velocity contrasts between the basalt flows and sedimentary interbeds below the WTP. The velocity structure of the upper four basalt flows (Saddle Mountains Basalt) and the inter-layered sedimentary interbeds (Ellensburg Formation) produces strong reductions in modeled earthquake ground motions propagating through them. Uncertainty in the strength of velocity contrasts between these basalts and interbeds primarily resulted from an absence of measured shear wave velocities (Vs) in the interbeds. For this study, Vs in the interbeds was estimated from older, limited compressional wave velocity (Vp) data using estimated ranges for the ratio of the two velocities (Vp/Vs) based on analogues in similar materials. A range of possible Vs for the interbeds and basalts was used and produced additional uncertainty in the resulting response spectra. Because of the

  13. Lack of data drives uncertainty in PCB health risk assessments.

    PubMed

    Cogliano, Vincent James

    2016-02-01

    Health risk assessments generally involve many extrapolations: for example, from animals to humans or from high doses to lower doses. Health risk assessments for PCBs involve all the usual uncertainties, plus additional uncertainties due to the nature of PCBs as a dynamic, complex mixture. Environmental processes alter PCB mixtures after release into the environment, so that people are exposed to mixtures that might not resemble the mixtures where there are toxicity data. This paper discusses the evolution of understanding in assessments of the cancer and noncancer effects of PCBs. It identifies where a lack of data in the past contributed to significant uncertainty and where new data subsequently altered the prevailing understanding of the toxicity of PCB mixtures, either qualitatively or quantitatively. Finally, the paper identifies some uncertainties remaining for current PCB health assessments, particularly those that result from a lack of data on exposure through nursing or on effects from inhalation of PCBs. PMID:26347413

  14. Treatment of uncertainties associated with PRAs in risk-informed decision making (NUREG1855).

    SciTech Connect

    Wheeler, Timothy A.

    2010-06-01

    This document provides guidance on how to treat uncertainties associated with probabilistic risk assessment (PRA) in risk-informed decisionmaking. The objectives of this guidance include fostering an understanding of the uncertainties associated with PRA and their impact on the results of PRA and providing a pragmatic approach to addressing these uncertainties in the context of the decisionmaking. In implementing risk-informed decisionmaking, the U.S. Nuclear Regulatory Commission expects that appropriate consideration of uncertainty will be given in the analyses used to support the decision and in the interpretation of the findings of those analyses. To meet the objective of this document, it is necessary to understand the role that PRA results play in the context of the decision process. To define this context, this document provides an overview of the risk-informed decisionmaking process itself. With the context defined, this document describes the characteristics of a risk model and, in particular, a PRA. This description includes recognition that a PRA, being a probabilistic model, characterizes aleatory uncertainty that results from randomness associated with the events of the model. Because the focus of this document is epistemic uncertainty (i.e., uncertainties in the formulation of the PRA model), it provides guidance on identifying and describing the different types of sources of epistemic uncertainty and the different ways that they are treated. The different types of epistemic uncertainty are parameter, model, and completeness uncertainties. The final part of the guidance addresses the uncertainty in PRA results in the context of riskinformed decisionmaking and, in particular, the interpretation of the results of the uncertainty analysis when comparing PRA results with the acceptance criteria established for a specified application. In addition, guidance is provided for addressing completeness uncertainty in risk-informed decision making. Such

  15. Imprecise probabilistic estimation of design floods with epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Qi, Wei; Zhang, Chi; Fu, Guangtao; Zhou, Huicheng

    2016-06-01

    An imprecise probabilistic framework for design flood estimation is proposed on the basis of the Dempster-Shafer theory to handle different epistemic uncertainties from data, probability distribution functions, and probability distribution parameters. These uncertainties are incorporated in cost-benefit analysis to generate the lower and upper bounds of the total cost for flood control, thus presenting improved information for decision making on design floods. Within the total cost bounds, a new robustness criterion is proposed to select a design flood that can tolerate higher levels of uncertainty. A variance decomposition approach is used to quantify individual and interactive impacts of the uncertainty sources on total cost. Results from three case studies, with 127, 104, and 54 year flood data sets, respectively, show that the imprecise probabilistic approach effectively combines aleatory and epistemic uncertainties from the various sources and provides upper and lower bounds of the total cost. Between the total cost and the robustness of design floods, a clear trade-off which is beyond the information that can be provided by the conventional minimum cost criterion is identified. The interactions among data, distributions, and parameters have a much higher contribution than parameters to the estimate of the total cost. It is found that the contributions of the various uncertainty sources and their interactions vary with different flood magnitude, but remain roughly the same with different return periods. This study demonstrates that the proposed methodology can effectively incorporate epistemic uncertainties in cost-benefit analysis of design floods.

  16. Uncertainty in Regional Air Quality Modeling

    NASA Astrophysics Data System (ADS)

    Digar, Antara

    Effective pollution mitigation is the key to successful air quality management. Although states invest millions of dollars to predict future air quality, the regulatory modeling and analysis process to inform pollution control strategy remains uncertain. Traditionally deterministic ‘bright-line’ tests are applied to evaluate the sufficiency of a control strategy to attain an air quality standard. A critical part of regulatory attainment demonstration is the prediction of future pollutant levels using photochemical air quality models. However, because models are uncertain, they yield a false sense of precision that pollutant response to emission controls is perfectly known and may eventually mislead the selection of control policies. These uncertainties in turn affect the health impact assessment of air pollution control strategies. This thesis explores beyond the conventional practice of deterministic attainment demonstration and presents novel approaches to yield probabilistic representations of pollutant response to emission controls by accounting for uncertainties in regional air quality planning. Computationally-efficient methods are developed and validated to characterize uncertainty in the prediction of secondary pollutant (ozone and particulate matter) sensitivities to precursor emissions in the presence of uncertainties in model assumptions and input parameters. We also introduce impact factors that enable identification of model inputs and scenarios that strongly influence pollutant concentrations and sensitivity to precursor emissions. We demonstrate how these probabilistic approaches could be applied to determine the likelihood that any control measure will yield regulatory attainment, or could be extended to evaluate probabilistic health benefits of emission controls, considering uncertainties in both air quality models and epidemiological concentration-response relationships. Finally, ground-level observations for pollutant (ozone) and precursor

  17. Sources of Uncertainty in Climate Change Projections of Precipitation

    NASA Astrophysics Data System (ADS)

    Gutmann, Ethan; Clark, Martyn; Eidhammer, Trude; Ikeda, Kyoko; Deser, Clara; Brekke, Levi; Arnold, Jeffrey; Rasmussen, Roy

    2016-04-01

    Predicting the likely changes in precipitation due to anthropogenic climate influences is one of the most important problems in earth science today. This problem is complicated by the enormous uncertainty in current predictions. Until all such sources of uncertainty are adequately addressed and quantified, we can not know what changes may be predictable, and which masked by the internal variability of the climate system itself. Here we assess multiple sources of uncertainty including those due to internal variability, climate model selection, emissions scenario, regional climate model physics, and statistical downscaling methods. This work focuses on the Colorado Rocky Mountains because these mountains serve as the water towers for much of the western United States, but the results are more broadly applicable, and results will be presented covering the Columbia River Basin and the California Sierra Nevadas as well. Internal variability is assessed using 30 members of the CESM Large Ensemble. Uncertainty due to the choice of climate models is assessed using 100 climate projections from the CMIP5 archive, including multiple emissions scenarios. Uncertainty due to regional climate model physics is assessed using a limited set of high-resolution Weather Research and Forecasting (WRF) model simulations in comparison to a larger multi-physics ensemble using the Intermediate Complexity Atmospheric Research (ICAR) model. Finally, statistical downscaling uncertainty is assessed using multiple statistical downscaling models. In near-term projections (25-35 years) internal variability is the largest source of uncertainty; however, over longer time scales (70-80 years) other sources of uncertainty become more important, with the importance of different sources of uncertainty varying depending on the metric assessed.

  18. Quantifying Uncertainty in Velocity Models using Bayesian Methods

    NASA Astrophysics Data System (ADS)

    Hobbs, R.; Caiado, C.; Majdański, M.

    2008-12-01

    Quanitifying uncertainty in models derived from observed data is a major issue. Public and political understanding of uncertainty is poor and for industry inadequate assessment of risk costs money. In this talk we will examine the geological structure of the subsurface, however our principal exploration tool, controlled source seismology, gives its data in time. Inversion tools exist to map these data into a depth model but a full exploration of the uncertainty of the model is rarely done because robust strategies do not exist for large non-linear complex systems. There are two principal sources of uncertainty: the first comes from the input data which is noisy and bandlimited; the second, and more sinister, is from the model parameterisation and forward algorithms themselves, which approximate to the physics to make the problem tractable. To address these issues we propose a Bayesian approach. One philosophy is to estimate the uncertainty in a possible model derived using standard inversion tools. During the inversion stage we can use our geological prejudice to derive an acceptable model. Then we use a local random walk using the Metropolis- Hastings algorithm to explore the model space immediately around a possible solution. For models with a limited number of parameters we can use the forward modeling step from the inversion code. However as the number of parameters increase and/or the cost of the forward modeling step becomes significant, we need to use fast emulators to act as proxies so a sufficient number of iterations can be performed on which to base our statistical measures of uncertainty. In this presentation we show examples of uncertainty estimation using both pre- and post-critical seismic data. In particular, we will demonstrate uncertainty introduced by the approximation of the physics by using a tomographic inversion of bandlimited data and show that uncertainty increases as the central frequency of the data decreases. This is consistent with the

  19. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  20. Addressing neurological disorders with neuromodulation.

    PubMed

    Oluigbo, Chima O; Rezai, Ali R

    2011-07-01

    Neurological disorders are becoming increasingly common in developed countries as a result of the aging population. In spite of medications, these disorders can result in progressive loss of function as well as chronic physical, cognitive, and emotional disability that ultimately places enormous emotional and economic on the patient, caretakers, and the society in general. Neuromodulation is emerging as a therapeutic option in these patients. Neuromodulation is a field, which involves implantable devices that allow for the reversible adjustable application of electrical, chemical, or biological agents to the central or peripheral nervous system with the objective of altering its functioning with the objective of achieving a therapeutic or clinically beneficial effect. It is a rapidly evolving field that brings together many different specialties in the fields of medicine, materials science, computer science and technology, biomedical, and neural engineering as well as the surgical or interventional specialties. It has multiple current and emerging indications, and an enormous potential for growth. The main challenges before it are in the need for effective collaboration between engineers, basic scientists, and clinicians to develop innovations that address specific problems resulting in new devices and clinical applications. PMID:21193369

  1. Gender: addressing a critical focus.

    PubMed

    Thornton, L; Wegner, M N

    1995-01-01

    The definition of gender was addressed at the Fourth World Conference on Women (Beijing, China). After extensive debate, the definition developed by the UN Population Fund in 1995 was adopted: "a set of qualities and behaviors expected from a female or male by society." The sustainability of family planning (FP) programs depends on acknowledgment of the role gender plays in contraceptive decision-making and use. For example, programs must consider the fact that women in many cultures do not make FP decisions without the consent of their spouse. AVSC is examining providers' gender-based ideas about clients and the effects of these views on the quality of reproductive health services. Questions such as how service providers can encourage joint responsibility for contraception without requiring spousal consent or how they can make men feel comfortable about using a male method in a society where FP is considered a woman's issue are being discussed. Also relevant is how service providers can discuss sexual matters openly with female clients in cultures that do not allow women to enjoy their sexuality. Another concern is the potential for physical violence to a client as a result of the provision of FP services. PMID:12294397

  2. For Medicare's New Approach To Physician Payment, Big Questions Remain.

    PubMed

    Wynne, Billy

    2016-09-01

    The Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) established a new framework for Medicare physician payment. Designed to stabilize uncertain payment rates for Medicare's fee-for-service (FFS) system and incentivize physicians to move into new alternative payment systems, MACRA contains several uncertainties of its own. In a textbook illustration of why it's important to be careful what you wish for, it's increasingly easy to predict that implementation of MACRA will be delayed as a result of both regulatory and legislative breaches of its statutory timeline. This article traces the contemporary history of the Medicare physician payment system and efforts to implement additional changes. PMID:27605645

  3. UncertWeb: chaining web services accounting for uncertainty

    NASA Astrophysics Data System (ADS)

    Cornford, Dan; Jones, Richard; Bastin, Lucy; Williams, Matthew; Pebesma, Edzer; Nativi, Stefano

    2010-05-01

    The development of interoperable services that permit access to data and processes, typically using web service based standards opens up the possibility for increasingly complex chains of data and processes, which might be discovered and composed in increasingly automatic ways. This concept, sometimes referred to as the "Model Web", offers the promise of integrated (Earth) system models, with pluggable web service based components which can be discovered, composed and evaluated dynamically. A significant issue with such service chains, indeed in any composite model composed of coupled components, is that in all interesting (non-linear) cases the effect of uncertainties on inputs, or components within the chain will have complex, potentially unexpected effects on the outputs. Within the FP7 UncertWeb project we will be developing a mechanism and an accompanying set of tools to enable rigorous uncertainty management in web based service chains involving both data and processes. The project will exploit and extend the UncertML candidate standard to flexibly propagate uncertainty through service chains, including looking at mechanisms to develop uncertainty enabled profiles of existing Open Geospatial Consortium services. To facilitate the use of such services we will develop tools to address the definition of the input uncertainties (elicitation), manage the uncertainty propagation (emulation), undertake uncertainty and sensitivity analysis and visualise the output uncertainty. In this talk we will outline the challenges of the UncertWeb project, illustrating this with a prototype service chain we have created for correcting station level pressure to sea-level pressure, which accounts for the various uncertainties involved. In particular we will discuss some of the challenges of chaining Open Geospatial Consortium services using the Business Process Execution Language. We will also address the issue of computational cost and communication bandwidth requirements for

  4. Detail of roofline with view of remaining cupola in background; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of roofline with view of remaining cupola in background; camera facing southwest. - Mare Island Naval Shipyard, Old Administrative Offices, Eighth Street, north side between Railroad Avenue & Walnut Avenue, Vallejo, Solano County, CA

  5. 53. INTERIOR VIEW LOOKING NORTH NORTHEAST SHOWING THE REMAINS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    53. INTERIOR VIEW LOOKING NORTH NORTHEAST SHOWING THE REMAINS OF A WOODEN SETTLING BOX IN THE BACKGROUND RIGHT. AMALGAMATING PANS IN THE FOREGROUND. - Standard Gold Mill, East of Bodie Creek, Northeast of Bodie, Bodie, Mono County, CA

  6. 7. REMAINS OF PLANK WALL WITHIN CANAL CONSTRUCTED TO PROTECT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. REMAINS OF PLANK WALL WITHIN CANAL CONSTRUCTED TO PROTECT OUTSIDE CANAL BANK, LOOKING SOUTHWEST. NOTE CROSS SUPPORT POLES EXTENDING TO HILLSIDE. - Snake River Ditch, Headgate on north bank of Snake River, Dillon, Summit County, CO

  7. 6. REMAINS OF PLANK WALL NAILED TO POSTS WITHIN CANAL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. REMAINS OF PLANK WALL NAILED TO POSTS WITHIN CANAL CONSTRUCTED TO PROTECT OUTSIDE CANAL BANK. VIEW IS TO THE WEST. - Snake River Ditch, Headgate on north bank of Snake River, Dillon, Summit County, CO

  8. 25. CAFETERIA Note remains of tile floor in foreground. Food ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. CAFETERIA Note remains of tile floor in foreground. Food cooked on the stove was served to workers in the eating area to the left of the counter (off picture). - Hovden Cannery, 886 Cannery Row, Monterey, Monterey County, CA

  9. The taphonomy of human remains in a glacial environment.

    PubMed

    Pilloud, Marin A; Megyesi, Mary S; Truffer, Martin; Congram, Derek

    2016-04-01

    A glacial environment is a unique setting that can alter human remains in characteristic ways. This study describes glacial dynamics and how glaciers can be understood as taphonomic agents. Using a case study of human remains recovered from Colony Glacier, Alaska, a glacial taphonomic signature is outlined that includes: (1) movement of remains, (2) dispersal of remains, (3) altered bone margins, (4) splitting of skeletal elements, and (5) extensive soft tissue preservation and adipocere formation. As global glacier area is declining in the current climate, there is the potential for more materials of archaeological and medicolegal significance to be exposed. It is therefore important for the forensic anthropologist to have an idea of the taphonomy in this setting and to be able to differentiate glacial effects from other taphonomic agents. PMID:26917542

  10. 11. DOUBLE CURVED RACK. UPPER PORTION ROTATES; LOWER PORTION REMAINS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. DOUBLE CURVED RACK. UPPER PORTION ROTATES; LOWER PORTION REMAINS STATIONARY. DISCARDED ROLLER NEAR CENTER OF FRAME. - Chicago, Milwaukee & St. Paul Railway, Bridge No. Z-6, Spanning North Branch of Chicago River, South of Cortland Street, Chicago, Cook County, IL

  11. View of Feature 1, the remains of and administration building, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Feature 1, the remains of and administration building, view to the southwest - Orphan Lode Mine, North of West Rim Road between Powell Point and Maricopa Point, South Rim, Grand Canyon Village, Coconino County, AZ

  12. View of Feature 1, the remains of and administration building, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Feature 1, the remains of and administration building, view to the west-northwest - Orphan Lode Mine, North of West Rim Road between Powell Point and Maricopa Point, South Rim, Grand Canyon Village, Coconino County, AZ

  13. View of Feature 1, the remains of and administration building, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Feature 1, the remains of and administration building, view to the south - Orphan Lode Mine, North of West Rim Road between Powell Point and Maricopa Point, South Rim, Grand Canyon Village, Coconino County, AZ

  14. View of remains of Feature 17, a cottage, view to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of remains of Feature 17, a cottage, view to the northwest - Orphan Lode Mine, North of West Rim Road between Powell Point and Maricopa Point, South Rim, Grand Canyon Village, Coconino County, AZ

  15. View of Feature 1, the remains of and administration building, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Feature 1, the remains of and administration building, view to the north - Orphan Lode Mine, North of West Rim Road between Powell Point and Maricopa Point, South Rim, Grand Canyon Village, Coconino County, AZ

  16. View of the remains of Feature 19, a cottage, view ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of the remains of Feature 19, a cottage, view to the west-northwest - Orphan Lode Mine, North of West Rim Road between Powell Point and Maricopa Point, South Rim, Grand Canyon Village, Coconino County, AZ

  17. View of Feature 3, the remains of an administration building, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Feature 3, the remains of an administration building, view to the south - Orphan Lode Mine, North of West Rim Road between Powell Point and Maricopa Point, South Rim, Grand Canyon Village, Coconino County, AZ

  18. 7. Detail view: east side of north end, showing remains ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. Detail view: east side of north end, showing remains of Fort San Antonio - Puente Guillermo Esteves, Spanning San Antonio Channel at PR-25 (Juan Ponce de Leon Avenue), San Juan, San Juan Municipio, PR

  19. Cellar: Detail of paired relieving arch and remains of herringbone ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Cellar: Detail of paired relieving arch and remains of herringbone brick pattern from earlier cooking fireplace at back, southeast wall looking southeast - Kingston-Upon-Hill, Kitts Hummock Road, Dover, Kent County, DE

  20. 4. Band Wheel and Walking Beam Mechanism, Including Remains of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Band Wheel and Walking Beam Mechanism, Including Remains of Frame Belt House, Looking Southeast - David Renfrew Oil Rig, East side of Connoquenessing Creek, 0.4 mile North of confluence with Thorn Creek, Renfrew, Butler County, PA

  1. 32. Interior view, encased fireplace and remains of the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    32. Interior view, encased - fireplace and remains of the hearth against the north wall, with scale l(note: hole punched through plaster allows access to the flues) - Kiskiack, Naval Mine Depot, State Route 238 vicinity, Yorktown, York County, VA

  2. 3. VIEW OF POWER PLANT LOOKING SOUTH INTO THE REMAINS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VIEW OF POWER PLANT LOOKING SOUTH INTO THE REMAINS OF THE TURBINE FLUMES. - Potomac Power Plant, On West Virginia Shore of Potomac River, about 1 mile upriver from confluence with Shenandoah River, Harpers Ferry, Jefferson County, WV

  3. View of submerged remains of Read Sawmill, with floor boards ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of submerged remains of Read Sawmill, with floor boards removed, showing cross beams, foundation sill and mortises, and horizontal wall boards. - Silas C. Read Sawmill, Outlet of Maxwell Lake near North Range Road, Fort Gordon, Richmond County, GA

  4. View of submerged remains of Read Sawmill, showing floor boards, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of submerged remains of Read Sawmill, showing floor boards, wall boards, tenoned uprights and mortised sill beams. - Silas C. Read Sawmill, Outlet of Maxwell Lake near North Range Road, Fort Gordon, Richmond County, GA

  5. View of submerged remains of Read Sawmill with most floorboards ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of submerged remains of Read Sawmill with most floorboards removed, showing cross beams with mortises, vertical wall boards, and horizontal floor boards. - Silas C. Read Sawmill, Outlet of Maxwell Lake near North Range Road, Fort Gordon, Richmond County, GA

  6. 11. Remains of Douglasfir cordwood abandoned when kilns ceased operation, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. Remains of Douglas-fir cordwood abandoned when kilns ceased operation, looking northeast. - Warren King Charcoal Kilns, 5 miles west of Idaho Highway 28, Targhee National Forest, Leadore, Lemhi County, ID

  7. View of submerged remains of Read Sawmill, showing floor boards, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of submerged remains of Read Sawmill, showing floor boards, cross beams and notches for wall post beams. - Silas C. Read Sawmill, Outlet of Maxwell Lake near North Range Road, Fort Gordon, Richmond County, GA

  8. 13. View South, showing the remaining pier footings for the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. View South, showing the remaining pier footings for the steam engine water tower for the Chesapeake and Ohio Railroad. - Cotton Hill Station Bridge, Spanning New River at State Route 16, Cotton Hill, Fayette County, WV

  9. 1. VIEW SHOWING REMAINS OF CAMOUFLAGE COVERING CONCRETE FOOTING FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    1. VIEW SHOWING REMAINS OF CAMOUFLAGE COVERING CONCRETE FOOTING FOR A GENERATOR PAD - Fort Cronkhite, Anti-Aircraft Battery No. 1, Concrete Footing-Generator Pad, Wolf Road, Sausalito, Marin County, CA

  10. 13. REMAINING TOP PART OF SOUTH ELEVATION, HAMMER BUILDING, SINCE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. REMAINING TOP PART OF SOUTH ELEVATION, HAMMER BUILDING, SINCE JOINED TO BUILDING 6. - Hughes Aircraft Company, Assembly & Manufacturing Building, 6775 Centinela Avenue, Los Angeles, Los Angeles County, CA

  11. 11. LOOKING SOUTH AT THE ONLY REMAINING PART OF THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. LOOKING SOUTH AT THE ONLY REMAINING PART OF THE NORTH SIDE OF ORIGINAL LAB, FROM COURTYARD. - U.S. Geological Survey, Rock Magnetics Laboratory, 345 Middlefield Road, Menlo Park, San Mateo County, CA

  12. 7. VIEW OF VESSEL FROM PORT BON, SHOWING REMAINS OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. VIEW OF VESSEL FROM PORT BON, SHOWING REMAINS OF MAIN CABIN. AFT CABIN STILL STANDS ON STERN IN BACKGROUND - Motorized Sailing Vessel "Fox", Beached on East Bank ofBayou Lafourche, Larose, Lafourche Parish, LA

  13. 6. VIEW SOUTHWEST, COOLING TROUGH REMAINS Imperial Carbon Black ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. VIEW SOUTHWEST, COOLING TROUGH REMAINS - Imperial Carbon Black Plant (Ruin), North side of North Fork of Hughes River along Bunnell Run Road just over 0.5 mile from its intersection with State Route 16, Harrisville, Ritchie County, WV

  14. 3. VIEW NORTH, COOLING TANK REMAINS Imperial Carbon Black ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. VIEW NORTH, COOLING TANK REMAINS - Imperial Carbon Black Plant (Ruin), North side of North Fork of Hughes River along Bunnell Run Road just over 0.5 mile from its intersection with State Route 16, Harrisville, Ritchie County, WV

  15. Credible Computations: Standard and Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; VanDalsem, William (Technical Monitor)

    1995-01-01

    The discipline of computational fluid dynamics (CFD) is at a crossroad. Most of the significant advances related to computational methods have taken place. The emphasis is now shifting from methods to results. Significant efforts are made in applying CFD to solve design problems. The value of CFD results in design depends on the credibility of computed results for the intended use. The process of establishing credibility requires a standard so that there is a consistency and uniformity in this process and in the interpretation of its outcome. The key element for establishing the credibility is the quantification of uncertainty. This paper presents salient features of a proposed standard and a procedure for determining the uncertainty. A customer of CFD products - computer codes and computed results - expects the following: A computer code in terms of its logic, numerics, and fluid dynamics and the results generated by this code are in compliance with specified requirements. This expectation is fulfilling by verification and validation of these requirements. The verification process assesses whether the problem is solved correctly and the validation process determines whether the right problem is solved. Standards for these processes are recommended. There is always some uncertainty, even if one uses validated models and verified computed results. The value of this uncertainty is important in the design process. This value is obtained by conducting a sensitivity-uncertainty analysis. Sensitivity analysis is generally defined as the procedure for determining the sensitivities of output parameters to input parameters. This analysis is a necessary step in the uncertainty analysis, and the results of this analysis highlight which computed quantities and integrated quantities in computations need to be determined accurately and which quantities do not require such attention. Uncertainty analysis is generally defined as the analysis of the effect of the uncertainties

  16. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    SciTech Connect

    Li, J.; McNelis, D.; Yim, M.S.

    2013-07-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for the discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC.

  17. 52. VIEW OF REMAINS OF ORIGINAL 1907 CONTROL PANEL, LOCATED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    52. VIEW OF REMAINS OF ORIGINAL 1907 CONTROL PANEL, LOCATED ON NORTH WALL OF EAST END OF CONTROL ROOM. PORTIONS OF THIS PANEL REMAINED IN USE UNTIL THE PLANT CLOSED. THE METERS AND CONTROLS ARE MOUNTED ON SOAPSTONE PANELS. THE INSTRUMENT IN THE LEFT CENTER OF THE PHOTOGRAPH IS A TIRRILL VOLTAGE REGULATOR. - New York, New Haven & Hartford Railroad, Cos Cob Power Plant, Sound Shore Drive, Greenwich, Fairfield County, CT

  18. Headspace constituents of the tree remain of Cinnamomum camphora.

    PubMed

    Miyazawa, M; Hashimoto, Y; Taniguchi, Y; Kubota, K

    2001-01-01

    The volatile ingredients isolated from a fresh tree of Cinnamomum camphora (camphor tree) and from a tree remain of C. camphora were collected by using headspace techniques and analyzed by means of gas chromatography/mass spectrometry (GC/MS). 99.77% of the constituents consisting 23 components from the fresh tree, 98.68% of the constituents consisting 24 components from the tree remain were identified. Of these ingredients, camphor was obtained as the most abundant component. PMID:11547425

  19. Managing uncertainty in family practice.

    PubMed Central

    Biehn, J.

    1982-01-01

    Because patients present in the early stages of undifferentiated problems, the family physician often faces uncertainty, especially in diagnosis and management. The physician's uncertainty may be unacceptable to the patient and may lead to inappropriate use of diagnostic procedures. The problem is intensified by the physician's hospital training, which emphasizes mastery of available knowledge and decision-making based on certainty. Strategies by which a physician may manage uncertainty include (a) a more open doctor-patient relationship, (b) understanding the patient's reason for attending the office, (c) a thorough assessment of the problem, (d) a commitment to reassessment and (e) appropriate consultation. PMID:7074488

  20. Quantum Cryptography Without Quantum Uncertainties

    NASA Astrophysics Data System (ADS)

    Durt, Thomas

    2002-06-01

    Quantum cryptography aims at transmitting a random key in such a way that the presence of a spy eavesdropping the communication would be revealed by disturbances in the transmission of the message. In standard quantum cryptography, this unavoidable disturbance is a consequence of the uncertainty principle of Heisenberg. We propose in this paper to replace quantum uncertainties by generalised, technological uncertainties, and discuss the realisability of such an idea. The proposed protocol can be considered as a simplification, but also as a generalisation of the standard quantum cryptographic protocols.

  1. Assessing uncertainty in physical constants

    NASA Astrophysics Data System (ADS)

    Henrion, Max; Fischhoff, Baruch

    1986-09-01

    Assessing the uncertainty due to possible systematic errors in a physical measurement unavoidably involves an element of subjective judgment. Examination of historical measurements and recommended values for the fundamental physical constants shows that the reported uncertainties have a consistent bias towards underestimating the actual errors. These findings are comparable to findings of persistent overconfidence in psychological research on the assessment of subjective probability distributions. Awareness of these biases could help in interpreting the precision of measurements, as well as provide a basis for improving the assessment of uncertainty in measurements.

  2. A non-destructive method for dating human remains

    USGS Publications Warehouse

    Lail, Warren K.; Sammeth, David; Mahan, Shannon; Nevins, Jason

    2013-01-01

    The skeletal remains of several Native Americans were recovered in an eroded state from a creek bank in northeastern New Mexico. Subsequently stored in a nearby museum, the remains became lost for almost 36 years. In a recent effort to repatriate the remains, it was necessary to fit them into a cultural chronology in order to determine the appropriate tribe(s) for consultation pursuant to the Native American Grave Protection and Repatriation Act (NAGPRA). Because the remains were found in an eroded context with no artifacts or funerary objects, their age was unknown. Having been asked to avoid destructive dating methods such as radiocarbon dating, the authors used Optically Stimulated Luminescence (OSL) to date the sediments embedded in the cranium. The OSL analyses yielded reliable dates between A.D. 1415 and A.D. 1495. Accordingly, we conclude that the remains were interred somewhat earlier than A.D. 1415, but no later than A.D. 1495. We believe the remains are from individuals ancestral to the Ute Mouache Band, which is now being contacted for repatriation efforts. Not only do our methods contribute to the immediate repatriation efforts, they provide archaeologists with a versatile, non-destructive, numerical dating method that can be used in many burial contexts.

  3. On the formulation of a minimal uncertainty model for robust control with structured uncertainty

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Chang, B.-C.; Fischl, Robert

    1991-01-01

    In the design and analysis of robust control systems for uncertain plants, representing the system transfer matrix in the form of what has come to be termed an M-delta model has become widely accepted and applied in the robust control literature. The M represents a transfer function matrix M(s) of the nominal closed loop system, and the delta represents an uncertainty matrix acting on M(s). The nominal closed loop system M(s) results from closing the feedback control system, K(s), around a nominal plant interconnection structure P(s). The uncertainty can arise from various sources, such as structured uncertainty from parameter variations or multiple unsaturated uncertainties from unmodeled dynamics and other neglected phenomena. In general, delta is a block diagonal matrix, but for real parameter variations delta is a diagonal matrix of real elements. Conceptually, the M-delta structure can always be formed for any linear interconnection of inputs, outputs, transfer functions, parameter variations, and perturbations. However, very little of the currently available literature addresses computational methods for obtaining this structure, and none of this literature addresses a general methodology for obtaining a minimal M-delta model for a wide class of uncertainty, where the term minimal refers to the dimension of the delta matrix. Since having a minimally dimensioned delta matrix would improve the efficiency of structured singular value (or multivariable stability margin) computations, a method of obtaining a minimal M-delta would be useful. Hence, a method of obtaining the interconnection system P(s) is required. A generalized procedure for obtaining a minimal P-delta structure for systems with real parameter variations is presented. Using this model, the minimal M-delta model can then be easily obtained by closing the feedback loop. The procedure involves representing the system in a cascade-form state-space realization, determining the minimal uncertainty matrix

  4. Preliminary assessment of the impact of conceptual model uncertainty on site performance

    SciTech Connect

    Gallegos, D.P.; Pohl, P.I.; Olague, N.E.; Knowlton, R.G.; Updegraff, C.D.

    1990-10-01

    The US Department of Energy is responsible for the design, construction, operation, and decommission of a site for the deep geologic disposal of high-level radioactive waste (HLW). This involves site characterization and the use of performance assessment to demonstrate compliance with regulations for HLW disposal from the US Environmental Protection Agency (EPA) and the US Nuclear Regulatory Commission. The EPA standard states that a performance assessment should consider the associated uncertainties involved in estimating cumulative release of radionuclides to the accessible environment. To date, the majority of the efforts in uncertainty analysis have been directed toward data and parameter uncertainty, whereas little effort has been made to treat model uncertainty. Model uncertainty includes conceptual model uncertainty, mathematical model uncertainty, and any uncertainties derived from implementing the mathematical model in a computer code. Currently there is no systematic approach that is designed to address the uncertainty in conceptual models. The purpose of this investigation is to take a first step at addressing conceptual model uncertainty. This will be accomplished by assessing the relative impact of alternative conceptual models on the integrated release of radionuclides to the accessible environment for an HLW repository site located in unsaturated, fractured tuff. 4 refs., 2 figs.

  5. Portfolios as Evidence of Reflective Practice: What Remains "Untold"

    ERIC Educational Resources Information Center

    Orland-Barak, Lily

    2005-01-01

    Addressing recent calls for investigating the specific quality of reflection associated with the uses of portfolios in teacher education, this paper describes and interprets the "practice of portfolio construction" as revealed in the construction and presentation of two kinds of portfolio in two in-service courses for mentors of teachers in…

  6. Non-scalar uncertainty: Uncertainty in dynamic systems

    NASA Technical Reports Server (NTRS)

    Martinez, Salvador Gutierrez

    1992-01-01

    The following point is stated throughout the paper: dynamic systems are usually subject to uncertainty, be it the unavoidable quantic uncertainty when working with sufficiently small scales or when working in large scales uncertainty can be allowed by the researcher in order to simplify the problem, or it can be introduced by nonlinear interactions. Even though non-quantic uncertainty can generally be dealt with by using the ordinary probability formalisms, it can also be studied with the proposed non-scalar formalism. Thus, non-scalar uncertainty is a more general theoretical framework giving insight into the nature of uncertainty and providing a practical tool in those cases in which scalar uncertainty is not enough, such as when studying highly nonlinear dynamic systems. This paper's specific contribution is the general concept of non-scalar uncertainty and a first proposal for a methodology. Applications should be based upon this methodology. The advantage of this approach is to provide simpler mathematical models for prediction of the system states. Present conventional tools for dealing with uncertainty prove insufficient for an effective description of some dynamic systems. The main limitations are overcome abandoning ordinary scalar algebra in the real interval (0, 1) in favor of a tensor field with a much richer structure and generality. This approach gives insight into the interpretation of Quantum Mechanics and will have its most profound consequences in the fields of elementary particle physics and nonlinear dynamic systems. Concepts like 'interfering alternatives' and 'discrete states' have an elegant explanation in this framework in terms of properties of dynamic systems such as strange attractors and chaos. The tensor formalism proves especially useful to describe the mechanics of representing dynamic systems with models that are closer to reality and have relatively much simpler solutions. It was found to be wise to get an approximate solution to an

  7. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. PMID:26695995

  8. Significant predictors of patients' uncertainty in primary brain tumors.

    PubMed

    Lin, Lin; Chien, Lung-Chang; Acquaye, Alvina A; Vera-Bolanos, Elizabeth; Gilbert, Mark R; Armstrong, Terri S

    2015-05-01

    Patients with primary brain tumors (PBT) face uncertainty related to prognosis, symptoms and treatment response and toxicity. Uncertainty is correlated to negative mood states and symptom severity and interference. This study identified predictors of uncertainty during different treatment stages (newly-diagnosed, on treatment, followed-up without active treatment). One hundred eighty six patients with PBT were accrued at various points in the illness trajectory. Data collection tools included: a clinical checklist/a demographic data sheet/the Mishel Uncertainty in Illness Scale-Brain Tumor Form. The structured additive regression model was used to identify significant demographic and clinical predictors of illness-related uncertainty. Participants were primarily white (80 %) males (53 %). They ranged in age from 19-80 (mean = 44.2 ± 12.6). Thirty-two of the 186 patients were newly-diagnosed, 64 were on treatment at the time of clinical visit with MRI evaluation, 21 were without MRI, and 69 were not on active treatment. Three subscales (ambiguity/inconsistency; unpredictability-disease prognoses; unpredictability-symptoms and other triggers) were different amongst the treatment groups (P < .01). However, patients' uncertainty during active treatment was as high as in newly-diagnosed period. Other than treatment stages, change of employment status due to the illness was the most significant predictor of illness-related uncertainty. The illness trajectory of PBT remains ambiguous, complex, and unpredictable, leading to a high incidence of uncertainty. There was variation in the subscales of uncertainty depending on treatment status. Although patients who are newly diagnosed reported the highest scores on most of the subscales, patients on treatment felt more uncertain about unpredictability of symptoms than other groups. Due to the complexity and impact of the disease, associated symptoms, and interference with functional status, comprehensive assessment of patients

  9. An address geocoding solution for Chinese cities

    NASA Astrophysics Data System (ADS)

    Zhang, Xuehu; Ma, Haoming; Li, Qi

    2006-10-01

    We introduce the challenges of address geocoding for Chinese cities and present a potential solution along with a prototype system that deal with these challenges by combining and extending current geocoding solutions developed for United States and Japan. The proposed solution starts by separating city addresses into "standard" addresses which meet a predefined address model and non-standard ones. The standard addresses are stored in a structured relational database in their normalized forms, while a selected portion of the non-standard addresses are stored as aliases to the standard addresses. An in-memory address index is then constructed from the address database and serves as the basis for real-time address matching. Test results were obtained from two trials conducted in the city Beijing. On average 80% matching rate were achieved. Possible improvements to the current design are also discussed.

  10. Uncertainty-induced quantum nonlocality

    NASA Astrophysics Data System (ADS)

    Wu, Shao-xiong; Zhang, Jun; Yu, Chang-shui; Song, He-shan

    2014-01-01

    Based on the skew information, we present a quantity, uncertainty-induced quantum nonlocality (UIN) to measure the quantum correlation. It can be considered as the updated version of the original measurement-induced nonlocality (MIN) preserving the good computability but eliminating the non-contractivity problem. For 2×d-dimensional state, it is shown that UIN can be given by a closed form. In addition, we also investigate the maximal uncertainty-induced nonlocality.

  11. Dynamical Realism and Uncertainty Propagation

    NASA Astrophysics Data System (ADS)

    Park, Inkwan

    In recent years, Space Situational Awareness (SSA) has become increasingly important as the number of tracked Resident Space Objects (RSOs) continues their growth. One of the most significant technical discussions in SSA is how to propagate state uncertainty in a consistent way with the highly nonlinear dynamical environment. In order to keep pace with this situation, various methods have been proposed to propagate uncertainty accurately by capturing the nonlinearity of the dynamical system. We notice that all of the methods commonly focus on a way to describe the dynamical system as precisely as possible based on a mathematical perspective. This study proposes a new perspective based on understanding dynamics of the evolution of uncertainty itself. We expect that profound insights of the dynamical system could present the possibility to develop a new method for accurate uncertainty propagation. These approaches are naturally concluded in goals of the study. At first, we investigate the most dominant factors in the evolution of uncertainty to realize the dynamical system more rigorously. Second, we aim at developing the new method based on the first investigation enabling orbit uncertainty propagation efficiently while maintaining accuracy. We eliminate the short-period variations from the dynamical system, called a simplified dynamical system (SDS), to investigate the most dominant factors. In order to achieve this goal, the Lie transformation method is introduced since this transformation can define the solutions for each variation separately. From the first investigation, we conclude that the secular variations, including the long-period variations, are dominant for the propagation of uncertainty, i.e., short-period variations are negligible. Then, we develop the new method by combining the SDS and the higher-order nonlinear expansion method, called state transition tensors (STTs). The new method retains advantages of the SDS and the STTs and propagates

  12. Natural Uncertainty Measure for Forecasting Floods in Ungauged Basins

    NASA Astrophysics Data System (ADS)

    Mantilla, Ricardo; Krajewski, Witold F.; Gupta, Vijay K.; Ayalew, Tibebu B.

    2015-04-01

    Recent data analysis have shown that peak flows for individual Rainfall-Runoff (RF-RO) events exhibit power law scaling with respect to drainage area, but the scaling slopes and intercepts change from one event to the next. We test this feature in the 32,400 km2 Iowa River basin, and give supporting evidence for our hypothesis that scaling slope and intercept incorporates all the pertinent physical processes that produce floods. These developments serve as the foundations for the key question that is addressed here: How to define uncertainty bounds for flood prediction for each event? We theoretically introduce the concept of Natural Uncertainty Measure for peak discharge (NUMPD) and test it using data from the Iowa River basin. We conjecture that NUMPD puts a limit to predictive uncertainty using measurements and modeling. In other words, the best any amount of data collection combined with any model can do is to come close to predicting NUMPD, but it cannot match or reduce it any further. For the applications of flood predictions, the concepts of Type-I and Type-II uncertainties in flood prediction are explained. We demonstrate Type-I uncertainty using the concept of NUMPD. Our results offer a context for Type-II uncertainty. Our results make a unique contribution to International Association of Hydrologic Sciences (IAHS) decade-long initiative on Predictions in Unaguged Basins (PUB) (2003-2012).

  13. Identification of the remains of King Richard III.

    PubMed

    King, Turi E; Fortes, Gloria Gonzalez; Balaresque, Patricia; Thomas, Mark G; Balding, David; Maisano Delser, Pierpaolo; Neumann, Rita; Parson, Walther; Knapp, Michael; Walsh, Susan; Tonasso, Laure; Holt, John; Kayser, Manfred; Appleby, Jo; Forster, Peter; Ekserdjian, David; Hofreiter, Michael; Schürer, Kevin

    2014-01-01

    In 2012, a skeleton was excavated at the presumed site of the Grey Friars friary in Leicester, the last-known resting place of King Richard III. Archaeological, osteological and radiocarbon dating data were consistent with these being his remains. Here we report DNA analyses of both the skeletal remains and living relatives of Richard III. We find a perfect mitochondrial DNA match between the sequence obtained from the remains and one living relative, and a single-base substitution when compared with a second relative. Y-chromosome haplotypes from male-line relatives and the remains do not match, which could be attributed to a false-paternity event occurring in any of the intervening generations. DNA-predicted hair and eye colour are consistent with Richard's appearance in an early portrait. We calculate likelihood ratios for the non-genetic and genetic data separately, and combined, and conclude that the evidence for the remains being those of Richard III is overwhelming. PMID:25463651

  14. Identification of the remains of King Richard III

    PubMed Central

    King, Turi E.; Fortes, Gloria Gonzalez; Balaresque, Patricia; Thomas, Mark G.; Balding, David; Delser, Pierpaolo Maisano; Neumann, Rita; Parson, Walther; Knapp, Michael; Walsh, Susan; Tonasso, Laure; Holt, John; Kayser, Manfred; Appleby, Jo; Forster, Peter; Ekserdjian, David; Hofreiter, Michael; Schürer, Kevin

    2014-01-01

    In 2012, a skeleton was excavated at the presumed site of the Grey Friars friary in Leicester, the last-known resting place of King Richard III. Archaeological, osteological and radiocarbon dating data were consistent with these being his remains. Here we report DNA analyses of both the skeletal remains and living relatives of Richard III. We find a perfect mitochondrial DNA match between the sequence obtained from the remains and one living relative, and a single-base substitution when compared with a second relative. Y-chromosome haplotypes from male-line relatives and the remains do not match, which could be attributed to a false-paternity event occurring in any of the intervening generations. DNA-predicted hair and eye colour are consistent with Richard’s appearance in an early portrait. We calculate likelihood ratios for the non-genetic and genetic data separately, and combined, and conclude that the evidence for the remains being those of Richard III is overwhelming. PMID:25463651

  15. Uncertainty in measurements by counting

    NASA Astrophysics Data System (ADS)

    Bich, Walter; Pennecchi, Francesca

    2012-02-01

    Counting is at the base of many high-level measurements, such as, for example, frequency measurements. In some instances the measurand itself is a number of events, such as spontaneous decays in activity measurements, or objects, such as colonies of bacteria in microbiology. Countings also play a fundamental role in everyday life. In any case, a counting is a measurement. A measurement result, according to its present definition, as given in the 'International Vocabulary of Metrology—Basic and general concepts and associated terms (VIM)', must include a specification concerning the estimated uncertainty. As concerns measurements by counting, this specification is not easy to encompass in the well-known framework of the 'Guide to the Expression of Uncertainty in Measurement', known as GUM, in which there is no guidance on the topic. Furthermore, the issue of uncertainty in countings has received little or no attention in the literature, so that it is commonly accepted that this category of measurements constitutes an exception in which the concept of uncertainty is not applicable, or, alternatively, that results of measurements by counting have essentially no uncertainty. In this paper we propose a general model for measurements by counting which allows an uncertainty evaluation compliant with the general framework of the GUM.

  16. Uncertainty of empirical correlation equations

    NASA Astrophysics Data System (ADS)

    Feistel, R.; Lovell-Smith, J. W.; Saunders, P.; Seitz, S.

    2016-08-01

    The International Association for the Properties of Water and Steam (IAPWS) has published a set of empirical reference equations of state, forming the basis of the 2010 Thermodynamic Equation of Seawater (TEOS-10), from which all thermodynamic properties of seawater, ice, and humid air can be derived in a thermodynamically consistent manner. For each of the equations of state, the parameters have been found by simultaneously fitting equations for a range of different derived quantities using large sets of measurements of these quantities. In some cases, uncertainties in these fitted equations have been assigned based on the uncertainties of the measurement results. However, because uncertainties in the parameter values have not been determined, it is not possible to estimate the uncertainty in many of the useful quantities that can be calculated using the parameters. In this paper we demonstrate how the method of generalised least squares (GLS), in which the covariance of the input data is propagated into the values calculated by the fitted equation, and in particular into the covariance matrix of the fitted parameters, can be applied to one of the TEOS-10 equations of state, namely IAPWS-95 for fluid pure water. Using the calculated parameter covariance matrix, we provide some preliminary estimates of the uncertainties in derived quantities, namely the second and third virial coefficients for water. We recommend further investigation of the GLS method for use as a standard method for calculating and propagating the uncertainties of values computed from empirical equations.

  17. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  18. Microscopic residues of bone from dissolving human remains in acids.

    PubMed

    Vermeij, Erwin; Zoon, Peter; van Wijk, Mayonne; Gerretsen, Reza

    2015-05-01

    Dissolving bodies is a current method of disposing of human remains and has been practiced throughout the years. During the last decade in the Netherlands, two cases have emerged in which human remains were treated with acid. In the first case, the remains of a cremated body were treated with hydrofluoric acid. In the second case, two complete bodies were dissolved in a mixture of hydrochloric and sulfuric acid. In both cases, a great variety of evidence was collected at the scene of crime, part of which was embedded in resin, polished, and investigated using SEM/EDX. Apart from macroscopic findings like residual bone and artificial teeth, in both cases, distinct microscopic residues of bone were found as follows: (partly) digested bone, thin-walled structures, and recrystallized calcium phosphate. Although some may believe it is possible to dissolve a body in acid completely, at least some of these microscopic residues will always be found. PMID:25677640

  19. Field contamination of skeletonized human remains with exogenous DNA.

    PubMed

    Edson, Suni M; Christensen, Alexander F

    2013-01-01

    The Armed Forces DNA Identification Laboratory reports the mitochondrial DNA (mtDNA) sequences of over 800 skeletal samples a year for the Joint POW/MIA Accounting Command-Central Identification Laboratory. These sequences are generated from degraded skeletal remains that are presumed to belong to U.S. service members missing from past military conflicts. In the laboratory, it is possible to control for contamination of remains; however, in the field, it can be difficult to prevent modern DNA from being transferred to skeletal elements and being carried forward through the analysis process. Four such cases are described here along with the controls in place in the laboratory to eliminate the possibility of the exogenous DNA being reported as authentic. In each case, the controls implemented by the laboratories prevented the false reporting of contaminant exogenous DNA from remains that were either faunal or human, but lacked endogenous DNA. PMID:22994903

  20. OVERVIEW OF REMAINS OF DEWATERING BUILDING, LOOKING SOUTH TOWARD CYANIDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    OVERVIEW OF REMAINS OF DEWATERING BUILDING, LOOKING SOUTH TOWARD CYANIDE PROCESSING AREA. WATER USED IN PROCESSING AT THE STAMP MILL WAS CIRCULATED HERE FOR RECLAMATION. SANDS WERE SETTLED OUT AND DEPOSITED IN ONE OF TWO TAILINGS HOLDING AREAS. CLEARED WATER WAS PUMPED BACK TO THE MILL FOR REUSE. THIS PROCESS WAS ACCOMPLISHED BY THE USE OF SETTLING CONES, EIGHT FEET IN DIAMETER AND SIX FEET HIGH. THE REMAINS OF FOUR CONES ARE AT CENTER, BEHIND THE TANK IN THE FOREGROUND. TO THE LEFT IS THE MAIN ACCESS ROAD BETWEEN THE MILL AND THE PARKING LOT. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  1. Climate change adaptation under uncertainty in the developing world: A case study of sea level rise in Kiribati

    NASA Astrophysics Data System (ADS)

    Donner, S. D.; Webber, S.

    2011-12-01

    Climate change is expected to have the greatest impact in parts of the developing world. At the 2010 meeting of U.N. Framework Convention on Climate Change in Cancun, industrialized countries agreed in principle to provide US$100 billion per year by 2020 to assist the developing world respond to climate change. This "Green Climate Fund" is a critical step towards addressing the challenge of climate change. However, the policy and discourse on supporting adaptation in the developing world remains highly idealized. For example, the efficacy of "no regrets" adaptation efforts or "mainstreaming" adaptation into decision-making are rarely evaluated in the real world. In this presentation, I will discuss the gap between adaptation theory and practice using a multi-year case study of the cultural, social and scientific obstacles to adapting to sea level rise in the Pacific atoll nation of Kiribati. Our field research reveals how scientific and institutional uncertainty can limit international efforts to fund adaptation and lead to spiraling costs. Scientific uncertainty about hyper-local impacts of sea level rise, though irreducible, can at times limit decision-making about adaptation measures, contrary to the notion that "good" decision-making practices can incorporate scientific uncertainty. Efforts to improve institutional capacity must be done carefully, or they risk inadvertently slowing the implementation of adaptation measures and increasing the likelihood of "mal"-adaptation.

  2. Planning ATES systems under uncertainty

    NASA Astrophysics Data System (ADS)

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions

  3. Long-time uncertainty propagation using generalized polynomial chaos and flow map composition

    SciTech Connect

    Luchtenburg, Dirk M.; Brunton, Steven L.; Rowley, Clarence W.

    2014-10-01

    We present an efficient and accurate method for long-time uncertainty propagation in dynamical systems. Uncertain initial conditions and parameters are both addressed. The method approximates the intermediate short-time flow maps by spectral polynomial bases, as in the generalized polynomial chaos (gPC) method, and uses flow map composition to construct the long-time flow map. In contrast to the gPC method, this approach has spectral error convergence for both short and long integration times. The short-time flow map is characterized by small stretching and folding of the associated trajectories and hence can be well represented by a relatively low-degree basis. The composition of these low-degree polynomial bases then accurately describes the uncertainty behavior for long integration times. The key to the method is that the degree of the resulting polynomial approximation increases exponentially in the number of time intervals, while the number of polynomial coefficients either remains constant (for an autonomous system) or increases linearly in the number of time intervals (for a non-autonomous system). The findings are illustrated on several numerical examples including a nonlinear ordinary differential equation (ODE) with an uncertain initial condition, a linear ODE with an uncertain model parameter, and a two-dimensional, non-autonomous double gyre flow.

  4. Structural model uncertainty in stochastic simulation

    SciTech Connect

    McKay, M.D.; Morrison, J.D.

    1997-09-01

    Prediction uncertainty in stochastic simulation models can be described by a hierarchy of components: stochastic variability at the lowest level, input and parameter uncertainty at a higher level, and structural model uncertainty at the top. It is argued that a usual paradigm for analysis of input uncertainty is not suitable for application to structural model uncertainty. An approach more likely to produce an acceptable methodology for analyzing structural model uncertainty is one that uses characteristics specific to the particular family of models.

  5. The relationship between aerosol model uncertainty and radiative forcing uncertainty

    NASA Astrophysics Data System (ADS)

    Carslaw, Ken; Lee, Lindsay; Reddington, Carly

    2016-04-01

    There has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated aerosol-cloud forcing between pre-industrial and present day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the pre-industrial aerosol state. But the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are "equally acceptable" compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty, but this hides a range of very different aerosol models. These multiple so-called "equifinal" model variants predict a wide range of forcings. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model-observation agreement could give a misleading impression of model robustness.

  6. Addressing Underrepresentation: Physics Teaching for All

    NASA Astrophysics Data System (ADS)

    Rifkin, Moses

    2016-02-01

    Every physics teacher wants to give his or her students the opportunity to learn physics well. Despite these intentions, certain groups of students—including women and underrepresented minorities (URMs)—are not taking and not remaining in physics. In many cases, these disturbing trends are more significant in physics than in any other science. This is a missed opportunity for our discipline because demographic diversity strengthens science. The question is what we can do about these trends in our classrooms, as very few physics teachers have been explicitly prepared to address them. In this article, I will share some steps that I've taken in my classroom that have moved my class in the right direction. In the words of Nobel Prize-winning physicist Carl Wieman and psychologists Lauren Aguilar and Gregory Walton: "By investing a small amount of class time in carefully designed and implemented interventions, physics teachers can promote greater success among students from diverse backgrounds. Ultimately, we hope such efforts will indeed improve the diversity and health of the physics profession."

  7. Teacher Retention: Why Do Beginning Teachers Remain in the Profession?

    ERIC Educational Resources Information Center

    Inman, Duane; Marlow, Leslie

    2004-01-01

    As beginning teachers continue to leave the profession within the first several years of entering, educators must identify factors which cause teachers to remain in the profession, as well as factors related to attrition if the current teacher shortage is to be remedied. The purpose of this study was to examine the reported attitudes of beginning…

  8. 5. View of remaining rock ledge from construction of passage ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. View of remaining rock ledge from construction of passage to enter mill (Riverdale Cotton Mill was built into the side of a hill). Partially subterranean area was popular with employees trying to escape the heat of the mill, now an unofficial smoking area. - Riverdale Cotton Mill, Corner of Middle & Lower Streets, Valley, Chambers County, AL

  9. 11. ENTRY STAIRWELL TO CABLE TUNNEL. REMAINS OF ELECTRICAL DISTRIBUTION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. ENTRY STAIRWELL TO CABLE TUNNEL. REMAINS OF ELECTRICAL DISTRIBUTION STATIONS AT LEFT, TRACKSIDE CAMERA STAND AT FAR RIGHT. Looking northeast toward launch pad. - Edwards Air Force Base, South Base Sled Track, Firing Control Blockhouse, South of Sled Track at east end, Lancaster, Los Angeles County, CA

  10. Remains of abutments for Bridge No. 1575 at MD Rt. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Remains of abutments for Bridge No. 1575 at MD Rt. 51 in Spring Gap, Maryland, looking northeast. (Compare with HAER MD-115 photos taken 1988). - Western Maryland Railway, Cumberland Extension, Pearre to North Branch, from WM milepost 125 to 160, Pearre, Washington County, MD

  11. 3. GENERAL VIEW OF REMAINS OF 40" BLOOMING MILL; THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. GENERAL VIEW OF REMAINS OF 40" BLOOMING MILL; THE ENGINE ROOM CONTAINING THE MESTA-CORLISS STEAM ENGINE, IS LOCATED AT THE FAR END OF THE MILL AS SEEN TO THE FAR RIGHT (THE BUILDING WITH THE SHED ROOF). - Republic Iron & Steel Company, Youngstown Works, Blooming Mill & Blooming Mill Engines, North of Poland Avenue, Youngstown, Mahoning County, OH

  12. Interior of control house showing remains of controller. Moving the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior of control house showing remains of controller. Moving the handle rotated the vertical shaft and porcelain cams to engage various electrical switches and activate the lift mechanism. All electrical components have been removed. - Potomac Edison Company, Chesapeake & Ohio Canal Bridge, Spanning C & O Canal South of U.S. 11, Williamsport, Washington County, MD

  13. 18. A view looking southeast at the remains of the ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. A view looking southeast at the remains of the director's office, his reception room and a portion of the elevator lobby. These two rooms were equipped with their own air conditioners. - John T. Beasley Building, 632 Cherry Street (between Sixth & Seventh Streets), Terre Haute, Vigo County, IN

  14. 6. Remains Beneath Collapsed Engine House Roof, Showing Foundation Timbers ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Remains Beneath Collapsed Engine House Roof, Showing Foundation Timbers and Automobile Engine Connected to Pulley Wheel, Looking Southwest - David Renfrew Oil Rig, East side of Connoquenessing Creek, 0.4 mile North of confluence with Thorn Creek, Renfrew, Butler County, PA

  15. DETAIL VIEW OF FILTER PRESS REMAINS, BOILER, SECONDARY ORE BIN, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF FILTER PRESS REMAINS, BOILER, SECONDARY ORE BIN, TRAM TRESTLE AND WATER TANK, LOOKING NORTHWEST. HIS VIEW IS TAKEN FROM THE THIRD LEVEL OF THE MILL, NEARBY THE BLACKSMITH'S FORGE. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  16. Neanderthal infant and adult infracranial remains from Marillac (Charente, France).

    PubMed

    Dolores Garralda, María; Maureille, Bruno; Vandermeersch, Bernard

    2014-09-01

    At the site of Marillac, near the Ligonne River in Marillac-le-Franc (Charente, France), a remarkable stratigraphic sequence has yielded a wealth of archaeological information, palaeoenvironmental data, as well as faunal and human remains. Marillac must have been a sinkhole used by Neanderthal groups as a hunting camp during MIS 4 (TL date 57,600 ± 4,600BP), where Quina Mousterian lithics and fragmented bones of reindeer predominate. This article describes three infracranial skeleton fragments. Two of them are from adults and consist of the incomplete shafts of a right radius (Marillac 24) and a left fibula (Marillac 26). The third fragment is the diaphysis of the right femur of an immature individual (Marillac 25), the size and shape of which resembles those from Teshik-Tash and could be assigned to a child of a similar age. The three fossils have been compared with the remains of other Neanderthals or anatomically Modern Humans (AMH). Furthermore, the comparison of the infantile femora, Marillac 25 and Teshik-Tash, with the remains of several European children from the early Middle Ages clearly demonstrates the robustness and rounded shape of both Neanderthal diaphyses. Evidence of peri-mortem manipulations have been identified on all three bones, with spiral fractures, percussion pits and, in the case of the radius and femur, unquestionable cutmarks made with flint implements, probably during defleshing. Traces of periostosis appear on the fibula fragment and on the immature femoral diaphysis, although their aetiology remains unknown. PMID:24919796

  17. 5. VIEW LOOKING NORTHEAST AT UPPER GUIDE WALL REMAINS AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VIEW LOOKING NORTHEAST AT UPPER GUIDE WALL REMAINS AND LAND WALL FROM THE OHIO RIVER. - Ohio Slack Water Dams, Lock & Dam No. 4, East bank of Ohio River at mile point 18.6, along State Route 65, Ambridge, Beaver County, PA

  18. Robotics to Enable Older Adults to Remain Living at Home

    PubMed Central

    Pearce, Alan J.; Adair, Brooke; Ozanne, Elizabeth; Said, Catherine; Santamaria, Nick; Morris, Meg E.

    2012-01-01

    Given the rapidly ageing population, interest is growing in robots to enable older people to remain living at home. We conducted a systematic review and critical evaluation of the scientific literature, from 1990 to the present, on the use of robots in aged care. The key research questions were as follows: (1) what is the range of robotic devices available to enable older people to remain mobile, independent, and safe? and, (2) what is the evidence demonstrating that robotic devices are effective in enabling independent living in community dwelling older people? Following database searches for relevant literature an initial yield of 161 articles was obtained. Titles and abstracts of articles were then reviewed by 2 independent people to determine suitability for inclusion. Forty-two articles met the criteria for question 1. Of these, 4 articles met the criteria for question 2. Results showed that robotics is currently available to assist older healthy people and people with disabilities to remain independent and to monitor their safety and social connectedness. Most studies were conducted in laboratories and hospital clinics. Currently limited evidence demonstrates that robots can be used to enable people to remain living at home, although this is an emerging smart technology that is rapidly evolving. PMID:23304507

  19. As Year Ends, Questions Remain for New Orleans

    ERIC Educational Resources Information Center

    Maxwell, Lesli A.

    2008-01-01

    In rebuilding public schooling in New Orleans after Hurricane Katrina, education reformers have managed to hire energetic teachers, break ground on a few new school buildings, raise public confidence, and show progress on test scores. But fundamental questions remain as the 2007-08 academic year draws to a close, including how the city's…

  20. Plans and objectives of the remaining Apollo missions.

    NASA Technical Reports Server (NTRS)

    Scherer, L. R.

    1972-01-01

    The three remaining Apollo missions will have significantly increased scientific capabilities. These result from increased payload, more time on the surface, improved range, and more sophisticated experiments on the surface and in orbit. Landing sites for the last three missions will be carefully selected to maximize the total scientific return.

  1. Administrative Climate and Novices' Intent to Remain Teaching

    ERIC Educational Resources Information Center

    Pogodzinski, Ben; Youngs, Peter; Frank, Kenneth A.; Belman, Dale

    2012-01-01

    Using survey data from novice teachers at the elementary and middle school level across 11 districts, multilevel logistic regressions were estimated to examine the association between novices' perceptions of the administrative climate and their desire to remain teaching within their schools. We find that the probability that a novice teacher…

  2. Robotics to enable older adults to remain living at home.

    PubMed

    Pearce, Alan J; Adair, Brooke; Miller, Kimberly; Ozanne, Elizabeth; Said, Catherine; Santamaria, Nick; Morris, Meg E

    2012-01-01

    Given the rapidly ageing population, interest is growing in robots to enable older people to remain living at home. We conducted a systematic review and critical evaluation of the scientific literature, from 1990 to the present, on the use of robots in aged care. The key research questions were as follows: (1) what is the range of robotic devices available to enable older people to remain mobile, independent, and safe? and, (2) what is the evidence demonstrating that robotic devices are effective in enabling independent living in community dwelling older people? Following database searches for relevant literature an initial yield of 161 articles was obtained. Titles and abstracts of articles were then reviewed by 2 independent people to determine suitability for inclusion. Forty-two articles met the criteria for question 1. Of these, 4 articles met the criteria for question 2. Results showed that robotics is currently available to assist older healthy people and people with disabilities to remain independent and to monitor their safety and social connectedness. Most studies were conducted in laboratories and hospital clinics. Currently limited evidence demonstrates that robots can be used to enable people to remain living at home, although this is an emerging smart technology that is rapidly evolving. PMID:23304507

  3. 15. CYLINDRICAL FISH SCALER Remnants of the wire screen remain, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. CYLINDRICAL FISH SCALER Remnants of the wire screen remain, through which the fish tumbled as the cylinder revolved. Note geared ring around cylinder, and the small drive shaft by which it was driven. - Hovden Cannery, 886 Cannery Row, Monterey, Monterey County, CA

  4. REAR DETAIL OF RIGHT ENGINE AND WING. THRUST REVERSER REMAINS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    REAR DETAIL OF RIGHT ENGINE AND WING. THRUST REVERSER REMAINS OPEN. MECHANICS JONI BAINE (R) AND BILL THEODORE(L) OPEN FLAP CARRIAGE ACCESS WITH AN IMPACT GUN. THEY WILL CHECK TRANSMISSION FLUID AND OIL THE JACK SCREW. AT FAR LEFT UTILITY MECHANICS BEGIN BODY POLISHING. - Greater Buffalo International Airport, Maintenance Hangar, Buffalo, Erie County, NY

  5. Aftermath. The remains of the southwest end of the bridge ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Aftermath. The remains of the southwest end of the bridge lie next to the southwest pier. View is south-southeast from confluence of Trinity and South Fork Trinity Rivers - South Fork Trinity River Bridge, State Highway 299 spanning South Fork Trinity River, Salyer, Trinity County, CA

  6. Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing

    SciTech Connect

    Datta, D.

    2010-10-26

    Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.

  7. Uncertainty relations for angular momentum

    NASA Astrophysics Data System (ADS)

    Dammeier, Lars; Schwonnek, René; Werner, Reinhard F.

    2015-09-01

    In this work we study various notions of uncertainty for angular momentum in the spin-s representation of SU(2). We characterize the ‘uncertainty regions’ given by all vectors, whose components are specified by the variances of the three angular momentum components. A basic feature of this set is a lower bound for the sum of the three variances. We give a method for obtaining optimal lower bounds for uncertainty regions for general operator triples, and evaluate these for small s. Further lower bounds are derived by generalizing the technique by which Robertson obtained his state-dependent lower bound. These are optimal for large s, since they are saturated by states taken from the Holstein-Primakoff approximation. We show that, for all s, all variances are consistent with the so-called vector model, i.e., they can also be realized by a classical probability measure on a sphere of radius \\sqrt{s(s+1)}. Entropic uncertainty relations can be discussed similarly, but are minimized by different states than those minimizing the variances for small s. For large s the Maassen-Uffink bound becomes sharp and we explicitly describe the extremalizing states. Measurement uncertainty, as recently discussed by Busch, Lahti and Werner for position and momentum, is introduced and a generalized observable (POVM) which minimizes the worst case measurement uncertainty of all angular momentum components is explicitly determined, along with the minimal uncertainty. The output vectors for the optimal measurement all have the same length r(s), where r(s)/s\\to 1 as s\\to ∞ .

  8. Decisions on new product development under uncertainties

    NASA Astrophysics Data System (ADS)

    Huang, Yeu-Shiang; Liu, Li-Chen; Ho, Jyh-Wen

    2015-04-01

    In an intensively competitive market, developing a new product has become a valuable strategy for companies to establish their market positions and enhance their competitive advantages. Therefore, it is essential to effectively manage the process of new product development (NPD). However, since various problems may arise in NPD projects, managers should set up some milestones and subsequently construct evaluative mechanisms to assess their feasibility. This paper employed the approach of Bayesian decision analysis to deal with the two crucial uncertainties for NPD, which are the future market share and the responses of competitors. The proposed decision process can provide a systematic analytical procedure to determine whether an NPD project should be continued or not under the consideration of whether effective usage is being made of the organisational resources. Accordingly, the proposed decision model can assist the managers in effectively addressing the NPD issue under the competitive market.

  9. Visualizing uncertainty in biological expression data

    NASA Astrophysics Data System (ADS)

    Holzhüter, Clemens; Lex, Alexander; Schmalstieg, Dieter; Schulz, Hans-Jörg; Schumann, Heidrun; Streit, Marc

    2012-01-01

    Expression analysis of ~omics data using microarrays has become a standard procedure in the life sciences. However, microarrays are subject to technical limitations and errors, which render the data gathered likely to be uncertain. While a number of approaches exist to target this uncertainty statistically, it is hardly ever even shown when the data is visualized using for example clustered heatmaps. Yet, this is highly useful when trying not to omit data that is "good enough" for an analysis, which otherwise would be discarded as too unreliable by established conservative thresholds. Our approach addresses this shortcoming by first identifying the margin above the error threshold of uncertain, yet possibly still useful data. It then displays this uncertain data in the context of the valid data by enhancing a clustered heatmap. We employ different visual representations for the different kinds of uncertainty involved. Finally, it lets the user interactively adjust the thresholds, giving visual feedback in the heatmap representation, so that an informed choice on which thresholds to use can be made instead of applying the usual rule-of-thumb cut-offs. We exemplify the usefulness of our concept by giving details for a concrete use case from our partners at the Medical University of Graz, thereby demonstrating our implementation of the general approach.

  10. Scheduling Future Water Supply Investments Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Huskova, I.; Matrosov, E. S.; Harou, J. J.; Kasprzyk, J. R.; Reed, P. M.

    2014-12-01

    Uncertain hydrological impacts of climate change, population growth and institutional changes pose a major challenge to planning of water supply systems. Planners seek optimal portfolios of supply and demand management schemes but also when to activate assets whilst considering many system goals and plausible futures. Incorporation of scheduling into the planning under uncertainty problem strongly increases its complexity. We investigate some approaches to scheduling with many-objective heuristic search. We apply a multi-scenario many-objective scheduling approach to the Thames River basin water supply system planning problem in the UK. Decisions include which new supply and demand schemes to implement, at what capacity and when. The impact of different system uncertainties on scheme implementation schedules are explored, i.e. how the choice of future scenarios affects the search process and its outcomes. The activation of schemes is influenced by the occurrence of extreme hydrological events in the ensemble of plausible scenarios and other factors. The approach and results are compared with a previous study where only the portfolio problem is addressed (without scheduling).

  11. Uncertainty quantification in reacting flow modeling.

    SciTech Connect

    Le MaÒitre, Olivier P.; Reagan, Matthew T.; Knio, Omar M.; Ghanem, Roger Georges; Najm, Habib N.

    2003-10-01

    Uncertainty quantification (UQ) in the computational modeling of physical systems is important for scientific investigation, engineering design, and model validation. In this work we develop techniques for UQ based on spectral and pseudo-spectral polynomial chaos (PC) expansions, and we apply these constructions in computations of reacting flow. We develop and compare both intrusive and non-intrusive spectral PC techniques. In the intrusive construction, the deterministic model equations are reformulated using Galerkin projection into a set of equations for the time evolution of the field variable PC expansion mode strengths. The mode strengths relate specific parametric uncertainties to their effects on model outputs. The non-intrusive construction uses sampling of many realizations of the original deterministic model, and projects the resulting statistics onto the PC modes, arriving at the PC expansions of the model outputs. We investigate and discuss the strengths and weaknesses of each approach, and identify their utility under different conditions. We also outline areas where ongoing and future research are needed to address challenges with both approaches.

  12. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. PMID:21489684

  13. MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy

    SciTech Connect

    Van Dyk, J; Palta, J; Bortfeld, T; Mijnheer, B

    2014-06-15

    Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “how do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.

  14. Uncertainty in Measured Data and Model Predictions: Essential Components for Mobilizing Environmental Data and Modeling

    NASA Astrophysics Data System (ADS)

    Harmel, D.

    2014-12-01

    In spite of pleas for uncertainty analysis - such as Beven's (2006) "Should it not be required that every paper in both field and modeling studies attempt to evaluate the uncertainty in the results?" - the uncertainty associated with hydrology and water quality data is rarely quantified and rarely considered in model evaluation. This oversight, justified in the past by mainly tenuous philosophical concerns, diminishes the value of measured data and ignores the environmental and socio-economic benefits of improved decisions and policies based on data with estimated uncertainty. This oversight extends to researchers, who typically fail to estimate uncertainty in measured discharge and water quality data because of additional effort required, lack of adequate scientific understanding on the subject, and fear of negative perception if data with "high" uncertainty are reported; however, the benefits are certain. Furthermore, researchers have a responsibility for scientific integrity in reporting what is known and what is unknown, including the quality of measured data. In response we produced an uncertainty estimation framework and the first cumulative uncertainty estimates for measured water quality data (Harmel et al., 2006). From that framework, DUET-H/WQ was developed (Harmel et al., 2009). Application to several real-world data sets indicated that substantial uncertainty can be contributed by each data collection procedural category and that uncertainties typically occur in order discharge < sediment < dissolved N and P < total N and P. Similarly, modelers address certain aspects of model uncertainty but ignore others, such as the impact of uncertainty in discharge and water quality data. Thus, we developed methods to incorporate prediction uncertainty as well as calibration/validation data uncertainty into model goodness-of-fit evaluation (Harmel and Smith, 2007; Harmel et al., 2010). These enhance model evaluation by: appropriately sharing burden with "data

  15. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  16. We underestimate uncertainties in our predictions.

    SciTech Connect

    Pilch, Martin M.

    2010-04-01

    Prediction is defined in the American Heritage Dictionary as follows: 'To state, tell about, or make known in advance, especially on the basis of special knowledge.' What special knowledge do we demand of modeling and simulation to assert that we have a predictive capability for high consequence applications? The 'special knowledge' question can be answered in two dimensions: the process and rigor by which modeling and simulation is executed and assessment results for the specific application. Here we focus on the process and rigor dimension and address predictive capability in terms of six attributes: (1) geometric and representational fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) validation, and (6) uncertainty quantification. This presentation will demonstrate through mini-tutorials, simple examples, and numerous case studies how each attribute creates opportunities for errors, biases, or uncertainties to enter into simulation results. The demonstrations will motivate a set of practices that minimize the risk in using modeling and simulation for high-consequence applications while defining important research directions. It is recognized that there are cultural, technical, infrastructure, and resource barriers that prevent analysts from performing all analyses at the highest levels of rigor. Consequently, the audience for this talk is (1) analysts, so they can know what is expected of them, (2) decision makers, so they can know what to expect from modeling and simulation, and (3) the R&D community, so they can address the technical and infrastructure issues that prevent analysts from executing analyses in a practical, timely, and quality manner.

  17. A solution (data architecture) for handling time-series data - sensor data (4D), its visualisation and the questions around uncertainty of this data

    NASA Astrophysics Data System (ADS)

    Nayembil, Martin; Barkwith, Andrew

    2016-04-01

    Geo-environmental research is increasingly in the age of data-driven research. It has become necessary to collect, store, integrate and visualise more subsurface data for environmental research. The information required to facilitate data-driven research is often characterised by its variability, volume, complexity and frequency. This has necessitated the development of suitable data workflows, hybrid data architectures, and multiple visualisation solutions to provide the proper context to scientists and to enable their understanding of the different trends that the data displays for their many scientific interpolations. However this data, predominantly time-series (4D) acquired through sensors and being mostly telemetered, poses significant challenges/questions in quantifying the uncertainty of the data. To validate the research answers including the data methodologies, the following open questions around uncertainty will need addressing, i.e. uncertainty generated from: • the instruments used for data capture; • the transfer process of the data often from remote locations through telemetry; • the data processing techniques used for harmonising and integration from multiple sensor outlets; • the approximations applied to visualize such data from various conversion factors to include units standardisation The main question remains: How do we deal with the issues around uncertainty when it comes to the large and variable amounts of time-series data we collect, harmonise and visualise for the data-driven geo-environmental research that we undertake today?

  18. An active learning approach with uncertainty, representativeness, and diversity.

    PubMed

    He, Tianxu; Zhang, Shukui; Xin, Jie; Zhao, Pengpeng; Wu, Jian; Xian, Xuefeng; Li, Chunhua; Cui, Zhiming

    2014-01-01

    Big data from the Internet of Things may create big challenge for data classification. Most active learning approaches select either uncertain or representative unlabeled instances to query their labels. Although several active learning algorithms have been proposed to combine the two criteria for query selection, they are usually ad hoc in finding unlabeled instances that are both informative and representative and fail to take the diversity of instances into account. We address this challenge by presenting a new active learning framework which considers uncertainty, representativeness, and diversity creation. The proposed approach provides a systematic way for measuring and combining the uncertainty, representativeness, and diversity of an instance. Firstly, use instances' uncertainty and representativeness to constitute the most informative set. Then, use the kernel k-means clustering algorithm to filter the redundant samples and the resulting samples are queried for labels. Extensive experimental results show that the proposed approach outperforms several state-of-the-art active learning approaches. PMID:25180208

  19. An Active Learning Approach with Uncertainty, Representativeness, and Diversity

    PubMed Central

    He, Tianxu; Zhang, Shukui; Xin, Jie; Xian, Xuefeng; Li, Chunhua; Cui, Zhiming

    2014-01-01

    Big data from the Internet of Things may create big challenge for data classification. Most active learning approaches select either uncertain or representative unlabeled instances to query their labels. Although several active learning algorithms have been proposed to combine the two criteria for query selection, they are usually ad hoc in finding unlabeled instances that are both informative and representative and fail to take the diversity of instances into account. We address this challenge by presenting a new active learning framework which considers uncertainty, representativeness, and diversity creation. The proposed approach provides a systematic way for measuring and combining the uncertainty, representativeness, and diversity of an instance. Firstly, use instances' uncertainty and representativeness to constitute the most informative set. Then, use the kernel k-means clustering algorithm to filter the redundant samples and the resulting samples are queried for labels. Extensive experimental results show that the proposed approach outperforms several state-of-the-art active learning approaches. PMID:25180208

  20. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  1. Uncertainty Quantification in Solidification Modelling

    NASA Astrophysics Data System (ADS)

    Fezi, K.; Krane, M. J. M.

    2015-06-01

    Numerical models have been used to simulate solidification processes, to gain insight into physical phenomena that cannot be observed experimentally. Often validation of such models has been done through comparison to a few or single experiments, in which agreement is dependent on both model and experimental uncertainty. As a first step to quantifying the uncertainty in the models, sensitivity and uncertainty analysis were performed on a simple steady state 1D solidification model of continuous casting of weld filler rod. This model includes conduction, advection, and release of latent heat was developed for use in uncertainty quantification in the calculation of the position of the liquidus and solidus and the solidification time. Using this model, a Smolyak sparse grid algorithm constructed a response surface that fit model outputs based on the range of uncertainty in the inputs to the model. The response surface was then used to determine the probability density functions (PDF's) of the model outputs and sensitivities of the inputs. This process was done for a linear fraction solid and temperature relationship, for which there is an analytical solution, and a Scheil relationship. Similar analysis was also performed on a transient 2D model of solidification in a rectangular domain.

  2. Oscillator Stengths and Their Uncertainties

    NASA Astrophysics Data System (ADS)

    Wahlgren, G. M.

    2010-11-01

    The oscillator strength is a key parameter in the description of the line absorption coefficient. It can be determined through experiment, abinitio and semi-empirical calculations, and backward analysis of line profiles. Each method has its advantages, and the uncertainty attached to its determination can range from low to indeterminable. For analysis of line profiles or equivalent widths the uncertainty in the oscillator strength can rival or surpass the difference between the derived element abundance from classical LTE and non-LTE analyses. It is therefore important to understand the nature of oscillator strength uncertainties and to assess whether this uncertainty can be a factor in choosing to initiate a non-LTE analysis or in the interpretation of its results. Methods for the determination of the oscillator strength are presented, prioritizing experiments, along with commentary about the sources and impact of the uncertainties. The Sei spectrum is used to illustrate how gf-values can be constructed from published data on atomic lifetimes and line intensities.

  3. Uncertainty Analysis for RELAP5-3D

    SciTech Connect

    Aaron J. Pawel; Dr. George L. Mesina

    2011-08-01

    In its current state, RELAP5-3D is a 'best-estimate' code; it is one of our most reliable programs for modeling what occurs within reactor systems in transients from given initial conditions. This code, however, remains an estimator. A statistical analysis has been performed that begins to lay the foundation for a full uncertainty analysis. By varying the inputs over assumed probability density functions, the output parameters were shown to vary. Using such statistical tools as means, variances, and tolerance intervals, a picture of how uncertain the results are based on the uncertainty of the inputs has been obtained.

  4. Cascading rainfall uncertainty into flood inundation impact models

    NASA Astrophysics Data System (ADS)

    Souvignet, Maxime; Freer, Jim E.; de Almeida, Gustavo A. M.; Coxon, Gemma; Neal, Jeffrey C.; Champion, Adrian J.; Cloke, Hannah L.; Bates, Paul D.

    2014-05-01

    Observed and numerical weather prediction (NWP) simulated precipitation products typically show differences in their spatial and temporal distribution. These differences can considerably influence the ability to predict hydrological responses. For flood inundation impact studies, as in forecast situations, an atmospheric-hydrologic-hydraulic model chain is needed to quantify the extent of flood risk. Uncertainties cascaded through the model chain are seldom explored, and more importantly, how potential input uncertainties propagate through this cascade, and how best to approach this, is still poorly understood. This requires a combination of modelling capabilities, the non-linear transformation of rainfall to river flow using rainfall-runoff models, and finally the hydraulic flood wave propagation based on the runoff predictions. Improving the characterisation of uncertainty, and what is important to include, in each component is important for quantifying impacts and understanding flood risk for different return periods. In this paper, we propose to address this issue by i) exploring the effects of errors in rainfall on inundation predictive capacity within an uncertainty framework by testing inundation uncertainty against different comparable meteorological conditions (i.e. using different rainfall products) and ii) testing different techniques to cascade uncertainties (e.g. bootstrapping, PPU envelope) within the GLUE (generalised likelihood uncertainty estimation) framework. Our method cascades rainfall uncertainties into multiple rainfall-runoff model structures using the Framework for Understanding Structural Errors (FUSE). The resultant prediction uncertainties in upstream discharge provide uncertain boundary conditions that are cascaded into a simplified shallow water hydraulic model (LISFLOOD-FP). Rainfall data captured by three different measurement techniques - rain gauges, gridded radar data and numerical weather predictions (NWP) models are evaluated

  5. Identifying and Reducing Remaining Stocks of Rinderpest Virus.

    PubMed

    Hamilton, Keith; Visser, Dawid; Evans, Brian; Vallat, Bernard

    2015-12-01

    In 2011, the world was declared free from rinderpest, one of the most feared and devastating infectious diseases of animals. Rinderpest is the second infectious disease, after smallpox, to have been eradicated. However, potentially infectious rinderpest virus material remains widely disseminated among research and diagnostic facilities across the world and poses a risk for disease recurrence should it be released. Member Countries of the World Organisation for Animal Health and the Food and Agricultural Organization of the United Nations are committed to destroying remaining stocks of infectious material or ensuring that it is stored under international supervision in a limited number of approved facilities. To facilitate this commitment and maintain global freedom from rinderpest, World Organisation for Animal Health Member Countries must report annually on rinderpest material held in their countries. The first official surveys, conducted during 2013-2015, revealed that rinderpest material was stored in an unacceptably high number of facilities and countries. PMID:26584400

  6. Mineral remains of early life on Earth? On Mars?

    USGS Publications Warehouse

    Iberall, Robbins E.; Iberall, A.S.

    1991-01-01

    The oldest sedimentary rocks on Earth, the 3.8-Ga Isua Iron-Formation in southwestern Greenland, are metamorphosed past the point where organic-walled fossils would remain. Acid residues and thin sections of these rocks reveal ferric microstructures that have filamentous, hollow rod, and spherical shapes not characteristic of crystalline minerals. Instead, they resemble ferric-coated remains of bacteria. Because there are no earlier sedimentary rocks to study on Earth, it may be necessary to expand the search elsewhere in the solar system for clues to any biotic precursors or other types of early life. A study of morphologies of iron oxide minerals collected in the southern highlands during a Mars sample return mission may therefore help to fill in important gaps in the history of Earth's earliest biosphere. -from Authors

  7. Mandibular remains support taxonomic validity of Australopithecus sediba.

    PubMed

    de Ruiter, Darryl J; DeWitt, Thomas J; Carlson, Keely B; Brophy, Juliet K; Schroeder, Lauren; Ackermann, Rebecca R; Churchill, Steven E; Berger, Lee R

    2013-04-12

    Since the announcement of the species Australopithecus sediba, questions have been raised over whether the Malapa fossils represent a valid taxon or whether inadequate allowance was made for intraspecific variation, in particular with reference to the temporally and geographically proximate species Au. africanus. The morphology of mandibular remains of Au. sediba, including newly recovered material discussed here, shows that it is not merely a late-surviving morph of Au. africanus. Rather-as is seen elsewhere in the cranium, dentition, and postcranial skeleton-these mandibular remains share similarities with other australopiths but can be differentiated from the hypodigm of Au. africanus in both size and shape as well as in their ontogenetic growth trajectory. PMID:23580533

  8. Dental DNA fingerprinting in identification of human remains

    PubMed Central

    Girish, KL; Rahman, Farzan S; Tippu, Shoaib R

    2010-01-01

    The recent advances in molecular biology have revolutionized all aspects of dentistry. DNA, the language of life yields information beyond our imagination, both in health or disease. DNA fingerprinting is a tool used to unravel all the mysteries associated with the oral cavity and its manifestations during diseased conditions. It is being increasingly used in analyzing various scenarios related to forensic science. The technical advances in molecular biology have propelled the analysis of the DNA into routine usage in crime laboratories for rapid and early diagnosis. DNA is an excellent means for identification of unidentified human remains. As dental pulp is surrounded by dentin and enamel, which forms dental armor, it offers the best source of DNA for reliable genetic type in forensic science. This paper summarizes the recent literature on use of this technique in identification of unidentified human remains. PMID:21731342

  9. Late Pleistocene human remains from Wezmeh Cave, western Iran.

    PubMed

    Trinkaus, Erik; Biglari, Fereidoun; Mashkour, Marjan; Monchot, Hervé; Reyss, Jean-Louis; Rougier, Hélène; Heydari, Saman; Abdi, Kamyar

    2008-04-01

    Paleontological analysis of remains from Wezmeh Cave in western Iran have yielded a Holocene Chalcolithic archeological assemblage, a rich Late Pleistocene carnivore faunal assemblage, and an isolated unerupted human maxillary premolar (P(3) or possibly P(4)). Species representation and U-series dating of faunal teeth place the carnivore assemblage during oxygen isotope stages (OIS) 3 and 2, and noninvasive gamma spectrometry dating of the human premolar places it at least as old as early OIS 2. The human premolar crown morphology is not diagnostic of late archaic versus early modern human affinities, but its buccolingual diameter places it at the upper limits of Late Pleistocene human P(3) and P(4) dimensions and separate from a terminal Pleistocene regional sample. Wezmeh Cave therefore provides additional Paleolithic human remains from the Zagros Mountains and further documents Late Pleistocene human association with otherwise carnivore-dominated cave assemblages. PMID:18000894

  10. Belarus ratifies START I pact; Ukraine remains last holdout

    SciTech Connect

    Lockwood, D.

    1993-03-01

    The Belarus Parliment ratified START I by a vote of 218 to 1 on February 4, 1993. The Parliment also voted to accede to the nuclear Non-Proliferation Treaty as a non-nuclear weapon state. The Parliment also passed two companion accords with Russia to coordinate the withdrawal of the ICBMs now in Belarus and to define the legal states of those weapons. Ukraine remains the only party to START I that has not yet approved the treaty.

  11. Direct dating of Early Upper Palaeolithic human remains from Mladec.

    PubMed

    Wild, Eva M; Teschler-Nicola, Maria; Kutschera, Walter; Steier, Peter; Trinkaus, Erik; Wanek, Wolfgang

    2005-05-19

    The human fossil assemblage from the Mladec Caves in Moravia (Czech Republic) has been considered to derive from a middle or later phase of the Central European Aurignacian period on the basis of archaeological remains (a few stone artefacts and organic items such as bone points, awls, perforated teeth), despite questions of association between the human fossils and the archaeological materials and concerning the chronological implications of the limited archaeological remains. The morphological variability in the human assemblage, the presence of apparently archaic features in some specimens, and the assumed early date of the remains have made this fossil assemblage pivotal in assessments of modern human emergence within Europe. We present here the first successful direct accelerator mass spectrometry radiocarbon dating of five representative human fossils from the site. We selected sample materials from teeth and from one bone for 14C dating. The four tooth samples yielded uncalibrated ages of approximately 31,000 14C years before present, and the bone sample (an ulna) provided an uncertain more-recent age. These data are sufficient to confirm that the Mladec human assemblage is the oldest cranial, dental and postcranial assemblage of early modern humans in Europe and is therefore central to discussions of modern human emergence in the northwestern Old World and the fate of the Neanderthals. PMID:15902255

  12. Microsatellites identify depredated waterfowl remains from glaucous gull stomachs

    USGS Publications Warehouse

    Scribner, K.T.; Bowman, T.D.

    1998-01-01

    Prey remains can provide valuable sources of information regarding causes of predation and the species composition of a predator's diet. Unfortunately, the highly degraded state of many prey samples from gastrointestinal tracts often precludes unambiguous identification. We describe a procedure by which PCR amplification of taxonomically informative microsatellite loci were used to identify species of waterfowl predated by glaucous gulls (Larus hyperboreus). We found that one microsatellite locus unambiguously distinguished between species of the subfamily Anserinae (whistling ducks, geese and swans) and those of the subfamily Anatidae (all other ducks). An additional locus distinguished the remains of all geese and swan species known to nest on the Yukon-Kuskokwim delta in western Alaska. The study focused on two waterfowl species which have experienced precipitous declines in population numbers: emperor geese (Chen canagica) and spectacled eiders (Somateria fischeri). No evidence of predation on spectacled eiders was observed. Twenty-six percent of all glaucous gull stomachs examined contained the remains of juvenile emperor geese.

  13. Osteometric sex determination of burned human skeletal remains.

    PubMed

    Gonçalves, D; Thompson, T J U; Cunha, E

    2013-10-01

    Sex determination of human burned skeletal remains is extremely hard to achieve because of heat-related fragmentation, warping and dimensional changes. In particular, the latter is impeditive of osteometric analyses that are based on references developed on unburned bones. New osteometric references were thus obtained which allow for more reliable sex determinations. The calcined remains of cremated Portuguese individuals were examined and specific standard measurements of the humerus, femur, talus and calcaneus were recorded. This allowed for the compilation of new sex discriminating osteometric references which were then tested on independent samples with good results. Both the use of simple section points and of logistic regression equations provided successful sex classification scores. These references may now be used for the sex determination of burned skeletons. Its reliability is highest for contemporary Portuguese remains but nonetheless these results have important repercussion for forensic research. More conservative use of these references may also prove valuable for other populations as well as for archaeological research. PMID:24112343

  14. Skeletal preservation of children's remains in the archaeological record.

    PubMed

    Manifold, B M

    2015-12-01

    Taphonomy is an important consideration in the reconstruction of past environments and events. Taphonomic alterations and processes are commonly encountered on human skeletal remains in both archaeological and forensic contexts. It is these processes that can alter the appearance of bone after death and the properties of the bones influence their reaction to these processes thus leading to differential preservation within a skeletal sample, none more so than the remains of children. This study investigates the skeletal preservation of 790 child and adolescent skeletons from six contrasting early and late medieval cemeteries from Britain in an attempt to assess whether geographical location and geology had an effect on the overall preservation of the skeletons. Skeletons were examined from six cemeteries, namely; Auldhame in Scotland, Edix Hill and Great Chesterford from Cambridgeshire; St Oswald's Priory from Gloucester and Wharram Percy from Yorkshire, and finally, the site of Llandough in Wales. The state of preservation was assessed using the anatomical preservation index (AP1), qualitative bone index (QBI) and the bone representation index (BRI). Also the presence of natural and artificial taphonomic processes was recorded for each skeleton. The results show a specific pattern of preservation and representation for non-adult remains across all sites with some differences in the states of preservation from different geographical locations and geological influences. Children under two years of age were found to be less affected by taphonomic processes than their older counterparts. PMID:26391374

  15. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  16. Sub-Heisenberg phase uncertainties

    NASA Astrophysics Data System (ADS)

    Pezzé, Luca

    2013-12-01

    Phase shift estimation with uncertainty below the Heisenberg limit, ΔϕHL∝1/N¯T, where N¯T is the total average number of particles employed, is a mirage of linear quantum interferometry. Recently, Rivas and Luis, [New J. Phys.NJOPFM1367-263010.1088/1367-2630/14/9/093052 14, 093052 (2012)] proposed a scheme to achieve a phase uncertainty Δϕ∝1/N¯Tk, with k an arbitrary exponent. This sparked an intense debate in the literature which, ultimately, does not exclude the possibility to overcome ΔϕHL at specific phase values. Our numerical analysis of the Rivas and Luis proposal shows that sub-Heisenberg uncertainties are obtained only when the estimator is strongly biased. No violation of the Heisenberg limit is found after bias correction or when using a bias-free Bayesian analysis.

  17. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  18. Uncertainties in hydrocarbon charge prediction

    NASA Astrophysics Data System (ADS)

    Visser, W.; Bell, A.

    Computer simulations allow the prediction of hydrocarbon volumes, composition and charge timing in undrilled petroleum prospects. Whereas different models may give different hydrocarbon charge predictions, it has now become evident that a dominant cause of erroneous predictions is the poor quality of input data. The main culprit for prediction errors is the uncertainty in the initial hydrogen index (H/C) of the source rock. A 10% uncertainty in the H/C may lead to 50% error in the predicted hydrocarbon volumes, and associated gas-oil ratio. Similarly, uncertainties in the maximum burial temperature and the kinetics of hydrocarbon generation may lead to 20-50% error. Despite this, charge modelling can have great value for the ranking of prospects in the same area with comparable geological histories.

  19. 32 CFR 516.7 - Mailing addresses.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 3 2011-07-01 2009-07-01 true Mailing addresses. 516.7 Section 516.7 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION General § 516.7 Mailing addresses. Mailing addresses for organizations referenced...

  20. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 5 2012-10-01 2012-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  1. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 5 2014-10-01 2014-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  2. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 5 2013-10-01 2013-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  3. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  4. 47 CFR 97.23 - Mailing address.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 5 2011-10-01 2011-10-01 false Mailing address. 97.23 Section 97.23 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) SAFETY AND SPECIAL RADIO SERVICES AMATEUR RADIO... name and mailing address. The mailing address must be in an area where the amateur service is...

  5. 47 CFR 13.10 - Licensee address.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Licensee address. 13.10 Section 13.10 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL COMMERCIAL RADIO OPERATORS General § 13.10 Licensee address. In accordance with § 1.923 of this chapter all applications must specify an address where...

  6. CCCC Chair's Address: Representing Ourselves, 2008

    ERIC Educational Resources Information Center

    Glenn, Cheryl

    2008-01-01

    This article presents the text of the author's address at the fifty-ninth annual convention of the Conference on College Composition and Communication (CCCC) in March 2008. In her address, the author picks up strands of previous Chairs' addresses and weaves them through the fabric of her remarks. What she hopes will give sheen to the fabric is her…

  7. 75 FR 49813 - Change of Address

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-16

    ... COMMISSION 11 CFR Parts 9405, 9407, 9409, 9410, 9420, and 9428 Change of Address AGENCY: United States... Assistance Commission (EAC) is amending its regulations to reflect a change of address for its headquarters. This technical amendment is a nomenclature change that updates and corrects the address for...

  8. 32 CFR 516.7 - Mailing addresses.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 3 2010-07-01 2010-07-01 true Mailing addresses. 516.7 Section 516.7 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS LITIGATION General § 516.7 Mailing addresses. Mailing addresses for organizations referenced...

  9. Assessment of the intrinsic uncertainty of the k0-based NAA

    NASA Astrophysics Data System (ADS)

    Bučar, Tinkara; Smodiš, Borut

    2006-08-01

    This paper addresses the intrinsic uncertainty of k0 neutron activation analysis (NAA) by evaluating the partial uncertainties of the nuclear parameters and parameters given by the irradiation conditions. Uncertainty propagation factors are determined from the basic equations of the k0-NAA and the combined uncertainties are calculated using a software package specially developed for this purpose. The nuclear parameter values and respective uncertainties are taken from an IUPAC database. The uncertainties are calculated for specific conditions given at the TRIGA Mark II reactor of the Jožef Stefan Institute, for all reactions where data is available. On average, neutron reaction-specific values in the range of 1-2% were obtained for 44 elements. For 23 elements, some data are missing in the database, so the values should be obtained elsewhere. The developed approach is generally applicable to other neutron flux conditions.

  10. Evaluating uncertainty in estimates of soil moisture memory with a reverse ensemble approach

    NASA Astrophysics Data System (ADS)

    MacLeod, Dave; Cloke, Hannah; Pappenberger, Florian; Weisheimer, Antje

    2016-07-01

    Soil moisture memory is a key component of seasonal predictability. However, uncertainty in current memory estimates is not clear and it is not obvious to what extent these are dependent on model uncertainties. To address this question, we perform a global sensitivity analysis of memory to key hydraulic parameters, using an uncoupled version of the H-TESSEL land surface model. Results show significant dependency of estimates of memory and its uncertainty on these parameters, suggesting that operational seasonal forecasting models using deterministic hydraulic parameter values are likely to display a narrower range of memory than exists in reality. Explicitly incorporating hydraulic parameter uncertainty into models may then give improvements in forecast skill and reliability, as has been shown elsewhere in the literature. Our results also show significant differences with previous estimates of memory uncertainty, warning against placing too much confidence in a single quantification of uncertainty.

  11. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans

    2015-04-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  12. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.

    2015-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  13. The visualization of spatial uncertainty

    SciTech Connect

    Srivastava, R.M.

    1994-12-31

    Geostatistical conditions simulation is gaining acceptance as a numerical modeling tool in the petroleum industry. Unfortunately, many of the new users of conditional simulation work with only one outcome or ``realization`` and ignore the many other outcomes that could be produced by their conditional simulation tools. 3-D visualization tools allow them to create very realistic images of this single outcome as reality. There are many methods currently available for presenting the uncertainty information from a family of possible outcomes; most of these, however, use static displays and many present uncertainty in a format that is not intuitive. This paper explores the visualization of uncertainty through dynamic displays that exploit the intuitive link between uncertainty and change by presenting the use with a constantly evolving model. The key technical challenge to such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such animation is possible and to show that such dynamic displays can be an effective tool in risk analysis for the petroleum industry.

  14. Groundwater management under sustainable yield uncertainty

    NASA Astrophysics Data System (ADS)

    Delottier, Hugo; Pryet, Alexandre; Dupuy, Alain

    2015-04-01

    groundwater systems. For predictive analysis of the SY to be realistic for real world problems, we test a calibration method based on the Gauss-Levenberg-Marquardt algorithm. Our results highlight that the analysis of the SY predictive uncertainty is essential for groundwater management. This uncertainty is expected to be large and can be addressed with better a priori information on model parameter values.

  15. Resolving Key Uncertainties in Subsurface Energy Recovery: One Role of In Situ Experimentation and URLs (Invited)

    NASA Astrophysics Data System (ADS)

    Elsworth, D.

    2013-12-01

    Significant uncertainties remain and influence the recovery of energy from the subsurface. These uncertainties include the fate and transport of long-lived radioactive wastes that result from the generation of nuclear power and have been the focus of an active network of international underground research laboratories dating back at least 35 years. However, other nascent carbon-free energy technologies including conventional and EGS geothermal methods, carbon-neutral methods such as carbon capture and sequestration and the utilization of reduced-carbon resources such as unconventional gas reservoirs offer significant challenges in their effective deployment. We illustrate the important role that in situ experiments may play in resolving behaviors at extended length- and time-scales for issues related to chemical-mechanical interactions. Significantly, these include the evolution of transport and mechanical characteristics of stress-sensitive fractured media and their influence of the long-term behavior of the system. Importantly, these interests typically relate to either creating reservoirs (hydroshearing in EGS reservoirs, artificial fractures in shales and coals) or maintaining seals at depth where the permeating fluids may include mixed brines, CO2, methane and other hydrocarbons. Critical questions relate to the interaction of these various fluid mixtures and compositions with the fractured substrate. Important needs are in understanding the roles of key processes (transmission, dissolution, precipitation, sorption and dynamic stressing) on the modification of effective stresses and their influence on the evolution of permeability, strength and induced seismicity on the resulting development of either wanted or unwanted fluid pathways. In situ experimentation has already contributed to addressing some crucial issues of these complex interactions at field scale. Important contributions are noted in understanding the fate and transport of long-lived wastes

  16. Estimating European carbon balance and its uncertainties

    NASA Astrophysics Data System (ADS)

    Churkina, G.; Vetter, M.; Jung, M.; Tomelleri, E.; Trusilova, K.

    2007-05-01

    A globally significant carbon sink in 1980's-1990's in northern extratropical regions was inferred from variations in atmospheric CO2 concentrations. Although this sink was attributed mostly to forest ecosystems, the magnitude and cause of this sink remain uncertain. We aim at understanding the role of European continent in this carbon sink and associated uncertainties. Our analysis is based on simulations of European net carbon flux, gross primary productivity, and ecosystem respiration with BIOME-BGC model and with a few other vegetation models. All model simulations were performed with the same soil texture, digital elevation map, fractional vegetation classification, climate, and atmospheric CO2 concentrations. We discuss uncertainties in the estimates of gross primary productivity of Europe associated with different land covers, meteorological data, as well as vegetation models. We also compare the ability of BIOME-BGC to simulate annual gross primary production of forest ecosystems across Europe with two other global biogeochemical models. The later analysis is based on site-level model simulations at 37 eddy covariance EUROFLUX sites representing climate zones from boreal to Mediterranean.

  17. Heisenberg uncertainty in reduced power algebras

    NASA Astrophysics Data System (ADS)

    Rosinger, Elemér E.

    2012-12-01

    The Heisenberg uncertainty relation is known to be obtainable by a purely mathematical argument. Based on that fact, here it is shown that the Heisenberg uncertainty relation remains valid when Quantum Mechanics is re-formulated within far wider frameworks of scalars, namely, within one or the other of the infinitely many reduced power algebras which can replace the usual real numbers R, or complex numbers C. Three possible major advantages in Physics of such a reformulation are: 1) the disappearance of the well known and hard to deal with problem of the so called "infinities in Physics", 2) the possibilitiy to have infinitely many "levels of precision" instead of the only one existing at present, 3) the possibility to model "hierarchies of Planck constants", [2]. Last and not least, the scalars given by reduced power algebras contain as a particular case those obtained by Nonstandard Analysis, yet they are far more simple and easy to deal with, being in fact on the level of a first course in Algebra. A detailed version of this paper can be found in arxiv:0901.4825.

  18. Uncertainty assessment in watershed-scale water quality modeling and management: 1. Framework and application of generalized likelihood uncertainty estimation (GLUE) approach

    NASA Astrophysics Data System (ADS)

    Zheng, Yi; Keller, Arturo A.

    2007-08-01

    Watershed-scale water quality models involve substantial uncertainty in model output because of sparse water quality observations and other sources of uncertainty. Assessing the uncertainty is very important for those who use the models to support management decision making. Systematic uncertainty analysis for these models has rarely been done and remains a major challenge. This study aimed (1) to develop a framework to characterize all important sources of uncertainty and their interactions in management-oriented watershed modeling, (2) to apply the generalized likelihood uncertainty estimation (GLUE) approach for quantifying simulation uncertainty for complex watershed models, and (3) to investigate the influence of subjective choices (especially the likelihood measure) in a GLUE analysis, as well as the availability of observational data, on the outcome of the uncertainty analysis. A two-stage framework was first established as the basis for uncertainty assessment and probabilistic decision-making. A watershed model (watershed analysis risk management framework (WARMF)) was implemented using data from the Santa Clara River Watershed in southern California. A typical catchment was constructed on which a series of experiments was conducted. The results show that GLUE can be implemented with affordable computational cost, yielding insights into the model behavior. However, in complex watershed water quality modeling, the uncertainty results highly depend on the subjective choices made by the modeler as well as the availability of observational data. The importance of considering management concerns in the uncertainty estimation was also demonstrated. Overall, this study establishes guidance for uncertainty assessment in management-oriented watershed modeling. The study results have suggested future efforts we could make in a GLUE-based uncertainty analysis, which has led to the development of a new method, as will be introduced in a companion paper. Eventually, the

  19. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  20. Uncertainties in offsite consequence analysis

    SciTech Connect

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  1. Geographic Uncertainty in Environmental Security

    NASA Astrophysics Data System (ADS)

    Ahlquist, Jon

    2008-06-01

    This volume contains 17 papers presented at the NATO Advanced Research Workshop on Fuzziness and Uncertainty held in Kiev, Ukraine, 28 June to 1 July 2006. Eleven of the papers deal with fuzzy set concepts, while the other six (papers 5, 7, 13, 14, 15, and 16) are not fuzzy. A reader with no prior exposure to fuzzy set theory would benefit from having an introductory text at hand, but the papers are accessible to a wide audience. In general, the papers deal with broad issues of classification and uncertainty in geographic information.

  2. Awe, uncertainty, and agency detection.

    PubMed

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events. PMID:24247728

  3. Estimating Uncertainties in the Multi-Instrument SBUV Profile Ozone Merged Data Set

    NASA Technical Reports Server (NTRS)

    Frith, Stacey; Stolarski, Richard

    2015-01-01

    The MOD data set is uniquely qualified for use in long-term ozone analysis because of its long record, high spatial coverage, and consistent instrument design and algorithm. The estimated MOD uncertainty term significantly increases the uncertainty over the statistical error alone. Trends in the post-2000 period are generally positive in the upper stratosphere, but only significant at 1-1.6 hPa. Remaining uncertainties not yet included in the Monte Carlo model are Smoothing Error ( 1 from 10 to 1 hPa) Relative calibration uncertainty between N11 and N17Seasonal cycle differences between SBUV records.

  4. Effect of corrosion and stress-corrosion cracking on pipe integrity and remaining life

    SciTech Connect

    Jaske, C.E.; Beavers, J.A.

    1996-07-01

    Process piping is often exposed to corrosive fluids. During service, such exposure may cause localized corrosion or stress-corrosion cracking that affects structural integrity. This paper presents a model that quantifies the effect of localized corrosion and stress-corrosion cracking on pipe failure stress. The model is an extension of those that have been developed for oil and gas pipelines. It accounts for both axial and hoop stress. Cracks are modeled using inelastic fracture mechanics. Both flow-stress and fracture-toughness dependent failure modes are addressed. Corrosion and crack-growth rates are used to predict remaining service life.

  5. A General Uncertainty Quantification Methodology for Cloud Microphysical Property Retrievals

    NASA Astrophysics Data System (ADS)

    Tang, Q.; Xie, S.; Chen, X.; Zhao, C.

    2014-12-01

    The US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program provides long-term (~20 years) ground-based cloud remote sensing observations. However, there are large uncertainties in the retrieval products of cloud microphysical properties based on the active and/or passive remote-sensing measurements. To address this uncertainty issue, a DOE Atmospheric System Research scientific focus study, Quantification of Uncertainties in Cloud Retrievals (QUICR), has been formed. In addition to an overview of recent progress of QUICR, we will demonstrate the capacity of an observation-based general uncertainty quantification (UQ) methodology via the ARM Climate Research Facility baseline cloud microphysical properties (MICROBASE) product. This UQ method utilizes the Karhunen-Loéve expansion (KLE) and Central Limit Theorems (CLT) to quantify the retrieval uncertainties from observations and algorithm parameters. The input perturbations are imposed on major modes to take into account the cross correlations between input data, which greatly reduces the dimension of random variables (up to a factor of 50) and quantifies vertically resolved full probability distribution functions of retrieved quantities. Moreover, this KLE/CLT approach has the capability of attributing the uncertainties in the retrieval output to individual uncertainty source and thus sheds light on improving the retrieval algorithm and observations. We will present the results of a case study for the ice water content at the Southern Great Plains during an intensive observing period on March 9, 2000. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  6. Modelling ecosystem service flows under uncertainty with stochiastic SPAN

    USGS Publications Warehouse

    Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.

    2012-01-01

    Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.

  7. Long-term market brisk, spot remains sluggish

    SciTech Connect

    1996-05-01

    Spot market activity totaled almost 54,000 lbs of U3O8 equivalent. The restricted uranium spot market price range had a slight increase from a high last month of $15.60/lb U3O8 to a hgih this month of $16.00/lb U3O8. The unrestricted uranium spot market price range remained at last month`s prices for the first time in recent weeks. Spot prices for conversion and SWU also held steady at their March levels.

  8. New fossil cercopithecid remains from the Humpata Plateau, southern Angola.

    PubMed

    Jablonski, N G

    1994-08-01

    The aim of the present investigation was to describe and identify the well-preserved cranial remains of a fossil cercopithecid recently recovered from sites on the Humpata Plateau in southern Angola. In the past, papionin fossils recovered from the Angolan site of Tchiua (Leba) have been referred to various taxa, including Dinopithecus ingens, Parapapio sp., and Papio (Dinopithecus) quadratirostris. Comparison of the new Angolan papionin cranial remains with those previously described from the Humpata Plateau and a large range of living and fossil Papionini revealed that the range of metrical and morphological variation present in the Humpata papionin sample was consistent with that found in a single extant papionin species. The Humpata cranial remains bear the largest number of similarities to Theropithecus baringensis R. Leakey, 1969, and it is to this species that the remains are hereby referred. This assignment is based on a suite of 11 shared attributes of the Humpata papionin fossils and the type specimen of T. baringensis, KNM BC2, which include: large molar teeth of relatively low relief with pinched cusps and with a prominent distal fovea on M3; a small, low cranial vault with little mid-parietal expansion; a bow-shaped supraorbital torus; trapezoidal, inferiorly tapering orbits; a functional complex related to the presence of a large and vertically oriented anterior temporalis muscle; a large infratemporal fossa with an anteromedially oriented posterior border; a long muzzle with a steep interorbital drop, shallow incisive arc, flattened dorsum, and rounded maxillary ridges; nasal bones that extend across the breadth of the posterior margin of the nasal aperture and then taper markedly as they approach nasion; prominent, inferiorly divergent mental ridges; and relatively shallow mandibular fossae that are long, elliptical in shape, and extend to the level of the M3. The results of the current study suggest that T. baringensis (now including the Humpata

  9. "Recent" macrofossil remains from the Lomonosov Ridge, central Arctic Ocean

    NASA Astrophysics Data System (ADS)

    Le Duc, Cynthia; de Vernal, Anne; Archambault, Philippe; Brice, Camille; Roberge, Philippe

    2016-04-01

    The examination of surface sediment samples collected from 17 sites along the Lomonosov Ridge at water depths ranging from 737 to 3339 meters during Polarstern Expedition PS87 in 2014 (Stein, 2015), indicates a rich biogenic content almost exclusively dominated by calcareous remains. Amongst biogenic remains, microfossils (planktic and benthic foraminifers, pteropods, ostracods, etc.) dominate but millimetric to centrimetric macrofossils occurred frequently at the surface of the sediment. The macrofossil remains consist of a large variety of taxa, including gastropods, bivalvia, polychaete tubes, scaphopods, echinoderm plates and spines, and fish otoliths. Among the Bivalvia, the most abundant taxa are Portlandia arctica, Hyalopecten frigidus, Cuspidaria glacilis, Policordia densicostata, Bathyarca spp., and Yoldiella spp. Whereas a few specimens are well preserved and apparently pristine, most mollusk shells displayed extensive alteration features. Moreover, most shells were covered by millimeter scale tubes of the serpulid polychaete Spirorbis sp. suggesting transport from low intertidal or subtidal zone. Both the ecological affinity and known geographic distribution of identified bivalvia as named above support the hypothesis of transportation rather than local development. In addition to mollusk shells, more than a hundred fish otoliths were recovered in surface sediments. The otoliths mostly belong to the Gadidae family. Most of them are well preserved and without serpulid tubes attached to their surface, suggesting a local/regional origin, unlike the shell remains. Although recovered at the surface, the macrofaunal assemblages of the Lomonosov Ridge do not necessarily represent the "modern" environments as they may result from reworking and because their occurrence at the surface of the sediment may also be due to winnowing of finer particles. Although the shells were not dated, we suspect that their actual ages may range from modern to several thousands of

  10. Yellow Fever Remains a Potential Threat to Public Health.

    PubMed

    Vasconcelos, Pedro F C; Monath, Thomas P

    2016-08-01

    Yellow fever (YF) remains a serious public health threat in endemic countries. The recent re-emergence in Africa, initiating in Angola and spreading to Democratic Republic of Congo and Uganda, with imported cases in China and Kenya is of concern. There is such a shortage of YF vaccine in the world that the World Health Organization has proposed the use of reduced doses (1/5) during emergencies. In this short communication, we discuss these and other problems including the risk of spread of YF to areas free of YF for decades or never before affected by this arbovirus disease. PMID:27400066

  11. Leprosy: ancient disease remains a public health problem nowadays*

    PubMed Central

    Noriega, Leandro Fonseca; Chiacchio, Nilton Di; Noriega, Angélica Fonseca; Pereira, Gilmayara Alves Abreu Maciel; Vieira, Marina Lino

    2016-01-01

    Despite being an ancient disease, leprosy remains a public health problem in several countries - particularly in India, Brazil and Indonesia. The current operational guidelines emphasize the evaluation of disability from the time of diagnosis and stipulate as fundamental principles for disease control: early detection and proper treatment. Continued efforts are needed to establish and improve quality leprosy services. A qualified primary care network that is integrated into specialized service and the development of educational activities are part of the arsenal in the fight against the disease, considered neglected and stigmatizing. PMID:27579761

  12. Encephalitozoon cuniculi in Raw Cow's Milk Remains Infectious After Pasteurization.

    PubMed

    Kváč, Martin; Tomanová, Vendula; Samková, Eva; Koubová, Jana; Kotková, Michaela; Hlásková, Lenka; McEvoy, John; Sak, Bohumil

    2016-02-01

    This study describes the prevalence of Encephalitozoon cuniculi in raw cow's milk and evaluates the effect of different milk pasteurization treatments on E. cuniculi infectivity for severe combined immunodeficient (SCID) mice. Using a nested polymerase chain reaction approach, 1 of 50 milking cows was found to repeatedly shed E. cuniculi in its feces and milk. Under experimental conditions, E. cuniculi spores in milk remained infective for SCID mice following pasteurization treatments at 72 °C for 15 s or 85 °C for 5 s. Based on these findings, pasteurized cow's milk should be considered a potential source of E. cuniculi infection in humans. PMID:26650923

  13. Leprosy: ancient disease remains a public health problem nowadays.

    PubMed

    Noriega, Leandro Fonseca; Chiacchio, Nilton Di; Noriega, Angélica Fonseca; Pereira, Gilmayara Alves Abreu Maciel; Vieira, Marina Lino

    2016-01-01

    Despite being an ancient disease, leprosy remains a public health problem in several countries -particularly in India, Brazil and Indonesia. The current operational guidelines emphasize the evaluation of disability from the time of diagnosis and stipulate as fundamental principles for disease control: early detection and proper treatment. Continued efforts are needed to establish and improve quality leprosy services. A qualified primary care network that is integrated into specialized service and the development of educational activities are part of the arsenal in the fight against the disease, considered neglected and stigmatizing. PMID:27579761

  14. Studies on protozoa in ancient remains - A Review

    PubMed Central

    Frías, Liesbeth; Leles, Daniela; Araújo, Adauto

    2013-01-01

    Paleoparasitological research has made important contributions to the understanding of parasite evolution and ecology. Although parasitic protozoa exhibit a worldwide distribution, recovering these organisms from an archaeological context is still exceptional and relies on the availability and distribution of evidence, the ecology of infectious diseases and adequate detection techniques. Here, we present a review of the findings related to protozoa in ancient remains, with an emphasis on their geographical distribution in the past and the methodologies used for their retrieval. The development of more sensitive detection methods has increased the number of identified parasitic species, promising interesting insights from research in the future. PMID:23440107

  15. Remains to be transmitted: Primo Levi's traumatic dream.

    PubMed

    Blévis, Jean-Jacques

    2004-07-01

    Drawing on the writings of Primo Levi and the psychoanalysis of Jacques Lacan, the author attempts to conceive psychic trauma as a coalescence of traumas, since this is perhaps the only way to prevent a subject from being forced back into identification with the catastrophic event, whatever that may have been. A recurrent dream of Primo Levi's suggests to the author the way that traumas may have coalesced within Levi. The hope would be to restore the entire significance of what remains from that traumatic event to the speech (parole) of the Other, to the speech of every human, even the most helpless, bruised, or destroyed among us. PMID:15287444

  16. Research potential and limitations of trace analyses of cremated remains.

    PubMed

    Harbeck, Michaela; Schleuder, Ramona; Schneider, Julius; Wiechmann, Ingrid; Schmahl, Wolfgang W; Grupe, Gisela

    2011-01-30

    Human cremation is a common funeral practice all over the world and will presumably become an even more popular choice for interment in the future. Mainly for purposes of identification, there is presently a growing need to perform trace analyses such as DNA or stable isotope analyses on human remains after cremation in order to clarify pending questions in civil or criminal court cases. The aim of this study was to experimentally test the potential and limitations of DNA and stable isotope analyses when conducted on cremated remains. For this purpose, tibiae from modern cattle were experimentally cremated by incinerating the bones in increments of 100°C until a maximum of 1000°C was reached. In addition, cremated human remains were collected from a modern crematory. The samples were investigated to determine level of DNA preservation and stable isotope values (C and N in collagen, C and O in the structural carbonate, and Sr in apatite). Furthermore, we assessed the integrity of microstructural organization, appearance under UV-light, collagen content, as well as the mineral and crystalline organization. This was conducted in order to provide a general background with which to explain observed changes in the trace analyses data sets. The goal is to develop an efficacious screening method for determining at which degree of burning bone still retains its original biological signals. We found that stable isotope analysis of the tested light elements in bone is only possible up to a heat exposure of 300°C while the isotopic signal from strontium remains unaltered even in bones exposed to very high temperatures. DNA-analyses seem theoretically possible up to a heat exposure of 600°C but can not be advised in every case because of the increased risk of contamination. While the macroscopic colour and UV-fluorescence of cremated bone give hints to temperature exposure of the bone's outer surface, its histological appearance can be used as a reliable indicator for the

  17. Uncertainty analysis on photogrammetry-derived national shoreline

    NASA Astrophysics Data System (ADS)

    Yao, Fang

    Photogrammetric shoreline mapping remains the primary method for mapping the national shoreline used by the National Geodetic Survey (NGS) in the National Oceanic and Atmospheric Administration (NOAA). To date, NGS has not conducted a statistical analysis on the photograrnmetry-derived shoreline uncertainty. The aim of this thesis is to develop and test a rigorous total propagated uncertainty (TPU) model for shoreline compiled from both tide-coordinated and non-tide-coordinated aerial imagery using photogrammetric methods. Survey imagery collected over a study site in northeast Maine was used to test the TPU model. The TPU model developed in this thesis can easily be extended to other areas and may facilitate estimation of uncertainty in inundation models and marsh migration models.

  18. Addressing Free Radical Oxidation in Acne Vulgaris

    PubMed Central

    Criscito, Maressa C.; Schlesinger, Todd E.; Verdicchio, Robert; Szoke, Ernest

    2016-01-01

    Objective: Comparatively little attention has been paid to the role of free radical oxidation in acne vulgaris. Here, using the traditional abnormalities cited for acne, the authors address the role of free radical oxidation throughout the pathogenesis by detailing the chemistry that may contribute to clinical changes. To probe the effects of free radical oxidation and test an antioxidant, they conducted a preliminary study of topically applied vitamin E. Methods: Seventeen patients with mild-to-moderate acne vulgaris were evaluated over an eight-week period in two private dermatology practices in this open-label study. All patients enrolled were on the same baseline regimen of salicylic acid and benzoyl peroxide. This regimen was then supplemented with topical vitamin E in sunflower seed oil. Results: At the end of the eight-week period, all patients demonstrated clinical improvement, as indicated by a reduction in the number of lesions and global mean difference. A statistically significant reduction was noted as early as Week 2. Enrolled patients also expressed a positive experience due to good tolerability and easy application. Conclusion: Although the exact pathogenesis of acne vulgaris remains unknown, the presence of excessive reactive oxygen species can be implicated in each of the major abnormalities involved. This presence, along with the positive results of the authors’ preliminary study, demonstrates the need for more exploration on the use of topical antioxidants in limiting free radical oxidation in the acne model. This paper is designed to stimulate academic discussion regarding a new way of thinking about the disease state of acne. PMID:26962389

  19. Point Cloud Metrics for Separating Standing Archaeological Remains and Low Vegetation in ALS Data

    NASA Astrophysics Data System (ADS)

    Opitz, R.; Nuninger, L.

    2013-07-01

    The integration of Airborne Laser Scanning survey into archaeological research and cultural heritage management has substantially added to our knowledge of archaeological remains in forested areas, and is changing our understanding of how these landscapes functioned in the past. While many types of archaeological remains manifest as micro-topography, several important classes of features commonly appear as standing remains. The identification of these remains is important for archaeological prospection surveys based on ALS data, and typically represent structures from the Roman, Medieval and early Modern periods. Standing structures in mixed scenes with vegetation are not well addressed by standard classification approaches developed to identify bare earth (terrain), individual trees or plot characteristics, or buildings (roofed structures). In this paper we propose an approach to the identification of these structures in the point cloud based on multi-scale measures of local density, roughness, and normal orientation. We demonstrate this approach using discrete-return ALS data collected in the Franche-Comte region of France at a nominal point density of 8 pts/m2, a resolution which, in coming years, will become increasingly available to archaeologists through government supported mapping schemes.

  20. Quantification of precipitation and temperature uncertainties simulated by CMIP3 and CMIP5 models

    NASA Astrophysics Data System (ADS)

    Woldemeskel, F. M.; Sharma, A.; Sivakumar, B.; Mehrotra, R.

    2016-01-01

    Assessment of climate change impacts on water resources is extremely challenging, due to the inherent uncertainties in climate projections using global climate models (GCMs). Three main sources of uncertainties can be identified in GCMs, i.e., model structure, emission scenario, and natural variability. The recently released fifth phase of the Coupled Model Intercomparison Project (CMIP5) includes a number of advances relative to its predecessor (CMIP3), in terms of the spatial resolution of models, list of variables, and concept of specifying future radiative forcing, among others. The question, however, is do these modifications indeed reduce the uncertainty in the projected climate at global and/or regional scales? We address this question by quantifying and comparing uncertainty in precipitation and temperature from 6 CMIP3 and 13 CMIP5 models. Uncertainty is quantified using the square root of error variance, which specifies uncertainty as a function of time and space, and decomposes the total uncertainty into its three constituents. The results indicate a visible reduction in the uncertainty of CMIP5 precipitation relative to CMIP3 but no significant change for temperature. For precipitation, the GCM uncertainty is found to be larger in regions of the world that receive heavy rainfall, as well as mountainous and coastal areas. For temperature, however, uncertainty is larger in extratropical cold regions and lower elevation areas.