Sample records for uncertainty analysis food

  1. Regulating genetically modified food. Policy trajectories, political culture, and risk perceptions in the U.S., Canada, and EU.

    PubMed

    Wohlers, Anton E

    2010-09-01

    This paper examines whether national differences in political culture add an explanatory dimension to the formulation of policy in the area of biotechnology, especially with respect to genetically modified food. The analysis links the formulation of protective regulatory policies governing genetically modified food to both country and region-specific differences in uncertainty tolerance levels and risk perceptions in the United States, Canada, and European Union. Based on polling data and document analysis, the findings illustrate that these differences matter. Following a mostly opportunistic risk perception within an environment of high tolerance for uncertainty, policymakers in the United States and Canada modified existing regulatory frameworks that govern genetically modified food in their respective countries. In contrast, the mostly cautious perception of new food technologies and low tolerance for uncertainty among European Union member states has contributed to the creation of elaborate and stringent regulatory policies governing genetically modified food.

  2. Organic food consumption in Taiwan: Motives, involvement, and purchase intention under the moderating role of uncertainty.

    PubMed

    Teng, Chih-Ching; Lu, Chi-Heng

    2016-10-01

    Despite the progressive development of the organic food sector in Taiwan, little is known about how consumers' consumption motives will influence organic food decision through various degrees of involvement and whether or not consumers with various degrees of uncertainty will vary in their intention to buy organic foods. The current study aims to examine the effect of consumption motives on behavioral intention related to organic food consumption under the mediating role of involvement as well as the moderating role of uncertainty. Research data were collected from organic food consumers in Taiwan via a questionnaire survey, eventually obtaining 457 valid questionnaires for analysis. This study tested the overall model fit and hypotheses through structural equation modeling method (SEM). The results show that consumer involvement significantly mediates the effects of health consciousness and ecological motives on organic food purchase intention, but not applied to food safety concern. Moreover, the moderating effect of uncertainty is statistical significance, indicating that the relationship between involvement and purchase intention becomes weaker in the condition of consumers with higher degree of uncertainty. Several implications and suggestions are also discussed for organic food providers and marketers. Copyright © 2016. Published by Elsevier Ltd.

  3. A probabilistic model for deriving soil quality criteria based on secondary poisoning of top predators. I. Model description and uncertainty analysis.

    PubMed

    Traas, T P; Luttik, R; Jongbloed, R H

    1996-08-01

    In previous studies, the risk of toxicant accumulation in food chains was used to calculate quality criteria for surface water and soil. A simple algorithm was used to calculate maximum permissable concentrations [MPC = no-observed-effect concentration/bioconcentration factor(NOEC/BCF)]. These studies were limited to simple food chains. This study presents a method to calculate MPCs for more complex food webs of predators. The previous method is expanded. First, toxicity data (NOECs) for several compounds were corrected for differences between laboratory animals and animals in the wild. Second, for each compound, it was assumed these NOECs were a sample of a log-logistic distribution of mammalian and avian NOECs. Third, bioaccumulation factors (BAFs) for major food items of predators were collected and were assumed to derive from different log-logistic distributions of BAFs. Fourth, MPCs for each compound were calculated using Monte Carlo sampling from NOEC and BAF distributions. An uncertainty analysis for cadmium was performed to identify the most uncertain parameters of the model. Model analysis indicated that most of the prediction uncertainty of the model can be ascribed to uncertainty of species sensitivity as expressed by NOECs. A very small proportion of model uncertainty is contributed by BAFs from food webs. Correction factors for the conversion of NOECs from laboratory conditions to the field have some influence on the final value of MPC5, but the total prediction uncertainty of the MPC is quite large. It is concluded that the uncertainty in species sensitivity is quite large. To avoid unethical toxicity testing with mammalian or avian predators, it cannot be avoided to use this uncertainty in the method proposed to calculate MPC distributions. The fifth percentile of the MPC is suggested as a safe value for top predators.

  4. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  6. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  7. Consumer-phase Salmonella enterica serovar enteritidis risk assessment for egg-containing food products.

    PubMed

    Mokhtari, Amirhossein; Moore, Christina M; Yang, Hong; Jaykus, Lee-Ann; Morales, Roberta; Cates, Sheryl C; Cowen, Peter

    2006-06-01

    We describe a one-dimensional probabilistic model of the role of domestic food handling behaviors on salmonellosis risk associated with the consumption of eggs and egg-containing foods. Six categories of egg-containing foods were defined based on the amount of egg contained in the food, whether eggs are pooled, and the degree of cooking practiced by consumers. We used bootstrap simulation to quantify uncertainty in risk estimates due to sampling error, and sensitivity analysis to identify key sources of variability and uncertainty in the model. Because of typical model characteristics such as nonlinearity, interaction between inputs, thresholds, and saturation points, Sobol's method, a novel sensitivity analysis approach, was used to identify key sources of variability. Based on the mean probability of illness, examples of foods from the food categories ranked from most to least risk of illness were: (1) home-made salad dressings/ice cream; (2) fried eggs/boiled eggs; (3) omelettes; and (4) baked foods/breads. For food categories that may include uncooked eggs (e.g., home-made salad dressings/ice cream), consumer handling conditions such as storage time and temperature after food preparation were the key sources of variability. In contrast, for food categories associated with undercooked eggs (e.g., fried/soft-boiled eggs), the initial level of Salmonella contamination and the log10 reduction due to cooking were the key sources of variability. Important sources of uncertainty varied with both the risk percentile and the food category under consideration. This work adds to previous risk assessments focused on egg production and storage practices, and provides a science-based approach to inform consumer risk communications regarding safe egg handling practices.

  8. Challenges and regulatory considerations in the acoustic measurement of high-frequency (>20 MHz) ultrasound.

    PubMed

    Nagle, Samuel M; Sundar, Guru; Schafer, Mark E; Harris, Gerald R; Vaezy, Shahram; Gessert, James M; Howard, Samuel M; Moore, Mary K; Eaton, Richard M

    2013-11-01

    This article examines the challenges associated with making acoustic output measurements at high ultrasound frequencies (>20 MHz) in the context of regulatory considerations contained in the US Food and Drug Administration industry guidance document for diagnostic ultrasound devices. Error sources in the acoustic measurement, including hydrophone calibration and spatial averaging, nonlinear distortion, and mechanical alignment, are evaluated, and the limitations of currently available acoustic measurement instruments are discussed. An uncertainty analysis of acoustic intensity and power measurements is presented, and an example uncertainty calculation is done on a hypothetical 30-MHz high-frequency ultrasound system. This analysis concludes that the estimated measurement uncertainty of the acoustic intensity is +73%/-86%, and the uncertainty in the mechanical index is +37%/-43%. These values exceed the respective levels in the Food and Drug Administration guidance document of 30% and 15%, respectively, which are more representative of the measurement uncertainty associated with characterizing lower-frequency ultrasound systems. Recommendations made for minimizing the measurement uncertainty include implementing a mechanical positioning system that has sufficient repeatability and precision, reconstructing the time-pressure waveform via deconvolution using the hydrophone frequency response, and correcting for hydrophone spatial averaging.

  9. Managing uncertainty: a review of food system scenario analysis and modelling

    PubMed Central

    Reilly, Michael; Willenbockel, Dirk

    2010-01-01

    Complex socio-ecological systems like the food system are unpredictable, especially to long-term horizons such as 2050. In order to manage this uncertainty, scenario analysis has been used in conjunction with food system models to explore plausible future outcomes. Food system scenarios use a diversity of scenario types and modelling approaches determined by the purpose of the exercise and by technical, methodological and epistemological constraints. Our case studies do not suggest Malthusian futures for a projected global population of 9 billion in 2050; but international trade will be a crucial determinant of outcomes; and the concept of sustainability across the dimensions of the food system has been inadequately explored so far. The impact of scenario analysis at a global scale could be strengthened with participatory processes involving key actors at other geographical scales. Food system models are valuable in managing existing knowledge on system behaviour and ensuring the credibility of qualitative stories but they are limited by current datasets for global crop production and trade, land use and hydrology. Climate change is likely to challenge the adaptive capacity of agricultural production and there are important knowledge gaps for modelling research to address. PMID:20713402

  10. Energy prices will play an important role in determining global land use in the twenty first century

    NASA Astrophysics Data System (ADS)

    Steinbuks, Jevgenijs; Hertel, Thomas W.

    2013-03-01

    Global land use research to date has focused on quantifying uncertainty effects of three major drivers affecting competition for land: the uncertainty in energy and climate policies affecting competition between food and biofuels, the uncertainty of climate impacts on agriculture and forestry, and the uncertainty in the underlying technological progress driving efficiency of food, bioenergy and timber production. The market uncertainty in fossil fuel prices has received relatively less attention in the global land use literature. Petroleum and natural gas prices affect both the competitiveness of biofuels and the cost of nitrogen fertilizers. High prices put significant pressure on global land supply and greenhouse gas emissions from terrestrial systems, while low prices can moderate demands for cropland. The goal of this letter is to assess and compare the effects of these core uncertainties on the optimal profile for global land use and land-based GHG emissions over the coming century. The model that we develop integrates distinct strands of agronomic, biophysical and economic literature into a single, intertemporally consistent, analytical framework, at global scale. Our analysis accounts for the value of land-based services in the production of food, first- and second-generation biofuels, timber, forest carbon and biodiversity. We find that long-term uncertainty in energy prices dominates the climate impacts and climate policy uncertainties emphasized in prior research on global land use.

  11. Uncertainty in simulating wheat yields under climate change

    USDA-ARS?s Scientific Manuscript database

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change...

  12. Statistical adjustment of culture-independent diagnostic tests for trend analysis in the Foodborne Diseases Active Surveillance Network (FoodNet), USA.

    PubMed

    Gu, Weidong; Dutta, Vikrant; Patrick, Mary; Bruce, Beau B; Geissler, Aimee; Huang, Jennifer; Fitzgerald, Collette; Henao, Olga

    2018-03-19

    Culture-independent diagnostic tests (CIDTs) are increasingly used to diagnose Campylobacter infection in the Foodborne Diseases Active Surveillance Network (FoodNet). Because CIDTs have different performance characteristics compared with culture, which has been used historically and is still used to diagnose campylobacteriosis, adjustment of cases diagnosed by CIDT is needed to compare with culture-confirmed cases for monitoring incidence trends. We identified the necessary parameters for CIDT adjustment using culture as the gold standard, and derived formulas to calculate positive predictive values (PPVs). We conducted a literature review and meta-analysis to examine the variability in CIDT performance and Campylobacter prevalence applicable to FoodNet sites. We then developed a Monte Carlo method to estimate test-type and site-specific PPVs with their associated uncertainties. The uncertainty in our estimated PPVs was largely derived from uncertainty about the specificity of CIDTs and low prevalence of Campylobacter in tested samples. Stable CIDT-adjusted incidences of Campylobacter cases from 2012 to 2015 were observed compared with a decline in culture-confirmed incidence. We highlight the lack of data on the total numbers of tested samples as one of main limitations for CIDT adjustment. Our results demonstrate the importance of adjusting CIDTs for understanding trends in Campylobacter incidence in FoodNet.

  13. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides: An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    EPA Pesticide Factsheets

    Federal Guidance Report No.13 (FGR 13) provides cancer risk coefficients for modes of environmental exposure to each of more than 800 radionuclides (EPA 1999), including inhalation of airborne activity and ingestion of activity in food or drinking water.

  14. Consumer responses to communication about food risk management.

    PubMed

    van Dijk, Heleen; Houghton, Julie; van Kleef, Ellen; van der Lans, Ivo; Rowe, Gene; Frewer, Lynn

    2008-01-01

    Recent emphasis within policy circles has been on transparent communication with consumers about food risk management decisions and practices. As a consequence, it is important to develop best practice regarding communication with the public about how food risks are managed. In the current study, the provision of information about regulatory enforcement, proactive risk management, scientific uncertainty and risk variability were manipulated in an experiment designed to examine their impact on consumer perceptions of food risk management quality. In order to compare consumer reactions across different cases, three food hazards were selected (mycotoxins on organically grown food, pesticide residues, and a genetically modified potato). Data were collected from representative samples of consumers in Germany, Greece, Norway and the UK. Scores on the "perceived food risk management quality" scale were subjected to a repeated-measures mixed linear model. Analysis points to a number of important findings, including the existence of cultural variation regarding the impact of risk communication strategies-something which has obvious implications for pan-European risk communication approaches. For example, while communication of uncertainty had a positive impact in Germany, it had a negative impact in the UK and Norway. Results also indicate that food risk managers should inform the public about enforcement of safety laws when communicating scientific uncertainty associated with risks. This has implications for the coordination of risk communication strategies between risk assessment and risk management organizations.

  15. Contextual Uncertainties, Human Mobility, and Perceived Food Environment: The Uncertain Geographic Context Problem in Food Access Research.

    PubMed

    Chen, Xiang; Kwan, Mei-Po

    2015-09-01

    We examined the uncertainty of the contextual influences on food access through an analytic framework of the uncertain geographic context problem (UGCoP). We first examined the compounding effects of two kinds of spatiotemporal uncertainties on people's everyday efforts to procure food and then outlined three key dimensions (food access in real time, temporality of the food environment, and perceived nutrition environment) in which research on food access must improve to better represent the contributing environmental influences that operate at the individual level. Guidelines to address the UGCoP in future food access research are provided to account for the multidimensional influences of the food environment on dietary behaviors.

  16. Food plant toxicants and safety Risk assessment and regulation of inherent toxicants in plant foods.

    PubMed

    Essers, A J; Alink, G M; Speijers, G J; Alexander, J; Bouwmeister, P J; van den Brandt, P A; Ciere, S; Gry, J; Herrman, J; Kuiper, H A; Mortby, E; Renwick, A G; Shrimpton, D H; Vainio, H; Vittozzi, L; Koeman, J H

    1998-05-01

    The ADI as a tool for risk management and regulation of food additives and pesticide residues is not readily applicable to inherent food plant toxicants: The margin between actual intake and potentially toxic levels is often small; application of the default uncertainty factors used to derive ADI values, particularly when extrapolating from animal data, would prohibit the utilisation of the food, which may have an overall beneficial health effect. Levels of inherent toxicants are difficult to control; their complete removal is not always wanted, due to their function for the plant or for human health. The health impact of the inherent toxicant is often modified by factors in the food, e.g. the bioavailability from the matrix and interaction with other inherent constituents. Risk-benefit analysis should be made for different consumption scenarios, without the use of uncertainty factors. Crucial in this approach is analysis of the toxicity of the whole foodstuff. The relationship between the whole foodstuff and the pure toxicant is expressed in the `product correction factor' (PCF). Investigations in humans are essential so that biomarkers of exposure and for effect can be used to analyse the difference between animals and humans and between the food and the pure toxicant. A grid of the variables characterising toxicity is proposed, showing their inter-relationships. A flow diagram for risk estimate is provided, using both toxicological and epidemiological studies.

  17. Regional crop yield forecasting: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    de Wit, A.; van Diepen, K.; Boogaard, H.

    2009-04-01

    Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.

  18. Adaptations in a hierarchical food web of southeastern Lake Michigan

    USGS Publications Warehouse

    Krause, Ann E.; Frank, Ken A.; Jones, Michael L.; Nalepa, Thomas F.; Barbiero, Richard P.; Madenjian, Charles P.; Agy, Megan; Evans, Marlene S.; Taylor, William W.; Mason, Doran M.; Léonard, Nancy J.

    2009-01-01

    Two issues in ecological network theory are: (1) how to construct an ecological network model and (2) how do entire networks (as opposed to individual species) adapt to changing conditions? We present a novel method for constructing an ecological network model for the food web of southeastern Lake Michigan (USA) and we identify changes in key system properties that are large relative to their uncertainty as this ecological network adapts from one time point to a second time point in response to multiple perturbations. To construct our food web for southeastern Lake Michigan, we followed the list of seven recommendations outlined in Cohen et al. [Cohen, J.E., et al., 1993. Improving food webs. Ecology 74, 252–258] for improving food webs. We explored two inter-related extensions of hierarchical system theory with our food web; the first one was that subsystems react to perturbations independently in the short-term and the second one was that a system's properties change at a slower rate than its subsystems’ properties. We used Shannon's equations to provide quantitative versions of the basic food web properties: number of prey, number of predators, number of feeding links, and connectance (or density). We then compared these properties between the two time-periods by developing distributions of each property for each time period that took uncertainty about the property into account. We compared these distributions, and concluded that non-overlapping distributions indicated changes in these properties that were large relative to their uncertainty. Two subsystems were identified within our food web system structure (p < 0.001). One subsystem had more non-overlapping distributions in food web properties between Time 1 and Time 2 than the other subsystem. The overall system had all overlapping distributions in food web properties between Time 1 and Time 2. These results supported both extensions of hierarchical systems theory. Interestingly, the subsystem with more non-overlapping distributions in food web properties was the subsystem that contained primarily benthic taxa, contrary to expectations that the identified major perturbations (lower phosphorous inputs and invasive species) would more greatly affect the subsystem containing primarily pelagic taxa. Future food-web research should employ rigorous statistical analysis and incorporate uncertainty in food web properties for a better understanding of how ecological networks adapt.

  19. Analysis of polonium-210 in food products and bioassay samples by isotope-dilution alpha spectrometry.

    PubMed

    Lin, Zhichao; Wu, Zhongyu

    2009-05-01

    A rapid and reliable radiochemical method coupled with a simple and compact plating apparatus was developed, validated, and applied for the analysis of (210)Po in variety of food products and bioassay samples. The method performance characteristics, including accuracy, precision, robustness, and specificity, were evaluated along with a detailed measurement uncertainty analysis. With high Po recovery, improved energy resolution, and effective removal of interfering elements by chromatographic extraction, the overall method accuracy was determined to be better than 5% with measurement precision of 10%, at 95% confidence level.

  20. Quantification of uncertainties in global grazing systems assessment

    NASA Astrophysics Data System (ADS)

    Fetzel, T.; Havlik, P.; Herrero, M.; Kaplan, J. O.; Kastner, T.; Kroisleitner, C.; Rolinski, S.; Searchinger, T.; Van Bodegom, P. M.; Wirsenius, S.; Erb, K.-H.

    2017-07-01

    Livestock systems play a key role in global sustainability challenges like food security and climate change, yet many unknowns and large uncertainties prevail. We present a systematic, spatially explicit assessment of uncertainties related to grazing intensity (GI), a key metric for assessing ecological impacts of grazing, by combining existing data sets on (a) grazing feed intake, (b) the spatial distribution of livestock, (c) the extent of grazing land, and (d) its net primary productivity (NPP). An analysis of the resulting 96 maps implies that on average 15% of the grazing land NPP is consumed by livestock. GI is low in most of the world's grazing lands, but hotspots of very high GI prevail in 1% of the total grazing area. The agreement between GI maps is good on one fifth of the world's grazing area, while on the remainder, it is low to very low. Largest uncertainties are found in global drylands and where grazing land bears trees (e.g., the Amazon basin or the Taiga belt). In some regions like India or Western Europe, massive uncertainties even result in GI > 100% estimates. Our sensitivity analysis indicates that the input data for NPP, animal distribution, and grazing area contribute about equally to the total variability in GI maps, while grazing feed intake is a less critical variable. We argue that a general improvement in quality of the available global level data sets is a precondition for improving the understanding of the role of livestock systems in the context of global environmental change or food security.

  1. Sampling for Chemical Analysis.

    ERIC Educational Resources Information Center

    Kratochvil, Byron; And Others

    1984-01-01

    This review, designed to make analysts aware of uncertainties introduced into analytical measurements during sampling, is organized under these headings: general considerations; theory; standards; and applications related to mineralogy, soils, sediments, metallurgy, atmosphere, water, biology, agriculture and food, medical and clinical areas, oil…

  2. A nutritional analysis of New Zealand military food rations at Gallipoli in 1915: likely contribution to scurvy and other nutrient deficiency disorders.

    PubMed

    Wilson, Nick; Nghiem, Nhung; Summers, Jennifer A; Carter, Mary-Ann; Harper, Glyn

    2013-04-19

    Amongst New Zealand soldiers in Gallipoli in 1915 there were reports of poor food quality and cases of scurvy. But no modern analysis of the military food rations has ever been conducted to better understand potential nutritional problems in this group. We analysed the foods in the military rations for 1915 using food composition data on the closest equivalents for modern foods. We compared these results with other plausible diets and various optimised ones using linear programming. Historical accounts provide evidence for poor food quality supplied to these soldiers. The nutrient analysis suggested that the military rations were below modern requirements for vitamins A, C and E; potassium; selenium; and dietary fibre. If military planners had used modest amounts of the canned vegetables and fruit available in 1915, this would probably have eliminated four of these six deficits. The results from the uncertainty analyses for vitamin C (e.g., 95% uncertainty interval [UI]: 5.5 to 6.7 mg per day), was compatible with the range known to cause scurvy, but the UI for vitamin A intake was only partly in the range for causing night blindness. To indicate the gap with the ideal, an optimised diet (using foods available in 1915), could have achieved all nutrient requirements for under half the estimated purchase cost of the 1915 military rations. There is now both historical and analytic evidence that the military rations provided to these soldiers were nutritionally inadequate in vitamin C, and probably other nutrients such as vitamin A. These deficits are likely to have caused cases of scurvy and may have contributed to the high rates of other illnesses experienced at Gallipoli. Such problems could have been readily prevented by providing rations that included some canned fruit or vegetables (e.g., as manufactured by New Zealand at the time).

  3. Feasibility study on the use of probabilistic migration modeling in support of exposure assessment from food contact materials.

    PubMed

    Poças, Maria F; Oliveira, Jorge C; Brandsch, Rainer; Hogg, Timothy

    2010-07-01

    The use of probabilistic approaches in exposure assessments of contaminants migrating from food packages is of increasing interest but the lack of concentration or migration data is often referred as a limitation. Data accounting for the variability and uncertainty that can be expected in migration, for example, due to heterogeneity in the packaging system, variation of the temperature along the distribution chain, and different time of consumption of each individual package, are required for probabilistic analysis. The objective of this work was to characterize quantitatively the uncertainty and variability in estimates of migration. A Monte Carlo simulation was applied to a typical solution of the Fick's law with given variability in the input parameters. The analysis was performed based on experimental data of a model system (migration of Irgafos 168 from polyethylene into isooctane) and illustrates how important sources of variability and uncertainty can be identified in order to refine analyses. For long migration times and controlled conditions of temperature the affinity of the migrant to the food can be the major factor determining the variability in the migration values (more than 70% of variance). In situations where both the time of consumption and temperature can vary, these factors can be responsible, respectively, for more than 60% and 20% of the variance in the migration estimates. The approach presented can be used with databases from consumption surveys to yield a true probabilistic estimate of exposure.

  4. Materiality matters: Blurred boundaries and the domestication of functional foods.

    PubMed

    Weiner, Kate; Will, Catherine

    2015-06-01

    Previous scholarship on novel foods, including functional foods, has suggested that they are difficult to categorise for both regulators and users. It is argued that they blur the boundary between 'food' and 'drug' and that uncertainties about the products create 'experimental' or 'restless' approaches to consumption. We investigate these uncertainties drawing on data about the use of functional foods containing phytosterols, which are licensed for sale in the EU for people wishing to reduce their cholesterol. We start from an interest in the products as material objects and their incorporation into everyday practices. We consider the scripts encoded in the physical form of the products through their regulation, production and packaging and find that these scripts shape but do not determine their use. The domestication of phytosterols involves bundling the products together with other objects (pills, supplements, foodstuffs). Considering their incorporation into different systems of objects offers new understandings of the products as foods or drugs. In their accounts of their practices, consumers appear to be relatively untroubled by uncertainties about the character of the products. We conclude that attending to materials and practices offers a productive way to open up and interrogate the idea of categorical uncertainties surrounding new food products.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bogen, K.T.; Conrado, C.L.; Robison, W.L.

    A detailed analysis of uncertainty and interindividual variability in estimated doses was conducted for a rehabilitation scenario for Bikini Island at Bikini Atoll, in which the top 40 cm of soil would be removed in the housing and village area, and the rest of the island is treated with potassium fertilizer, prior to an assumed resettlement date of 1999. Predicted doses were considered for the following fallout-related exposure pathways: ingested Cesium-137 and Strontium-90, external gamma exposure, and inhalation and ingestion of Americium-241 + Plutonium-239+240. Two dietary scenarios were considered: (1) imported foods are available (IA), and (2) imported foods aremore » unavailable (only local foods are consumed) (IUA). Corresponding calculations of uncertainty in estimated population-average dose showed that after {approximately}5 y of residence on Bikini, the upper and lower 95% confidence limits with respect to uncertainty in this dose are estimated to be approximately 2-fold higher and lower than its population-average value, respectively (under both IA and IUA assumptions). Corresponding calculations of interindividual variability in the expected value of dose with respect to uncertainty showed that after {approximately}5 y of residence on Bikini, the upper and lower 95% confidence limits with respect to interindividual variability in this dose are estimated to be approximately 2-fold higher and lower than its expected value, respectively (under both IA and IUA assumptions). For reference, the expected values of population-average dose at age 70 were estimated to be 1.6 and 5.2 cSv under the IA and IUA dietary assumptions, respectively. Assuming that 200 Bikini resettlers would be exposed to local foods (under both IA and IUA assumptions), the maximum 1-y dose received by any Bikini resident is most likely to be approximately 2 and 8 mSv under the IA and IUA assumptions, respectively.« less

  6. On-line coupled high performance liquid chromatography-gas chromatography for the analysis of contamination by mineral oil. Part 2: migration from paperboard into dry foods: interpretation of chromatograms.

    PubMed

    Biedermann, Maurus; Grob, Koni

    2012-09-14

    Mineral oil hydrocarbons are complex as well as varying mixtures and produce correspondingly complex chromatograms (on-line HPLC-GC-FID as described in Part 1): mostly humps of unresolved components are obtained, sometimes with sharp peaks on top. Chromatograms may also contain peaks of hydrocarbons from other sources which need to be subtracted from the mineral oil components. The review focuses on the interpretation and integration of chromatograms related to food contamination by mineral oil from paperboard boxes (off-set printing inks and recycled fibers), if possible distinguishing between various sources of mineral oil. Typical chromatograms are shown for relevant components and interferences as well as food samples encountered on the market. Details are pointed out which may provide relevant information. Integration is shown for examples of paperboard packaging materials as well as various foods. Finally the uncertainty of the analysis and limit of quantitation are discussed for specific examples. They primarily result from the interpretation of the chromatogram, manually placing the baseline and cuts for taking off extraneous components. Without previous enrichment, the limit of quantitation is between around 0.1 mg/kg for foods with a low fat content and 2.5 mg/kg for fats and oils. The measurement uncertainty can be kept clearly below 20% for most samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Materiality matters: Blurred boundaries and the domestication of functional foods

    PubMed Central

    Weiner, Kate; Will, Catherine

    2015-01-01

    Previous scholarship on novel foods, including functional foods, has suggested that they are difficult to categorise for both regulators and users. It is argued that they blur the boundary between ‘food' and ‘drug' and that uncertainties about the products create ‘experimental' or ‘restless' approaches to consumption. We investigate these uncertainties drawing on data about the use of functional foods containing phytosterols, which are licensed for sale in the EU for people wishing to reduce their cholesterol. We start from an interest in the products as material objects and their incorporation into everyday practices. We consider the scripts encoded in the physical form of the products through their regulation, production and packaging and find that these scripts shape but do not determine their use. The domestication of phytosterols involves bundling the products together with other objects (pills, supplements, foodstuffs). Considering their incorporation into different systems of objects offers new understandings of the products as foods or drugs. In their accounts of their practices, consumers appear to be relatively untroubled by uncertainties about the character of the products. We conclude that attending to materials and practices offers a productive way to open up and interrogate the idea of categorical uncertainties surrounding new food products. PMID:26157471

  8. Assessing and reporting uncertainties in dietary exposure analysis - Part II: Application of the uncertainty template to a practical example of exposure assessment.

    PubMed

    Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne

    2017-11-01

    A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. The prevalence of food allergy: a meta-analysis.

    PubMed

    Rona, Roberto J; Keil, Thomas; Summers, Colin; Gislason, David; Zuidmeer, Laurian; Sodergren, Eva; Sigurdardottir, Sigurveig T; Lindner, Titia; Goldhahn, Klaus; Dahlstrom, Jorgen; McBride, Doreen; Madsen, Charlotte

    2007-09-01

    There is uncertainty about the prevalence of food allergy in communities. To assess the prevalence of food allergy by performing a meta-analysis according to the method of assessment used. The foods assessed were cow's milk, hen's egg, peanut, fish, shellfish, and an overall estimate of food allergy. We summarized the information in 5 categories: self-reported symptoms, specific IgE positive, specific skin prick test positive, symptoms combined with sensitization, and food challenge studies. We systematically searched MEDLINE and EMBASE for publications since 1990. The meta-analysis included only original studies. They were stratified by age groups: infant/preschool, school children, and adults. A total of 934 articles were identified, but only 51 were considered appropriate for inclusion. The prevalence of self-reported food allergy was very high compared with objective measures. There was marked heterogeneity between studies regardless of type of assessment or food item considered, and in most analyses this persisted after age stratification. Self-reported prevalence of food allergy varied from 1.2% to 17% for milk, 0.2% to 7% for egg, 0% to 2% for peanuts and fish, 0% to 10% for shellfish, and 3% to 35% for any food. There is a marked heterogeneity in the prevalence of food allergy that could be a result of differences in study design or methodology, or differences between populations. We recommend that measurements be made by using standardized methods, if possible food challenge. We need to be cautious in estimates of prevalence based only on self-reported food allergy.

  10. Water supply infrastructure planning under multiple uncertainties: A differentiated approach

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and desalination. Intense withdrawals for urban and agricultural use will lead to lowering of the water table in the aquifer at rapid but uncertain rates due to poor groundwater characterization. We assess the potential for additional groundwater data collection and a flexible infrastructure approach similar to that in Melbourne to mitigate risk.

  11. The role of food ethics in food policy.

    PubMed

    Mepham, T B

    2000-11-01

    Certain developments in the agricultural and food sciences have far-reaching implications for society and the environment, which suggest the need to examine their ethical acceptability as a standard component of technology assessment. Such considerations have led to the emergence of a new academic discipline, food ethics. The present paper describes how ethical theory may be applied to the analysis of the impacts of prospective food biotechnologies to assess potential effects on four 'interest groups', i.e. consumers, producers, treated organisms and the biota (fauna and flora). The principles which structure the framework used, i.e. the ethical matrix, are adapted to the field of agriculture and food from those applied in medical ethics. Use of the ethical matrix is illustrated by applying it to the specific case of bovine somatotrophin, the genetically-engineered protein hormone which is injected into lactating cattle to increase their milk yields. Ethical analysis is seen to depend on a number of critical requirements, i.e. scientific data, non-scientific evidence and predictions, suitably-qualified assessors ('competent moral judges'), the 'world-views' of the assessors and application of the precautionary principle to cope with 'uncertainty'.

  12. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    PubMed

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  13. Uncertainty of Monetary Valued Ecosystem Services – Value Transfer Functions for Global Mapping

    PubMed Central

    Schmidt, Stefan; Manceur, Ameur M.; Seppelt, Ralf

    2016-01-01

    Growing demand of resources increases pressure on ecosystem services (ES) and biodiversity. Monetary valuation of ES is frequently seen as a decision-support tool by providing explicit values for unconsidered, non-market goods and services. Here we present global value transfer functions by using a meta-analytic framework for the synthesis of 194 case studies capturing 839 monetary values of ES. For 12 ES the variance of monetary values could be explained with a subset of 93 study- and site-specific variables by utilizing boosted regression trees. This provides the first global quantification of uncertainties and transferability of monetary valuations. Models explain from 18% (water provision) to 44% (food provision) of variance and provide statistically reliable extrapolations for 70% (water provision) to 91% (food provision) of the terrestrial earth surface. Although the application of different valuation methods is a source of uncertainty, we found evidence that assuming homogeneity of ecosystems is a major error in value transfer function models. Food provision is positively correlated with better life domains and variables indicating positive conditions for human well-being. Water provision and recreation service show that weak ownerships affect valuation of other common goods negatively (e.g. non-privately owned forests). Furthermore, we found support for the shifting baseline hypothesis in valuing climate regulation. Ecological conditions and societal vulnerability determine valuation of extreme event prevention. Valuation of habitat services is negatively correlated with indicators characterizing less favorable areas. Our analysis represents a stepping stone to establish a standardized integration of and reporting on uncertainties for reliable and valid benefit transfer as an important component for decision support. PMID:26938447

  14. Risk, rationality, and regret: responding to the uncertainty of childhood food anaphylaxis.

    PubMed

    Hu, W; Kerridge, I; Kemp, A

    2005-06-01

    Risk and uncertainty are unavoidable in clinical medicine. In the case of childhood food allergy, the dysphoric experience of uncertainty is heightened by the perception of unpredictable danger to young children. Medicine has tended to respond to uncertainty with forms of rational decision making. Rationality cannot, however, resolve uncertainty and provides an insufficient account of risk. This paper compares the medical and parental accounts of two peanut allergic toddlers to highlight the value of emotions in decision making. One emotion in particular, regret, assists in explaining the actions taken to prevent allergic reactions, given the diffuse nature of responsibility for children. In this light, the assumption that doctors make rational judgments while patients have emotion led preferences is a false dichotomy. Reconciling medical and lay accounts requires acknowledgement of the interrelationship between the rational and the emotional, and may lead to more appropriate clinical decision making under conditions of uncertainty.

  15. IsoWeb: A Bayesian Isotope Mixing Model for Diet Analysis of the Whole Food Web

    PubMed Central

    Kadoya, Taku; Osada, Yutaka; Takimoto, Gaku

    2012-01-01

    Quantitative description of food webs provides fundamental information for the understanding of population, community, and ecosystem dynamics. Recently, stable isotope mixing models have been widely used to quantify dietary proportions of different food resources to a focal consumer. Here we propose a novel mixing model (IsoWeb) that estimates diet proportions of all consumers in a food web based on stable isotope information. IsoWeb requires a topological description of a food web, and stable isotope signatures of all consumers and resources in the web. A merit of IsoWeb is that it takes into account variation in trophic enrichment factors among different consumer-resource links. Sensitivity analysis using realistic hypothetical food webs suggests that IsoWeb is applicable to a wide variety of food webs differing in the number of species, connectance, sample size, and data variability. Sensitivity analysis based on real topological webs showed that IsoWeb can allow for a certain level of topological uncertainty in target food webs, including erroneously assuming false links, omission of existent links and species, and trophic aggregation into trophospecies. Moreover, using an illustrative application to a real food web, we demonstrated that IsoWeb can compare the plausibility of different candidate topologies for a focal web. These results suggest that IsoWeb provides a powerful tool to analyze food-web structure from stable isotope data. We provide R and BUGS codes to aid efficient applications of IsoWeb. PMID:22848427

  16. Probability bounds analysis for nonlinear population ecology models.

    PubMed

    Enszer, Joshua A; Andrei Măceș, D; Stadtherr, Mark A

    2015-09-01

    Mathematical models in population ecology often involve parameters that are empirically determined and inherently uncertain, with probability distributions for the uncertainties not known precisely. Propagating such imprecise uncertainties rigorously through a model to determine their effect on model outputs can be a challenging problem. We illustrate here a method for the direct propagation of uncertainties represented by probability bounds though nonlinear, continuous-time, dynamic models in population ecology. This makes it possible to determine rigorous bounds on the probability that some specified outcome for a population is achieved, which can be a core problem in ecosystem modeling for risk assessment and management. Results can be obtained at a computational cost that is considerably less than that required by statistical sampling methods such as Monte Carlo analysis. The method is demonstrated using three example systems, with focus on a model of an experimental aquatic food web subject to the effects of contamination by ionic liquids, a new class of potentially important industrial chemicals. Copyright © 2015. Published by Elsevier Inc.

  17. Lifecycle Greenhouse Gas Analysis of an Anaerobic Codigestion Facility Processing Dairy Manure and Industrial Food Waste.

    PubMed

    Ebner, Jacqueline H; Labatut, Rodrigo A; Rankin, Matthew J; Pronto, Jennifer L; Gooch, Curt A; Williamson, Anahita A; Trabold, Thomas A

    2015-09-15

    Anaerobic codigestion (AcoD) can address food waste disposal and manure management issues while delivering clean, renewable energy. Quantifying greenhouse gas (GHG) emissions due to implementation of AcoD is important to achieve this goal. A lifecycle analysis was performed on the basis of data from an on-farm AcoD in New York, resulting in a 71% reduction in GHG, or net reduction of 37.5 kg CO2e/t influent relative to conventional treatment of manure and food waste. Displacement of grid electricity provided the largest reduction, followed by avoidance of alternative food waste disposal options and reduced impacts associated with storage of digestate vs undigested manure. These reductions offset digester emissions and the net increase in emissions associated with land application in the AcoD case relative to the reference case. Sensitivity analysis showed that using feedstock diverted from high impact disposal pathways, control of digester emissions, and managing digestate storage emissions were opportunities to improve the AcoD GHG benefits. Regional and parametrized emissions factors for the storage emissions and land application phases would reduce uncertainty.

  18. Integrated mixed methods policy analysis for sustainable food systems: trends, challenges and future research.

    PubMed

    Cuevas, Soledad

    Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.

  19. Water-energy-food nexus: concepts, questions and methodologies

    NASA Astrophysics Data System (ADS)

    Li, Y.; Chen, X.; Ding, W.; Zhang, C.; Fu, G.

    2017-12-01

    The term of water-energy -food nexus has gained increasing attention in the research and policy making communities as the security of water, energy and food becomes severe under changing environment. Ignorance of their closely interlinkages accompanied by their availability and service may result in unforeseeable, adverse consequences. This paper comprehensively reviews the state-of-the-art in the field of water-energy-food, with a focus on concepts, research questions and methodologies. First, two types of nexus definition are compared and discussed to understand the essence of nexus research issues. Then, three kinds of nexus research questions are presented, including internal relationship analysis, external impact analysis, and evaluation of the nexus system. Five nexus modelling approaches are discussed in terms of their advantages, disadvantages and application, with an aim to identify research gaps in current nexus methods. Finally, future research areas and challenges are discussed, including system boundary, data uncertainty and modelling, underlying mechanism of nexus issues and system performance evaluation. This study helps bring research efforts together to address the challenging questions in the nexus and develop the consensus on building resilient water, energy and food systems.

  20. Folic Acid Food Fortification—Its History, Effect, Concerns, and Future Directions

    PubMed Central

    Crider, Krista S.; Bailey, Lynn B.; Berry, Robert J.

    2011-01-01

    Periconceptional intake of folic acid is known to reduce a woman’s risk of having an infant affected by a neural tube birth defect (NTD). National programs to mandate fortification of food with folic acid have reduced the prevalence of NTDs worldwide. Uncertainty surrounding possible unintended consequences has led to concerns about higher folic acid intake and food fortification programs. This uncertainty emphasizes the need to continually monitor fortification programs for accurate measures of their effect and the ability to address concerns as they arise. This review highlights the history, effect, concerns, and future directions of folic acid food fortification programs. PMID:22254102

  1. Comparative analysis of the labelling of nanotechnologies across four stakeholder groups

    NASA Astrophysics Data System (ADS)

    Capon, Adam; Gillespie, James; Rolfe, Margaret; Smith, Wayne

    2015-08-01

    Societies are constantly challenged to develop policies around the introduction of new technologies, which by their very nature contain great uncertainty. This uncertainty gives prominence to varying viewpoints which are value laden and have the ability to drastically shift policy. The issue of nanotechnologies is a prime example. The labelling of products that contain new technologies has been one policy tool governments have used to address concerns around uncertainty. Our study develops evidence regarding opinions on the labelling of products made by nanotechnologies. We undertook a computer-assisted telephone (CATI) survey of the Australian public and those involved in nanotechnologies from the academic, business and government sectors using a standardised questionnaire. Analysis was undertaken using descriptive and logistic regression techniques. We explored reluctance to purchase as a result of labelling products which contained manufactured nanomaterials both generally and across five broad products (food, cosmetics/sunscreens, medicines, pesticides, tennis racquets/computers) which represent the broad categories of products regulated by differing government agencies in Australia. We examined the relationship between reluctance to purchase and risk perception, trust, and familiarity. We found irrespective of stakeholder, most supported the labelling of products which contained manufactured nanomaterials. Perception of risk was the main driver of reluctance to purchase, while trust and familiarity were likely to have an indirect effect through risk perception. Food is likely to be the greatest product impacted by labelling. Risk perception surrounding nanotechnologies and label `framing' on the product are key issues to be addressed in the implementation of a labelling scheme.

  2. The consistency of a species' response to press perturbations with high food web uncertainty.

    PubMed

    Tunney, Tyler D; Carpenter, Stephen R; Vander Zanden, M Jake

    2017-07-01

    Predicting species responses to perturbations is a fundamental challenge in ecology. Decision makers must often identify management perturbations that are the most likely to deliver a desirable management outcome despite incomplete information on the pattern and strength of food web links. Motivated by a current fishery decline in inland lakes of the Midwestern United States, we evaluate consistency of the responses of a target species (walleye [Sander vitreus]) to press perturbations. We represented food web uncertainty with 193 plausible topological models and applied four perturbations to each one. Frequently the direction of the focal predator response to the same perturbation is not consistent across food web topologies. Simultaneous application of management perturbations led to less consistent outcomes compared to the best single perturbation. However, direct manipulation of the adult focal predator produced a desirable outcome in 77% of 193 plausible topologies. Identifying perturbations that produce consistent outcomes in the face of food web uncertainty can have important implications for natural resource conservation and management efforts. © 2017 by the Ecological Society of America.

  3. How unpredictable access to food increases the body fat of small passerines: A mechanistic approach.

    PubMed

    Anselme, Patrick; Otto, Tobias; Güntürkün, Onur

    2017-11-01

    Unpredictable rewards increase the vigor of responses in autoshaping (a Pavlovian conditioning procedure) and are preferred to predictable rewards in free-choice tasks involving fixed- versus variable-delay schedules. The significance those behavioral properties may have in field conditions is currently unknown. However, it is noticeable that when exposed to unpredictable food, small passerines - such as robins, titmice, and starlings - get fatter than when food is abundant. In functional terms, fattening is viewed as an evolutionary strategy acting against the risk of starvation when food is in short supply. But this functional view does not explain the causal mechanisms by which small passerines come to be fatter under food uncertainty. Here, it is suggested that one of these causal mechanisms is that involved in behavioral invigoration and preference for food uncertainty in the laboratory. Based on a psychological theory of motivational changes under food uncertainty, we developed an integrative computational model to test this idea. We show that, for functional (adaptive) reasons, the excitatory property of reward unpredictability can underlie the propensity of wild birds to forage longer and/or more intensively in an unpredictable environment, with the consequence that they can put on more fat reserves. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Science, safety, and trust: the case of transgenic food.

    PubMed

    Martinelli, Lucia; Karbarz, Małgorzata; Siipi, Helena

    2013-02-01

    Genetically modified (GM) food is discussed as an example of the controversial relation between the intrinsic uncertainty of the scientific approach and the demand of citizen-consumers to use products of science innovation that are known to be safe. On the whole, peer-reviewed studies on GM food safety do not note significant health risks, with a few exceptions, like the most renowned "Pusztai affair" and the recent "Seralini case." These latter studies have been disregarded by the scientific community, based on incorrect experimental designs and statistic analysis. Such contradictory results show the complexity of risk evaluation, and raise concerns in the citizen-consumers against the GM food. A thoughtful consideration by scientific community and decision makers of the moral values that are present in risk evaluation and risk management should be the most trustable answer to citizen-consumers to their claim for clear and definitive answers concerning safety/un-safety of GM food.

  5. What role does evaporative demand play in driving drought in Africa?

    NASA Astrophysics Data System (ADS)

    Hobbins, M.; Shukla, S.; McNally, A.; McEvoy, D.; Huntington, J. L.; Husak, G. J.; Funk, C. C.; Senay, G. B.; Verdin, J. P.; Jansma, T.; Dewes, C.

    2016-12-01

    Agricultural drought, crop stress and potential food insecurity have long been monitored hydrologically by land surface and crop models that balance moisture supply from precipitation against demand for moisture by evapotranspiration (ET). However, ET is generally derived either from poor parameterizations of evaporative demand (E0). For instance, in the Palmer Drought Severity Index's hydrologic bucket model, ET is driven by an E0 estimate that is itself driven solely by variations in temperature. E0's other physical drivers—wind, humidity, net radiation—are not considered, even though these non-temperature drivers often dominate its temporal variability both regionally and seasonally. However, recent work has started to uncover E0's role as both a drier and a signal of drought. Determining the role that drivers of E0 play is essential to reduce extraneous modeling uncertainty and recognize key sources of variability. This will lead to improvements in the dependent hydrologic analyses and in drought preparedness in food-insecure countries. Here we examine E0 as a driver, or signal, of drought, concentrating on Famine Early Warning Systems Network (FEWS NET) food-insecure countries in Africa during established drought periods. In order to circumvent the lack of station observations, we use an ongoing reanalysis—NASA's MERRA-2—to provide forcings for the FAO-56 model of Penman-Monteith reference ET as a fully physical estimator of E0. An example use of the resulting E0 dataset is in the Water Requirement Satisfaction Index model by the FEWS NET community. First, we decompose the temporal variability of E0 in drought into all of its physical drivers through a mean-value, second-moment uncertainty analysis in which both the sensitivity of E0 to its drivers and their observed variabilities are accounted for. Second, we explicitly attribute changes in the drought signal to the individual drivers of E0. We outline the methodology and results of this uncertainty analysis and attribution study across the space-time domains of established droughts, demonstrating the drivers of E0 variability and the evaporative drivers of crop stress and food insecurity. This presentation describes the analysis concept and summarizes the results across drought-affected regions and periods in Africa.

  6. Magnetic thermometry in the aseptic processing of foods containing particulates (abstract)

    NASA Astrophysics Data System (ADS)

    Ghiron, Kenneth; Litchfield, Bruce

    1997-04-01

    Aseptic processing of foods has many advantages over canning, including higher efficiency, lighter packaging, better taste, and higher nutritional value. Aseptic processing is different from canning where the food and container are sterilized together. Instead, a thin stream of food is heated and the packaging is independently sterilized before the food is placed in the package. However, no aseptic processes have been successfully filed with the FDA for foods containing sizable solid particles because of uncertainties in the thermal sterilization of the particles (e.g., soup). We have demonstrated that by inserting small paramagnetic particles in the interior of the simulated and real food particles, the local temperature can be measured. With this information, any questions about the adequate sterilization of the particles can be resolved. The measurements were done by directing the food stream through a magnetic field and sensing the voltages induced in a pickup coil by the motion of the magnetized particles. Details of the equipment design and data analysis will be discussed along with an introduction to the aseptic processing of foods.

  7. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  8. Uncertainty Analysis of Coupled Socioeconomic-Cropping Models: Building Confidence in Climate Change Decision-Support Tools for Local Stakeholders

    NASA Astrophysics Data System (ADS)

    Malard, J. J.; Rojas, M.; Adamowski, J. F.; Gálvez, J.; Tuy, H. A.; Melgar-Quiñonez, H.

    2015-12-01

    While cropping models represent the biophysical aspects of agricultural systems, system dynamics modelling offers the possibility of representing the socioeconomic (including social and cultural) aspects of these systems. The two types of models can then be coupled in order to include the socioeconomic dimensions of climate change adaptation in the predictions of cropping models.We develop a dynamically coupled socioeconomic-biophysical model of agricultural production and its repercussions on food security in two case studies from Guatemala (a market-based, intensive agricultural system and a low-input, subsistence crop-based system). Through the specification of the climate inputs to the cropping model, the impacts of climate change on the entire system can be analysed, and the participatory nature of the system dynamics model-building process, in which stakeholders from NGOs to local governmental extension workers were included, helps ensure local trust in and use of the model.However, the analysis of climate variability's impacts on agroecosystems includes uncertainty, especially in the case of joint physical-socioeconomic modelling, and the explicit representation of this uncertainty in the participatory development of the models is important to ensure appropriate use of the models by the end users. In addition, standard model calibration, validation, and uncertainty interval estimation techniques used for physically-based models are impractical in the case of socioeconomic modelling. We present a methodology for the calibration and uncertainty analysis of coupled biophysical (cropping) and system dynamics (socioeconomic) agricultural models, using survey data and expert input to calibrate and evaluate the uncertainty of the system dynamics as well as of the overall coupled model. This approach offers an important tool for local decision makers to evaluate the potential impacts of climate change and their feedbacks through the associated socioeconomic system.

  9. Simplified enzymatic high-performance anion exchange chromatographic determination of total fructans in food and pet food-limitations and measurement uncertainty.

    PubMed

    Stöber, Paul; Bénet, Sylvie; Hischenhuber, Claudia

    2004-04-21

    A simplified method to determine total fructans in food and pet food has been developed and validated. It follows the principle of AOAC method 997.08, i.e., high-performance anion exchange chromatographic (HPAEC) determination of total fructose released from fructans (F(f)) and total glucose released from fructans (G(f)) after enzymatic fructan hydrolysis. Unlike AOAC method 997.08, calculation of total fructans is based on the determination of F(f) alone. This is motivated by the inherent difficulty to accurately determine low amounts of G(f) since many food and pet food products contain other sources of total glucose (e.g., starch and sucrose). In this case, a correction factor g can be used (1.05 by default) to take into account the theoretical contribution of G(f). At levels >5% of total fructans and in commercial fructan ingredients, both F(f) and G(f) can and should be accurately determined; hence, no correction factor g is required. The method is suitable to quantify total fructans in various food and pet food products at concentrations >or=0.2% providing that the product does not contain other significant sources of total fructose such as free fructose or sucrose. Recovery rates in commercial fructan ingredients and in selected food and pet food ranged from 97 to 102%. As part of a measurement uncertainty estimation study, individual contributions to the total uncertainty (u) of the total fructan content were identified and quantified by using the validation data available. As a result, a correlation between the sucrose content and the total uncertainty of the total fructan content was established allowing us to define a limit of quantitation as a function of the sucrose content. One can conclude that this method is limited to food products where the sucrose content does not exceed about three times the total fructan content. Despite this limitation, which is inherent to any total fructan method based on the same approach, this procedure represents an excellent compromise with regard to accuracy, applicability, and convenience.

  10. Epoxidized soy bean oil migrating from the gaskets of lids into food packed in glass jars. Analysis by on-line liquid chromatography-gas chromatography.

    PubMed

    Fankhauser-Noti, Anja; Fiselier, Katell; Biedermann-Brem, Sandra; Grob, Koni

    2005-08-05

    The migration of epoxidized soy bean oil (ESBO) from the gasket in the lids of glass jars into foods, particularly those rich in edible oil, often far exceeds the legal limit (60 mg/kg). ESBO was determined through a methyl ester isomer of diepoxy linoleic acid. Transesterification occurred directly in the homogenized food. From the extracted methyl esters, the diepoxy components were isolated by normal-phase LC and transferred on-line to gas chromatography with flame ionization detection using the on-column interface in the concurrent solvent evaporation mode. The method involves verification elements to ensure the reliability of the results for every sample analyzed. The detection limit is 2-5 mg/kg, depending on the food. Uncertainty of the procedure is below 10%.

  11. Quantification of allyl hexanoate in pineapple beverages and yogurts as a case study to characterise a source of uncertainty in dietary exposure assessment to flavouring substances.

    PubMed

    Raffo, A; D'Aloise, A; Magrì, A D; Leclercq, C

    2012-01-01

    One source of uncertainty in the estimation of dietary exposure to flavouring substances is the uncertainty in the occurrence and concentration levels of these substances naturally present or added to foodstuffs. The aim of this study was to assess the variability of concentration levels of allyl hexanoate, considered as a case study, in two main food categories to which it is often added: pineapple juice-based beverages and yogurts containing pineapple. Thirty-four beverages and 29 yogurts, with pineapple fruit or juice and added flavourings declared as ingredients on the package, were purchased from the local market (in Rome) and analysed. Analytical methods based on the stir bar sorptive extraction (SBSE) technique for the isolation of the target analyte, and on GC-MS analysis for final determination, were developed for the two food categories. In beverages, allyl hexanoate concentrations ranged from less than 0.01 to 16.71 mg l(-1), whereas in yogurts they ranged from 0.02 to 89.41 mg kg(-1). Average concentrations in beverages and yogurts with pineapple as the main fruit ingredient (1.91 mg l(-1) for beverages, 9.61 mg kg(-1) for yogurts) were in fair agreement with average use level data reported from industry surveys for the relevant food categories (4.5 and 6.0 mg kg(-1), respectively). Within the group of yogurts a single product was found to contain a level of allyl hexanoate more than 10-fold higher than the average reported use level. The screening techniques developed by the European Food Safety Authority (EFSA) using use level data provided by industry gave estimates of exposure that were of the same order of magnitude as the estimates obtained for regular consumers who would be loyal to the pineapple yogurt and beverage products containing the highest observed concentration of the substance of interest. In this specific case the uncertainty in the results obtained with the use of standard screening techniques for exposure assessment based on industry reported use levels is low.

  12. "It just goes against the grain." Public understandings of genetically modified (GM) food in the UK.

    PubMed

    Shaw, Alison

    2002-07-01

    This paper reports on one aspect of qualitative research on public understandings of food risks, focusing on lay understandings of genetically modified (GM) food in the UK context. A range of theoretical, conceptual, and empirical literature on food, risk, and the public understanding of science are reviewed. The fieldwork methods are outlined and empirical data from a range of lay groups are presented. Major themes include: varying "technical" knowledge of science, the relationship between knowledge and acceptance of genetic modification, the uncertainty of scientific knowledge, genetic modification as inappropriate scientific intervention in "nature", the acceptability of animal and human applications of genetic modification, the appropriate boundaries of scientific innovation, the necessity for GM foods, the uncertainty of risks in GM food, fatalism about avoiding risks, and trust in "experts" to manage potential risks in GM food. Key discussion points relating to a sociological understanding of public attitudes to GM food are raised and some policy implications are highlighted.

  13. Dietary supplementation with non-prey food enhances fitness of a predatory arthropod

    USDA-ARS?s Scientific Manuscript database

    Uncertainties exist about the value of non-prey food for natural enemies that are commonly food limited, and the dietary conditions where non-prey foods are beneficial for carnivorous species. We examined the nutritional role of a non-prey food using a ground dwelling, tangle web-building spider tha...

  14. Precision of dehydroascorbic acid quantitation with the use of the subtraction method--validation of HPLC-DAD method for determination of total vitamin C in food.

    PubMed

    Mazurek, Artur; Jamroz, Jerzy

    2015-04-15

    In food analysis, a method for determination of vitamin C should enable measuring of total content of ascorbic acid (AA) and dehydroascorbic acid (DHAA) because both chemical forms exhibit biological activity. The aim of the work was to confirm applicability of HPLC-DAD method for analysis of total content of vitamin C (TC) and ascorbic acid in various types of food by determination of validation parameters such as: selectivity, precision, accuracy, linearity and limits of detection and quantitation. The results showed that the method applied for determination of TC and AA was selective, linear and precise. Precision of DHAA determination by the subtraction method was also evaluated. It was revealed that the results of DHAA determination obtained by the subtraction method were not precise which resulted directly from the assumption of this method and the principles of uncertainty propagation. The proposed chromatographic method should be recommended for routine determinations of total vitamin C in various food. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Measurement issues associated with quantitative molecular biology analysis of complex food matrices for the detection of food fraud.

    PubMed

    Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy

    2016-01-07

    Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays.

  16. Climate resilient crops for improving global food security and safety.

    PubMed

    Dhankher, Om Parkash; Foyer, Christine H

    2018-05-01

    Food security and the protection of the environment are urgent issues for global society, particularly with the uncertainties of climate change. Changing climate is predicted to have a wide range of negative impacts on plant physiology metabolism, soil fertility and carbon sequestration, microbial activity and diversity that will limit plant growth and productivity, and ultimately food production. Ensuring global food security and food safety will require an intensive research effort across the food chain, starting with crop production and the nutritional quality of the food products. Much uncertainty remains concerning the resilience of plants, soils, and associated microbes to climate change. Intensive efforts are currently underway to improve crop yields with lower input requirements and enhance the sustainability of yield through improved biotic and abiotic stress tolerance traits. In addition, significant efforts are focused on gaining a better understanding of the root/soil interface and associated microbiomes, as well as enhancing soil properties. © 2018 The Authors Plant, Cell & Environment Published by John Wiley & Sons Ltd.

  17. Extracting additional risk managers information from a risk assessment of Listeria monocytogenes in deli meats.

    PubMed

    Pérez-Rodríguez, F; van Asselt, E D; Garcia-Gimeno, R M; Zurera, G; Zwietering, M H

    2007-05-01

    The risk assessment study of Listeria monocytogenes in ready-to-eat foods conducted by the U.S. Food and Drug Administration is an example of an extensive quantitative microbiological risk assessment that could be used by risk analysts and other scientists to obtain information and by managers and stakeholders to make decisions on food safety management. The present study was conducted to investigate how detailed sensitivity analysis can be used by assessors to extract more information on risk factors and how results can be communicated to managers and stakeholders in an understandable way. The extended sensitivity analysis revealed that the extremes at the right side of the dose distribution (at consumption, 9 to 11.5 log CFU per serving) were responsible for most of the cases of listeriosis simulated. For concentration at retail, values below the detection limit of 0.04 CFU/g and the often used limit for L. monocytogenes of 100 CFU/g (also at retail) were associated with a high number of annual cases of listeriosis (about 29 and 82%, respectively). This association can be explained by growth of L. monocytogenes at both average and extreme values of temperature and time, indicating that a wide distribution can lead to high risk levels. Another finding is the importance of the maximal population density (i.e., the maximum concentration of L. monocytogenes assumed at a certain temperature) for accurately estimating the risk of infection by opportunistic pathogens such as L. monocytogenes. According to the obtained results, mainly concentrations corresponding to the highest maximal population densities caused risk in the simulation. However, sensitivity analysis applied to the uncertainty parameters revealed that prevalence at retail was the most important source of uncertainty in the model.

  18. Risk, Uncertainty and Precaution in Science: The Threshold of the Toxicological Concern Approach in Food Toxicology.

    PubMed

    Bschir, Karim

    2017-04-01

    Environmental risk assessment is often affected by severe uncertainty. The frequently invoked precautionary principle helps to guide risk assessment and decision-making in the face of scientific uncertainty. In many contexts, however, uncertainties play a role not only in the application of scientific models but also in their development. Building on recent literature in the philosophy of science, this paper argues that precaution should be exercised at the stage when tools for risk assessment are developed as well as when they are used to inform decision-making. The relevance and consequences of this claim are discussed in the context of the threshold of the toxicological concern approach in food toxicology. I conclude that the approach does not meet the standards of an epistemic version of the precautionary principle.

  19. Attribution of global foodborne disease to specific foods: Findings from a World Health Organization structured expert elicitation.

    PubMed

    Hoffmann, Sandra; Devleesschauwer, Brecht; Aspinall, Willy; Cooke, Roger; Corrigan, Tim; Havelaar, Arie; Angulo, Frederick; Gibb, Herman; Kirk, Martyn; Lake, Robin; Speybroeck, Niko; Torgerson, Paul; Hald, Tine

    2017-01-01

    Recently the World Health Organization, Foodborne Disease Burden Epidemiology Reference Group (FERG) estimated that 31 foodborne diseases (FBDs) resulted in over 600 million illnesses and 420,000 deaths worldwide in 2010. Knowing the relative role importance of different foods as exposure routes for key hazards is critical to preventing illness. This study reports the findings of a structured expert elicitation providing globally comparable food source attribution estimates for 11 major FBDs in each of 14 world subregions. We used Cooke's Classical Model to elicit and aggregate judgments of 73 international experts. Judgments were elicited from each expert individually and aggregated using both equal and performance weights. Performance weighted results are reported as they increased the informativeness of estimates, while retaining accuracy. We report measures of central tendency and uncertainty bounds on food source attribution estimate. For some pathogens we see relatively consistent food source attribution estimates across subregions of the world; for others there is substantial regional variation. For example, for non-typhoidal salmonellosis, pork was of minor importance compared to eggs and poultry meat in the American and African subregions, whereas in the European and Western Pacific subregions the importance of these three food sources were quite similar. Our regional results broadly agree with estimates from earlier European and North American food source attribution research. As in prior food source attribution research, we find relatively wide uncertainty bounds around our median estimates. We present the first worldwide estimates of the proportion of specific foodborne diseases attributable to specific food exposure routes. While we find substantial uncertainty around central tendency estimates, we believe these estimates provide the best currently available basis on which to link FBDs and specific foods in many parts of the world, providing guidance for policy actions to control FBDs.

  20. Uncertainties in Predicting Rice Yield by Current Crop Models Under a Wide Range of Climatic Conditions

    NASA Technical Reports Server (NTRS)

    Li, Tao; Hasegawa, Toshihiro; Yin, Xinyou; Zhu, Yan; Boote, Kenneth; Adam, Myriam; Bregaglio, Simone; Buis, Samuel; Confalonieri, Roberto; Fumoto, Tamon; hide

    2014-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We evaluated 13 rice models against multi-year experimental yield data at four sites with diverse climatic conditions in Asia and examined whether different modeling approaches on major physiological processes attribute to the uncertainties of prediction to field measured yields and to the uncertainties of sensitivity to changes in temperature and CO2 concentration [CO2]. We also examined whether a use of an ensemble of crop models can reduce the uncertainties. Individual models did not consistently reproduce both experimental and regional yields well, and uncertainty was larger at the warmest and coolest sites. The variation in yield projections was larger among crop models than variation resulting from 16 global climate model-based scenarios. However, the mean of predictions of all crop models reproduced experimental data, with an uncertainty of less than 10 percent of measured yields. Using an ensemble of eight models calibrated only for phenology or five models calibrated in detail resulted in the uncertainty equivalent to that of the measured yield in well-controlled agronomic field experiments. Sensitivity analysis indicates the necessity to improve the accuracy in predicting both biomass and harvest index in response to increasing [CO2] and temperature.

  1. Sustainable development and next generation's health: a long-term perspective about the consequences of today's activities for food safety.

    PubMed

    Frazzoli, Chiara; Petrini, Carlo; Mantovani, Alberto

    2009-01-01

    Development is defined sustainable when it meets the needs of the present without compromising the ability of future generations to meet their own needs. Pivoting on social, environmental and economic aspects of food chain sustainability, this paper presents the concept of sustainable food safety based on the prevention of risks and burden of poor health for generations to come. Under this respect, the assessment of long-term, transgenerational risks is still hampered by serious scientific uncertainties. Critical issues to the development of a sustainable food safety framework may include: endocrine disrupters as emerging contaminants that specifically target developing organisms; toxicological risks assessment in Countries at the turning point of development; translating knowledge into toxicity indexes to support risk management approaches, such as hazard analysis and critical control points (HACCP); the interplay between chemical hazards and social determinants. Efforts towards the comprehensive knowledge and management of key factors of sustainable food safety appear critical to the effectiveness of the overall sustainability policies.

  2. Columbia River food webs: Developing a broader scientific foundation for river restoration

    USGS Publications Warehouse

    Alldredge, J. Richard; Beauchamp, David; Bisson, Peter A.; Congleton, James; Henny, Charles; Huntly, Nancy; Lamberson, Roland; Levings, Colin; Naiman, Robert J.; Pearcy, William; Rieman, Bruce; Ruggerone, Greg; Scarnecchia, Dennis; Smouse, Peter; Wood, Chris C.

    2011-01-01

    The objectives of this report are to provide a fundamental understanding of aquatic food webs in the Columbia River Basin and to illustrate and summarize their influences on native fish restoration efforts. The spatial scope addresses tributaries, impoundments, the free-flowing Columbia and Snake rivers, as well as the estuary and plume. Achieving the Council's vision for the Columbia River Fish and Wildlife Program (NPCC 2009-09) of sustaining a "productive and diverse community" that provides "abundant" harvest, is best accomplished through a time-prioritized action plan, one that complements other approaches while addressing important challenges and uncertainties related to the Basin's food webs. Note that the oceanic food webs, although of immense importance in sustaining fish populations, are not considered beyond the plume since they involve an additional set of complex and rapidly evolving issues. An analysis of oceanic food webs of relevance to the Columbia River requires a separately focused effort (e.g., Hoegh- Guldberg and Bruno 2010).

  3. Modelling the bioaccumulation of persistent organic pollutants in agricultural food chains for regulatory exposure assessment.

    PubMed

    Takaki, Koki; Wade, Andrew J; Collins, Chris D

    2017-02-01

    New models for estimating bioaccumulation of persistent organic pollutants in the agricultural food chain were developed using recent improvements to plant uptake and cattle transfer models. One model named AgriSim was based on K OW regressions of bioaccumulation in plants and cattle, while the other was a steady-state mechanistic model, AgriCom. The two developed models and European Union System for the Evaluation of Substances (EUSES), as a benchmark, were applied to four reported food chain (soil/air-grass-cow-milk) scenarios to evaluate the performance of each model simulation against the observed data. The four scenarios considered were as follows: (1) polluted soil and air, (2) polluted soil, (3) highly polluted soil surface and polluted subsurface and (4) polluted soil and air at different mountain elevations. AgriCom reproduced observed milk bioaccumulation well for all four scenarios, as did AgriSim for scenarios 1 and 2, but EUSES only did this for scenario 1. The main causes of the deviation for EUSES and AgriSim were the lack of the soil-air-plant pathway and the ambient air-plant pathway, respectively. Based on the results, it is recommended that soil-air-plant and ambient air-plant pathway should be calculated separately and the K OW regression of transfer factor to milk used in EUSES be avoided. AgriCom satisfied the recommendations that led to the low residual errors between the simulated and the observed bioaccumulation in agricultural food chain for the four scenarios considered. It is therefore recommended that this model should be incorporated into regulatory exposure assessment tools. The model uncertainty of the three models should be noted since the simulated concentration in milk from 5th to 95th percentile of the uncertainty analysis often varied over two orders of magnitude. Using a measured value of soil organic carbon content was effective to reduce this uncertainty by one order of magnitude.

  4. "Functional foods compensate for an unhealthy lifestyle". Some Swedish consumers' impressions and perceived need of functional foods.

    PubMed

    Landström, Eva; Hursti, Ulla-Kaisa Koivisto; Magnusson, Maria

    2009-08-01

    The aim of the present study was to explore some Swedish consumers' impressions of and perceived need of functional foods. Data were collected through 10 focus groups. A total of 46 individuals participated (31 females, 18-75 years, and 16 males, 18-78 years). The interviews were transcribed verbatim and analysed by the use of content analysis. Uncertainties--e.g., if functional foods are normal foods or medicines, if the foods would give additional physiological effects and/or if the ingredients and substances could cause harm--caused questions among the interviewees of trustworthiness and a feeling of losing control. The interviewees debated on the necessity of functional foods. Apart from perceiving functional foods as unnatural, the interviewees thought that functional foods would falsely compensate for an unhealthy lifestyle. The use of functional foods was considered to be justified when a healthy lifestyle is incapable of improving people's health. The interviewees perceived themselves to be in no need of functional foods. They thought that the foods were meant for others, for those in unquestionable need. We conclude that the impressions of FF among Swedish consumers are complex and versatile. The necessity of FF was justified unless no other lifestyle changes were able to improve a person's state of health.

  5. Diving into the consumer nutrition environment: A Bayesian spatial factor analysis of neighborhood restaurant environment.

    PubMed

    Luan, Hui; Law, Jane; Lysy, Martin

    2018-02-01

    Neighborhood restaurant environment (NRE) plays a vital role in shaping residents' eating behaviors. While NRE 'healthfulness' is a multi-facet concept, most studies evaluate it based only on restaurant type, thus largely ignoring variations of in-restaurant features. In the few studies that do account for such features, healthfulness scores are simply averaged over accessible restaurants, thereby concealing any uncertainty that attributed to neighborhoods' size or spatial correlation. To address these limitations, this paper presents a Bayesian Spatial Factor Analysis for assessing NRE healthfulness in the city of Kitchener, Canada. Several in-restaurant characteristics are included. By treating NRE healthfulness as a spatially correlated latent variable, the adopted modeling approach can: (i) identify specific indicators most relevant to NRE healthfulness, (ii) provide healthfulness estimates for neighborhoods without accessible restaurants, and (iii) readily quantify uncertainties in the healthfulness index. Implications of the analysis for intervention program development and community food planning are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Matthew Reynolds | NREL

    Science.gov Websites

    food science. Matthew's research at NREL is focused on applying uncertainty quantification techniques . Research Interests Uncertainty quantification Computational multilinear algebra Approximation theory of and the Canonical Tensor Decomposition, Journal of Computational Physics (2017) Randomized Alternating

  7. Effectiveness of the food recovery at the retailing stage under shelf life uncertainty: An application to Italian food chains

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Muriana, Cinzia, E-mail: cinzia.muriana@unipa.it

    Highlights: • The food recovery is seen as suitable way to manage food near to its expiry date. • The variability of the products shelf life must be taken into account. • The paper addresses the mathematic modeling of the profit related to food recovery. • The optimal time to withdraw the products is determinant for food recovery. - Abstract: Food losses represent a significant issue affecting food supply chains. The possibility of recovering such products can be seen as an effective way to reduce such a phenomenon, improve supply chain performances and ameliorate the conditions of undernourished people. Themore » topic has been already investigated by a previous paper enforcing the hypothesis of deterministic and constant Shelf Life (SL) of products. However, such a model cannot be properly extended to products affected by uncertainties of the SL as it does not take into account the deterioration costs and loss of profits due to the overcoming of the SL within the cycle time. Thus the present paper presents an extension of the previous one under stochastic conditions of the food quality. Differently from the previous publication, this work represents a general model applicable to all supply chains, especially to those managing fresh products characterized by uncertain SL such as fruits and vegetables. The deterioration costs and loss of profits are included in the model and the optimal time at which to withdraw the products from the shelves as well as the quantities to be shipped at each alternative destination have been determined. A comparison of the proposed model with that reported in the previous publication has been carried out in order to underline the impact of the SL variability on the optimality conditions. The results show that the food recovery strategy in the presence of uncertainty of the food quality is rewarding, even if the optimal profit is lower than that of the deterministic case.« less

  8. Projecting Future Land Use Changes in West Africa Driven by Climate and Socioeconomic Factors: Uncertainties and Implications for Adaptation

    NASA Astrophysics Data System (ADS)

    Wang, G.; Ahmed, K. F.; You, L.

    2015-12-01

    Land use changes constitute an important regional climate change forcing in West Africa, a region of strong land-atmosphere coupling. At the same time, climate change can be an important driver for land use, although its importance relative to the impact of socio-economic factors may vary significant from region to region. This study compares the contributions of climate change and socioeconomic development to potential future changes of agricultural land use in West Africa and examines various sources of uncertainty using a land use projection model (LandPro) that accounts for the impact of socioeconomic drivers on the demand side and the impact of climate-induced crop yield changes on the supply side. Future crop yield changes were simulated by a process-based crop model driven with future climate projections from a regional climate model, and future changes of food demand is projected using a model for policy analysis of agricultural commodities and trade. The impact of human decision-making on land use was explicitly considered through multiple "what-if" scenarios to examine the range of uncertainties in projecting future land use. Without agricultural intensification, the climate-induced decrease of crop yield together with increase of food demand are found to cause a significant increase in agricultural land use at the expense of forest and grassland by the mid-century, and the resulting land use land cover changes are found to feed back to the regional climate in a way that exacerbates the negative impact of climate on crop yield. Analysis of results from multiple decision-making scenarios suggests that human adaptation characterized by science-informed decision making to minimize land use could be very effective in many parts of the region.

  9. [Significance of motivation balance for a choice of dog's behavior under conditions of environmental uncertainty].

    PubMed

    Chilingarian, L I; Grigor'ian, G A

    2007-01-01

    Two experimental models with a choice between two reinforcements were used for assessment of individual typological features of dogs. In the first model dogs were given the choice of homogeneous food reinforcements: between less valuable constantly delivered reinforcement and more valuable reinforcement but delivered with low probabilities. In the second model the dogs had the choice of heterogeneous reinforcements: between performing alimentary and defensive reactions. Under conditions of rise of uncertainty owing to a decrease in probability of getting the valuable food, two dogs continued to prefer the valuable reinforcement, while the third animal gradually shifted its behavior from the choice of a highly valuable but infrequent reward to a less valuable but easily achieved reinforcement. Under condition of choice between the valuable food reinforcement and avoidance of electrocutaneous stimulation, the first two dogs preferred food, whereas the third animal which had been previously oriented to the choice of the low-valuable constant reinforcement, steadily preferred the avoidance behavior. The data obtained are consistent with the hypothesis according to which the individual typological characteristics of animals's (human's) behavior substantially depend on two parameters: extent of environmental uncertainty and subjective features of reinforcement assessment.

  10. Bio-physical vs. Economic Uncertainty in the Analysis of Climate Change Impacts on World Agriculture

    NASA Astrophysics Data System (ADS)

    Hertel, T. W.; Lobell, D. B.

    2010-12-01

    Accumulating evidence suggests that agricultural production could be greatly affected by climate change, but there remains little quantitative understanding of how these agricultural impacts would affect economic livelihoods in poor countries. The recent paper by Hertel, Burke and Lobell (GEC, 2010) considers three scenarios of agricultural impacts of climate change, corresponding to the fifth, fiftieth, and ninety fifth percentiles of projected yield distributions for the world’s crops in 2030. They evaluate the resulting changes in global commodity prices, national economic welfare, and the incidence of poverty in a set of 15 developing countries. Although the small price changes under the medium scenario are consistent with previous findings, their low productivity scenario reveals the potential for much larger food price changes than reported in recent studies which have hitherto focused on the most likely outcomes. The poverty impacts of price changes under the extremely adverse scenario are quite heterogeneous and very significant in some population strata. They conclude that it is critical to look beyond central case climate shocks and beyond a simple focus on yields and highly aggregated poverty impacts. In this paper, we conduct a more formal, systematic sensitivity analysis (SSA) with respect to uncertainty in the biophysical impacts of climate change on agriculture, by explicitly specifying joint distributions for global yield changes - this time focusing on 2050. This permits us to place confidence intervals on the resulting price impacts and poverty results which reflect the uncertainty inherited from the biophysical side of the analysis. We contrast this with the economic uncertainty inherited from the global general equilibrium model (GTAP), by undertaking SSA with respect to the behavioral parameters in that model. This permits us to assess which type of uncertainty is more important for regional price and poverty outcomes. Finally, we undertake a combined SSA, wherein climate change-induced productivity shocks are permitted to interact with the uncertain economic parameters. This permits us to examine potential interactions between the two sources of uncertainty.

  11. Attribution of global foodborne disease to specific foods: Findings from a World Health Organization structured expert elicitation

    PubMed Central

    Devleesschauwer, Brecht; Aspinall, Willy; Cooke, Roger; Corrigan, Tim; Havelaar, Arie; Angulo, Frederick; Gibb, Herman; Kirk, Martyn; Lake, Robin; Speybroeck, Niko; Torgerson, Paul; Hald, Tine

    2017-01-01

    Background Recently the World Health Organization, Foodborne Disease Burden Epidemiology Reference Group (FERG) estimated that 31 foodborne diseases (FBDs) resulted in over 600 million illnesses and 420,000 deaths worldwide in 2010. Knowing the relative role importance of different foods as exposure routes for key hazards is critical to preventing illness. This study reports the findings of a structured expert elicitation providing globally comparable food source attribution estimates for 11 major FBDs in each of 14 world subregions. Methods and findings We used Cooke’s Classical Model to elicit and aggregate judgments of 73 international experts. Judgments were elicited from each expert individually and aggregated using both equal and performance weights. Performance weighted results are reported as they increased the informativeness of estimates, while retaining accuracy. We report measures of central tendency and uncertainty bounds on food source attribution estimate. For some pathogens we see relatively consistent food source attribution estimates across subregions of the world; for others there is substantial regional variation. For example, for non-typhoidal salmonellosis, pork was of minor importance compared to eggs and poultry meat in the American and African subregions, whereas in the European and Western Pacific subregions the importance of these three food sources were quite similar. Our regional results broadly agree with estimates from earlier European and North American food source attribution research. As in prior food source attribution research, we find relatively wide uncertainty bounds around our median estimates. Conclusions We present the first worldwide estimates of the proportion of specific foodborne diseases attributable to specific food exposure routes. While we find substantial uncertainty around central tendency estimates, we believe these estimates provide the best currently available basis on which to link FBDs and specific foods in many parts of the world, providing guidance for policy actions to control FBDs. PMID:28910293

  12. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ingale, S. V.; Datta, D.

    2010-10-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  13. Biogeoscience opportunities to address agricultural supply chain risk: observations, methods, and applications

    NASA Astrophysics Data System (ADS)

    Wolf, A.; Gaitan, C. F.; Thomas, T.; Watts, D.; Bollinger, J.

    2017-12-01

    Food and agriculture is the largest global industry, at $7.8Tn annual value, and is also the least digitized industry. As a consequence, the inefficiencies in this industry are staggering: yield gaps below potential are 20-70% worldwide, and of the crops that are produced, 20-50% are lost from the time of harvest up to consumption. Where some frame the challenges in agriculture as "grow more with less," a more useful analysis is around risk and uncertainty. In emerging markets, lack of geospatial data makes it difficult to recommend improved seeds or fertilizers for particular locales, therefore risky to make operating loans, impossible to accurately price crop insurance, and ultimately poses challenges in making contracts for delivery to processors that bring ag products into the food system. In developed markets, the ever increasing demands around immediacy, transparency, quality, crop novelty and food safety are straining the capacity of growers and processors to keep up. We have come to see this as a challenge in developing predictions joining both buyers and sellers around a shared set of facts on harvest timing, total yield, and post harvest quality. While these challenges have been met historically from government agencies and marketing boards reporting seasonal and regional forecasts, in many instances these are insufficient for making critical operational decisions on short timescales. In this talk, we will present a new set of measurements and analytical tools that enable unprecedented granularity in predictions to reduce risk and uncertainty in the food and ag supply chain, with special attention to applications that have potential to be economically self-sustaining.

  14. Effectiveness of the food recovery at the retailing stage under shelf life uncertainty: An application to Italian food chains.

    PubMed

    Muriana, Cinzia

    2015-07-01

    Food losses represent a significant issue affecting food supply chains. The possibility of recovering such products can be seen as an effective way to reduce such a phenomenon, improve supply chain performances and ameliorate the conditions of undernourished people. The topic has been already investigated by a previous paper enforcing the hypothesis of deterministic and constant Shelf Life (SL) of products. However, such a model cannot be properly extended to products affected by uncertainties of the SL as it does not take into account the deterioration costs and loss of profits due to the overcoming of the SL within the cycle time. Thus the present paper presents an extension of the previous one under stochastic conditions of the food quality. Differently from the previous publication, this work represents a general model applicable to all supply chains, especially to those managing fresh products characterized by uncertain SL such as fruits and vegetables. The deterioration costs and loss of profits are included in the model and the optimal time at which to withdraw the products from the shelves as well as the quantities to be shipped at each alternative destination have been determined. A comparison of the proposed model with that reported in the previous publication has been carried out in order to underline the impact of the SL variability on the optimality conditions. The results show that the food recovery strategy in the presence of uncertainty of the food quality is rewarding, even if the optimal profit is lower than that of the deterministic case. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Managing uncertainty about food risks - Consumer use of food labelling.

    PubMed

    Tonkin, Emma; Coveney, John; Meyer, Samantha B; Wilson, Annabelle M; Webb, Trevor

    2016-12-01

    General consumer knowledge of and engagement with the production of food has declined resulting in increasing consumer uncertainty about, and sensitivity to, food risks. Emphasis is therefore placed on providing information for consumers to reduce information asymmetry regarding food risks, particularly through food labelling. This study examines the role of food labelling in influencing consumer perceptions of food risks. In-depth, 1-h interviews were conducted with 24 Australian consumers. Participants were recruited based on an a priori defined food safety risk scale, and to achieve a diversity of demographic characteristics. The methodological approach used, adaptive theory, was chosen to enable a constant interweaving of theoretical understandings and empirical data throughout the study. Participants discussed perceiving both traditional (food spoilage/microbial contamination) and modern (social issues, pesticide and 'chemical' contamination) risks as present in the food system. Food labelling was a symbol of the food system having managed traditional risks, and a tool for consumers to personally manage perceived modern risks. However, labelling also raised awareness of modern risks not previously considered. The consumer framing of risk presented demonstrates the need for more meaningful consumer engagement in policy decision making to ensure risk communication and management meet public expectations. This research innovatively identifies food labelling as both a symbol of, and a tool for, the management of perceived risks for consumers. Therefore it is imperative that food system actors ensure the authenticity and trustworthiness of all aspects of food labelling, not only those related to food safety. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    USDA-ARS?s Scientific Manuscript database

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  17. Uncertainty in action-value estimation affects both action choice and learning rate of the choice behaviors of rats

    PubMed Central

    Funamizu, Akihiro; Ito, Makoto; Doya, Kenji; Kanzaki, Ryohei; Takahashi, Hirokazu

    2012-01-01

    The estimation of reward outcomes for action candidates is essential for decision making. In this study, we examined whether and how the uncertainty in reward outcome estimation affects the action choice and learning rate. We designed a choice task in which rats selected either the left-poking or right-poking hole and received a reward of a food pellet stochastically. The reward probabilities of the left and right holes were chosen from six settings (high, 100% vs. 66%; mid, 66% vs. 33%; low, 33% vs. 0% for the left vs. right holes, and the opposites) in every 20–549 trials. We used Bayesian Q-learning models to estimate the time course of the probability distribution of action values and tested if they better explain the behaviors of rats than standard Q-learning models that estimate only the mean of action values. Model comparison by cross-validation revealed that a Bayesian Q-learning model with an asymmetric update for reward and non-reward outcomes fit the choice time course of the rats best. In the action-choice equation of the Bayesian Q-learning model, the estimated coefficient for the variance of action value was positive, meaning that rats were uncertainty seeking. Further analysis of the Bayesian Q-learning model suggested that the uncertainty facilitated the effective learning rate. These results suggest that the rats consider uncertainty in action-value estimation and that they have an uncertainty-seeking action policy and uncertainty-dependent modulation of the effective learning rate. PMID:22487046

  18. "System Destroys Trust?"--Regulatory Institutions and Public Perceptions of Food Risks in Taiwan

    ERIC Educational Resources Information Center

    Chou, Kuei-tien; Liou, Hwa-meei

    2010-01-01

    This article aims to explore public perceptions of global food risk issues and public attitudes towards government capacity to respond to concerns with technological and health uncertainties in an era of rapid economic development in newly industrialized countries. From cross-national comparative research on global food risk issues in the EU, UK,…

  19. Cost Utility Analysis of Topical Steroids Compared With Dietary Elimination for Treatment of Eosinophilic Esophagitis.

    PubMed

    Cotton, Cary C; Erim, Daniel; Eluri, Swathi; Palmer, Sarah H; Green, Daniel J; Wolf, W Asher; Runge, Thomas M; Wheeler, Stephanie; Shaheen, Nicholas J; Dellon, Evan S

    2017-06-01

    Topical corticosteroids or dietary elimination are recommended as first-line therapies for eosinophilic esophagitis, but data to directly compare these therapies are scant. We performed a cost utility comparison of topical corticosteroids and the 6-food elimination diet (SFED) in treatment of eosinophilic esophagitis, from the payer perspective. We used a modified Markov model based on current clinical guidelines, in which transition between states depended on histologic response simulated at the individual cohort-member level. Simulation parameters were defined by systematic review and meta-analysis to determine the base-case estimates and bounds of uncertainty for sensitivity analysis. Meta-regression models included adjustment for differences in study and cohort characteristics. In the base-case scenario, topical fluticasone was about as effective as SFED but more expensive at a 5-year time horizon ($9261.58 vs $5719.72 per person). SFED was more effective and less expensive than topical fluticasone and topical budesonide in the base-case scenario. Probabilistic sensitivity analysis revealed little uncertainty in relative treatment effectiveness. There was somewhat greater uncertainty in the relative cost of treatments; most simulations found SFED to be less expensive. In a cost utility analysis comparing topical corticosteroids and SFED for first-line treatment of eosinophilic esophagitis, the therapies were similar in effectiveness. SFED was on average less expensive, and more cost effective in most simulations, than topical budesonide and topical fluticasone, from a payer perspective and not accounting for patient-level costs or quality of life. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.

  20. Applicability and feasibility of systematic review for performing evidence-based risk assessment in food and feed safety.

    PubMed

    Aiassa, E; Higgins, J P T; Frampton, G K; Greiner, M; Afonso, A; Amzal, B; Deeks, J; Dorne, J-L; Glanville, J; Lövei, G L; Nienstedt, K; O'connor, A M; Pullin, A S; Rajić, A; Verloo, D

    2015-01-01

    Food and feed safety risk assessment uses multi-parameter models to evaluate the likelihood of adverse events associated with exposure to hazards in human health, plant health, animal health, animal welfare, and the environment. Systematic review and meta-analysis are established methods for answering questions in health care, and can be implemented to minimize biases in food and feed safety risk assessment. However, no methodological frameworks exist for refining risk assessment multi-parameter models into questions suitable for systematic review, and use of meta-analysis to estimate all parameters required by a risk model may not be always feasible. This paper describes novel approaches for determining question suitability and for prioritizing questions for systematic review in this area. Risk assessment questions that aim to estimate a parameter are likely to be suitable for systematic review. Such questions can be structured by their "key elements" [e.g., for intervention questions, the population(s), intervention(s), comparator(s), and outcome(s)]. Prioritization of questions to be addressed by systematic review relies on the likely impact and related uncertainty of individual parameters in the risk model. This approach to planning and prioritizing systematic review seems to have useful implications for producing evidence-based food and feed safety risk assessment.

  1. Quantitative assessment of the risk of microbial spoilage in foods. Prediction of non-stability at 55 °C caused by Geobacillus stearothermophilus in canned green beans.

    PubMed

    Rigaux, Clémence; André, Stéphane; Albert, Isabelle; Carlin, Frédéric

    2014-02-03

    Microbial spoilage of canned foods by thermophilic and highly heat-resistant spore-forming bacteria, such as Geobacillus stearothermophilus, is a persistent problem in the food industry. An incubation test at 55 °C for 7 days, then validation of biological stability, is used as an indicator of compliance with good manufacturing practices. We propose a microbial risk assessment model predicting the percentage of non-stability due to G. stearothermophilus in canned green beans manufactured by a French company. The model accounts for initial microbial contaminations of fresh unprocessed green beans with G. stearothermophilus, cross-contaminations in the processing chain, inactivation processes and probability of survival and growth. The sterilization process is modeled by an equivalent heating time depending on sterilization value F₀ and on G. stearothermophilus resistance parameter z(T). Following the recommendations of international organizations, second order Monte-Carlo simulations are used, separately propagating uncertainty and variability on parameters. As a result of the model, the mean predicted non-stability rate is of 0.5%, with a 95% uncertainty interval of [0.1%; 1.2%], which is highly similar to data communicated by the French industry. A sensitivity analysis based on Sobol indices and some scenario tests underline the importance of cross-contamination at the blanching step, in addition to inactivation due to the sterilization process. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Assessing food security in water scarce regions by Life Cycle Analysis: a case study in the Gaza strip

    NASA Astrophysics Data System (ADS)

    Recanati, Francesca; Castelletti, Andrea; Melià, Paco; Dotelli, Giovanni

    2013-04-01

    Food security is a major issue in Palestine for both political and physical reasons, with direct effects on the local population living conditions: the nutritional level of people in Gaza is classified by FAO as "insecure". As most of the protein supply comes from irrigated agricultural production and aquaculture, freshwater availability is a limiting factor to food security, and the primary reason for frequent conflicts among food production processes (e.g. aquaculture, land livestock or different types of crops). In this study we use Life Cycle Analysis to assess the environmental impacts associated to all the stages of water-based protein production (from agriculture and aquaculture) in the Gaza strip under different agricultural scenarios and hydroclimatic variability. As reported in several recent studies, LCA seems to be an appropriate methodology to analyze agricultural systems and assess associated food security in different socio-economic contexts. However, we argue that the inherently linear and static nature of LCA might prove inadequate to tackle with the complex interaction between water cycle variability and the food production system in water-scarce regions of underdeveloped countries. Lack of sufficient and reliable data to characterize the water cycle is a further source of uncertainty affecting the robustness of the analysis. We investigate pros and cons of LCA and LCA-based option planning in an average size farm in Gaza strip, where farming and aquaculture are family-based and integrated by reuse of fish breeding water for irrigation. Different technological solutions (drip irrigation system, greenhouses etc.) are evaluated to improve protein supply and reduce the pressure on freshwater, particularly during droughts. But this use of technology represent also a contribution in increasing sustainability in agricultural processes, and therefore in economy, of Gaza Strip (reduction in chemical fertilizers and pesticides etc.).

  3. Business model configuration and dynamics for technology commercialization in mature markets.

    PubMed

    Flammini, Serena; Arcese, Gabriella; Lucchetti, Maria Claudia; Mortara, Letizia

    2017-01-01

    The food industry is a well-established and complex industry. New entrants attempting to penetrate it via the commercialization of a new technological innovation could face high uncertainty and constraints. The capability to innovate through collaboration and to identify suitable strategies and innovative business models (BMs) can be particularly important for bringing a technological innovation to this market. However, although the potential for these capabilities has been advocated, we still lack a complete understanding of how new ventures could support the technology commercialization process via the development of BMs. The paper aims to discuss these issues. To address this gap, this paper builds a conceptual framework that knits together the different bodies of extant literature (i.e. entrepreneurship, strategy and innovation) to analyze the BM innovation processes associated with the exploitation of emerging technologies; determines the suitability of the framework using data from the exploratory case study of IT IS 3D - a firm which has started to exploit 3D printing in the food industry; and improves the initial conceptual framework with the findings that emerged in the case study. From this analysis it emerged that: companies could use more than one BM at a time; hence, BM innovation processes could co-exist and be run in parallel; the facing of high uncertainty might lead firms to choose a closed and/or a familiar BM, while explorative strategies could be pursued with open BMs; significant changes in strategies during the technology commercialization process are not necessarily reflected in a radical change in the BM; and firms could deliberately adopt interim strategies and BMs as means to identify the more suitable ones to reach the market. This case study illustrates how firms could innovate the processes of their BM development to face the uncertainties linked with the entry into a mature and highly conservative industry (food).

  4. Learning in an exotic social wasp while relocating a food source.

    PubMed

    Lozada, Mariana; D'Adamo, Paola

    2014-01-01

    In this paper we review several studies on Vespulagermanica behavioral plasticity while relocating a food source in natural environments. This exotic social wasp, which has become established in many parts of the world, displays diverse cognitive abilities when foraging. Given its successful invasiveness worldwide, our initial hypothesis was that this species has great behavioral plasticity, which enables it to face environmental uncertainty. In our work we have analyzed foraging behavior associated with undepleted resources. Throughout several experiments, rapid learning was observed in this species; after few learning experiences they associate diverse contextual cues with a food source. However, by exploring wasp behavior when food suddenly disappeared, either because it had been removed or displaced, we found that they continued searching over a no longer rewarding site for a considerable period of time, suggesting that past experience can hinder new learning. Particularly surprising is the fact that when food was displaced nearby, wasps persisted in searching over the empty dish, ignoring the presence of food close by. We propose that this species could be a suitable model for studying cognitive plasticity in relation to environmental uncertainty. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. "And then you start to loose it because you think about Nutella": The significance of food for people with inflammatory bowel disease - a qualitative study.

    PubMed

    Palant, Alexander; Koschack, Janka; Rassmann, Simone; Lucius-Hoene, Gabriele; Karaus, Michael; Himmel, Wolfgang

    2015-07-30

    Many patients with inflammatory bowel disease strongly believe that food or certain food products heavily influence the symptoms or even trigger acute flare-ups. Unfortunately, there is no generalizable information for these patients, and therefore no effective diet has been identified to date. The narrative interviews we used for this study provide the basis for the German website www.krankheitserfahrungen.de . Maximum-variation sampling was used to include a broad range of experiences and a variety of different factors that might influence people's experiences. The sample included men and women of different age groups and social and ethnic backgrounds from across Germany. The interviews were analyzed using grounded theory. Four interrelated categories emerged: managing uncertainty, eating: between craving and aversion, being different and professional help as a further source of uncertainty. The most important issue for our responders was the handling of uncertainty and to find a way between desire for, and aversion against, eating. Many participants described difficulties during formal social occasions such as weddings, birthdays, or when going out to a restaurant. Many of the experiences the participants reported in their daily struggle with food and their illness, such as cravings for and abstaining from certain foods, were rather unusual and often stressful. Because they decided not to go out in public any longer, some of the interviewees experienced even more social isolation than they did before. Health professionals need to become more involved and not only advice about food and eating, but also help their patients find strategies for avoiding social isolation.

  6. Simultaneous analysis of historical, emerging and novel brominated flame retardants in food and feed using a common extraction and purification method.

    PubMed

    Bichon, Emmanuelle; Guiffard, Ingrid; Vénisseau, Anaïs; Lesquin, Elodie; Vaccher, Vincent; Marchand, Philippe; Le Bizec, Bruno

    2018-08-01

    Brominated Flame Retardants (BFRs) are still widely used for industrial purposes. These contaminants may enter the food chain where they mainly occur in food of animal origin. The aim of our work was to provide a unique method able to quantify the widest range of BFRs in feed and food items. After freeze-drying and grinding, a pressurized liquid extraction was carried out. The extract was purified on acidified silica, Florisil ® and carbon columns, the four separated fractions were analyzed by gas and liquid chromatography coupled to high resolution and tandem mass spectrometry. Isotopic dilution was preferentially used when commercial labelled compounds were available. Analytical sensitivity was in accordance with the expectations of Recommendation 2014/118/EU for PBDEs, HBCDDs, TBBPA, TBBPA-bME, EHTBB, BEHTEBP and TBBPA-bME. Additional BFRs were included in this analytical method with the same level of performances (LOQs below 0.01 ng g -1 ww). These are PBBs, pTBX, TBCT, PBBz, PBT, PBEB, HBBz, BTBPE, OBIND and T23BPIC. However, some of the BFRs listed in Recommendation 2014/118/EU are not yet covered by our analytical method, i.e. TBBPA-bOHEE, TBBPA-bAE, TBBPA-bGE, TBBPA-bDiBPrE, TBBPS, TBBPS-bME, TDBPP, EBTEBPI, HBCYD and DBNPG. The uncertainty measurement was fully calculated for 21 of the 31 analytes monitored in the method. Reproducibility uncertainty was below 23% in isotopic dilution. Certified reference materials are now required to better characterize the trueness of this method, which was applied in the French National Control Plans. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Uncertainly Analysis of Two Types of Humidity Sensors by a Humidity Generator with a Divided-Flow System.

    PubMed

    Chen, Ling-Hsi; Chen, Chiachung

    2018-02-21

    Humidity measurement is an important technique for the agricultural, foods, pharmaceuticals, and chemical industries. For the sake of convenience, electrical relative humidity (RH) sensors have been widely used. These sensors need to be calibrated to ensure their accuracy and the uncertainty measurement of these sensors has become a major concern. In this study, a self-made divided-flow generator was established to calibrate two types of electrical humidity sensors. The standard reference humidity was calculated from dew-point temperature and air dry-bulb temperature measured by a chilled mirror monitor. This divided-flow generator could produce consistent result of RH measurement results. The uncertainty of the reference standard increased with the increase of RH values. The combined uncertainty with the adequate calibration equations were ranged from 0.82% to 1.45% RH for resistive humidity sensors and 0.63% to 1.4% for capacitive humidity sensors, respectively. This self-made, divided-flow generator, and calibration method are cheap, time-saving, and easy to be used. Thus, the proposed approach can easily be applied in research laboratories.

  8. Uncertainly Analysis of Two Types of Humidity Sensors by a Humidity Generator with a Divided-Flow System

    PubMed Central

    Chen, Ling-Hsi

    2018-01-01

    Humidity measurement is an important technique for the agricultural, foods, pharmaceuticals, and chemical industries. For the sake of convenience, electrical relative humidity (RH) sensors have been widely used. These sensors need to be calibrated to ensure their accuracy and the uncertainty measurement of these sensors has become a major concern. In this study, a self-made divided-flow generator was established to calibrate two types of electrical humidity sensors. The standard reference humidity was calculated from dew-point temperature and air dry-bulb temperature measured by a chilled mirror monitor. This divided-flow generator could produce consistent result of RH measurement results. The uncertainty of the reference standard increased with the increase of RH values. The combined uncertainty with the adequate calibration equations were ranged from 0.82% to 1.45% RH for resistive humidity sensors and 0.63% to 1.4% for capacitive humidity sensors, respectively. This self-made, divided-flow generator, and calibration method are cheap, time-saving, and easy to be used. Thus, the proposed approach can easily be applied in research laboratories. PMID:29466313

  9. Reducing uncertainty in risk modeling for methylmercury exposure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ponce, R.; Egeland, G.; Middaugh, J.

    The biomagnification and bioaccumulation of methylmercury in marine species represents a challenge for risk assessment related to the consumption of subsistence foods in Alaska. Because of the profound impact that food consumption advisories have on indigenous peoples seeking to preserve a way of life, there is a need to reduce uncertainty in risk assessment. Thus, research was initiated to reduce the uncertainty in assessing the health risks associated with the consumption of subsistence foods. Because marine subsistence foods typically contain elevated levels of methylmercury, preliminary research efforts have focused on methylmercury as the principal chemical of concern. Of particular interestmore » are the antagonistic effects of selenium on methylmercury toxicity. Because of this antagonism, methylmercury exposure through the consumption of marine mammal meat (with high selenium) may not be as toxic as comparable exposures through other sources of dietary intake, such as in the contaminated bread episode of Iraq (containing relatively low selenium). This hypothesis is supported by animal experiments showing reduced toxicity of methylmercury associated with marine mammal meat, by the antagonistic influence of selenium on methylmercury toxicity, and by negative clinical findings in adult populations exposed to methylmercury through a marine diet not subject to industrial contamination. Exploratory model development is underway to identify potential improvements and applications of current deterministic and probabilistic models, particularly by incorporating selenium as an antagonist in risk modeling methods.« less

  10. Contamination of packaged food by substances migrating from a direct-contact plastic layer: Assessment using a generic quantitative household scale methodology.

    PubMed

    Vitrac, Olivier; Challe, Blandine; Leblanc, Jean-Charles; Feigenbaum, Alexandre

    2007-01-01

    The contamination risk in 12 packaged foods by substances released from the plastic contact layer has been evaluated using a novel modeling technique, which predicts the migration that accounts for (i) possible variations in the time of contact between foodstuffs and packaging and (ii) uncertainty in physico-chemical parameters used to predict migration. Contamination data, which are subject to variability and uncertainty, are derived through a stochastic resolution of transport equations, which control the migration into food. Distributions of contact times between packaging materials and foodstuffs were reconstructed from the volumes and frequencies of purchases of a given panel of 6422 households, making assumptions about household storage behaviour. The risk of contamination of the packaged foods was estimated for styrene (a monomer found in polystyrene yogurt pots) and 2,6-di-tert-butyl-4-hydroxytoluene (a representative of the widely used phenolic antioxidants). The results are analysed and discussed regarding sensitivity of the model to the set parameters and chosen assumptions.

  11. [Genetically modified organisms: a new threat to food safety].

    PubMed

    Spendeler, Liliane

    2005-01-01

    This article analyzes all of the food safety-related aspects related to the use of genetically modified organisms into agriculture and food. A discussion is provided as to the uncertainties related to the insertion of foreign genes into organisms, providing examples of unforeseen, undesirable effects and of instabilities of the organisms thus artificially fabricated. Data is then provided from both official agencies as well as existing literature questioning the accuracy and reliability of the risk analyses as to these organisms being harmless to health and discusses the almost total lack of scientific studies analyzing the health safety/dangerousness of transgenic foods. Given all these unknowns, other factors must be taken into account, particularly genetic contamination of the non-genetically modified crops, which is now starting to become widespread in some parts of the world. Not being able of reversing the situation in the even of problems is irresponsible. Other major aspects are the impacts on the environment (such as insects building up resistances, the loss of biodiversity, the increase in chemical products employed) with indirect repercussions on health and/or future food production. Lastly, thoughts for discussion are added concerning food safety in terms of food availability and food sovereignty, given that the transgenic seed and related agrochemicals market is currently cornered by five large-scale transnational companies. The conclusion entails an analysis of biotechnological agriculture's contribution to sustainability.

  12. What Risk Assessments of Genetically Modified Organisms Can Learn from Institutional Analyses of Public Health Risks

    PubMed Central

    Rajan, S. Ravi; Letourneau, Deborah K.

    2012-01-01

    The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large. PMID:23193357

  13. What risk assessments of genetically modified organisms can learn from institutional analyses of public health risks.

    PubMed

    Rajan, S Ravi; Letourneau, Deborah K

    2012-01-01

    The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large.

  14. Modelling the anaerobic digestion of solid organic waste - Substrate characterisation method for ADM1 using a combined biochemical and kinetic parameter estimation approach.

    PubMed

    Poggio, D; Walker, M; Nimmo, W; Ma, L; Pourkashanian, M

    2016-07-01

    This work proposes a novel and rigorous substrate characterisation methodology to be used with ADM1 to simulate the anaerobic digestion of solid organic waste. The proposed method uses data from both direct substrate analysis and the methane production from laboratory scale anaerobic digestion experiments and involves assessment of four substrate fractionation models. The models partition the organic matter into a mixture of particulate and soluble fractions with the decision on the most suitable model being made on quality of fit between experimental and simulated data and the uncertainty of the calibrated parameters. The method was tested using samples of domestic green and food waste and using experimental data from both short batch tests and longer semi-continuous trials. The results showed that in general an increased fractionation model complexity led to better fit but with increased uncertainty. When using batch test data the most suitable model for green waste included one particulate and one soluble fraction, whereas for food waste two particulate fractions were needed. With richer semi-continuous datasets, the parameter estimation resulted in less uncertainty therefore allowing the description of the substrate with a more complex model. The resulting substrate characterisations and fractionation models obtained from batch test data, for both waste samples, were used to validate the method using semi-continuous experimental data and showed good prediction of methane production, biogas composition, total and volatile solids, ammonia and alkalinity. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  15. Uncertainty in action-value estimation affects both action choice and learning rate of the choice behaviors of rats.

    PubMed

    Funamizu, Akihiro; Ito, Makoto; Doya, Kenji; Kanzaki, Ryohei; Takahashi, Hirokazu

    2012-04-01

    The estimation of reward outcomes for action candidates is essential for decision making. In this study, we examined whether and how the uncertainty in reward outcome estimation affects the action choice and learning rate. We designed a choice task in which rats selected either the left-poking or right-poking hole and received a reward of a food pellet stochastically. The reward probabilities of the left and right holes were chosen from six settings (high, 100% vs. 66%; mid, 66% vs. 33%; low, 33% vs. 0% for the left vs. right holes, and the opposites) in every 20-549 trials. We used Bayesian Q-learning models to estimate the time course of the probability distribution of action values and tested if they better explain the behaviors of rats than standard Q-learning models that estimate only the mean of action values. Model comparison by cross-validation revealed that a Bayesian Q-learning model with an asymmetric update for reward and non-reward outcomes fit the choice time course of the rats best. In the action-choice equation of the Bayesian Q-learning model, the estimated coefficient for the variance of action value was positive, meaning that rats were uncertainty seeking. Further analysis of the Bayesian Q-learning model suggested that the uncertainty facilitated the effective learning rate. These results suggest that the rats consider uncertainty in action-value estimation and that they have an uncertainty-seeking action policy and uncertainty-dependent modulation of the effective learning rate. © 2012 The Authors. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  16. Estimate of the uncertainty in measurement for the determination of mercury in seafood by TDA AAS.

    PubMed

    Torres, Daiane Placido; Olivares, Igor R B; Queiroz, Helena Müller

    2015-01-01

    An approach for the estimate of the uncertainty in measurement considering the individual sources related to the different steps of the method under evaluation as well as the uncertainties estimated from the validation data for the determination of mercury in seafood by using thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) is proposed. The considered method has been fully optimized and validated in an official laboratory of the Ministry of Agriculture, Livestock and Food Supply of Brazil, in order to comply with national and international food regulations and quality assurance. The referred method has been accredited under the ISO/IEC 17025 norm since 2010. The approach of the present work in order to reach the aim of estimating of the uncertainty in measurement was based on six sources of uncertainty for mercury determination in seafood by TDA AAS, following the validation process, which were: Linear least square regression, Repeatability, Intermediate precision, Correction factor of the analytical curve, Sample mass, and Standard reference solution. Those that most influenced the uncertainty in measurement were sample weight, repeatability, intermediate precision and calibration curve. The obtained result for the estimate of uncertainty in measurement in the present work reached a value of 13.39%, which complies with the European Regulation EC 836/2011. This figure represents a very realistic estimate of the routine conditions, since it fairly encompasses the dispersion obtained from the value attributed to the sample and the value measured by the laboratory analysts. From this outcome, it is possible to infer that the validation data (based on calibration curve, recovery and precision), together with the variation on sample mass, can offer a proper estimate of uncertainty in measurement.

  17. Survey of patulin occurrence in apple juice and apple products in Catalonia, Spain, and an estimate of dietary intake.

    PubMed

    Cano-Sancho, G; Marin, S; Ramos, A J; Sanchis, V

    2009-01-01

    This study was conducted to assess patulin exposure in the Catalonian population. Patulin levels were determined in 161 apple juice samples, 77 solid apple-based food samples and 146 apple-based baby food samples obtained from six hypermarkets and supermarkets from twelve main cities of Catalonia, Spain. Patulin was analysed by a well-established validated method involving ethyl acetate extraction and direct analysis by high-performance liquid chromatography (HPLC) with ultraviolet light detection. Mean patulin levels for positive samples in apple juice, solid apple-based food and apple-based baby food were 8.05, 13.54 and 7.12 µg kg(-1), respectively. No samples exceeded the maximum permitted levels established by European Union regulation. Dietary intake was separately assessed for babies, infants and adults through a Food Frequency Questionnaire developed from 1056 individuals from Catalonia. Babies were the main group exposed to patulin, however no risk was detected at these levels of contamination. Adults and infants consumers were far from risk levels. Another approach to determine estimated exposure was conducted through Monte Carlo simulation that distinguishes variability in exposures from uncertainty of distributional parameter estimates.

  18. Climate impacts on human livelihoods: where uncertainty matters in projections of water availability

    NASA Astrophysics Data System (ADS)

    Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.

    2014-10-01

    Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models, as well as greenhouse gas scenarios, are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure what is referred to here as AHEAD (Adequate Human livelihood conditions for wEll-being And Development). Based on a trans-disciplinary sample of concepts addressing human well-being and livelihoods, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows for the uncertainty of climate and impact model projections to be identified and differentiated. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that livelihood conditions are compromised by water scarcity in 34 countries. However, more often, AHEAD fulfilment is limited through other elements. The analysis shows that the water-specific uncertainty ranges of the model output are outside relevant thresholds for AHEAD for 65 out of 111 countries, and therefore do not contribute to the overall uncertainty about climate change impacts on livelihoods. In 46 of the countries in the analysis, water-specific uncertainty is relevant to AHEAD. The AHEAD method presented here, together with first results, forms an important step towards making scientific results more applicable for policy decisions.

  19. Development of a new chlorogenic acid certified reference material for food and drug analysis.

    PubMed

    Yang, Dezhi; Jiao, LingTai; Zhang, Baoxi; Du, Guanhua; Lu, Yang

    2017-06-05

    This paper reports the preparation and characterization of a new chlorogenic acid (CHA) certified reference material (CRM), which is unavailable commercially. CHA is an active ingredient found in many geo-authentic Chinese medicinal materials and developed as an anti-cancer drug. In this work, trace impurities were isolated and identified through various techniques. CHA CRM was quantified with two analytical methods, and their results were in good agreement with each other. The certified value and corresponding expanded uncertainty of CHA CRM reached 99.4%±0.2%, which was calculated by multiplying the combined standard uncertainty by the coverage factor (k=2), at a confidence level of 95%. This CRM can be used to calibrate measurement system, evaluate or validate measurement procedures, assign traceable property values to non-CRMs, and conduct quality control assays. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Design and Analysis of Offshore Macroalgae Biorefineries.

    PubMed

    Golberg, Alexander; Liberzon, Alexander; Vitkin, Edward; Yakhini, Zohar

    2018-03-15

    Displacing fossil fuels and their derivatives with renewables, and increasing sustainable food production are among the major challenges facing the world in the coming decades. A possible, sustainable direction for addressing this challenge is the production of biomass and the conversion of this biomass to the required products through a complex system coined biorefinery. Terrestrial biomass and microalgae are possible sources; however, concerns over net energy balance, potable water use, environmental hazards, and uncertainty in the processing technologies raise questions regarding their actual potential to meet the anticipated food, feed, and energy challenges in a sustainable way. Alternative sustainable sources for biorefineries are macroalgae grown and processed offshore. However, implementation of the offshore biorefineries requires detailed analysis of their technological, economic, and environmental performance. In this chapter, the basic principles of marine biorefineries design are shown. The methods to integrate thermodynamic efficiency, investment, and environmental aspects are discussed. The performance improvement by development of new cultivation methods that fit macroalgae physiology and development of new fermentation methods that address macroalgae unique chemical composition is shown.

  1. TEM and SP-ICP-MS analysis of the release of silver nanoparticles from decoration of pastry.

    PubMed

    Verleysen, E; Van Doren, E; Waegeneers, N; De Temmerman, P-J; Abi Daoud Francisco, M; Mast, J

    2015-04-08

    Metallic silver is an EU approved food additive referred to as E174. It is generally assumed that silver is only present in bulk form in the food chain. This work demonstrates that a simple treatment with water of "silver pearls", meant for decoration of pastry, results in the release of a subfraction of silver nanoparticles. The number-based size and shape distributions of the single, aggregated, and/or agglomerated particles released from the silver pearls were determined by combining conventional bright-field TEM imaging with semiautomatic particle detection and analysis. In addition, the crystal structure of the particles was studied by electron diffraction and chemical information was obtained by combining HAADF-STEM imaging with EDX spectroscopy and mapping. The TEM results were confirmed by SP-ICP-MS. The representative Ag test nanomaterial NM-300 K was used as a positive control to determine the uncertainty on the measurement of the size and shape of the particles.

  2. Eating nanomaterials: cruelty-free and safe? the EFSA guidance on risk assessment of nanomaterials in food and feed.

    PubMed

    Sauer, Ursula G

    2011-12-01

    Nanomaterials are increasingly being added to food handling and packaging materials, or directly, to human food and animal feed. To ensure the safety of such engineered nanomaterials (ENMs), in May 2011, the European Food Safety Authority (EFSA) published a guidance document on Risk assessment of the application of nanoscience and nanotechnologies in the food and feed chain. It states that risk assessment should be performed by following a step-wise procedure. Whenever human or animal exposure to nanomaterials is expected, the general hazard characterisation scheme requests information from in vitro genotoxicity, toxicokinetic and repeated dose 90-day oral toxicity studies in rodents. Numerous prevailing uncertainties with regard to nanomaterial characterisation and their hazard and risk assessment are addressed in the guidance document. This article discusses the impact of these knowledge gaps on meeting the goal of ensuring human safety. The EFSA's guidance on the risk assessment of ENMs in food and animal feed is taken as an example for discussion, from the point of view of animal welfare, on what level of uncertainty should be considered acceptable for human safety assessment of products with non-medical applications, and whether animal testing should be considered ethically acceptable for such products.

  3. Global, regional and national consumption of major food groups in 1990 and 2010: a systematic analysis including 266 country-specific nutrition surveys worldwide

    PubMed Central

    Micha, Renata; Khatibzadeh, Shahab; Shi, Peilin; Andrews, Kathryn G; Engell, Rebecca E; Mozaffarian, Dariush

    2015-01-01

    Objective To quantify global intakes of key foods related to non-communicable diseases in adults by region (n=21), country (n=187), age and sex, in 1990 and 2010. Design We searched and obtained individual-level intake data in 16 age/sex groups worldwide from 266 surveys across 113 countries. We combined these data with food balance sheets available in all nations and years. A hierarchical Bayesian model estimated mean food intake and associated uncertainty for each age-sex-country-year stratum, accounting for differences in intakes versus availability, survey methods and representativeness, and sampling and modelling uncertainty. Setting/population Global adult population, by age, sex, country and time. Results In 2010, global fruit intake was 81.3 g/day (95% uncertainty interval 78.9–83.7), with country-specific intakes ranging from 19.2–325.1 g/day; in only 2 countries (representing 0.4% of the world's population), mean intakes met recommended targets of ≥300 g/day. Country-specific vegetable intake ranged from 34.6–493.1 g/day (global mean=208.8 g/day); corresponding values for nuts/seeds were 0.2–152.7 g/day (8.9 g/day); for whole grains, 1.3–334.3 g/day (38.4 g/day); for seafood, 6.0–87.6 g/day (27.9 g/day); for red meats, 3.0–124.2 g/day (41.8 g/day); and for processed meats, 2.5–66.1 g/day (13.7 g/day). Mean national intakes met recommended targets in countries representing 0.4% of the global population for vegetables (≥400 g/day); 9.6% for nuts/seeds (≥4 (28.35 g) servings/week); 7.6% for whole grains (≥2.5 (50 g) servings/day); 4.4% for seafood (≥3.5 (100 g) servings/week); 20.3% for red meats (≤1 (100 g) serving/week); and 38.5% for processed meats (≤1 (50 g) serving/week). Intakes of healthful foods were generally higher and of less healthful foods generally lower at older ages. Intakes were generally similar by sex. Vegetable, seafood and processed meat intakes were stable over time; fruits, nuts/seeds and red meat, increased; and whole grains, decreased. Conclusions These global dietary data by nation, age and sex identify key challenges and opportunities for optimising diets, informing policies and priorities for improving global health. PMID:26408285

  4. The interplay between societal concerns and the regulatory frame on GM crops in the European Union.

    PubMed

    Devos, Yann; Reheul, Dirk; De Waele, Danny; Van Speybroeck, Linda

    2006-01-01

    Recapitulating how genetic modification technology and its agro-food products aroused strong societal opposition in the European Union, this paper demonstrates how this opposition contributed to shape the European regulatory frame on GM crops. More specifically, it describes how this opposition contributed to a de facto moratorium on the commercialization of new GM crop events in the end of the nineties. From this period onwards, the regulatory frame has been continuously revised in order to slow down further erosion of public and market confidence. Various scientific and technical reforms were made to meet societal concerns relating to the safety of GM crops. In this context, the precautionary principle, environmental post-market monitoring and traceability were adopted as ways to cope with scientific uncertainties. Labeling, traceability, co-existence and public information were installed in an attempt to meet the general public request for more information about GM agro-food products, and the specific demand to respect the consumers' and farmers' freedom of choice. Despite these efforts, today, the explicit role of public participation and/or ethical consultation during authorization procedures is at best minimal. Moreover, no legal room was created to progress to an integral sustainability evaluation during market procedures. It remains to be seen whether the recent policy shift towards greater transparency about value judgments, plural viewpoints and scientific uncertainties will be one step forward in integrating ethical concerns more explicitly in risk analysis. As such, the regulatory frame stands open for further interpretation, reflecting in various degrees a continued interplay with societal concerns relating to GM agro-food products. In this regard, both societal concerns and diversely interpreted regulatory criteria can be inferred as signaling a request - and even a quest - to render more explicit the broader-than-scientific dimension of the actual risk analysis.

  5. Evaluating MoE and its Uncertainty and Variability for Food Contaminants (EuroTox presentation)

    EPA Science Inventory

    Margin of Exposure (MoE), is a metric for quantifying the relationship between exposure and hazard. Ideally, it is the ratio of the dose associated with hazard and an estimate of exposure. For example, hazard may be characterized by a benchmark dose (BMD), and, for food contami...

  6. Agriculture and food availability -- remote sensing of agriculture for food security monitoring in the developing world

    USGS Publications Warehouse

    Budde, Michael E.; Rowland, James; Funk, Christopher C.

    2010-01-01

    For one-sixth of the world’s population - roughly 1 billion children, women and men - growing, buying or receiving adequate, affordable food to eat is a daily uncertainty. The World Monetary Fund reports that food prices worldwide increased 43 percent in 2007-2008, and unpredictable growing conditions make subsistence farming, on which many depend, a risky business. Scientists with the U.S. Geological Survey (USGS) are part of a network of both private and government institutions that monitor food security in many of the poorest nations in the world.

  7. The first taste is always with the eyes: a meta-analysis on the neural correlates of processing visual food cues.

    PubMed

    van der Laan, L N; de Ridder, D T D; Viergever, M A; Smeets, P A M

    2011-03-01

    Food selection is primarily guided by the visual system. Multiple functional neuro-imaging studies have examined the brain responses to visual food stimuli. However, the results of these studies are heterogeneous and there still is uncertainty about the core brain regions involved in the neural processing of viewing food pictures. The aims of the present study were to determine the concurrence in the brain regions activated in response to viewing pictures of food and to assess the modulating effects of hunger state and the food's energy content. We performed three Activation Likelihood Estimation (ALE) meta-analyses on data from healthy normal weight subjects in which we examined: 1) the contrast between viewing food and nonfood pictures (17 studies, 189 foci), 2) the modulation by hunger state (five studies, 48 foci) and 3) the modulation by energy content (seven studies, 86 foci). The most concurrent brain regions activated in response to viewing food pictures, both in terms of ALE values and the number of contributing experiments, were the bilateral posterior fusiform gyrus, the left lateral orbitofrontal cortex (OFC) and the left middle insula. Hunger modulated the response to food pictures in the right amygdala and left lateral OFC, and energy content modulated the response in the hypothalamus/ventral striatum. Overall, the concurrence between studies was moderate: at best 41% of the experiments contributed to the clusters for the contrast between food and nonfood. Therefore, future research should further elucidate the separate effects of methodological and physiological factors on between-study variations. Copyright © 2010 Elsevier Inc. All rights reserved.

  8. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  9. Future Irrigation Requirement of Rice Under Irrigated Area - Uncertainty through GCMs and Crop Models : A Case Study of Indo-Gangetic Plains of India

    NASA Astrophysics Data System (ADS)

    Pillai, S. N.; Singh, H.; Ruane, A. C.; Boote, K. G.; Porter, C.; Rosenzweig, C.; Panwar, A. S.

    2017-12-01

    Indo-Gangetic Plains (IGP), the food basket of South Asia, characterised by predominantly cereal-based farming systems where livestock is an integral part of farm economy. Climate change is projected to have significant effects on agriculture production and hence on food and livelihood security because more than 90 per cent farmers fall under small and marginal category. The rising temperatures and uncertainties in rainfall associated with global warming may have serious direct and indirect impacts on crop production. A loss of 10-40% crop production is predicted in different crops in India by the end of this century by different researchers. Cereal crops (mainly rice and wheat) are crucial to ensuring the food security in the region, but sustaining their productivity has become a major challenge due to climate variability and uncertainty. Under AgMIP Project, we have analysed the climate change impact on farm level productivity of rice at Meerut District, Uttar Pradesh using 29 GCMs under RCP4.5 and RCP8.5 during mid-century period 2041-2070. Two crop simulation models DSSAT4.6 and APSIM7.7 were used for impact study. There is lot of uncertainty in yield level by different GCMs and crop models. Under RCP4.5, APSIM showed a declining yield up to 14.5 % while DSSAT showed a declining yield level of 6.5 % only compared to the baseline (1980-2010). However, out of 29 GCMs, 15 GCMs showed negative impact and 14 showed positive impact under APSIM while it showed 21 and 8 GCMs, respectively in the case of DSSAT. DSSAT and APSIM simulated irrigation water requirement in future of the order of 645±75 mm and 730±107 mm, respectively under RCP4.5. However, the same will be of the order of 626 ± 99 mm and 749 ± 147 mm, respectively under RCP8.5. Projected irrigation water productivity showed a range of 4.87-12.15 kg ha-1 mm-1 and 6.77-12.63 kg ha-1 mm-1 through APSIM and DSSAT, respectively under RCP4.5, which stands an average of 7.81 and 8.53 kg ha-1 mm-1 during the baseline period. It reduced to 4.22-10.64 and 6.37-12.56 kg ha-1 mm-1 through APSIM and DSSAT, respectively under RCP8.5. This showed the uncertainty of GCMs as well as the Crop models for future projection. A multi-model approach with optimistic and pessimistic projections should be used for scenario analysis and policy planning in future rather than single model projections.

  10. Impact of communication on consumers' food choices.

    PubMed

    Verbeke, Wim

    2008-08-01

    Consumers' food choices and dietary behaviour can be markedly affected by communication and information. Whether the provided information is processed by the receiver, and thus becomes likely to be effective, depends on numerous factors. The role of selected determinants such as uncertainty, knowledge, involvement, health-related motives and trust, as well as message content variables, are discussed in the present paper based on previous empirical studies. The different studies indicate that: uncertainty about meat quality and safety does not automatically result in more active information search; subjective knowledge about fish is a better predictor of fish consumption than objective knowledge; high subjective knowledge about functional foods as a result of a low trusted information source such as mass media advertising leads to a lower probability of adopting these foods in the diet. Also, evidence of the stronger impact of negative news as compared with messages promoting positive outcomes of food choices is discussed. Finally, three audience-segmentation studies based on consumers' involvement with fresh meat, individuals' health-related-motive orientations and their use of and trust in fish information sources are presented. A clear message from these studies is that communication and information provision strategies targeted to a specific audience's needs, interests or motives stand a higher likelihood of being attended to and processed by the receiving audience, and therefore also stand a higher chance of yielding their envisaged impact in terms of food choice and dietary behaviour.

  11. So Many Brands and Varieties to Choose from: Does This Compromise the Control of Food Intake in Humans?

    PubMed

    Hardman, Charlotte A; Ferriday, Danielle; Kyle, Lesley; Rogers, Peter J; Brunstrom, Jeffrey M

    2015-01-01

    The recent rise in obesity is widely attributed to changes in the dietary environment (e.g., increased availability of energy-dense foods and larger portion sizes). However, a critical feature of our "obesogenic environment" may have been overlooked - the dramatic increase in "dietary variability" (the tendency for specific mass-produced foods to be available in numerous varieties that differ in energy content). In this study we tested the hypothesis that dietary variability compromises the control of food intake in humans. Specifically, we examined the effects of dietary variability in pepperoni pizza on two key outcome variables; i) compensation for calories in pepperoni pizza and ii) expectations about the satiating properties of pepperoni pizza (expected satiation). We reasoned that dietary variability might generate uncertainty about the postingestive effects of a food. An internet-based questionnaire was completed by 199 adults. This revealed substantial variation in exposure to different varieties of pepperoni pizza. In a follow-up study (n= 66; 65% female), high pizza variability was associated with i) poorer compensation for calories in pepperoni pizza and ii) lower expected satiation for pepperoni pizza. Furthermore, the effect of uncertainty on caloric compensation was moderated by individual differences in decision making (loss aversion). For the first time, these findings highlight a process by which dietary variability may compromise food-intake control in humans. This is important because it exposes a new feature of Western diets (processed foods in particular) that might contribute to overeating and obesity.

  12. Sampling design by the core-food approach for the Taiwan total diet study on veterinary drugs.

    PubMed

    Chen, Chien-Chih; Tsai, Ching-Lun; Chang, Chia-Chin; Ni, Shih-Pei; Chen, Yi-Tzu; Chiang, Chow-Feng

    2017-06-01

    The core-food (CF) approach, first adopted in the United States in the 1980s, has been widely used by many countries to assess the exposure to dietary hazards at a population level. However, the reliability of exposure estimates (C × CR) depends critically on sampling methods designed for the detected chemical concentrations (C) of each CF to match with the corresponding consumption rate (CR) estimated from the surveyed intake data. In order to reduce the uncertainty of food matching, this study presents a sampling design scheme, namely the subsample method, for the 2016 Taiwan total diet study (TDS) on veterinary drugs. We first combined the four sets of national dietary recall data that covered the entire age strata (1-65+ years), and aggregated them into 307 CFs by their similarity in nutritional values, manufacturing and cooking methods. The 40 CFs pertinent to veterinary drug residues were selected for this study, and 16 subsamples for each CF were designed by weighing their quantities in CR, product brands, manufacturing, processing and cooking methods. The calculated food matching rates of each CF from this study were 84.3-97.3%, which were higher than those obtained from many previous studies using the representative food (RF) method (53.1-57.8%). The subsample method not only considers the variety of food processing and cooking methods, but also it provides better food matching and reduces the uncertainty of exposure assessment.

  13. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less

  14. Food-borne disease and climate change in the United Kingdom.

    PubMed

    Lake, Iain R

    2017-12-05

    This review examined the likely impact of climate change upon food-borne disease in the UK using Campylobacter and Salmonella as example organisms. Campylobacter is an important food-borne disease and an increasing public health threat. There is a reasonable evidence base that the environment and weather play a role in its transmission to humans. However, uncertainty as to the precise mechanisms through which weather affects disease, make it difficult to assess the likely impact of climate change. There are strong positive associations between Salmonella cases and ambient temperature, and a clear understanding of the mechanisms behind this. However, because the incidence of Salmonella disease is declining in the UK, any climate change increases are likely to be small. For both Salmonella and Campylobacter the disease incidence is greatest in older adults and young children. There are many pathways through which climate change may affect food but only a few of these have been rigorously examined. This provides a high degree of uncertainty as to what the impacts of climate change will be. Food is highly controlled at the National and EU level. This provides the UK with resilience to climate change as well as potential to adapt to its consequences but it is unknown whether these are sufficient in the context of a changing climate.

  15. Application of Nitrogen and Carbon Stable Isotopes (δ15N and δ13C) to Quantify Food Chain Length and Trophic Structure

    PubMed Central

    Perkins, Matthew J.; McDonald, Robbie A.; van Veen, F. J. Frank; Kelly, Simon D.; Rees, Gareth; Bearhop, Stuart

    2014-01-01

    Increasingly, stable isotope ratios of nitrogen (δ15N) and carbon (δ13C) are used to quantify trophic structure, though relatively few studies have tested accuracy of isotopic structural measures. For laboratory-raised and wild-collected plant-invertebrate food chains spanning four trophic levels we estimated nitrogen range (NR) using δ15N, and carbon range (CR) using δ13C, which are used to quantify food chain length and breadth of trophic resources respectively. Across a range of known food chain lengths we examined how NR and CR changed within and between food chains. Our isotopic estimates of structure are robust because they were calculated using resampling procedures that propagate variance in sample means through to quantified uncertainty in final estimates. To identify origins of uncertainty in estimates of NR and CR, we additionally examined variation in discrimination (which is change in δ15N or δ13C from source to consumer) between trophic levels and among food chains. δ15N discrimination showed significant enrichment, while variation in enrichment was species and system specific, ranged broadly (1.4‰ to 3.3‰), and importantly, propagated variation to subsequent estimates of NR. However, NR proved robust to such variation and distinguished food chain length well, though some overlap between longer food chains infers a need for awareness of such limitations. δ13C discrimination was inconsistent; generally no change or small significant enrichment was observed. Consequently, estimates of CR changed little with increasing food chain length, showing the potential utility of δ13C as a tracer of energy pathways. This study serves as a robust test of isotopic quantification of food chain structure, and given global estimates of aquatic food chains approximate four trophic levels while many food chains include invertebrates, our use of four trophic level plant-invertebrate food chains makes our findings relevant for a majority of ecological systems. PMID:24676331

  16. Multimedia models of human exposure to industrial emissions through food products: Identifying the critical pathways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKone, T.E.; Maddalena, R.L.; Dowdy, D.L.

    1995-12-31

    The magnitude of potential human exposure through foods can be and has been assessed through food-product samples. Reducing these exposures requires that measuring not only the exposure media (food) concentrations but also defining the processes that link various contaminant sources to food. The authors present here a general approach for quantifying human exposure to contaminants in home-grown foods using four factors-ambient environmental concentrations in air and soil; bioconcentration factors between air, soil, leaves, and roots; translocations from leaves and roots to edible-plant parts; and human activity patterns associated with consumption of home-grown food products. The authors combined these factors inmore » a general model and used sensitivity analyses to assess the important contributions to the imprecision of exposure estimates. Despite large uncertainties about human activities, they found the major source of uncertainty (variance) in exposure estimates is attributable to imprecision in quantifying bioconcentration. This is especially evident for persistent and fat-soluble compounds--such as PCBs, PAHs, dioxins, and furans. The evaluation of analytical and theoretical methods used to characterize bioconcentration factors reveals that molecular connectivity indices (MCIs) based on molecular structure are better predictors of bioconcentration in plants than are the solubility factors currently in use. The authors describe ongoing laboratory experiments in exposure chambers with garden foods such as leafy vegetables and root crops. They then assess the ability of these methods to better define chemical mass transfers among the components of the home-garden system-air, soil-organic phases, soil solution, plant roots, and plant leaves.« less

  17. The simulation of the LANFOS-H food radiation contamination detector using Geant4 package

    NASA Astrophysics Data System (ADS)

    Piotrowski, Lech Wiktor; Casolino, Marco; Ebisuzaki, Toshikazu; Higashide, Kazuhiro

    2015-02-01

    Recent incident in the Fukushima power plant caused a growing concern about the radiation contamination and resulted in lowering the Japanese limits for the permitted amount of 137Cs in food to 100 Bq/kg. To increase safety and ease the concern we are developing LANFOS (Large Food Non-destructive Area Sampler)-a compact, easy to use detector for assessment of radiation in food. Described in this paper LANFOS-H has a 4 π coverage to assess the amount of 137Cs present, separating it from the possible 40K food contamination. Therefore, food samples do not have to be pre-processed prior to a test and can be consumed after measurements. It is designed for use by non-professionals in homes and small institutions such as schools, showing safety of the samples, but can be also utilized by specialists providing radiation spectrum. Proper assessment of radiation in food in the apparatus requires estimation of the γ conversion factor of the detectors-how many γ photons will produce a signal. In this paper we show results of the Monte Carlo estimation of this factor for various approximated shapes of fish, vegetables and amounts of rice, performed with Geant4 package. We find that the conversion factor combined from all the detectors is similar for all food types and is around 37%, varying maximally by 5% with sample length, much less than for individual detectors. The different inclinations and positions of samples in the detector introduce uncertainty of 1.4%. This small uncertainty validates the concept of a 4 π non-destructive apparatus.

  18. 'Traffic-light' nutrition labelling and 'junk-food' tax: a modelled comparison of cost-effectiveness for obesity prevention.

    PubMed

    Sacks, G; Veerman, J L; Moodie, M; Swinburn, B

    2011-07-01

    Cost-effectiveness analyses are important tools in efforts to prioritise interventions for obesity prevention. Modelling facilitates evaluation of multiple scenarios with varying assumptions. This study compares the cost-effectiveness of conservative scenarios for two commonly proposed policy-based interventions: front-of-pack 'traffic-light' nutrition labelling (traffic-light labelling) and a tax on unhealthy foods ('junk-food' tax). For traffic-light labelling, estimates of changes in energy intake were based on an assumed 10% shift in consumption towards healthier options in four food categories (breakfast cereals, pastries, sausages and preprepared meals) in 10% of adults. For the 'junk-food' tax, price elasticities were used to estimate a change in energy intake in response to a 10% price increase in seven food categories (including soft drinks, confectionery and snack foods). Changes in population weight and body mass index by sex were then estimated based on these changes in population energy intake, along with subsequent impacts on disability-adjusted life years (DALYs). Associated resource use was measured and costed using pathway analysis, based on a health sector perspective (with some industry costs included). Costs and health outcomes were discounted at 3%. The cost-effectiveness of each intervention was modelled for the 2003 Australian adult population. Both interventions resulted in reduced mean weight (traffic-light labelling: 1.3 kg (95% uncertainty interval (UI): 1.2; 1.4); 'junk-food' tax: 1.6 kg (95% UI: 1.5; 1.7)); and DALYs averted (traffic-light labelling: 45,100 (95% UI: 37,700; 60,100); 'junk-food' tax: 559,000 (95% UI: 459,500; 676,000)). Cost outlays were AUD81 million (95% UI: 44.7; 108.0) for traffic-light labelling and AUD18 million (95% UI: 14.4; 21.6) for 'junk-food' tax. Cost-effectiveness analysis showed both interventions were 'dominant' (effective and cost-saving). Policy-based population-wide interventions such as traffic-light nutrition labelling and taxes on unhealthy foods are likely to offer excellent 'value for money' as obesity prevention measures.

  19. [Trends in the utilization of food additives].

    PubMed

    Szűcs, Viktória; Bánáti, Diána

    2013-11-17

    The frequent media reports on food additives weakened consumers' trust in food producers and food control authorities as well. Furthermore, consumers' uncertainty is also raised by the fact that they obtain their information from inadequate, mistrustful sources and, therefore, consumers might avoid the consumption of certain foodstuffs. While food producers may react by replacing artificial components by natural ones, they try to emphasize the favourable characteristics of their products. The authors describe the main trends and efforts related to food additives. On the basis of the overview it can be concluded that - besides taking into consideration consumers' needs - product development and research directions are promising. Food producers' efforts may help to restore consumer confidence and trust and they may help them to have informed choice.

  20. Representation of analysis results involving aleatory and epistemic uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less

  1. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  2. When good news leads to bad choices.

    PubMed

    McDevitt, Margaret A; Dunn, Roger M; Spetch, Marcia L; Ludvig, Elliot A

    2016-01-01

    Pigeons and other animals sometimes deviate from optimal choice behavior when given informative signals for delayed outcomes. For example, when pigeons are given a choice between an alternative that always leads to food after a delay and an alternative that leads to food only half of the time after a delay, preference changes dramatically depending on whether the stimuli during the delays are correlated with (signal) the outcomes or not. With signaled outcomes, pigeons show a much greater preference for the suboptimal alternative than with unsignaled outcomes. Key variables and research findings related to this phenomenon are reviewed, including the effects of durations of the choice and delay periods, probability of reinforcement, and gaps in the signal. We interpret the available evidence as reflecting a preference induced by signals for good news in a context of uncertainty. Other explanations are briefly summarized and compared. © 2016 Society for the Experimental Analysis of Behavior.

  3. Invasive alien species in the food chain: Advancing risk assessment models to address climate change, economics and uncertainty

    Treesearch

    Darren J. Kriticos; Robert C. Venette; Richard H.A. Baker; Sarah Brunel; Frank H. Koch; Trond Rafoss; Wopke van der Werf; Susan P. Worner

    2013-01-01

    Economic globalization depends on the movement of people and goods between countries. As these exchanges increase, so does the potential for translocation of harmful pests, weeds, and pathogens capable of impacting our crops, livestock and natural resources (Hulme 2009), with concomitant impacts on global food security (Cook et al. 2011).

  4. Cost-effectiveness of a complex workplace dietary intervention: an economic evaluation of the Food Choice at Work study

    PubMed Central

    Fitzgerald, Sarah; Murphy, Aileen; Kirby, Ann; Geaney, Fiona; Perry, Ivan J

    2018-01-01

    Objective To evaluate the costs, benefits and cost-effectiveness of complex workplace dietary interventions, involving nutrition education and system-level dietary modification, from the perspective of healthcare providers and employers. Design Single-study economic evaluation of a cluster-controlled trial (Food Choice at Work (FCW) study) with 1-year follow-up. Setting Four multinational manufacturing workplaces in Cork, Ireland. Participants 517 randomly selected employees (18–65 years) from four workplaces. Interventions Cost data were obtained from the FCW study. Nutrition education included individual nutrition consultations, nutrition information (traffic light menu labelling, posters, leaflets and emails) and presentations. System-level dietary modification included menu modification (restriction of fat, sugar and salt), increase in fibre, fruit discounts, strategic positioning of healthier alternatives and portion size control. The combined intervention included nutrition education and system-level dietary modification. No intervention was implemented in the control. Outcomes The primary outcome was an improvement in health-related quality of life, measured using the EuroQoL 5 Dimensions 5 Levels questionnaire. The secondary outcome measure was reduction in absenteeism, which is measured in monetary amounts. Probabilistic sensitivity analysis (Monte Carlo simulation) assessed parameter uncertainty. Results The system-level intervention dominated the education and combined interventions. When compared with the control, the incremental cost-effectiveness ratio (€101.37/quality-adjusted life-year) is less than the nationally accepted ceiling ratio, so the system-level intervention can be considered cost-effective. The cost-effectiveness acceptability curve indicates there is some decision uncertainty surrounding this, arising from uncertainty surrounding the differences in effectiveness. These results are reiterated when the secondary outcome measure is considered in a cost–benefit analysis, whereby the system-level intervention yields the highest net benefit (€56.56 per employee). Conclusions System-level dietary modification alone offers the most value per improving employee health-related quality of life and generating net benefit for employers by reducing absenteeism. While system-level dietary modification strategies are potentially sustainable obesity prevention interventions, future research should include long-term outcomes to determine if improvements in outcomes persist. Trial registration number ISRCTN35108237; Post-results. PMID:29502090

  5. Two essays on environmental and food security

    NASA Astrophysics Data System (ADS)

    Jeanty, Pierre Wilner

    The first essay of this dissertation, "estimating non-market economic benefits of using biodiesel fuel: a stochastic double bounded approach", is an attempt to incorporate uncertainty into double bounded dichotomous choice contingent valuation. The double bounded approach, which entails asking respondents a follow-up question after they have answered a first question, has emerged as a means to increase efficiency in willingness to pay (WTP) estimates. However, several studies have found inconsistency between WTP estimates generated by the first and second questions. In this study, it is posited that this inconsistency is due to uncertainty facing the respondents when the second question is introduced. The author seeks to understand whether using a follow-up question in a stochastic format, which allows respondents to express uncertainty, would alleviate the inconsistency problem. In a contingent valuation survey to estimate non-market economic benefits of using more biodiesel vs. petroleum diesel fuel in an airshed encompassing South Eastern and Central Ohio, it is found that the gap between WTP estimates produced by the first and the second questions reduces when respondents are allowed to express uncertainty. The proposed stochastic follow-up approach yields more efficient WTP estimates than the conventional follow-up approach while maintaining efficiency gain over the single bounded model. From a methodological standpoint, this study distinguishes from previous research by being the first to implement a double bounded contingent valuation survey with a stochastic follow-up question. In the second essay, "analyzing the effects of civil wars and violent conflicts on food security in developing countries: an instrumental variable panel data approach", instrumental variable panel data techniques are applied to estimate the effects of civil wars and violent conflicts on food security in a sample of 73 developing countries from 1970 to 2002. The number of hungry in the developing countries has been rampant in the past several years. Civil wars and violent conflicts have been associated with food insecurity. The study aims to provide empirical evidence as to whether the manifest increase in the number of hungry can be ascribed to civil unrest. From a statistical standpoint, the results convincingly pinpoint the danger of using conventional panel data estimators when endogeneity is of the conventional simultaneous equation type, i.e. with respect to the idiosyncratic error term. From a policy viewpoint, it is found that, in general, civil wars and conflicts are detrimental to food security. However, more vulnerable are countries unable to make available for their citizens the minimum dietary energy requirements under which a country is qualified for food aid. Policies aiming at curbing food insecurity in developing countries need to take into account this difference.

  6. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  7. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  8. Migration as a turning point in food habits: the early phase of dietary acculturation among women from South Asian, African, and Middle Eastern Countries living in Norway.

    PubMed

    Terragni, Laura; Garnweidner, Lisa M; Pettersen, Kjell Sverre; Mosdøl, Annhild

    2014-01-01

    This article explores the early phase of dietary acculturation after migration. South Asian, African and Middle Eastern women (N = 21) living in Norway were interviewed about their early experiences with food in a new context. The findings pointed to abrupt changes in food habits in the first period after migration. To various degrees, women reported unfamiliarity with foods in shops, uncertainty about meal formats and food preparation and fear of eating food prohibited by their religion. Their food consumption tended to be restricted to food items perceived as familiar or safe. Our findings indicate that the first period after migration represents a specific phase in the process of dietary acculturation. Early initiatives aimed at enhancing confidence in food and familiarity with the new food culture are recommended.

  9. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  10. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  11. Estimating the burden of disease attributable to iron deficiency anaemia in South Africa in 2000.

    PubMed

    Nojilana, Beatrice; Norman, Rosana; Dhansay, Muhammad A; Labadarios, Demetre; van Stuijvenberg, Martha E; Bradshaw, Debbie

    2007-08-01

    To estimate the extent of iron deficiency anaemia (IDA) among children aged 0 - 4 years and pregnant women aged 15 - 49 years, and the burden of disease attributed to IDA in South Africa in 2000. The comparative risk assessment (CRA) methodology of the World Health Organization (WHO) was followed using local prevalence and burden estimates. IDA prevalence came from re-analysis of the South African Vitamin A Consultative Group study in the case of the children, and from a pooled estimate from several studies in the case of the pregnant women (haemoglobin level < 11 g/dl and ferritin level < 12 microg/l). Monte Carlo simulation-modelling was used for the uncertainty analysis. South Africa. Children under 5 years and pregnant women 15 - 49 years. Direct sequelae of IDA, maternal and perinatal deaths and disability-adjusted life years (DALYs) from mild mental disability related to IDA. Results. It is estimated that 5.1% of children and 9 - 12% of pregnant women had IDA and that about 7.3% of perinatal deaths and 4.9% of maternal deaths were attributed to IDA in 2000. Overall, about 174,976 (95% uncertainty interval 150,344 - 203,961) healthy years of life lost (YLLs), or between 0.9% and 1.3% of all DALYs in South Africa in 2000, were attributable to IDA. This first study in South Africa to quantify the burden from IDA suggests that it is a less serious public health problem in South Africa than in many other developing countries. Nevertheless, this burden is preventable, and the study highlights the need to disseminate the food-based dietary guidelines formulated by the National Department of Health to people who need them and to monitor the impact of the food fortification programme.

  12. Do healthier foods and diet patterns cost more than less healthy options? A systematic review and meta-analysis

    PubMed Central

    Rao, Mayuree; Afshin, Ashkan; Singh, Gitanjali; Mozaffarian, Dariush

    2013-01-01

    Objective To conduct a systematic review and meta-analysis of prices of healthier versus less healthy foods/diet patterns while accounting for key sources of heterogeneity. Data sources MEDLINE (2000–2011), supplemented with expert consultations and hand reviews of reference lists and related citations. Design Studies reviewed independently and in duplicate were included if reporting mean retail price of foods or diet patterns stratified by healthfulness. We extracted, in duplicate, mean prices and their uncertainties of healthier and less healthy foods/diet patterns and rated the intensity of health differences for each comparison (range 1–10). Prices were adjusted for inflation and the World Bank purchasing power parity, and standardised to the international dollar (defined as US$1) in 2011. Using random effects models, we quantified price differences of healthier versus less healthy options for specific food types, diet patterns and units of price (serving, day and calorie). Statistical heterogeneity was quantified using I2 statistics. Results 27 studies from 10 countries met the inclusion criteria. Among food groups, meats/protein had largest price differences: healthier options cost $0.29/serving (95% CI $0.19 to $0.40) and $0.47/200 kcal ($0.42 to $0.53) more than less healthy options. Price differences per serving for healthier versus less healthy foods were smaller among grains ($0.03), dairy (−$0.004), snacks/sweets ($0.12) and fats/oils ($0.02; p<0.05 each) and not significant for soda/juice ($0.11, p=0.64). Comparing extremes (top vs bottom quantile) of food-based diet patterns, healthier diets cost $1.48/day ($1.01 to $1.95) and $1.54/2000 kcal ($1.15 to $1.94) more. Comparing nutrient-based patterns, price per day was not significantly different (top vs bottom quantile: $0.04; p=0.916), whereas price per 2000 kcal was $1.56 ($0.61 to $2.51) more. Adjustment for intensity of differences in healthfulness yielded similar results. Conclusions This meta-analysis provides the best evidence until today of price differences of healthier vs less healthy foods/diet patterns, highlighting the challenges and opportunities for reducing financial barriers to healthy eating. PMID:24309174

  13. Shell egg handling and preparation practices in food service establishments in Finland.

    PubMed

    Lievonen, S; Ranta, J; Maijala, R

    2007-10-01

    Foodborne outbreaks are often reported to be acquired at food service establishments. As a part of a quantitative risk assessment on the consumer risk of contracting Salmonella infection via shell eggs, we studied how small, medium, and large restaurants, institutional kitchens, and staff canteens (n=171) purchase, store, and use shell eggs. In addition, we estimated the fraction of raw and undercooked risky egg dishes among all egg dishes served in food service establishments of different sizes and types. The majority of establishments used shell eggs (78%), purchased eggs once per week (39%), and stored eggs at cool temperatures (82%). The size of the food service establishment had a less significant effect on shell egg preparation and handling practices than the type of the establishment. In particular, restaurants and institutional kitchens differed from each other. Restaurants purchased shell eggs more frequently, were more likely to store them at room temperature, stored shell eggs for a shorter period, and were more likely to prepare undercooked egg dishes than institutional kitchens. It was predicted that 6 to 20% of all different egg dishes prepared in a single randomly chosen food service establishment would be risky egg dishes with a 95% Bayesian credible interval of 0 to 96%, showing uncertainty because of the variability between kitchens and uncertainty in kitchen type-specific parameters. The results indicate that although most Finnish food service establishments had safe egg handling practices, a substantial minority expressed risky behavior. Compared with the egg consumption patterns in private Finnish households, however, practices in food service establishments did not prove to be more prone to risk.

  14. So Many Brands and Varieties to Choose from: Does This Compromise the Control of Food Intake in Humans?

    PubMed Central

    Hardman, Charlotte A.; Ferriday, Danielle; Kyle, Lesley; Rogers, Peter J.; Brunstrom, Jeffrey M.

    2015-01-01

    The recent rise in obesity is widely attributed to changes in the dietary environment (e.g., increased availability of energy-dense foods and larger portion sizes). However, a critical feature of our “obesogenic environment” may have been overlooked - the dramatic increase in “dietary variability” (the tendency for specific mass-produced foods to be available in numerous varieties that differ in energy content). In this study we tested the hypothesis that dietary variability compromises the control of food intake in humans. Specifically, we examined the effects of dietary variability in pepperoni pizza on two key outcome variables; i) compensation for calories in pepperoni pizza and ii) expectations about the satiating properties of pepperoni pizza (expected satiation). We reasoned that dietary variability might generate uncertainty about the postingestive effects of a food. An internet-based questionnaire was completed by 199 adults. This revealed substantial variation in exposure to different varieties of pepperoni pizza. In a follow-up study (n= 66; 65% female), high pizza variability was associated with i) poorer compensation for calories in pepperoni pizza and ii) lower expected satiation for pepperoni pizza. Furthermore, the effect of uncertainty on caloric compensation was moderated by individual differences in decision making (loss aversion). For the first time, these findings highlight a process by which dietary variability may compromise food-intake control in humans. This is important because it exposes a new feature of Western diets (processed foods in particular) that might contribute to overeating and obesity. PMID:25923118

  15. Risk assessment of dietary exposure to phytosterol oxidation products from baked food in China.

    PubMed

    Hu, Yinzhou; Wang, Mengmeng; Huang, Weisu; Yang, Guoliang; Lou, Tiantian; Lai, Shiyun; Lu, Baiyi; Zheng, Lufei

    2018-02-01

    Phytosterols are nutritional phytochemicals that may undergo oxidation and be transformed into phytosterol oxidation products (POPs), thus inducing pathological and toxic effects. This work investigated four main phytosterols and 28 POPs in 104 kinds of commercial baked food by using GC-MS. The dietary exposure and hazard index values (HI) associated with POPs from baked food consumption in China were estimated by using Monte Carlo simulation. Concentrations of the total phytosterols were between 3.39 and 209.80 μg/g. The total concentrations of POPs, including 5α,6α/5β,6β-epoxysterols, 7-ketosterol, 7α/7β-hydroxysterols, 6-hydroxysterols, and triols, ranged from 0.37 to 27.81 μg/g. The median dietary exposure of POP contents in baked food for four age groups in China were 10.91 (children), 6.20 (adolescents), 3.63 (adults), and 3.40 (seniors) mg/(kg×day). Risk assessment of median HI with respect to POPs indicated no risk (HI <1) for people in adolescents, adults, and seniors in the country area of China, while a risk (1 < HI < 10) would refer to the baked food consumption of people in urban area and children in country area of China. Sensitivity and uncertainty analysis showed that the most significant variables for each age group in China were POP concentration, body weight, and ingestion rate.

  16. Challenges and Alternatives to Sustainable Management of Agriculture and Pastoral Ecosystems in Asian Drylands

    NASA Astrophysics Data System (ADS)

    Qi, J.

    2015-12-01

    There is no question that human must produce additional 70% food to feed the new 2.2 billion of people on the planet by 2050, but the question is where to grow the additional food. The demand for the additional food lies not only in producing the basic resources needed to sustain a healthy lifestyle, but also from a changing diet, especially in rapidly developing countries in the dryland regions around the world. It is forecast that this demand for meat will require an additional 0.2 billion tons per year by 2050, which is almost a doubling of present meat consumption. These new demands create mounting pressures on agriculture and pastoral ecosystems and the reported trajectory of warmer and drier climate in the future increases uncertainties in food security, adding further stresses to the already stressed nations in the Asian dryland belt. Different approaches are being either proposed or practiced in the region but the question is whether or not the current practices are sustainable or optimal in addressing the emerging issues. Given the complexity and interplay among the food, water and energy, what are alternatives to ensure a sustainable trajectory of regional development to meet the new food demand? This presentation reviews existing practices and proposes alternative solutions, by specifically examining the trade-offs between different ecosystem services that drylands in Asian may provide. Preliminary analysis suggested that the current trajectory of meat and milk production is likely not on a sustainable pathway.

  17. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  18. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  19. Lost water and nitrogen resources due to EU consumer food waste

    NASA Astrophysics Data System (ADS)

    Vanham, D.; Bouraoui, F.; Leip, A.; Grizzetti, B.; Bidoglio, G.

    2015-08-01

    The European Parliament recently called for urgent measures to halve food waste in the EU, where consumers are responsible for a major part of total waste along the food supply chain. Due to a lack of data on national food waste statistics, uncertainty in (consumer) waste quantities (and the resulting associated quantities of natural resources) is very high, but has never been previously assessed in studies for the EU. Here we quantify: (1) EU consumer food waste, and (2) associated natural resources required for its production, in term of water and nitrogen, as well as estimating the uncertainty of these values. Total EU consumer food waste averages 123 (min 55-max 190) kg/capita annually (kg/cap/yr), i.e. 16% (min 7-max 24%) of all food reaching consumers. Almost 80%, i.e. 97 (min 45-max 153) kg/cap/yr is avoidable food waste, which is edible food not consumed. We have calculated the water and nitrogen (N) resources associated with avoidable food waste. The associated blue water footprint (WF) (the consumption of surface and groundwater resources) averages 27 litre per capita per day (min 13-max 40 l/cap/d), which slightly exceeds the total blue consumptive EU municipal water use. The associated green WF (consumptive rainwater use) is 294 (min 127-max 449) l/cap/d, equivalent to the total green consumptive water use for crop production in Spain. The nitrogen (N) contained in avoidable food waste averages 0.68 (min 0.29-max 1.08) kg/cap/yr. The food production N footprint (any remaining N used in the food production process) averages 2.74 (min 1.02-max 4.65) kg/cap/yr, equivalent to the use of mineral fertiliser by the UK and Germany combined. Among all the food product groups wasted, meat accounts for the highest amounts of water and N resources, followed by wasted cereals. The results of this study provide essential insights and information on sustainable consumption and resource efficiency for both EU policies and EU consumers.

  20. Understanding consumption-related sucralose emissions - A conceptual approach combining substance-flow analysis with sampling analysis.

    PubMed

    Neset, Tina-Simone Schmid; Singer, Heinz; Longrée, Philipp; Bader, Hans-Peter; Scheidegger, Ruth; Wittmer, Anita; Andersson, Jafet Clas Martin

    2010-07-15

    This paper explores the potential of combining substance-flow modelling with water and wastewater sampling to trace consumption-related substances emitted through the urban wastewater. The method is exemplified on sucralose. Sucralose is a chemical sweetener that is 600 times sweeter than sucrose and has been on the European market since 2004. As a food additive, sucralose has recently increased in usage in a number of foods, such as soft drinks, dairy products, candy and several dietary products. In a field campaign, sucralose concentrations were measured in the inflow and outflow of the local wastewater treatment plant in Linköping, Sweden, as well as upstream and downstream of the receiving stream and in Lake Roxen. This allows the loads emitted from the city to be estimated. A method consisting of solid-phase extraction followed by liquid chromatography and high resolution mass spectrometry was used to quantify the sucralose in the collected surface and wastewater samples. To identify and quantify the sucralose sources, a consumption analysis of households including small business enterprises was conducted as well as an estimation of the emissions from the local food industry. The application of a simple model including uncertainty and sensitivity analysis indicates that at present not one large source but rather several small sources contribute to the load coming from households, small business enterprises and industry. This is in contrast to the consumption pattern seen two years earlier, which was dominated by one product. The inflow to the wastewater treatment plant decreased significantly from other measurements made two years earlier. The study shows that the combination of substance-flow modelling with the analysis of the loads to the receiving waters helps us to understand consumption-related emissions. Copyright 2010 Elsevier B.V. All rights reserved.

  1. Bisphenol A

    MedlinePlus

    ... there is still uncertainty about some links between human health effects and exposure to endocrine disruptors. Both the ... to understand exactly how current findings relate to human health and development, according to the U.S. Food and ...

  2. Forward and backward uncertainty propagation: an oxidation ditch modelling example.

    PubMed

    Abusam, A; Keesman, K J; van Straten, G

    2003-01-01

    In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.

  3. Development of decision tools to assess migration from plastic materials in contact with food.

    PubMed

    Gillet, G; Vitrac, O; Tissier, D; Saillard, P; Desobry, S

    2009-12-01

    Testing the specific migration limits of all substances intentionally added to polymer material according to European Union (EU) regulation is a time-consuming and expensive task. Although mathematical modeling offers an interesting alternative, it can significantly overestimate the migration in situations which are strongly conservative due to significant uncertainty in transport properties. In addition, its application is of little use for end-users or enforcement laboratories, which do not have access to the formulation. This paper revises the paradigm of migration modeling by combining modeling with deformulation experiments and iterative modeling in the framework of decision theory. The complete approach is illustrated for polyolefins in contact with 50% ethanol for eight typical migrants, including hindered phenolic antioxidants and low molecular weight surrogates. Results from a French ACTIA project on the identification of formulation fingerprints and on the prediction of partition coefficients with alcoholic and aqueous stimulants is described. When the true migration was close but still lower than the limit of concern, the proposed compact decision tree, including up to four sources of uncertainty, showed that the chance of demonstrating compliance was about 3 : 4 in the presence of one source of uncertainty, whereas it fell below 2 : 4 and 1 : 4 with two and three sources of uncertainty, respectively. The recommendations for further food packaging safety surveys and future developments are discussed.

  4. Pretest uncertainty analysis for chemical rocket engine tests

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1987-01-01

    A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.

  5. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  6. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  7. [Safety of food additives from a German and European point of view].

    PubMed

    Gürtler, R

    2010-06-01

    There are about 300 food additives permitted in the EU for which a re-evaluation program was initiated recently. Occasionally, it is speculated that the use of single food additives might be of safety concern. First results of the re-evaluation could give an impression on how such concerns were taken into account by responsible authorities, such as the European Food Safety Authority (EFSA). For some of the food additives, the lowest dose resulting in adverse effects was lower in recent studies compared to previous studies. Thus, the acceptable daily intake (ADI) derived applying the common uncertainty factor was lower than the ADI derived using data from previous studies. Therefore, it has to be considered whether the conditions of use need to be modified for these food additives.

  8. Thresholds of allergenic proteins in foods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hourihane, Jonathan O'B.; Knulst, Andre C.

    2005-09-01

    Threshold doses or Estimated Eliciting Doses (EEDs) represent an important new field of research in food allergy. Clinicians and regulators have embraced some toxicological concepts such as LOAEL and NOAEL and applied them to an area of significant clinical uncertainty and interest. The impact of intrinsic human factors (e.g., asthma and exercise) and extrinsic event factors (e.g., season, location and especially dose of allergen) on a future allergic reaction in the community needs to be considered carefully when interpreting results of clinical and research low-dose food challenges. The ongoing cooperation of food allergy research groups in medicine, food science andmore » government will surely deliver results of the highest importance to the wider communities of allergology, food science and technology and the increasing number of allergic consumers.« less

  9. Seasonal dynamics and functioning of the Sylt-Rømø Bight, northern Wadden Sea

    NASA Astrophysics Data System (ADS)

    de la Vega, Camille; Horn, Sabine; Baird, Dan; Hines, David; Borrett, Stuart; Jensen, Lasse Fast; Schwemmer, Philipp; Asmus, Ragnhild; Siebert, Ursula; Asmus, Harald

    2018-04-01

    The Wadden Sea undergoes large seasonal changes in species abundance and biomass comprising its complex food web. This study examined four carbon food web models of the Sylt-Rømø Bight, one for each season. Each flow model consisted of 66 compartments depicting the respective biomass and energy budget of each ecosystem component and the flows between them. Ecological network analysis (ENA), a set of algorithms to evaluate the functioning of ecological networks, was used to assess the seasonal variability in the system properties of the Sylt-Rømø Bight food webs. We used an uncertainty analysis to quantitatively evaluate the significance of inter-seasonal differences. Clear seasonal variation was observed in most of the whole system indicators such as the flow diversity, the effective link density and the relative redundancy which varied by 12.8%, 17.3% and 10.3% respectively between the highest in summer and the lowest during fall and winter, whereas the relevant ascendency ratio was the highest in winter during the least active months. Other indices such as the average mutual information index, which fluctuated between 1.73 in fall and 1.79 in spring, showed no significant variation between seasons. Results from ENA have great potential for ecosystem management, as it provides a holistic assessment of the functioning of ecosystems.

  10. Food and Drug Administration tobacco regulation and product judgments.

    PubMed

    Kaufman, Annette R; Finney Rutten, Lila J; Parascandola, Mark; Blake, Kelly D; Augustson, Erik M

    2015-04-01

    The Family Smoking Prevention and Tobacco Control Act granted the Food and Drug Administration (FDA) the authority to regulate tobacco products in the U.S. However, little is known about how regulation may be related to judgments about tobacco product-related risks. To understand how FDA tobacco regulation beliefs are associated with judgments about tobacco product-related risks. The Health Information National Trends Survey is a national survey of the U.S. adult population. Data used in this analysis were collected from October 2012 through January 2013 (N=3,630) by mailed questionnaire and analyzed in 2013. Weighted bivariate chi-square analyses were used to assess associations among FDA regulation belief, tobacco harm judgments, sociodemographics, and smoking status. A weighted multinomial logistic regression was conducted where FDA regulation belief was regressed on tobacco product judgments, controlling for sociodemographic variables and smoking status. About 41% believed that the FDA regulates tobacco products in the U.S., 23.6% reported the FDA does not, and 35.3% did not know. Chi-square analyses showed that smoking status was significantly related to harm judgments about electronic cigarettes (p<0.0001). The multinomial logistic regression revealed that uncertainty about FDA regulation was associated with tobacco product harm judgment uncertainty. Tobacco product harm perceptions are associated with beliefs about tobacco product regulation by the FDA. These findings suggest the need for increased public awareness and understanding of the role of tobacco product regulation in protecting public health. Copyright © 2015. Published by Elsevier Inc.

  11. A detailed description of the uncertainty analysis for high area ratio rocket nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  12. A detailed description of the uncertainty analysis for High Area Ratio Rocket Nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  13. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  14. [A correlational study on uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers].

    PubMed

    Yoo, Kyung Hee

    2007-06-01

    This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.

  15. Cost-effectiveness analysis of salt reduction policies to reduce coronary heart disease in Syria, 2010-2020.

    PubMed

    Wilcox, Meredith L; Mason, Helen; Fouad, Fouad M; Rastam, Samer; al Ali, Radwan; Page, Timothy F; Capewell, Simon; O'Flaherty, Martin; Maziak, Wasim

    2015-01-01

    This study presents a cost-effectiveness analysis of salt reduction policies to lower coronary heart disease in Syria. Costs and benefits of a health promotion campaign about salt reduction (HP); labeling of salt content on packaged foods (L); reformulation of salt content within packaged foods (R); and combinations of the three were estimated over a 10-year time frame. Policies were deemed cost-effective if their cost-effectiveness ratios were below the region's established threshold of $38,997 purchasing power parity (PPP). Sensitivity analysis was conducted to account for the uncertainty in the reduction of salt intake. HP, L, and R+HP+L were cost-saving using the best estimates. The remaining policies were cost-effective (CERs: R=$5,453 PPP/LYG; R+HP=$2,201 PPP/LYG; R+L=$2,125 PPP/LYG). R+HP+L provided the largest benefit with net savings using the best and maximum estimates, while R+L was cost-effective with the lowest marginal cost using the minimum estimates. This study demonstrated that all policies were cost-saving or cost effective, with the combination of reformulation plus labeling and a comprehensive policy involving all three approaches being the most promising salt reduction strategies to reduce CHD mortality in Syria.

  16. Managing uncertainty: information and insurance under the risk of starvation.

    PubMed Central

    Dall, Sasha R X; Johnstone, Rufus A

    2002-01-01

    In an uncertain world, animals face both unexpected opportunities and danger. Such outcomes can select for two potential strategies: collecting information to reduce uncertainty, or insuring against it. We investigate the relative value of information and insurance (energy reserves) under starvation risk by offering model foragers a choice between constant and varying food sources over finite foraging bouts. We show that sampling the variable option (choosing it when it is not expected to be good) should decline both with lower reserves and late in foraging bouts; in order to be able to reap the reduction in uncertainty associated with exploiting a variable resource effectively, foragers must be able to afford and compensate for an initial increase in the risk of an energetic shortfall associated with choosing the option when it is bad. Consequently, expected exploitation of the varying option increases as it becomes less variable, and when the overall risk of energetic shortfall is reduced. In addition, little activity on the variable alternative is expected until reserves are built up early in a foraging bout. This indicates that gathering information is a luxury while insurance is a necessity, at least when foraging on stochastic and variable food under the risk of starvation. PMID:12495509

  17. Assessing the applicability of stable isotope analysis to determine the contribution of landfills to vultures' diet.

    PubMed

    Tauler-Ametller, Helena; Hernández-Matías, Antonio; Parés, Francesc; Pretus, Joan Ll; Real, Joan

    2018-01-01

    Human activities cause changes to occur in the environment that affect resource availability for wildlife. The increase in the human population of cities has led to a rise in the amount of waste deposited in landfills, installations that have become a new food resource for both pest and threatened species such as vultures. In this study we used stable isotope analysis (SIA) and conventional identification of food remains from Egyptian Vultures (Neophron percnopterus) to assess the applicability of SIA as a new tool for determining the composition of the diets of vultures, a group of avian scavengers that is threatened worldwide. We focused on an expanding Egyptian Vulture population in NE Iberian Peninsula to determine the part played by landfills and livestock in the diet of these species, and aimed to reduce the biases associated with conventional ways of identifying food remains. We compared proportions of diet composition obtained with isotope mixing models and conventional analysis for five main prey. The greatest agreement between the two methods was in the categories 'landfills' and 'birds' and the greatest differences between the results from the two methods were in the categories 'livestock', 'carnivores' and 'wild herbivores'. Despite uncertainty associated to SIA, our results showed that stable isotope analysis can help to distinguish between animals that rely on waste and so present enriched levels of δ 13C than those that feed on the countryside. Indeed, a high proportion of food derived from landfills (nearly 50%) was detected in some breeding pairs. Furthermore we performed GLMM analyses that showed that high values of δ 13C in Egyptian Vulture feathers (a proxy of feeding in landfills) are related with high levels of humanization of territories. This method has the potential to be applied to other threatened vulture species for which there is a lack of information regarding resources they are consuming, being especially important as the main causes of vultures decline worldwide are related to the consumption and availability of food resources.

  18. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  19. Understanding and improving mitigation strategies for reducing catchment scale nutrient loads using high resolution observations and uncertainty analysis approaches

    NASA Astrophysics Data System (ADS)

    Collins, A.; Lloyd, C.; Freer, J. E.; Johnes, P.; Stirling, M.

    2012-12-01

    One of the biggest challenges in catchment water quality management is tackling the problem of reducing water pollution from agriculture whilst ensuring food security nationally. Improvements to catchment management plans are needed if we are to enhance biodiversity and maintain good ecological status in freshwater ecosystems, while producing enough food to support a growing global population. In order to plan for a more sustainable and secure future, research needs to quantify the uncertainties and understand the complexities in the source-mobilisation-delivery-impact continuum of pollution and nutrients at all scales. In the UK the Demonstration Test Catchment (DTC) project has been set up to improve water quality specifically from diffuse pollution from agriculture by enhanced high resolution monitoring and targeted mitigation experiments. The DTC project aims to detect shifts in the baseline trend of the most ecologically-significant pollutants resulting from targeted on-farm measures at field to farm scales and assessing their effects on ecosystem function. The DTC programme involves three catchments across the UK that are indicative of three different typologies and land uses. This paper will focus on the Hampshire Avon DTC, where a total of 12 parameters are monitored by bank-side stations at two sampling sites, including flow, turbidity, phosphate and nitrate concentrations at 30 min resolution. This monitoring is supported by daily resolution sampling at 5 other sites and storm sampling at all locations. Part of the DTC project aims to understand how observations of water quality within river systems at different temporal resolutions and types of monitoring strategies enable us to understand and detect changes over and above the natural variability. Baseline monitoring is currently underway and early results show that high-resolution data is essential at this sub-catchment scale to understand important process dynamics. This is critical if we are to design cost efficient and effective management strategies. The high-resolution dataset means that there are new opportunities to explore the associated uncertainties in monitoring water quality and assessing ecological status and how that relates to current monitoring networks. For example, concurrent grab samples at the high-resolution sampling stations allow the assessment of the uncertainties which would be generated through coarser sampling strategies. This is just the beginning of the project, however, as the project progresses, the high resolution dataset will provide higher statistical power compared with previous data collection schemes and allow the employment of more complex methods such as signal decomposition e.g. wavelet analysis, which can allow us to start to decipher the complex interactions occurring at sub-catchment scale which may not be immediately detectable in bulk signals. In this paper we outline our methodological approach, present some of the initial findings of this research and how we can quantify changes to nutrient loads whilst taking account the main uncertainties and the inherent natural variability.

  20. Are sweet snacks more sensitive to price increases than sugar-sweetened beverages: analysis of British food purchase data

    PubMed Central

    Smith, Richard D; Quirmbach, Diana; Jebb, Susan A

    2018-01-01

    Objectives Taxing sugar-sweetened beverages (SSBs) is now advocated, and implemented, in many countries as a measure to reduce the purchase and consumption of sugar to tackle obesity. To date, there has been little consideration of the potential impact that such a measure could have if extended to other sweet foods, such as confectionery, cakes and biscuits that contribute more sugar to the diet than SSBs. The objective of this study is to compare changes in the demand for sweet snacks and SSBs arising from potential price increases. Setting Secondary data on household itemised purchases of all foods and beverages from 2012 to 2013. Participants Representative sample of 32 249 households in Great Britain. Primary and secondary outcome measures Change in food and beverage purchases due to changes in their own price and the price of other foods or beverages measured as price elasticity of demand for the full sample and by income groups. Results Chocolate and confectionery, cakes and biscuits have similar price sensitivity as SSBs, across all income groups. Unlike the case of SSBs, price increases in these categories are also likely to prompt reductions in the purchase of other sweet snacks and SSBs, which magnify the overall impact. The effects of price increases are greatest in the low-income group. Conclusions Policies that lead to increases in the price of chocolate and confectionery, cakes and biscuits may lead to additional and greater health gains than similar increases in the price of SSBs through direct reductions in the purchases of these foods and possible positive multiplier effects that reduce demand for other products. Although some uncertainty remains, the associations found in this analysis are sufficiently robust to suggest that policies—and research—concerning the use of fiscal measures should consider a broader range of products than is currently the case. PMID:29700100

  1. Are sweet snacks more sensitive to price increases than sugar-sweetened beverages: analysis of British food purchase data.

    PubMed

    Smith, Richard D; Cornelsen, Laura; Quirmbach, Diana; Jebb, Susan A; Marteau, Theresa M

    2018-04-26

    Taxing sugar-sweetened beverages (SSBs) is now advocated, and implemented, in many countries as a measure to reduce the purchase and consumption of sugar to tackle obesity. To date, there has been little consideration of the potential impact that such a measure could have if extended to other sweet foods, such as confectionery, cakes and biscuits that contribute more sugar to the diet than SSBs. The objective of this study is to compare changes in the demand for sweet snacks and SSBs arising from potential price increases. Secondary data on household itemised purchases of all foods and beverages from 2012 to 2013. Representative sample of 32 249 households in Great Britain. Change in food and beverage purchases due to changes in their own price and the price of other foods or beverages measured as price elasticity of demand for the full sample and by income groups. Chocolate and confectionery, cakes and biscuits have similar price sensitivity as SSBs, across all income groups. Unlike the case of SSBs, price increases in these categories are also likely to prompt reductions in the purchase of other sweet snacks and SSBs, which magnify the overall impact. The effects of price increases are greatest in the low-income group. Policies that lead to increases in the price of chocolate and confectionery, cakes and biscuits may lead to additional and greater health gains than similar increases in the price of SSBs through direct reductions in the purchases of these foods and possible positive multiplier effects that reduce demand for other products. Although some uncertainty remains, the associations found in this analysis are sufficiently robust to suggest that policies-and research-concerning the use of fiscal measures should consider a broader range of products than is currently the case. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Do Cd, Cu, Ni, Pb, and Zn biomagnify in aquatic ecosystems?

    PubMed

    Cardwell, Rick D; Deforest, David K; Brix, Kevin V; Adams, William J

    2013-01-01

    In this review, we sought to assess from a study of the literature whether five in organic metals (viz., cadmium, copper, lead, nickel, and zinc) bio magnify in aquatic food webs. We also examined whether accumulated metals were toxic to consumers/predators and whether the essential metals (Cu and Zn and possibly Ni) behaved differently from non-essential ones (Cd and Pb). Biomagnification potential was indexed by the magnitude of single and multiple trophic transfers in food chains. In this analysis, we used three lines of evidence-laboratory empirical, biokinetic modeling, and field studies-to make assessments. Trophic transfer factors, calculatedfrom lab studies, field studies, and biokinetic modeling, were generally congruent.Results indicated that Cd, Cu, Pb, and Zn generally do not biomagnify in food chains consisting of primary producers, macro invertebrate consumers, and fish occupying TL 3 and higher. However, bio magnification of Zn (TTFs of 1-2) is possible for circumstances in which dietary Zn concentrations are below those required for metabolism. Cd, Cu, Ni, and Zn may biomagnify in specific marine food chains consisting of bivalves, herbivorous gastropods, and barnacles at TL2 and carnivorous gastropods at TL3. There was an inverse relationship between TTF and exposure concentration for Cd, Cu, Pb, and Zn, a finding that is consistent with previous reviews of bioconcentration factors and bioaccumulation factors for metals. Our analysis also failed to demonstrate a relationship between the magnitude of TTFsand dietary toxicity to consumer organisms. Consequently, we conclude that TTFs for the metals examined are not an inherently useful predictor of potential hazard(i.e., toxic potential) to aquatic organisms. This review identified several uncertainties or data gaps, such as the relatively limited data available for nickel, reliance upon highly structured food chains in laboratory studies compared to the unstructured food webs found in nature, and variability in TTFs between the organisms found in different habitats, and years sampled.

  3. Development of a special-purpose test surface guided by uncertainty analysis - Introduction of a new uncertainty analysis step

    NASA Technical Reports Server (NTRS)

    Wang, T.; Simon, T. W.

    1988-01-01

    Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.

  4. The Relative Importance of the Vadose Zone in Multimedia Risk Assessment Modeling Applied at a National Scale: An Analysis of Benzene Using 3MRA

    NASA Astrophysics Data System (ADS)

    Babendreier, J. E.

    2002-05-01

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described, along with the conceptual layout of an accompanying java-based paralleling software toolset. Preliminary work is also reported for a scenario involving Benzene disposal that describes the relative importance of the vadose zone in driving risk levels for ecological receptors and human health. Incorporating landfills, waste piles, aerated tanks, surface impoundments, and land application units, the site-based data used in the analysis included 201 national facilities representing 419 site-WMU combinations.

  5. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    NASA Astrophysics Data System (ADS)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18

  6. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Non-Small Cell Lung Cancer.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R; Kazandjian, D; Blumenthal, G; Pazdur, R; Woodcock, J

    2016-12-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analyses. There is much interest in quantifying regulatory approaches to benefit and risk. In this work the use of a quantitative benefit-risk analysis was applied to regulatory decision-making about new drugs to treat advanced non-small cell lung cancer (NSCLC). Benefits and risks associated with 20 US Food and Drug Administration (FDA) decisions associated with a set of candidate treatments submitted between 2003 and 2015 were analyzed. For benefit analysis, the median overall survival (OS) was used where available. When not available, OS was estimated based on overall response rate (ORR) or progression-free survival (PFS). Risks were analyzed based on magnitude (or severity) of harm and likelihood of occurrence. Additionally, a sensitivity analysis was explored to demonstrate analysis of systematic uncertainty. FDA approval decision outcomes considered were found to be consistent with the benefit-risk logic. © 2016 American Society for Clinical Pharmacology and Therapeutics.

  7. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  8. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  9. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  11. Uncertainties in forecasting the response of polar bears to global climate change

    USGS Publications Warehouse

    Douglas, David C.; Atwood, Todd C.; Butterworth, Andy

    2017-01-01

    Several sources of uncertainty affect how precisely the future status of polar bears (Ursus maritimus) can be forecasted. Foremost are unknowns about the future levels of global greenhouse gas emissions, which could range from an unabated increase to an aggressively mitigated reduction. Uncertainties also arise because different climate models project different amounts and rates of future warming (and sea ice loss)—even for the same emission scenario. There are also uncertainties about how global warming could affect the Arctic Ocean’s food web, so even if climate models project the presence of sea ice in the future, the availability of polar bear prey is not guaranteed. Under a worst-case emission scenario in which rates of greenhouse gas emissions continue to rise unabated to century’s end, the uncertainties about polar bear status center on a potential for extinction. If the species were to persist, it would likely be restricted to a high-latitude refugium in northern Canada and Greenland—assuming a food web also existed with enough accessible prey to fuel weight gains for surviving onshore during the most extreme years of summer ice melt. On the other hand, if emissions were to be aggressively mitigated at the levels proposed in the Paris Climate Agreement, healthy polar bear populations would probably continue to occupy all but the most southern areas of their contemporary summer range. While polar bears have survived previous warming phases—which indicate some resiliency to the loss of sea ice habitat—what is certain is that the present pace of warming is unprecedented and will increasingly expose polar bears to historically novel stressors.

  12. Durability reliability analysis for corroding concrete structures under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  13. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  14. Validation of a high-performance liquid chromatographic method with UV detection for the determination of ethopabate residues in poultry liver.

    PubMed

    Granja, Rodrigo H M M; Niño, Alfredo M Montes; Zucchetti, Roberto A M; Niño, Rosario E Montes; Salerno, Alessandro G

    2008-01-01

    Ethopabate is frequently used in the prophylaxis and treatment of coccidiosis in poultry. Residues of this drug in food present a potential risk to consumers. A simple, rapid, and sensitive column high-performance liquid chromatographic (HPLC) method with UV detection for determination of ethopabate in poultry liver is presented. The drug is extracted with acetonitrile. After evaporation, the residue is dissolved with an acetone-hexane mixture and cleaned up by solid-phase extraction using Florisil columns. The analyte is then eluted with methanol. LC analysis is carried out on a C18 5 microm Gemini column, 15 cm x 4.6 mm. Ethopabate is quantified by means of UV detection at 270 nm. Parameters such as decision limit, detection capability, precision, recovery, ruggedness, and measurement uncertainty were calculated according to method validation guidelines provided in 2002/657/EC and ISO/IEC 17025:2005. Decision limit and detection capability were determined to be 2 and 3 microg/kg, respectively. Average recoveries from poultry samples fortified with 10, 15, and 20 microg/kg levels of ethopabate were 100-105%. A complete statistical analysis was performed on the results obtained, including an estimation of the method uncertainty. The method is to be implemented into Brazil's residue monitoring and control program for ethopabate.

  15. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  16. Readiness of adolescents to use genetically modified organisms according to their knowledge and emotional attitude towards GMOs.

    PubMed

    Lachowski, Stanisław; Jurkiewicz, Anna; Choina, Piotr; Florek-Łuszczki, Magdalena; Buczaj, Agnieszka; Goździewska, Małgorzata

    2017-06-07

    Agriculture based on genetically modified organisms plays an increasingly important role in feeding the world population, which is evidenced by a considerable growth in the size of land under genetically modified crops (GM). Uncertainty and controversy around GM products are mainly due to the lack of accurate and reliable information, and lack of knowledge concerning the essence of genetic modifications, and the effect of GM food on the human organism, and consequently, a negative emotional attitude towards what is unknown. The objective of the presented study was to discover to what extent knowledge and the emotional attitude of adolescents towards genetically modified organisms is related with acceptance of growing genetically modified plants or breeding GM animals on own farm or allotment garden, and the purchase and consumption of GM food, as well as the use of GMOs in medicine. The study was conducted by the method of a diagnostic survey using a questionnaire designed by the author, which covered a group of 500 adolescents completing secondary school on the level of maturity examination. The collected material was subjected to statistical analysis. Research hypotheses were verified using chi-square test (χ 2 ), t-Student test, and stepwise regression analysis. Stepwise regression analysis showed that the readiness of adolescents to use genetically modified organisms as food or for the production of pharmaceuticals, the production of GM plants or animals on own farm, depends on an emotional-evaluative attitude towards GMOs, and the level of knowledge concerning the essence of genetic modifications.

  17. By how much would limiting TV food advertising reduce childhood obesity?

    PubMed

    Veerman, J Lennert; Van Beeck, Eduard F; Barendregt, Jan J; Mackenbach, Johan P

    2009-08-01

    There is evidence suggesting that food advertising causes childhood obesity. The strength of this effect is unclear. To inform decisions on whether to restrict advertising opportunities, we estimate how much of the childhood obesity prevalence is attributable to food advertising on television (TV). We constructed a mathematical simulation model to estimate the potential effects of reducing the exposure of 6- to 12-year-old US children to TV advertising for food on the prevalence of overweight and obesity. Model input was based on body measurements from NHANES 2003-04, the CDC-2000 cut-offs for weight categories, and literature that relates advertising to consumption levels and consumption to body mass. In an additional analysis we use a Delphi study to obtain experts' estimates of the effect of advertising on consumption. Based on literature findings, the model predicts that reducing the exposure to zero would decrease the average BMI by 0.38 kg/m(-2) and lower the prevalence of obesity from 17.8 to 15.2% (95% uncertainty interval 14.8-15.6) for boys and from 15.9% to 13.5% (13.1-13.8) for girls. When estimates are based on expert opinion, these values are 11.0% (7.7-14.0) and 9.9% (7.2-12.4), respectively. This study suggests that from one in seven up to one in three obese children in the USA might not have been obese in the absence of advertising for unhealthy food on TV. Limiting the exposure of children to marketing of energy-dense food could be part of a broader effort to make children's diets healthier.

  18. Ochratoxin a and mitotic disruption: mode of action analysis of renal tumor formation by ochratoxin A.

    PubMed

    Mally, Angela

    2012-06-01

    The mycotoxin and food contaminant ochratoxin A (OTA) is a potent renal carcinogen in rodents, but its mode of action (MoA) is still poorly defined. In 2006, the European Food Safety Authority concluded that there is a "lack of evidence for the existence of OTA-DNA adducts" and thus insufficient evidence to establish DNA reactivity as a MoA for tumor formation by OTA. In reviewing the available database on OTA toxicity, a MoA for renal carcinogenicity of OTA is developed that involves a combination of genetic instability and increased proliferative drive as consequences of OTA-mediated disruption of mitosis, whereby the organ- and site-specificity of tumor formation by OTA is determined by selective renal uptake of OTA into the proximal tubule epithelium. The proposed MoA is critically assessed with respect to concordance of dose-response of the suggested key events and tumor formation, their temporal association, consistency, and biological plausibility. Uncertainties, data gaps and needs for further research are highlighted.

  19. Discrimination of sweeteners based on the refractometric analysis

    NASA Astrophysics Data System (ADS)

    Bodurov, I.; Vlaeva, I.; Viraneva, A.; Yovcheva, T.

    2017-01-01

    In the present work, the refractive characteristics of aqueous solutions of several sweeteners are investigated. These data in combination with ones from other sensors should find application for brief determination of sweeteners content in food and dynamic monitoring of food quality. The refractive indices of pure (distilled) water and aqueous solutions of several commonly used natural and artificial sweeteners (glucose, fructose, sucrose, lactose, sorbitol [E420], isomalt [E953], saccharin sodium [E950], cyclamate sodium and glycerol [E422]) with 10 wt.% concentration are accurately measured at 405 nm, 532 nm and 632.8 nm wavelengths. The measurements are carried out using three wavelength laser microrefractometer based on the total internal reflection method. The critical angle is determined by the disappearance of the diffraction orders from a metal grating. The experimental uncertainty is less than ±0.0001. The dispersion dependences of the refractive indices are obtained using the one-term Sellmeier model. Based on the obtained experimental data additional refractive and dispersion characteristics are calculated.

  20. Comparison of bactericidal efficiency of 7.5 MeV X-rays, gamma-rays, and 10 MeV e-beams

    NASA Astrophysics Data System (ADS)

    Song, Beom-Seok; Lee, Yunjong; Moon, Byeong-Geum; Go, Seon-Min; Park, Jong-Heum; Kim, Jae-Kyung; Jung, Koo; Kim, Dong-Ho; Ryu, Sang-Ryeol

    2016-08-01

    This study was performed to verify the feasibility of 7.5 MeV X-rays for food pasteurization through a comparison of the bactericidal efficiency with those of other sources for selected bacterial pathogens. No significant differences were observed between the overall bactericidal efficiency for beef-inoculated pathogens based on the uncertainty of the absorbed dose and variations in bacterial counts. This result supported that all three irradiation sources were effective for inactivation of food-borne bacteria and that 7.5 MeV X-rays may be used for food pasteurization.

  1. The Water, Energy and Food Nexus: Finding the Balance in Infrastructure Investment

    NASA Astrophysics Data System (ADS)

    Huber-lee, A. T.; Wickel, B.; Kemp-Benedict, E.; Purkey, D. R.; Hoff, H.; Heaps, C.

    2013-12-01

    There is increasing evidence that single-sector infrastructure planning is leading to severely stressed human and ecological systems. There are a number of cross-sectoral impacts in these highly inter-linked systems. Examples include: - Promotion of biofuels that leads to conversion from food crops, reducing both food and water security. - Promotion of dams solely built for hydropower rather than multi-purpose uses, that deplete fisheries and affect saltwater intrusion dynamics in downstream deltas - Historical use of water for cooling thermal power plants, with increasing pressure from other water uses, as well as problems of increased water temperatures that affect the ability to cool plants efficiently. This list can easily be expanded, as these inter-linkages are increasing over time. As developing countries see a need to invest in new infrastructure to improve the livelihoods of the poor, developed countries face conditions of deteriorating infrastructure with an opportunity for new investment. It is crucial, especially in the face of uncertainty of climate change and socio-political realities, that infrastructure planning factors in the influence of multiple sectors and the potential impacts from the perspectives of different stakeholders. There is a need for stronger linkages between science and policy as well. The Stockholm Environment Institute is developing and implementing practical and innovative nexus planning approaches in Latin America, Africa and Asia that brings together stakeholders and ways of integrating uncertainty in a cross-sectoral quantitative framework using the tools WEAP (Water Evaluation and Planning) and LEAP (Long-range Energy Alternatives Planning). The steps used include: 1. Identify key actors and stakeholders via social network analysis 2. Work with these actors to scope out priority issues and decision criteria in both the short and long term 3. Develop quantitative models to clarify options and balances between the needs and priorities of different stakeholders 4. Present and visualize results in ways easily comprehended by the general public, and, 5. Identify current and potential future governance options to implement various infrastructure investments and institutional innovations While this work is under active development, early results show the value of cross-sector integration. Perhaps the most crucial realization emerging from this body of work is that the current mode of single sector infrastructure investment is resulting in tremendous risk, given the interdependence of water, energy, food, and the environment and the uncertainties associated with climate change. By looking at a wider scope of water, energy and food trajectories, and seeing how these affect each other over time, stakeholders and decision makers can take advantage of potential synergies between sectors, rather than look solely at tradeoffs. While climate change poses a tremendous challenge for infrastructure development it also is emerging as a common concern among investors, developers, conservationists and others, presenting a unique opportunity for rethinking infrastructure development and balancing needs across sectors and including environmental needs. This paper will provide practical approaches to illustrate the value of balancing across sectors.

  2. Information-based cues at point of choice to change selection and consumption of food, alcohol and tobacco products: a systematic review.

    PubMed

    Carter, Patrice; Bignardi, Giacomo; Hollands, Gareth J; Marteau, Theresa M

    2018-03-27

    Reducing harmful consumption of food, alcohol, and tobacco products would prevent many cancers, diabetes and cardiovascular disease. Placing information-based cues in the environments in which we select and consume these products has the potential to contribute to changing these behaviours. In this review, information-based cues are defined as those which comprise any combination of words, symbols, numbers or pictures that convey information about a product or its use. We specifically exclude cues which are located on the products themselves. We conducted a systematic review of randomised, cluster- randomised, and non-randomised controlled trials to assess the impact of such cues on selection and consumption. Thirteen studies met the inclusion criteria, of which 12 targeted food (most commonly fruit and vegetables), one targeted alcohol sales, and none targeted tobacco products. Ten studies reported statistically significant effects on some or all of the targeted products, although studies were insufficiently homogenous to justify meta-analysis. Existing evidence suggests information-based cues can influence selection and consumption of food and alcohol products, although significant uncertainty remains. The current evidence base is limited both in quality and quantity, with relatively few, heterogeneous studies at unclear or high risk of bias. Additional, more rigorously conducted studies are warranted to better estimate the potential for these interventions to change selection and consumption of food, alcohol and tobacco products. PROSPERO. 2016; CRD42016051884 .

  3. Compilation and network analyses of cambrian food webs.

    PubMed

    Dunne, Jennifer A; Williams, Richard J; Martinez, Neo D; Wood, Rachel A; Erwin, Douglas H

    2008-04-29

    A rich body of empirically grounded theory has developed about food webs--the networks of feeding relationships among species within habitats. However, detailed food-web data and analyses are lacking for ancient ecosystems, largely because of the low resolution of taxa coupled with uncertain and incomplete information about feeding interactions. These impediments appear insurmountable for most fossil assemblages; however, a few assemblages with excellent soft-body preservation across trophic levels are candidates for food-web data compilation and topological analysis. Here we present plausible, detailed food webs for the Chengjiang and Burgess Shale assemblages from the Cambrian Period. Analyses of degree distributions and other structural network properties, including sensitivity analyses of the effects of uncertainty associated with Cambrian diet designations, suggest that these early Paleozoic communities share remarkably similar topology with modern food webs. Observed regularities reflect a systematic dependence of structure on the numbers of taxa and links in a web. Most aspects of Cambrian food-web structure are well-characterized by a simple "niche model," which was developed for modern food webs and takes into account this scale dependence. However, a few aspects of topology differ between the ancient and recent webs: longer path lengths between species and more species in feeding loops in the earlier Chengjiang web, and higher variability in the number of links per species for both Cambrian webs. Our results are relatively insensitive to the exclusion of low-certainty or random links. The many similarities between Cambrian and recent food webs point toward surprisingly strong and enduring constraints on the organization of complex feeding interactions among metazoan species. The few differences could reflect a transition to more strongly integrated and constrained trophic organization within ecosystems following the rapid diversification of species, body plans, and trophic roles during the Cambrian radiation. More research is needed to explore the generality of food-web structure through deep time and across habitats, especially to investigate potential mechanisms that could give rise to similar structure, as well as any differences.

  4. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  5. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  6. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.

  7. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  8. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  9. Determination and Uncertainty Analysis of Inorganic Arsenic in Husked Rice by Solid Phase Extraction and Atomic Absorption Spectrometry with Hydride Generation.

    PubMed

    Saxena, Sushil Kumar; Karipalli, Agnes Raju; Krishnan, Anoop A; Rangasamy, Rajesh; Malekadi, Praveen; Singh, Dhirendra P; Vasu, Vimesh; Singh, Vijay K

    2017-05-01

    This study enables the selective determination of inorganic arsenic (iAs) with a low detection limit using an economical instrument [atomic absorption spectrometer with hydride generation (HG)] to meet the regulatory requirements as per European Commission (EC) and Codex guidelines. Dry rice samples (0.5 g) were diluted using 0.1 M HNO3-3% H2O2 and heated in a water bath (90 ± 2°C) for 60 min. Through this process, all the iAs is solubilized and oxidized to arsenate [As(V)]. The centrifuged extract was loaded onto a preconditioned and equilibrated strong anion-exchange SPE column (silica-based Strata SAX 500 mg/6 mL), followed by selective and sequential elution of As(V), enabling the selective quantification of iAs using atomic absorption spectrometry with HG. In-house validation showed a mean recovery of 94% and an LOQ of 0.025 mg/kg. The repeatability (HorRatr) and reproducibility (HorRatR) values were <2, meeting the performance criteria mandated by the EC. The combined standard measurement uncertainty by this method was less than the maximum standard measurement uncertainty; thus, the method can be considered for official control purposes. The method was applied for the determination of iAs in husked rice samples and has potential applications in other food commodities.

  10. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  11. Agriculture waste and rising CO2

    USDA-ARS?s Scientific Manuscript database

    Currently, there are many uncertainties concerning agriculture’s role in global environmental change including the effects of rising atmospheric CO2 concentration. A viable and stable world food supply depends on productive agricultural systems, but environmental concerns within agriculture have to...

  12. Cost-effectiveness of a complex workplace dietary intervention: an economic evaluation of the Food Choice at Work study.

    PubMed

    Fitzgerald, Sarah; Murphy, Aileen; Kirby, Ann; Geaney, Fiona; Perry, Ivan J

    2018-03-03

    To evaluate the costs, benefits and cost-effectiveness of complex workplace dietary interventions, involving nutrition education and system-level dietary modification, from the perspective of healthcare providers and employers. Single-study economic evaluation of a cluster-controlled trial (Food Choice at Work (FCW) study) with 1-year follow-up. Four multinational manufacturing workplaces in Cork, Ireland. 517 randomly selected employees (18-65 years) from four workplaces. Cost data were obtained from the FCW study. Nutrition education included individual nutrition consultations, nutrition information (traffic light menu labelling, posters, leaflets and emails) and presentations. System-level dietary modification included menu modification (restriction of fat, sugar and salt), increase in fibre, fruit discounts, strategic positioning of healthier alternatives and portion size control. The combined intervention included nutrition education and system-level dietary modification. No intervention was implemented in the control. The primary outcome was an improvement in health-related quality of life, measured using the EuroQoL 5 Dimensions 5 Levels questionnaire. The secondary outcome measure was reduction in absenteeism, which is measured in monetary amounts. Probabilistic sensitivity analysis (Monte Carlo simulation) assessed parameter uncertainty. The system-level intervention dominated the education and combined interventions. When compared with the control, the incremental cost-effectiveness ratio (€101.37/quality-adjusted life-year) is less than the nationally accepted ceiling ratio, so the system-level intervention can be considered cost-effective. The cost-effectiveness acceptability curve indicates there is some decision uncertainty surrounding this, arising from uncertainty surrounding the differences in effectiveness. These results are reiterated when the secondary outcome measure is considered in a cost-benefit analysis, whereby the system-level intervention yields the highest net benefit (€56.56 per employee). System-level dietary modification alone offers the most value per improving employee health-related quality of life and generating net benefit for employers by reducing absenteeism. While system-level dietary modification strategies are potentially sustainable obesity prevention interventions, future research should include long-term outcomes to determine if improvements in outcomes persist. ISRCTN35108237; Post-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. The Impacts of Various Environments Factors and Adaptive Management Strategies on Food Crops in the 21st Century Based on a Land Surface Model

    NASA Astrophysics Data System (ADS)

    Jain, A. K.; Lin, T. S.; Lawrence, P.; Kheshgi, H. S.

    2017-12-01

    Environmental factors - characterized by increasing levels of CO2, and changes in temperature and precipitation patterns - present potential risks to global food supply. To date, understanding of environmental factors' effects on crop production remains uncertain due to (1) uncertainties in projected trends of these factors and their spatial and temporal variability; (2) uncertainties in the physiological, genetic and molecular basis of crop adaptation to adaptive management practices (e.g. change in planting time, irrigation and N fertilization etc.) and (3) uncertainties in current land surface models to estimate the response of crop production to changes in environmental factors and management strategies. In this study we apply a process-based land surface model, the Integrated Science Assessment model (ISAM), to assess the impact of various environmental factors and management strategies on the production of row crops (corn, soybean and wheat) at regional and global scales. Results are compared to corresponding simulations performed with the crop model in the Community Land Model (CLM4.5). Each model is driven with historical atmospheric forcing data (1901-2005), and projected atmospheric forcing data under RCP 4.5 or RCP 8.5 (2006-2100) from CESM CMIP5 simulations to estimate the effects of different climate change projections on potential productivity of food crops at a global scale. For each set of atmospheric forcing data, production of each crop is simulated with and without inclusion of adaptive management practices (e.g. application of irrigation, N fertilization, change in planting time and crop cultivars etc.) to assess the effect of adaptation on projected crop production over the 21st century. In detail, three questions are addressed: (1) what is the impact of different climate change projections on global crop production; (2) what is the effect of adaptive management practices on projected crop production; and (3) how do differences in model mechanisms in ISAM and CLM4.5 impact projected global crop production and adaptive management practices (irrigation and N fertilizer) over the 21st century. The major outcomes of this study will help to understand the uncertainties in potential productivity of food crops under different environmental conditions and management practices.

  14. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  15. Empirically Estimating the Potential for Farm-Level Adaptation to Climate Change in Western European Agriculture

    NASA Astrophysics Data System (ADS)

    Moore, F. C.; Lobell, D. B.

    2013-12-01

    Agriculture is one of the economic sectors most exposed to climate change and estimating the sensitivity of food production to these changes is critical for determining the severity of climate change impacts and for informing both adaptation and mitigation policy. While climate change might have adverse effects in many areas, it has long been recognized that farmers have a suite of adaptation options at their disposal including, inter alia, changing planting date, varieties, crops, or the mix and quantity of inputs applied. These adaptations may significantly reduce the adverse impacts of climate change but the potential effectiveness of these options and the speed with which farmers will adopt them remain uncertain. We estimate the sensitivity of crop yields and farm profits in western Europe to climate change with and without the adoption of on-farm adaptations. We use cross-sectional variation across farms to define the long-run response function that includes adaptation and inter-annual variation within farms to define the short-run response function without adaptation. The difference between these can be interpreted as the potential for adaptation. We find that future warming will have a large adverse impact on wheat and barley yields and that adaptation will only be able to mitigate a small fraction of this. Maize, oilseed and sugarbeet yields are more modestly affected and adaptation is more effective for these crops. Farm profits could increase slightly under moderate amounts of warming if adaptations are adopted but will decline in the absence of adaptation. A decomposition of variance gives the relative importance of different sources of uncertainty in projections of climate change impacts. We find that in most cases uncertainty over future adaptation pathways (whether farmers will or will not adopt beneficial adaptations) is the most important source of uncertainty in projecting the effect of temperature changes on crop yields and farm profits. This source of uncertainty dominates both uncertainty over temperature projections (climate uncertainty) and uncertainty over how sensitive crops or profits are to changes in temperature (response uncertainty). Therefore, constraining how quickly farmers are likely to adapt will be essential for improving our understanding of how climate change will affect food production over the next few decades.

  16. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  17. Managing Food Allergens in the U.K. Retail Supply Chain.

    PubMed

    Walker, Michael J; Gowland, M Hazel; Points, John

    2018-01-01

    The U.K. food and grocery market is highly significant financially and dominated by 10 retailers within a regulated and extremely economically competitive environment. We summarize the approach of U.K. retailers to allergen risk assessment (RA) and risk management (RM) within the U.K. legal framework and explore public visibility of retailers' allergen policies. RA and RM of allergens appear effective in curtailing retail-triggered severe food allergy reactions. However, allergen recalls remain high, precautionary allergen labeling (PAL) remains an area of confusion, and there is no consistent Web-based provision of information for consumers who have allergies. Resolution of PAL awaits an agreed-on threshold framework, but a key challenge is to engage with patients and gain their trust rather than thrust education at them. It would be helpful for retailers to publish their allergen RA and RM policies. A target should be agreed on between government and retailers for a reduction in the proliferation of PAL wording variants by a given date within the next 3 years. A further hurdle is potentially flawed allergen analysis-development of reference methods and reference materials are acknowledged needs. Laboratories should report allergen results in an informative manner, communicating uncertainty and caveats. Ideally a laboratory representative would be included on any incident control team. Efforts must continue to standardize preparedness for protecting and defending food and drink from deliberate attack.

  18. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    PubMed

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  19. What is a food and what is a medicinal product in the European Union? Use of the benchmark dose (BMD) methodology to define a threshold for "pharmacological action".

    PubMed

    Lachenmeier, Dirk W; Steffen, Christian; el-Atma, Oliver; Maixner, Sibylle; Löbell-Behrends, Sigrid; Kohl-Himmelseher, Matthias

    2012-11-01

    The decision criterion for the demarcation between foods and medicinal products in the EU is the significant "pharmacological action". Based on six examples of substances with ambivalent status, the benchmark dose (BMD) method is evaluated to provide a threshold for pharmacological action. Using significant dose-response models from literature clinical trial data or epidemiology, the BMD values were 63mg/day for caffeine, 5g/day for alcohol, 6mg/day for lovastatin, 769mg/day for glucosamine sulfate, 151mg/day for Ginkgo biloba extract, and 0.4mg/day for melatonin. The examples for caffeine and alcohol validate the approach because intake above BMD clearly exhibits pharmacological action. Nevertheless, due to uncertainties in dose-response modelling as well as the need for additional uncertainty factors to consider differences in sensitivity within the human population, a "borderline range" on the dose-response curve remains. "Pharmacological action" has proven to be not very well suited as binary decision criterion between foods and medicinal product. The European legislator should rethink the definition of medicinal products, as the current situation based on complicated case-by-case decisions on pharmacological action leads to an unregulated market flooded with potentially illegal food supplements. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. How many food additives are rodent carcinogens?

    PubMed

    Johnson, F M

    2002-01-01

    One generally assumes that chemical agents added to foods are reasonably free of risks to human health, and practically everyone consumes some additives in his or her food daily throughout life. In the United States, the 1958 Food Additives Amendment to the Federal Food, Drug and Cosmetic Act of 1938 requires food manufacturers to demonstrate the safety of food additives to the Food and Drug Administration (FDA). The Amendment contains a provision that prohibits approval of an additive if it is found to cause cancer in humans or animals. In the present study, data from the National Toxicology Program rodent bioassay (NTPRB) were used to identify a sample of approximately 50 rodent-tested additives and other chemicals added to food that had been evaluated independently of the FDA/food industry. Surprisingly, the sample shows more than 40% of these food chemicals to be carcinogenic in one or more rodent groups. If this percentage is extrapolated to all substances added to food in the United States, it would imply that more than 1000 of such substances are potential rodent carcinogens. The NTP and FDA test guidelines use similar, though not necessarily identical, rodent test procedures, including near lifetime exposures to the maximum tolerated dose. The FDA specifies that test chemicals should be administered by the oral route. However, the oral route includes three methods of delivering chemicals, that is, mixed in the food or water or delivered by stomach tube (gavage). The NTP data show only 1 of 18 food chemicals mixed in the food are rodent carcinogens, but 16 of 23 gavage-administered food chemicals are carcinogenic to rodents. The distribution suggests that among orally delivered chemicals, those administered in the feed will more likely prove to be noncarcinogens than chemicals given by gavage. The rodent data also reveal that effects may vary according to dose and genotype, as well as by route of administration, to further complicate extrapolation to humans. Human experience with known carcinogens such as tobacco, asbestos, and benzidine convinces us that environmental carcinogens constitute a real threat to human health, although predicting human carcinogens from rodent tests involves a number of uncertainties. These uncertainties do not mean that we should simply ignore the presence of carcinogens. Rather, in the interests of public safety, a serious effort should be made to resolve the questions surrounding the presence of chemicals identified as rodent carcinogens in our food. Environ. Mol. Mutagen. 39:69-80, 2002 Published 2002 Wiley-Liss, Inc.

  1. Development of new reference material neohesperidin for quality control of dietary supplements.

    PubMed

    Gong, Ningbo; Zhang, Baoxi; Yang, Dezhi; Gao, Zhaolin; Du, Guanhua; Lu, Yang

    2015-07-01

    Neohesperidin is an important natural flavanone glycoside distributed in several citrus species. This compound is widely used as a raw material for food additives in the food industry. The request for certified reference materials (CRMs) in dietary supplements was stipulated by the National Administrative Committee for CRMs and was underpinned by the need to improve the accuracy and comparability of measurement data and to establish metrological traceability of analytical results. This paper reports the sample preparation methodology, homogeneity and stability studies, value assignment and uncertainty estimation of a new certified reference material of neohesperidin (GBW09522). Differential scanning calorimetry, coulometric titration and mass balance methods proved to be sufficiently reliable and accurate for certification purposes. The certified value of neohesperidin CRM is 994 g kg(-1) with an expanded uncertainty of 4 g kg(-1) (k = 2). The reference material described above was homogeneous and stable for 12 months at a storage temperature of 25 °C. The new CRM of neohesperidin can be used to validate analytical methods and improve the accuracy of measurement data as well as quality control of neohesperidin-related dietary supplements, foods, traditional herbs and pharmaceutical formulations. © 2014 Society of Chemical Industry.

  2. Food nanotechnology – an overview

    PubMed Central

    Sekhon, Bhupinder S

    2010-01-01

    Food nanotechnology is an area of emerging interest and opens up a whole universe of new possibilities for the food industry. The basic categories of nanotechnology applications and functionalities currently in the development of food packaging include: the improvement of plastic materials barriers, the incorporation of active components that can deliver functional attributes beyond those of conventional active packaging, and the sensing and signaling of relevant information. Nano food packaging materials may extend food life, improve food safety, alert consumers that food is contaminated or spoiled, repair tears in packaging, and even release preservatives to extend the life of the food in the package. Nanotechnology applications in the food industry can be utilized to detect bacteria in packaging, or produce stronger flavors and color quality, and safety by increasing the barrier properties. Nanotechnology holds great promise to provide benefits not just within food products but also around food products. In fact, nanotechnology introduces new chances for innovation in the food industry at immense speed, but uncertainty and health concerns are also emerging. EU/WE/global legislation for the regulation of nanotechnology in food are meager. Moreover, current legislation appears unsuitable to nanotechnology specificity. PMID:24198465

  3. Food waste in the Swiss food service industry - Magnitude and potential for reduction.

    PubMed

    Betz, Alexandra; Buchli, Jürg; Göbel, Christine; Müller, Claudia

    2015-01-01

    Food losses occur across the whole food supply chain. They have negative effects on the economy and the environment, and they are not justifiable from an ethical point of view. The food service industry was identified by Beretta et al. (2013) as the third largest source of food waste based on food input at each stage of the value added chain. The total losses are estimated 18% of the food input, the avoidable losses 13.5%. However, these estimations are related with considerable uncertainty. To get more reliable and detailed data of food losses in this sector, the waste from two companies (in the education and business sectors) was classified into four categories (storage losses, preparation losses, serving losses, and plate waste) and seven food classes and measured for a period of five days. A questionnaire evaluated customer reaction, and a material flow analysis was used to describe the mass and monetary losses within the process chain. The study found that in company A (education sector) 10.73% and in company B (business sector) 7.69% of the mass of all food delivered was wasted during the process chain. From this, 91.98% of the waste in company A and 78.14% in company B were classified as avoidable. The highest proportion of waste occurred from serving losses with starch accompaniments and vegetables being the most frequently wasted items. The quantities of waste per meal were 91.23 g (value CHF 0.74) and 85.86 g (value CHF 0.44) for company A and company B, respectively. The annual loss averaged 10.47 tonnes (value CHF 85,047) in company A and 16.55 tonnes (value CHF 85,169) in company B. The customer survey showed that 15.79% (n=356) of the respondents in company A and 18.32% (n=382) in company B produced plate waste. The main causes of plate waste cited were 'portion served by staff too large' and 'lack of hunger'. Sustainable measures need to be implemented in the food service industry to reduce food waste and to improve efficiency. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  5. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e.g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.

  6. Food Taxes: A New Holy Grail?

    PubMed Central

    Devisch, Ignaas

    2013-01-01

    In an effort to reduce the growing prevalence of overweight and obesity, food taxes have been introduced in several European countries, the so-called ‘obesitax’. As yet little evidence is at hand, policy measures are being taken to counterweight the consumption of unhealthy food or the increasing diet-related diseases. Several questions need to be discussed, starting from a general perspective: can food taxes become an appropriate and just policy measure to reduce overweight and obesity and therefore increase consumer’s health? The implementation of an effective and fair food tax is an exercise riddled with uncertainty. Not only is there a need for evidence on the health and economic impact of food taxes, we also have to think about a conceptual and ethical discussion concerning the balance between health imperatives and public health on the one hand, and social and ethical standards on the other hand. PMID:24596843

  7. Food taxes: a new holy grail?

    PubMed

    Devisch, Ignaas

    2013-08-01

    In an effort to reduce the growing prevalence of overweight and obesity, food taxes have been introduced in several European countries, the so-called 'obesitax'. As yet little evidence is at hand, policy measures are being taken to counterweight the consumption of unhealthy food or the increasing diet-related diseases. Several questions need to be discussed, starting from a general perspective: can food taxes become an appropriate and just policy measure to reduce overweight and obesity and therefore increase consumer's health? The implementation of an effective and fair food tax is an exercise riddled with uncertainty. Not only is there a need for evidence on the health and economic impact of food taxes, we also have to think about a conceptual and ethical discussion concerning the balance between health imperatives and public health on the one hand, and social and ethical standards on the other hand.

  8. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  9. Facility Measurement Uncertainty Analysis at NASA GRC

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin

    2016-01-01

    This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.

  10. US Food Security and Climate Change: Mid-Century Projections of Commodity Crop Production by the IMPACT Model

    NASA Astrophysics Data System (ADS)

    Takle, E. S.; Gustafson, D. I.; Beachy, R.; Nelson, G. C.; Mason-D'Croz, D.; Palazzo, A.

    2013-12-01

    Agreement is developing among agricultural scientists on the emerging inability of agriculture to meet growing global food demands. The lack of additional arable land and availability of freshwater have long been constraints on agriculture. Changes in trends of weather conditions that challenge physiological limits of crops, as projected by global climate models, are expected to exacerbate the global food challenge toward the middle of the 21st century. These climate- and constraint-driven crop production challenges are interconnected within a complex global economy, where diverse factors add to price volatility and food scarcity. We use the DSSAT crop modeling suite, together with mid-century projections of four AR4 global models, as input to the International Food Policy Research Institute IMPACT model to project the impact of climate change on food security through the year 2050 for internationally traded crops. IMPACT is an iterative model that responds to endogenous and exogenous drivers to dynamically solve for the world prices that ensure global supply equals global demand. The modeling methodology reconciles the limited spatial resolution of macro-level economic models that operate through equilibrium-driven relationships at a national level with detailed models of biophysical processes at high spatial resolution. The analysis presented here suggests that climate change in the first half of the 21st century does not represent a near-term threat to food security in the US due to the availability of adaptation strategies (e.g., loss of current growing regions is balanced by gain of new growing regions). However, as climate continues to trend away from 20th century norms current adaptation measures will not be sufficient to enable agriculture to meet growing food demand. Climate scenarios from higher-level carbon emissions exacerbate the food shortfall, although uncertainty in climate model projections (particularly precipitation) is a limitation to impact studies.

  11. Science, law, and politics in the Food and Drug Administration's genetically engineered foods policy: FDA's 1992 policy statement.

    PubMed

    Pelletier, David L

    2005-05-01

    The US Food and Drug Administration's (FDA's) 1992 policy statement was developed in the context of critical gaps in scientific knowledge concerning the compositional effects of genetic transformation and severe limitations in methods for safety testing. FDA acknowledged that pleiotropy and insertional mutagenesis may cause unintended changes, but it was unknown whether this happens to a greater extent in genetic engineering compared with traditional breeding. Moreover, the agency was not able to identify methods by which producers could screen for unintended allergens and toxicants. Despite these uncertainties, FDA granted genetically engineered foods the presumption of GRAS (Generally Recognized As Safe) and recommended that producers use voluntary consultations before marketing them.

  12. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  13. Lognormal Uncertainty Estimation for Failure Rates

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  14. DIETARY INTAKE OF YOUNG CHILDREN

    EPA Science Inventory

    Dietary exposure research supports the requirements of the Food Quality Protection Act (FQPA) of 1996 by improving methods of aggregate and cumulative exposure assessments for children. The goal of this research is to reduce the level of uncertainty in assessing the dietary path...

  15. Rising temperatures reduce global wheat production

    USDA-ARS?s Scientific Manuscript database

    Crop models are essential to assess the threat of climate change for food production but have not been systematically tested against temperature experiments, despite demonstrated uncertainty in temperature response. Herein, we compare 30 different wheat crop models against field experiments in which...

  16. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  17. Quantifying model-structure- and parameter-driven uncertainties in spring wheat phenology prediction with Bayesian analysis

    DOE PAGES

    Alderman, Phillip D.; Stanfill, Bryan

    2016-10-06

    Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less

  18. A unified approach for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-09-01

    Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.

  19. Qalibra: a general model for food risk-benefit assessment that quantifies variability and uncertainty.

    PubMed

    Hart, Andy; Hoekstra, Jeljer; Owen, Helen; Kennedy, Marc; Zeilmaker, Marco J; de Jong, Nynke; Gunnlaugsdottir, Helga

    2013-04-01

    The EU project BRAFO proposed a framework for risk-benefit assessment of foods, or changes in diet, that present both potential risks and potential benefits to consumers (Hoekstra et al., 2012a). In higher tiers of the BRAFO framework, risks and benefits are integrated quantitatively to estimate net health impact measured in DALYs or QALYs (disability- or quality-adjusted life years). This paper describes a general model that was developed by a second EU project, Qalibra, to assist users in conducting these assessments. Its flexible design makes it applicable to a wide range of dietary questions involving different nutrients, contaminants and health effects. Account can be taken of variation between consumers in their diets and also other characteristics relevant to the estimation of risk and benefit, such as body weight, gender and age. Uncertainty in any input parameter may be quantified probabilistically, using probability distributions, or deterministically by repeating the assessment with alternative assumptions. Uncertainties that are not quantified should be evaluated qualitatively. Outputs produced by the model are illustrated using results from a simple assessment of fish consumption. More detailed case studies on oily fish and phytosterols are presented in companion papers. The model can be accessed as web-based software at www.qalibra.eu. Copyright © 2012. Published by Elsevier Ltd.

  20. Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center

    NASA Technical Reports Server (NTRS)

    Reinath, Michael S.

    1997-01-01

    Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.

  1. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE PAGES

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    2016-09-12

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  2. Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.

    Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less

  3. Predicting Consumer Biomass, Size-Structure, Production, Catch Potential, Responses to Fishing and Associated Uncertainties in the World's Marine Ecosystems.

    PubMed

    Jennings, Simon; Collingridge, Kate

    2015-01-01

    Existing estimates of fish and consumer biomass in the world's oceans are disparate. This creates uncertainty about the roles of fish and other consumers in biogeochemical cycles and ecosystem processes, the extent of human and environmental impacts and fishery potential. We develop and use a size-based macroecological model to assess the effects of parameter uncertainty on predicted consumer biomass, production and distribution. Resulting uncertainty is large (e.g. median global biomass 4.9 billion tonnes for consumers weighing 1 g to 1000 kg; 50% uncertainty intervals of 2 to 10.4 billion tonnes; 90% uncertainty intervals of 0.3 to 26.1 billion tonnes) and driven primarily by uncertainty in trophic transfer efficiency and its relationship with predator-prey body mass ratios. Even the upper uncertainty intervals for global predictions of consumer biomass demonstrate the remarkable scarcity of marine consumers, with less than one part in 30 million by volume of the global oceans comprising tissue of macroscopic animals. Thus the apparently high densities of marine life seen in surface and coastal waters and frequently visited abundance hotspots will likely give many in society a false impression of the abundance of marine animals. Unexploited baseline biomass predictions from the simple macroecological model were used to calibrate a more complex size- and trait-based model to estimate fisheries yield and impacts. Yields are highly dependent on baseline biomass and fisheries selectivity. Predicted global sustainable fisheries yield increases ≈4 fold when smaller individuals (< 20 cm from species of maximum mass < 1 kg) are targeted in all oceans, but the predicted yields would rarely be accessible in practice and this fishing strategy leads to the collapse of larger species if fishing mortality rates on different size classes cannot be decoupled. Our analyses show that models with minimal parameter demands that are based on a few established ecological principles can support equitable analysis and comparison of diverse ecosystems. The analyses provide insights into the effects of parameter uncertainty on global biomass and production estimates, which have yet to be achieved with complex models, and will therefore help to highlight priorities for future research and data collection. However, the focus on simple model structures and global processes means that non-phytoplankton primary production and several groups, structures and processes of ecological and conservation interest are not represented. Consequently, our simple models become increasingly less useful than more complex alternatives when addressing questions about food web structure and function, biodiversity, resilience and human impacts at smaller scales and for areas closer to coasts.

  4. Balancing consumer protection and scientific integrity in the face of uncertainty: the example of gluten-free foods.

    PubMed

    McCabe, Margaret Sova

    2010-01-01

    In 2009, gluten-free foods were not only "hot" in the marketplace, several countries, including the United States, continued efforts to define gluten-free and appropriate labeling parameters. The regulatory process illuminates how difficult regulations based on safe scientific thresholds can be for regulators, manufacturers and consumers. This article analyzes the gluten-free regulatory landscape, challenges to defining a safe gluten threshold, and how consumers might need more label information beyond the term "gluten-free." The article includes an overview of international gluten-free regulations, the Food and Drug Administration (FDA) rulemaking process, and issues for consumers.

  5. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  6. Parameter uncertainty and nonstationarity in regional extreme rainfall frequency analysis in Qu River Basin, East China

    NASA Astrophysics Data System (ADS)

    Zhu, Q.; Xu, Y. P.; Gu, H.

    2014-12-01

    Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.

  7. CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrer, R.; Rhodes, J.; Smith, K.

    2012-07-01

    The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)

  8. Perceptions of risk from nanotechnologies and trust in stakeholders: a cross sectional study of public, academic, government and business attitudes.

    PubMed

    Capon, Adam; Gillespie, James; Rolfe, Margaret; Smith, Wayne

    2015-04-26

    Policy makers and regulators are constantly required to make decisions despite the existence of substantial uncertainty regarding the outcomes of their proposed decisions. Understanding stakeholder views is an essential part of addressing this uncertainty, which provides insight into the possible social reactions and tolerance of unpredictable risks. In the field of nanotechnology, large uncertainties exist regarding the real and perceived risks this technology may have on society. Better evidence is needed to confront this issue. We undertook a computer assisted telephone interviewing (CATI) survey of the Australian public and a parallel survey of those involved in nanotechnology from the academic, business and government sectors. Analysis included comparisons of proportions and logistic regression techniques. We explored perceptions of nanotechnology risks both to health and in a range of products. We examined views on four trust actors. The general public's perception of risk was significantly higher than that expressed by other stakeholders. The public bestows less trust in certain trust actors than do academics or government officers, giving its greatest trust to scientists. Higher levels of public trust were generally associated with lower perceptions of risk. Nanotechnology in food and cosmetics/sunscreens were considered riskier applications irrespective of stakeholder, while familiarity with nanotechnology was associated with a reduced risk perception. Policy makers should consider the disparities in risk and trust perceptions between the public and influential stakeholders, placing greater emphasis on risk communication and the uncertainties of risk assessment in these areas of higher concern. Scientists being the highest trusted group are well placed to communicate the risks of nanotechnologies to the public.

  9. Population-level differences in disease transmission: A Bayesian analysis of multiple smallpox epidemics

    PubMed Central

    Elderd, Bret D.; Dwyer, Greg; Dukic, Vanja

    2013-01-01

    Estimates of a disease’s basic reproductive rate R0 play a central role in understanding outbreaks and planning intervention strategies. In many calculations of R0, a simplifying assumption is that different host populations have effectively identical transmission rates. This assumption can lead to an underestimate of the overall uncertainty associated with R0, which, due to the non-linearity of epidemic processes, may result in a mis-estimate of epidemic intensity and miscalculated expenditures associated with public-health interventions. In this paper, we utilize a Bayesian method for quantifying the overall uncertainty arising from differences in population-specific basic reproductive rates. Using this method, we fit spatial and non-spatial susceptible-exposed-infected-recovered (SEIR) models to a series of 13 smallpox outbreaks. Five outbreaks occurred in populations that had been previously exposed to smallpox, while the remaining eight occurred in Native-American populations that were naïve to the disease at the time. The Native-American outbreaks were close in a spatial and temporal sense. Using Bayesian Information Criterion (BIC), we show that the best model includes population-specific R0 values. These differences in R0 values may, in part, be due to differences in genetic background, social structure, or food and water availability. As a result of these inter-population differences, the overall uncertainty associated with the “population average” value of smallpox R0 is larger, a finding that can have important consequences for controlling epidemics. In general, Bayesian hierarchical models are able to properly account for the uncertainty associated with multiple epidemics, provide a clearer understanding of variability in epidemic dynamics, and yield a better assessment of the range of potential risks and consequences that decision makers face. PMID:24021521

  10. Food Security Under Shifting Economic, Demographic, and Climatic Conditions (Invited)

    NASA Astrophysics Data System (ADS)

    Naylor, R. L.

    2013-12-01

    Global demand for food, feed, and fuel will continue to rise in a more populous and affluent world. Meeting this demand in the future will become increasingly challenging with global climate change; when production shocks stemming from climate variability are added to the new mean climate state, food markets could become more volatile. This talk will focus on the interacting market effects of demand and supply for major food commodities, with an eye on climate-related supply trends and shocks. Lessons from historical patterns of climate variability (e.g., ENSO and its global teleconnections) will be used to infer potential food security outcomes in the event of abrupt changes in the mean climate state. Domestic food and trade policy responses to crop output and price volatility in key producing and consuming nations, such as export bans and import tariffs, will be discussed as a potentially major destabilizing force, underscoring the important influence of uncertainty in achieving--or failing to achieve--food security.

  11. Nanotechnologies in agriculture and food - an overview of different fields of application, risk assessment and public perception.

    PubMed

    Grobe, Antje; Rissanen, Mikko E

    2012-12-01

    Nanomaterials in agriculture and food are key issues of public and regulatory interest. Over the past ten years, patents for nanotechnological applications in the field of food and agriculture have become abundant. Uncertainty prevails however regarding their current development status and presence in the consumer market. Thus, the discussion on nanotechnologies in the food sector with its specific public perception of benefits and risks and the patterns of communication are becoming similar to the debate on genetically modified organisms. The food industry's silence in communication increased mistrust of consumer organisations and policy makers. The article discusses the background of the current regulatory debates, starting with the EU recommendation for defining nanomaterials, provides an overview of possible fields of application in agriculture and food industries and discusses risk assessment and the public debate on benefits and risks. Communicative recommendations are directed at researchers, the food industry and regulators in order to increase trust both in stakeholders, risk management and regulatory processes.

  12. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  13. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, Keith

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less

  14. 77 FR 42654 - Trifloxystrobin; Pesticide Tolerance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-20

    ... code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS code 32532). This... filing. III. Aggregate Risk Assessment and Determination of Safety Section 408(b)(2)(A)(i) of FFDCA... dose at which adverse effects of concern are identified (the LOAEL). Uncertainty/safety factors are...

  15. By how much would limiting TV food advertising reduce childhood obesity?

    PubMed Central

    Van Beeck, Eduard F.; Barendregt, Jan J.; Mackenbach, Johan P.

    2009-01-01

    Background: There is evidence suggesting that food advertising causes childhood obesity. The strength of this effect is unclear. To inform decisions on whether to restrict advertising opportunities, we estimate how much of the childhood obesity prevalence is attributable to food advertising on television (TV). Methods: We constructed a mathematical simulation model to estimate the potential effects of reducing the exposure of 6- to 12-year-old US children to TV advertising for food on the prevalence of overweight and obesity. Model input was based on body measurements from NHANES 2003–04, the CDC-2000 cut-offs for weight categories, and literature that relates advertising to consumption levels and consumption to body mass. In an additional analysis we use a Delphi study to obtain experts’ estimates of the effect of advertising on consumption. Results: Based on literature findings, the model predicts that reducing the exposure to zero would decrease the average BMI by 0.38 kg/m−2 and lower the prevalence of obesity from 17.8 to 15.2% (95% uncertainty interval 14.8–15.6) for boys and from 15.9% to 13.5% (13.1–13.8) for girls. When estimates are based on expert opinion, these values are 11.0% (7.7–14.0) and 9.9% (7.2–12.4), respectively. Conclusion: This study suggests that from one in seven up to one in three obese children in the USA might not have been obese in the absence of advertising for unhealthy food on TV. Limiting the exposure of children to marketing of energy-dense food could be part of a broader effort to make children's diets healthier. PMID:19324935

  16. Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak

    2011-01-01

    This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.

  17. Climate impacts on human livelihoods: where uncertainty matters in projections of water availability

    NASA Astrophysics Data System (ADS)

    Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.

    2014-03-01

    Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target-measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models as well as greenhouse gas scenarios are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure Adequate Human livelihood conditions for wEll-being And Development (AHEAD). Based on a transdisciplinary sample of influential concepts addressing human well-being, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows identifying and differentiating uncertainty of climate and impact model projections. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that in many countries today, livelihood conditions are compromised by water scarcity. However, more often, AHEAD fulfilment is limited through other elements. Moreover, the analysis shows that for 44 out of 111 countries, the water-specific uncertainty ranges are outside relevant thresholds for AHEAD, and therefore do not contribute to the overall uncertainty about climate change impacts on livelihoods. The AHEAD method presented here, together with first results, forms an important step towards making scientific results more applicable for policy-decisions.

  18. Spatial and Temporal Uncertainty of Crop Yield Aggregations

    NASA Technical Reports Server (NTRS)

    Porwollik, Vera; Mueller, Christoph; Elliott, Joshua; Chryssanthacopoulos, James; Iizumi, Toshichika; Ray, Deepak K.; Ruane, Alex C.; Arneth, Almut; Balkovic, Juraj; Ciais, Philippe; hide

    2016-01-01

    The aggregation of simulated gridded crop yields to national or regional scale requires information on temporal and spatial patterns of crop-specific harvested areas. This analysis estimates the uncertainty of simulated gridded yield time series related to the aggregation with four different harvested area data sets. We compare aggregated yield time series from the Global Gridded Crop Model Inter-comparison project for four crop types from 14 models at global, national, and regional scale to determine aggregation-driven differences in mean yields and temporal patterns as measures of uncertainty. The quantity and spatial patterns of harvested areas differ for individual crops among the four datasets applied for the aggregation. Also simulated spatial yield patterns differ among the 14 models. These differences in harvested areas and simulated yield patterns lead to differences in aggregated productivity estimates, both in mean yield and in the temporal dynamics. Among the four investigated crops, wheat yield (17% relative difference) is most affected by the uncertainty introduced by the aggregation at the global scale. The correlation of temporal patterns of global aggregated yield time series can be as low as for soybean (r = 0.28).For the majority of countries, mean relative differences of nationally aggregated yields account for10% or less. The spatial and temporal difference can be substantial higher for individual countries. Of the top-10 crop producers, aggregated national multi-annual mean relative difference of yields can be up to 67% (maize, South Africa), 43% (wheat, Pakistan), 51% (rice, Japan), and 427% (soybean, Bolivia).Correlations of differently aggregated yield time series can be as low as r = 0.56 (maize, India), r = 0.05*Corresponding (wheat, Russia), r = 0.13 (rice, Vietnam), and r = -0.01 (soybean, Uruguay). The aggregation to sub-national scale in comparison to country scale shows that spatial uncertainties can cancel out in countries with large harvested areas per crop type. We conclude that the aggregation uncertainty can be substantial for crop productivity and production estimations in the context of food security, impact assessment, and model evaluation exercises.

  19. Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares

    NASA Technical Reports Server (NTRS)

    Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.

    2012-01-01

    A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.

  20. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    PubMed

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Uncertainty in Operational Atmospheric Analyses and Re-Analyses

    NASA Astrophysics Data System (ADS)

    Langland, R.; Maue, R. N.

    2016-12-01

    This talk will describe uncertainty in atmospheric analyses of wind and temperature produced by operational forecast models and in re-analysis products. Because the "true" atmospheric state cannot be precisely quantified, there is necessarily error in every atmospheric analysis, and this error can be estimated by computing differences ( variance and bias) between analysis products produced at various centers (e.g., ECMWF, NCEP, U.S Navy, etc.) that use independent data assimilation procedures, somewhat different sets of atmospheric observations and forecast models with different resolutions, dynamical equations, and physical parameterizations. These estimates of analysis uncertainty provide a useful proxy to actual analysis error. For this study, we use a unique multi-year and multi-model data archive developed at NRL-Monterey. It will be shown that current uncertainty in atmospheric analyses is closely correlated with the geographic distribution of assimilated in-situ atmospheric observations, especially those provided by high-accuracy radiosonde and commercial aircraft observations. The lowest atmospheric analysis uncertainty is found over North America, Europe and Eastern Asia, which have the largest numbers of radiosonde and commercial aircraft observations. Analysis uncertainty is substantially larger (by factors of two to three times) in most of the Southern hemisphere, the North Pacific ocean, and under-developed nations of Africa and South America where there are few radiosonde or commercial aircraft data. It appears that in regions where atmospheric analyses depend primarily on satellite radiance observations, analysis uncertainty of both temperature and wind remains relatively high compared to values found over North America and Europe.

  2. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    NASA Astrophysics Data System (ADS)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.

  3. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  4. BMI and Healthcare Cost Impact of Eliminating Tax Subsidy for Advertising Unhealthy Food to Youth.

    PubMed

    Sonneville, Kendrin R; Long, Michael W; Ward, Zachary J; Resch, Stephen C; Wang, Y Claire; Pomeranz, Jennifer L; Moodie, Marj L; Carter, Rob; Sacks, Gary; Swinburn, Boyd A; Gortmaker, Steven L

    2015-07-01

    Food and beverage TV advertising contributes to childhood obesity. The current tax treatment of advertising as an ordinary business expense in the U.S. subsidizes marketing of nutritionally poor foods and beverages to children. This study models the effect of a national intervention that eliminates the tax subsidy of advertising nutritionally poor foods and beverages on TV to children aged 2-19 years. We adapted and modified the Assessing Cost Effectiveness framework and methods to create the Childhood Obesity Intervention Cost Effectiveness Study model to simulate the impact of the intervention over the 2015-2025 period for the U.S. population, including short-term effects on BMI and 10-year healthcare expenditures. We simulated uncertainty intervals (UIs) using probabilistic sensitivity analysis and discounted outcomes at 3% annually. Data were analyzed in 2014. We estimated the intervention would reduce an aggregate 2.13 million (95% UI=0.83 million, 3.52 million) BMI units in the population and would cost $1.16 per BMI unit reduced (95% UI=$0.51, $2.63). From 2015 to 2025, the intervention would result in $352 million (95% UI=$138 million, $581 million) in healthcare cost savings and gain 4,538 (95% UI=1,752, 7,489) quality-adjusted life-years. Eliminating the tax subsidy of TV advertising costs for nutritionally poor foods and beverages advertised to children and adolescents would likely be a cost-saving strategy to reduce childhood obesity and related healthcare expenditures. Copyright © 2015 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  5. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  6. AN IMPROVEMENT TO THE MOUSE COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM

    EPA Science Inventory

    The original MOUSE (Modular Oriented Uncertainty System) system was designed to deal with the problem of uncertainties in Environmental engineering calculations, such as a set of engineering cast or risk analysis equations. It was especially intended for use by individuals with l...

  7. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  8. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  9. Uncertainty analysis of diffuse-gray radiation enclosure problems: A hypersensitive case study

    NASA Technical Reports Server (NTRS)

    Taylor, Robert P.; Luck, Rogelio; Hodge, B. K.; Steele, W. Glenn

    1993-01-01

    An uncertainty analysis of diffuse-gray enclosure problems is presented. The genesis was a diffuse-gray enclosure problem which proved to be hypersensitive to the specification of view factors. This genesis is discussed in some detail. The uncertainty analysis is presented for the general diffuse-gray enclosure problem and applied to the hypersensitive case study. It was found that the hypersensitivity could be greatly reduced by enforcing both closure and reciprocity for the view factors. The effects of uncertainties in the surface emissivities and temperatures are also investigated.

  10. 78 FR 23497 - Propiconazole; Pesticide Tolerances

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-19

    ...). Animal production (NAICS code 112). Food manufacturing (NAICS code 311). Pesticide manufacturing (NAICS.... Aggregate Risk Assessment and Determination of Safety Section 408(b)(2)(A)(i) of FFDCA allows EPA to... dose at which adverse effects of concern are identified (the LOAEL). Uncertainty/safety factors are...

  11. K-State Problem Identification Rating Scales for College Students

    ERIC Educational Resources Information Center

    Robertson, John M.; Benton, Stephen L.; Newton, Fred B.; Downey, Ronald G.; Marsh, Patricia A.; Benton, Sheryl A.; Tseng, Wen-Chih; Shin, Kang-Hyun

    2006-01-01

    The K-State Problem Identification Rating Scales, a new screening instrument for college counseling centers, gathers information about clients' presenting symptoms, functioning levels, and readiness to change. Three studies revealed 7 scales: Mood Difficulties, Learning Problems, Food Concerns, Interpersonal Conflicts, Career Uncertainties,…

  12. An Adaptation Dilemma Caused by Impacts-Modeling Uncertainty

    NASA Astrophysics Data System (ADS)

    Frieler, K.; Müller, C.; Elliott, J. W.; Heinke, J.; Arneth, A.; Bierkens, M. F.; Ciais, P.; Clark, D. H.; Deryng, D.; Doll, P. M.; Falloon, P.; Fekete, B. M.; Folberth, C.; Friend, A. D.; Gosling, S. N.; Haddeland, I.; Khabarov, N.; Lomas, M. R.; Masaki, Y.; Nishina, K.; Neumann, K.; Oki, T.; Pavlick, R.; Ruane, A. C.; Schmid, E.; Schmitz, C.; Stacke, T.; Stehfest, E.; Tang, Q.; Wisser, D.

    2013-12-01

    Ensuring future well-being for a growing population under either strong climate change or an aggressive mitigation strategy requires a subtle balance of potentially conflicting response measures. In the case of competing goals, uncertainty in impact estimates plays a central role when high confidence in achieving a primary objective (such as food security) directly implies an increased probability of uncertainty induced failure with regard to a competing target (such as climate protection). We use cross sectoral consistent multi-impact model simulations from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP, www.isi-mip.org) to illustrate this uncertainty dilemma: RCP projections from 7 global crop, 11 hydrological, and 7 biomes models are combined to analyze irrigation and land use changes as possible responses to climate change and increasing crop demand due to population growth and economic development. We show that - while a no-regrets option with regard to climate protection - additional irrigation alone is not expected to balance the demand increase by 2050. In contrast, a strong expansion of cultivated land closes the projected production-demand gap in some crop models. However, it comes at the expense of a loss of natural carbon sinks of order 50%. Given the large uncertainty of state of the art crop model projections even these strong land use changes would not bring us ';on the safe side' with respect to food supply. In a world where increasing carbon emissions continue to shrink the overall solution space, we demonstrate that current impacts-modeling uncertainty is a luxury we cannot afford. ISI-MIP is intended to provide cross sectoral consistent impact projections for model intercomparison and improvement as well as cross-sectoral integration. The results presented here were generated within the first Fast-Track phase of the project covering global impact projections. The second phase will also include regional projections. It is the aim of the project to build up a CMIP like open archive for climate impact projections allowing for the necessary sharpening the our picture of a 1,2,3,4 degrees warmer world.

  13. Prospects for the use of plant cell cultures in food biotechnology.

    PubMed

    Davies, Kevin M; Deroles, Simon C

    2014-04-01

    Plant cell cultures can offer continuous production systems for high-value food and health ingredients, independent of geographical or environmental variations and constraints. Yet despite many improvements in culture technologies, cell line selection, and bioreactor design, there are few commercial successes. This is principally due to the culture yield and market price of food products not being sufficient to cover the plant cell culture production costs. A better understanding of the underpinning biological mechanisms that control the target metabolite biosynthetic pathways may allow the metabolic engineering of cell lines to provide for economically competitive product yields. However, uncertainty around the regulatory and public acceptance of products derived from engineered cell cultures presents a barrier to the uptake of the technology by food product companies. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  15. Methods for Estimating the Uncertainty in Emergy Table-Form Models

    EPA Science Inventory

    Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...

  16. A Framework for the Cross-Sectoral Integration of Multi-Model Impact Projections: Land Use Decisions Under Climate Impacts Uncertainties

    NASA Technical Reports Server (NTRS)

    Frieler, K.; Elliott, Joshua; Levermann, A.; Heinke, J.; Arneth, A.; Bierkens, M. F. P.; Ciais, P.; Clark, D. B.; Deryng, D.; Doll, P.; hide

    2015-01-01

    Climate change and its impacts already pose considerable challenges for societies that will further increase with global warming (IPCC, 2014a, b). Uncertainties of the climatic response to greenhouse gas emissions include the potential passing of large-scale tipping points (e.g. Lenton et al., 2008; Levermann et al., 2012; Schellnhuber, 2010) and changes in extreme meteorological events (Field et al., 2012) with complex impacts on societies (Hallegatte et al., 2013). Thus climate change mitigation is considered a necessary societal response for avoiding uncontrollable impacts (Conference of the Parties, 2010). On the other hand, large-scale climate change mitigation itself implies fundamental changes in, for example, the global energy system. The associated challenges come on top of others that derive from equally important ethical imperatives like the fulfilment of increasing food demand that may draw on the same resources. For example, ensuring food security for a growing population may require an expansion of cropland, thereby reducing natural carbon sinks or the area available for bio-energy production. So far, available studies addressing this problem have relied on individual impact models, ignoring uncertainty in crop model and biome model projections. Here, we propose a probabilistic decision framework that allows for an evaluation of agricultural management and mitigation options in a multi-impactmodel setting. Based on simulations generated within the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), we outline how cross-sectorally consistent multi-model impact simulations could be used to generate the information required for robust decision making. Using an illustrative future land use pattern, we discuss the trade-off between potential gains in crop production and associated losses in natural carbon sinks in the new multiple crop- and biome-model setting. In addition, crop and water model simulations are combined to explore irrigation increases as one possible measure of agricultural intensification that could limit the expansion of cropland required in response to climate change and growing food demand. This example shows that current impact model uncertainties pose an important challenge to long-term mitigation planning and must not be ignored in long-term strategic decision making

  17. A framework for the cross-sectoral integration of multi-model impact projections: land use decisions under climate impacts uncertainties

    NASA Astrophysics Data System (ADS)

    Frieler, K.; Levermann, A.; Elliott, J.; Heinke, J.; Arneth, A.; Bierkens, M. F. P.; Ciais, P.; Clark, D. B.; Deryng, D.; Döll, P.; Falloon, P.; Fekete, B.; Folberth, C.; Friend, A. D.; Gellhorn, C.; Gosling, S. N.; Haddeland, I.; Khabarov, N.; Lomas, M.; Masaki, Y.; Nishina, K.; Neumann, K.; Oki, T.; Pavlick, R.; Ruane, A. C.; Schmid, E.; Schmitz, C.; Stacke, T.; Stehfest, E.; Tang, Q.; Wisser, D.; Huber, V.; Piontek, F.; Warszawski, L.; Schewe, J.; Lotze-Campen, H.; Schellnhuber, H. J.

    2015-07-01

    Climate change and its impacts already pose considerable challenges for societies that will further increase with global warming (IPCC, 2014a, b). Uncertainties of the climatic response to greenhouse gas emissions include the potential passing of large-scale tipping points (e.g. Lenton et al., 2008; Levermann et al., 2012; Schellnhuber, 2010) and changes in extreme meteorological events (Field et al., 2012) with complex impacts on societies (Hallegatte et al., 2013). Thus climate change mitigation is considered a necessary societal response for avoiding uncontrollable impacts (Conference of the Parties, 2010). On the other hand, large-scale climate change mitigation itself implies fundamental changes in, for example, the global energy system. The associated challenges come on top of others that derive from equally important ethical imperatives like the fulfilment of increasing food demand that may draw on the same resources. For example, ensuring food security for a growing population may require an expansion of cropland, thereby reducing natural carbon sinks or the area available for bio-energy production. So far, available studies addressing this problem have relied on individual impact models, ignoring uncertainty in crop model and biome model projections. Here, we propose a probabilistic decision framework that allows for an evaluation of agricultural management and mitigation options in a multi-impact-model setting. Based on simulations generated within the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP), we outline how cross-sectorally consistent multi-model impact simulations could be used to generate the information required for robust decision making. Using an illustrative future land use pattern, we discuss the trade-off between potential gains in crop production and associated losses in natural carbon sinks in the new multiple crop- and biome-model setting. In addition, crop and water model simulations are combined to explore irrigation increases as one possible measure of agricultural intensification that could limit the expansion of cropland required in response to climate change and growing food demand. This example shows that current impact model uncertainties pose an important challenge to long-term mitigation planning and must not be ignored in long-term strategic decision making.

  18. Irreducible Uncertainty in Terrestrial Carbon Projections

    NASA Astrophysics Data System (ADS)

    Lovenduski, N. S.; Bonan, G. B.

    2016-12-01

    We quantify and isolate the sources of uncertainty in projections of carbon accumulation by the ocean and terrestrial biosphere over 2006-2100 using output from Earth System Models participating in the 5th Coupled Model Intercomparison Project. We consider three independent sources of uncertainty in our analysis of variance: (1) internal variability, driven by random, internal variations in the climate system, (2) emission scenario, driven by uncertainty in future radiative forcing, and (3) model structure, wherein different models produce different projections given the same emission scenario. Whereas uncertainty in projections of ocean carbon accumulation by 2100 is 100 Pg C and driven primarily by emission scenario, uncertainty in projections of terrestrial carbon accumulation by 2100 is 50% larger than that of the ocean, and driven primarily by model structure. This structural uncertainty is correlated with emission scenario: the variance associated with model structure is an order of magnitude larger under a business-as-usual scenario (RCP8.5) than a mitigation scenario (RCP2.6). In an effort to reduce this structural uncertainty, we apply various model weighting schemes to our analysis of variance in terrestrial carbon accumulation projections. The largest reductions in uncertainty are achieved when giving all the weight to a single model; here the uncertainty is of a similar magnitude to the ocean projections. Such an analysis suggests that this structural uncertainty is irreducible given current terrestrial model development efforts.

  19. Consumer acceptance of technology-based food innovations: lessons for the future of nutrigenomics.

    PubMed

    Ronteltap, A; van Trijp, J C M; Renes, R J; Frewer, L J

    2007-07-01

    Determinants of consumer adoption of innovations have been studied from different angles and from the perspectives of various disciplines. In the food area, the literature is dominated by a focus on consumer concern. This paper reviews previous research into acceptance of technology-based innovation from both inside and outside the food domain, extracts key learnings from this literature and integrates them into a new conceptual framework for consumer acceptance of technology-based food innovations. The framework distinguishes 'distal' and 'proximal' determinants of acceptance. Distal factors (characteristics of the innovation, the consumer and the social system) influence consumers' intention to accept an innovation through proximal factors (perceived cost/benefit considerations, perceptions of risk and uncertainty, social norm and perceived behavioural control). The framework's application as a tool to anticipate consumer reaction to future innovations is illustrated for an actual technology-based innovation in food science, nutrigenomics (the interaction between nutrition and human genetics).

  20. Fungal treatment of humic-rich industrial wastewater: application of white rot fungi in remediation of food-processing wastewater.

    PubMed

    Zahmatkesh, Mostafa; Spanjers, Henri; van Lier, Jules B

    2017-11-01

    This paper presents the results of fungal treatment of a real industrial wastewater (WW), providing insight into the main mechanisms involved and clarifying some ambiguities and uncertainties in the previous reports. In this regard, the mycoremediation potentials of four strains of white rot fungi (WRF): Phanerochaete chrysosporium, Trametes versicolor, Pleurotus ostreatus and Pleurotus pulmonarius were tested to remove humic acids (HA) from a real humic-rich industrial treated WW of a food-processing plant. The HA removal was assessed by color measurement and size-exclusion chromatography (SEC) analysis. T. versicolor showed the best decolorization efficiency of 90% and yielded more than 45% degradation of HA, which was the highest among the tested fungal strains. The nitrogen limitation was studied and results showed that it affected the fungal extracellular laccase and manganese peroxidase (MnP) activities. The results of the SEC analysis revealed that the mechanism of HA removal by WRF involves degradation of large HA molecules to smaller molecules, conversion of HA to fulvic acid-like molecules and also biosorption of HA by fungal mycelia. The effect of HS on the growth of WRF was investigated and results showed that the inhibition or stimulation of growth differs among the fungal strains.

  1. Cancer risk of polycyclic aromatic hydrocarbons (PAHs) in the soils from Jiaozhou Bay wetland.

    PubMed

    Yang, Wei; Lang, Yinhai; Li, Guoliang

    2014-10-01

    To estimate the cancer risk exposed to the PAHs in Jiaozhou Bay wetland soils, a probabilistic health risk assessment was conducted based on Monte Carlo simulations. A sensitivity analysis was performed to determine the input variables that contribute most to the cancer risk assessment. Three age groups were selected to estimate the cancer risk via four exposure pathways (soil ingestion, food ingestion, dermal contact and inhalation). The results revealed that the 95th percentiles cancer risks for children, teens and adults were 9.11×10(-6), 1.04×10(-5) and 7.08×10(-5), respectively. The cancer risks for three age groups were at acceptable range (10(-6)-10(-4)), indicating no potential cancer risk. For different exposure pathways, food ingestion was the major exposure pathway. For 7 carcinogenic PAHs, the cancer risk caused by BaP was the highest. Sensitivity analysis demonstrated that the parameters of exposure duration (ED) and sum of converted 7 carcinogenic PAHs concentrations in soil based on BaPeq (CSsoil) contribute most to the total uncertainty. This study provides a comprehensive risk assessment on carcinogenic PAHs in Jiaozhou Bay wetland soils, and might be useful in providing potential strategies of cancer risk prevention and controlling. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  3. Uncertainty Analysis of Consequence Management (CM) Data Products.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  4. Influences of system uncertainties on the numerical transfer path analysis of engine systems

    NASA Astrophysics Data System (ADS)

    Acri, A.; Nijman, E.; Acri, A.; Offner, G.

    2017-10-01

    Practical mechanical systems operate with some degree of uncertainty. In numerical models uncertainties can result from poorly known or variable parameters, from geometrical approximation, from discretization or numerical errors, from uncertain inputs or from rapidly changing forcing that can be best described in a stochastic framework. Recently, random matrix theory was introduced to take parameter uncertainties into account in numerical modeling problems. In particular in this paper, Wishart random matrix theory is applied on a multi-body dynamic system to generate random variations of the properties of system components. Multi-body dynamics is a powerful numerical tool largely implemented during the design of new engines. In this paper the influence of model parameter variability on the results obtained from the multi-body simulation of engine dynamics is investigated. The aim is to define a methodology to properly assess and rank system sources when dealing with uncertainties. Particular attention is paid to the influence of these uncertainties on the analysis and the assessment of the different engine vibration sources. Examples of the effects of different levels of uncertainties are illustrated by means of examples using a representative numerical powertrain model. A numerical transfer path analysis, based on system dynamic substructuring, is used to derive and assess the internal engine vibration sources. The results obtained from this analysis are used to derive correlations between parameter uncertainties and statistical distribution of results. The derived statistical information can be used to advance the knowledge of the multi-body analysis and the assessment of system sources when uncertainties in model parameters are considered.

  5. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical specificity.

  6. Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.

    PubMed

    Meyer, Veronika R

    2003-09-01

    Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.

  7. Analysis of uncertainties in the estimates of nitrous oxide and methane emissions in the UK's greenhouse gas inventory for agriculture

    NASA Astrophysics Data System (ADS)

    Milne, Alice E.; Glendining, Margaret J.; Bellamy, Pat; Misselbrook, Tom; Gilhespy, Sarah; Rivas Casado, Monica; Hulin, Adele; van Oijen, Marcel; Whitmore, Andrew P.

    2014-01-01

    The UK's greenhouse gas inventory for agriculture uses a model based on the IPCC Tier 1 and Tier 2 methods to estimate the emissions of methane and nitrous oxide from agriculture. The inventory calculations are disaggregated at country level (England, Wales, Scotland and Northern Ireland). Before now, no detailed assessment of the uncertainties in the estimates of emissions had been done. We used Monte Carlo simulation to do such an analysis. We collated information on the uncertainties of each of the model inputs. The uncertainties propagate through the model and result in uncertainties in the estimated emissions. Using a sensitivity analysis, we found that in England and Scotland the uncertainty in the emission factor for emissions from N inputs (EF1) affected uncertainty the most, but that in Wales and Northern Ireland, the emission factor for N leaching and runoff (EF5) had greater influence. We showed that if the uncertainty in any one of these emission factors is reduced by 50%, the uncertainty in emissions of nitrous oxide reduces by 10%. The uncertainty in the estimate for the emissions of methane emission factors for enteric fermentation in cows and sheep most affected the uncertainty in methane emissions. When inventories are disaggregated (as that for the UK is) correlation between separate instances of each emission factor will affect the uncertainty in emissions. As more countries move towards inventory models with disaggregation, it is important that the IPCC give firm guidance on this topic.

  8. Uncertainties in internal gas counting

    NASA Astrophysics Data System (ADS)

    Unterweger, M.; Johansson, L.; Karam, L.; Rodrigues, M.; Yunoki, A.

    2015-06-01

    The uncertainties in internal gas counting will be broken down into counting uncertainties and gas handling uncertainties. Counting statistics, spectrum analysis, and electronic uncertainties will be discussed with respect to the actual counting of the activity. The effects of the gas handling and quantities of counting and sample gases on the uncertainty in the determination of the activity will be included when describing the uncertainties arising in the sample preparation.

  9. Determination of total arsenic in fish by hydride-generation atomic absorption spectrometry: method validation, traceability and uncertainty evaluation

    NASA Astrophysics Data System (ADS)

    Nugraha, W. C.; Elishian, C.; Ketrin, R.

    2017-03-01

    Fish containing arsenic compound is one of the important indicators of arsenic contamination in water monitoring. The high level of arsenic in fish is due to absorption through food chain and accumulated in their habitat. Hydride generation (HG) coupled with atomic absorption spectrometric (AAS) detection is one of the most popular techniques employed for arsenic determination in a variety of matrices including fish. This study aimed to develop a method for the determination of total arsenic in fish by HG-AAS. The method for sample preparation from American of Analytical Chemistry (AOAC) Method 999.10-2005 was adopted for acid digestion using microwave digestion system and AOAC Method 986.15 - 2005 for dry ashing. The method was developed and validated using Certified Reference Material DORM 3 Fish Protein for trace metals for ensuring the accuracy and the traceability of the results. The sources of uncertainty of the method were also evaluated. By using the method, it was found that the total arsenic concentration in the fish was 45.6 ± 1.22 mg.Kg-1 with a coverage factor of equal to 2 at 95% of confidence level. Evaluation of uncertainty was highly influenced by the calibration curve. This result was also traceable to International Standard System through analysis of Certified Reference Material DORM 3 with 97.5% of recovery. In summary, it showed that method of preparation and HG-AAS technique for total arsenic determination in fish were valid and reliable.

  10. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  11. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  12. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  13. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle cost...

  14. MOUSE (MODULAR ORIENTED UNCERTAINTY SYSTEM): A COMPUTERIZED UNCERTAINTY ANALYSIS SYSTEM (FOR MICRO- COMPUTERS)

    EPA Science Inventory

    Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...

  15. Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty

    EPA Science Inventory

    In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...

  16. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  17. Still serving hot soup? Two hundred years of a charitable food sector in Australia: a narrative review.

    PubMed

    Lindberg, Rebecca; Whelan, Jillian; Lawrence, Mark; Gold, Lisa; Friel, Sharon

    2015-08-01

    Despite the importance of the charitable food sector for a proportion of the Australian population, there is uncertainty about its present and future contributions to wellbeing. This paper describes its nature and examines its scope for improving health and food security. The review, using systematic methods for public health research, identified peer-reviewed and grey literature relevant to Australian charitable food programs (2002 to 2012). Seventy publications met the criteria and informed this paper. The sector includes food banks, more than 3,000 community agencies and 800 school breakfast programs. It provides food for up to two million people annually. The scope extends beyond emergency food relief and includes case management, advocacy and other support. Weaknesses include a food supply that is sub-optimal, resource limitations and lack of evidence to evaluate or support their work towards food security. The sector supports people experiencing disadvantage and involves multiple organisations, working in a variety of settings, to provide food for up to 8% of the population. The limits on the sector's capacity to address food insecurity by itself must be acknowledged so that civil society, government and the food industry can support sufficient, nutritious and affordable food for all. © 2015 Public Health Association of Australia.

  18. `spup' - An R Package for Analysis of Spatial Uncertainty Propagation and Application to Trace Gas Emission Simulations

    NASA Astrophysics Data System (ADS)

    Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.

    2016-12-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.

  19. Earth Science Data and Models for Improved Targeting of Humanitarian Aid

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.

    2011-01-01

    Humanitarian assistance to developing countries has long focused on countries that have political, economic and strategic interest to the United States. Recent changes in global security concerns have heightened the perception that humanitarian action is becoming increasingly politicized. This is seen to be largely driven by the 'global war on terror' along with a push by donors and the United Nations for closer integration between humanitarian action and diplomatic, military and other spheres of engagement in conflict and crisis-affected states (HPG 2010). As we enter an era of rising commodity prices and increasing uncertainty in global food production due to a changing climate, scientific data and analysis will be increasingly important to improve the targeting of humanitarian assistance. Earth science data enables appropriate humanitarian response to complex food emergencies that arise in regions outside the areas of current strategic and security focus. As the climate changes, new places will become vulnerable to food insecurity and will need emergency assistance. Earth science data and multidisciplinary models will enable an information-based comparison of need that goes beyond strategic and political considerations to identify new hotspots of food insecurity as they emerge. These analyses will improve aid targeting and timeliness while reducing strategic risk by highlighting new regions at risk of crisis in a rapidly changing world. Improved targeting with respect to timing and location could reduce cost while increasing the likelihood that those who need aid get it.

  20. Uncertainty analysis and robust trajectory linearization control of a flexible air-breathing hypersonic vehicle

    NASA Astrophysics Data System (ADS)

    Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang

    2014-08-01

    Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.

  1. Uncertainty Analysis of Instrument Calibration and Application

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Experimental aerodynamic researchers require estimated precision and bias uncertainties of measured physical quantities, typically at 95 percent confidence levels. Uncertainties of final computed aerodynamic parameters are obtained by propagation of individual measurement uncertainties through the defining functional expressions. In this paper, rigorous mathematical techniques are extended to determine precision and bias uncertainties of any instrument-sensor system. Through this analysis, instrument uncertainties determined through calibration are now expressed as functions of the corresponding measurement for linear and nonlinear univariate and multivariate processes. Treatment of correlated measurement precision error is developed. During laboratory calibration, calibration standard uncertainties are assumed to be an order of magnitude less than those of the instrument being calibrated. Often calibration standards do not satisfy this assumption. This paper applies rigorous statistical methods for inclusion of calibration standard uncertainty and covariance due to the order of their application. The effects of mathematical modeling error on calibration bias uncertainty are quantified. The effects of experimental design on uncertainty are analyzed. The importance of replication is emphasized, techniques for estimation of both bias and precision uncertainties using replication are developed. Statistical tests for stationarity of calibration parameters over time are obtained.

  2. Risk Assessment Terminology: Risk Communication Part 1

    PubMed Central

    Bentley, Stefano; Giacometti, Federica; Piva, Silvia; Serraino, Andrea

    2016-01-01

    The paper describes the terminology of risk communication in the view of food safety: the theory of stakeholders, the citizens’ involvement and the community interest and consultation are reported. Different aspects of risk communication (public communication, scientific uncertainty, trust, care, consensus and crisis communication) are discussed. PMID:27800435

  3. Estimates of Global Rangeland Net Primary Productivity and its Consumption Based on Climate and Livestock Distribution Data

    NASA Astrophysics Data System (ADS)

    Asrar, G.; Wolf, J.; Rafique, R.; West, T. O.; Ogle, S. M.

    2016-12-01

    Rangelands play an important role in providing ecosystem services such as food, forage, and fuels in many parts of the world. The net primary productivity (NPP), a difference between CO2 fixed by plants and CO2 lost to autotrophic respiration, is a good indicator of the productivity of rangeland ecosystems, and their contribution to the cycling of carbon in the Earth system. In this study, we estimated the NPP of global rangelands, the consumption thereof by grazing livestock, and associated uncertainties, to better understand and quantify the contribution of rangelands to land-based carbon storage. We estimated rangeland NPP using mean annual precipitation data from Climate Research Unit (CRU), and a regression model based on global observations (Del Grosso et al., 2008). Spatial distributions of annual livestock consumption of rangeland NPP (Wolf et al., 2015) were combined with gridded annual rangeland NPP for the years 2000 - 2011. The uncertainty analysis of these estimates was conducted using a Monte Carlo approach. The rangeland NPP estimates with associated uncertainties were also compared with the total modeled GPP estimates obtained from vegetation dynamic model simulations. Our results showed that mean above-ground NPP of rangelands is 1017.5 MgC/km2, while mean below-ground NPP is 847.6 MgC/km2. The total rangeland NPP represents a significant portion of the total NPP of the terrestrial ecosystem. The livestock area requirements used to geographically distribute livestock spatially are based on optimal pasturage and are low relative to area requirements on less productive land. Even so, ca. 90% of annual livestock consumption of rangeland NPP were met with no adjustment of livestock distributions. Moreover, the results of this study allowed us to explicitly quantify the temporal and spatial variations of rangeland NPP under different climatic conditions. Uncertainty analysis was helpful in identifying the strength and weakness of the methods used to estimate rangeland NPP. Overall, the results from this study are useful in quantifying the contribution of rangelands to the carbon cycle and for providing geospatially distributed carbon fluxes associated with the production and consumption of rangeland biomass.

  4. Uncertainties in Past and Future Global Water Availability

    NASA Astrophysics Data System (ADS)

    Sheffield, J.; Kam, J.

    2014-12-01

    Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.

  5. Social norms and their influence on eating behaviours.

    PubMed

    Higgs, Suzanne

    2015-03-01

    Social norms are implicit codes of conduct that provide a guide to appropriate action. There is ample evidence that social norms about eating have a powerful effect on both food choice and amounts consumed. This review explores the reasons why people follow social eating norms and the factors that moderate norm following. It is proposed that eating norms are followed because they provide information about safe foods and facilitate food sharing. Norms are a powerful influence on behaviour because following (or not following) norms is associated with social judgements. Norm following is more likely when there is uncertainty about what constitutes correct behaviour and when there is greater shared identity with the norm referent group. Social norms may affect food choice and intake by altering self-perceptions and/or by altering the sensory/hedonic evaluation of foods. The same neural systems that mediate the rewarding effects of food itself are likely to reinforce the following of eating norms. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Importance analysis for Hudson River PCB transport and fate model parameters using robust sensitivity studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, S.; Toll, J.; Cothern, K.

    1995-12-31

    The authors have performed robust sensitivity studies of the physico-chemical Hudson River PCB model PCHEPM to identify the parameters and process uncertainties contributing the most to uncertainty in predictions of water column and sediment PCB concentrations, over the time period 1977--1991 in one segment of the lower Hudson River. The term ``robust sensitivity studies`` refers to the use of several sensitivity analysis techniques to obtain a more accurate depiction of the relative importance of different sources of uncertainty. Local sensitivity analysis provided data on the sensitivity of PCB concentration estimates to small perturbations in nominal parameter values. Range sensitivity analysismore » provided information about the magnitude of prediction uncertainty associated with each input uncertainty. Rank correlation analysis indicated which parameters had the most dominant influence on model predictions. Factorial analysis identified important interactions among model parameters. Finally, term analysis looked at the aggregate influence of combinations of parameters representing physico-chemical processes. The authors scored the results of the local and range sensitivity and rank correlation analyses. The authors considered parameters that scored high on two of the three analyses to be important contributors to PCB concentration prediction uncertainty, and treated them probabilistically in simulations. They also treated probabilistically parameters identified in the factorial analysis as interacting with important parameters. The authors used the term analysis to better understand how uncertain parameters were influencing the PCB concentration predictions. The importance analysis allowed us to reduce the number of parameters to be modeled probabilistically from 16 to 5. This reduced the computational complexity of Monte Carlo simulations, and more importantly, provided a more lucid depiction of prediction uncertainty and its causes.« less

  7. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  8. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  9. Uncertainty Measurement for Trace Element Analysis of Uranium and Plutonium Samples by Inductively Coupled Plasma-Atomic Emission Spectrometry (ICP-AES) and Inductively Coupled Plasma-Mass Spectrometry (ICP-MS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallimore, David L.

    2012-06-13

    The measurement uncertainty estimatino associated with trace element analysis of impurities in U and Pu was evaluated using the Guide to the Expression of Uncertainty Measurement (GUM). I this evalution the uncertainty sources were identified and standard uncertainties for the components were categorized as either Type A or B. The combined standard uncertainty was calculated and a coverage factor k = 2 was applied to obtain the expanded uncertainty, U. The ICP-AES and ICP-MS methods used were deveoped for the multi-element analysis of U and Pu samples. A typical analytical run consists of standards, process blanks, samples, matrix spiked samples,more » post digestion spiked samples and independent calibration verification standards. The uncertainty estimation was performed on U and Pu samples that have been analyzed previously as part of the U and Pu Sample Exchange Programs. Control chart results and data from the U and Pu metal exchange programs were combined with the GUM into a concentration dependent estimate of the expanded uncertainty. Comparison of trace element uncertainties obtained using this model was compared to those obtained for trace element results as part of the Exchange programs. This process was completed for all trace elements that were determined to be above the detection limit for the U and Pu samples.« less

  10. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  11. Assessing uncertainties in surface water security: An empirical multimodel approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.

    2015-11-01

    Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.

  12. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  13. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  14. Projected shifts in copepod surface communities in the Mediterranean Sea under several climate change scenarios

    NASA Astrophysics Data System (ADS)

    Benedetti, F.; Guilhaumon, F.; Adloff, F.; Irisson, J. O.; Ayata, S. D.

    2016-02-01

    Although future increases in water temperature and future changes in regional circulation are expected to have great impacts on the pelagic food-web, estimates focusing on community-level shifts are still lacking for the planktonic compartment. By combining statistical niche models (or species distribution models) with projections from a regional circulation model, the impact of climate change on copepod epipelagic communities is assessed for the Mediterranean Sea. Habitat suitability maps are generated for 106 of the most abundant copepod species to analyze emerging patterns of diversity at the community level. Using variance analysis, we also quantified the uncertainties associated to our modeling strategy (niche model choice, CO2 emission scenario, boundary forcings of the circulation model). Comparing present and future projections, changes in species richness (alpha diversity) and in community composition (beta diversity, decomposed into turnover and nestedness component) are calculated. Average projections show that copepod communities will mainly experience turn-over processes, with little changes in species richness. Species gains are mainly located in the Gulf of Lions, the Northern Adriatic and the Northern Aegean seas. However, projections are highly variable, especially in the Eastern Mediterranean basin. We show that such variability is mainly driven by the choice of the niche model, through interactions with the CO2 emission scenario or the boundary forcing of the circulation model can be locally important. Finally, the possible impact of the estimated community changes on zooplanktonic functional and phylogenetic diversity is also assessed. We encourage the enlargement of this type of study to other components of the pelagic food-web, and argue that niche models' outputs should always be given along with a measure of uncertainty, and explained in light of a strong theoretical background.

  15. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  16. Robustness analysis of non-ordinary Petri nets for flexible assembly systems

    NASA Astrophysics Data System (ADS)

    Hsieh, Fu-Shiung

    2010-05-01

    Non-ordinary controlled Petri nets (NCPNs) have the advantages to model flexible assembly systems in which multiple identical resources may be required to perform an operation. However, existing studies on NCPNs are still limited. For example, the robustness properties of NCPNs have not been studied. This motivates us to develop an analysis method for NCPNs. Robustness analysis concerns the ability for a system to maintain operation in the presence of uncertainties. It provides an alternative way to analyse a perturbed system without reanalysis. In our previous research, we have analysed the robustness properties of several subclasses of ordinary controlled Petri nets. To study the robustness properties of NCPNs, we augment NCPNs with an uncertainty model, which specifies an upper bound on the uncertainties for each reachable marking. The resulting PN models are called non-ordinary controlled Petri nets with uncertainties (NCPNU). Based on NCPNU, the problem is to characterise the maximal tolerable uncertainties for each reachable marking. The computational complexities to characterise maximal tolerable uncertainties for each reachable marking grow exponentially with the size of the nets. Instead of considering general NCPNU, we limit our scope to a subclass of PN models called non-ordinary controlled flexible assembly Petri net with uncertainties (NCFAPNU) for assembly systems and study its robustness. We will extend the robustness analysis to NCFAPNU. We identify two types of uncertainties under which the liveness of NCFAPNU can be maintained.

  17. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling: GEOSTATISTICAL SENSITIVITY ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Chen, Xingyuan; Ye, Ming

    Sensitivity analysis is an important tool for quantifying uncertainty in the outputs of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a hierarchical sensitivity analysis method that (1) constructs an uncertainty hierarchy by analyzing the input uncertainty sources, and (2) accounts for the spatial correlation among parameters at each level ofmore » the hierarchy using geostatistical tools. The contribution of uncertainty source at each hierarchy level is measured by sensitivity indices calculated using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport in model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally as driven by the dynamic interaction between groundwater and river water at the site. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed parameters.« less

  18. Uncertainties propagation and global sensitivity analysis of the frequency response function of piezoelectric energy harvesters

    NASA Astrophysics Data System (ADS)

    Ruiz, Rafael O.; Meruane, Viviana

    2017-06-01

    The goal of this work is to describe a framework to propagate uncertainties in piezoelectric energy harvesters (PEHs). These uncertainties are related to the incomplete knowledge of the model parameters. The framework presented could be employed to conduct prior robust stochastic predictions. The prior analysis assumes a known probability density function for the uncertain variables and propagates the uncertainties to the output voltage. The framework is particularized to evaluate the behavior of the frequency response functions (FRFs) in PEHs, while its implementation is illustrated by the use of different unimorph and bimorph PEHs subjected to different scenarios: free of uncertainties, common uncertainties, and uncertainties as a product of imperfect clamping. The common variability associated with the PEH parameters are tabulated and reported. A global sensitivity analysis is conducted to identify the Sobol indices. Results indicate that the elastic modulus, density, and thickness of the piezoelectric layer are the most relevant parameters of the output variability. The importance of including the model parameter uncertainties in the estimation of the FRFs is revealed. In this sense, the present framework constitutes a powerful tool in the robust design and prediction of PEH performance.

  19. The uncertainty of crop yield projections is reduced by improved temperature response functions

    USDA-ARS?s Scientific Manuscript database

    Increasing the accuracy of crop productivity estimates is a key Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on cr...

  20. Data-derived uncertainty factor approach in the revised N-methyl carbamate risk assessment.

    EPA Science Inventory

    EPA completed its Revised Cumulative Risk Assessment (CRA) of the N-methyl Carbamate (NMC) Pesticides in 2007. This assessment evaluated the joint risk to 10 pesticides from food, water, and residential exposure. NMCs share the ability to inhibit acetylcholinesterase (AChE) via c...

  1. Interspecific variation in physiological and foliar metabolic responses to reduced soil water availability

    USDA-ARS?s Scientific Manuscript database

    Climatic uncertainty, particularly in regard to water resources, may alter irrigation management of rice, an essential cereal grain acknowledged as the primary food source for more than half the world’s population. To reduce water use, an alternate wetting and drying (AWD) system has been developed...

  2. UNCERTAINTY IN SOURCE PARTITIONING USING STABLE ISOTOPES

    EPA Science Inventory

    Stable isotope analyses are often used to quantify the contribution of multiple sources to a mixture, such as proportions of food sources in an animal's diet, C3 vs. C4 plant inputs to soil organic carbon, etc. Linear mixing models can be used to partition two sources with a sin...

  3. What do we know about the effects of exposure to 'Low alcohol' and equivalent product labelling on the amounts of alcohol, food and tobacco people select and consume? A systematic review.

    PubMed

    Shemilt, Ian; Hendry, Vivien; Marteau, Theresa M

    2017-01-12

    Explicit labelling of lower strength alcohol products could reduce alcohol consumption by attracting more people to buy and drink such products instead of higher strength ones. Alternatively, it may lead to more consumption due to a 'self-licensing' mechanism. Equivalent labelling of food or tobacco (for example "Low fat" or "Low tar") could influence consumption of those products by similar mechanisms. This systematic review examined the effects of 'Low alcohol' and equivalent labelling of alcohol, food and tobacco products on selection, consumption, and perceptions of products among adults. A systematic review was conducted based on Cochrane methods. Electronic and snowball searches identified 26 eligible studies. Evidence from 12 randomised controlled trials (all on food) was assessed for risk of bias, synthesised using random effects meta-analysis, and interpreted in conjunction with evidence from 14 non-randomised studies (one on alcohol, seven on food and six on tobacco). Outcomes assessed were: quantities of the product (i) selected or (ii) consumed (primary outcomes - behaviours), (iii) intentions to select or consume the product, (iv) beliefs associated with it consumption, (v) product appeal, and (vi) understanding of the label (secondary outcomes - cognitions). Evidence for impacts on the primary outcomes (i.e. amounts selected or consumed) was overall of very low quality, showing mixed effects, likely to vary by specific label descriptors, products and population characteristics. Overall very low quality evidence suggested that exposure to 'Low alcohol' and equivalent labelling on alcohol, food and tobacco products can shift consumer perceptions of products, with the potential to 'self-licence' excess consumption. Considerable uncertainty remains about the effects of labels denoting low alcohol, and equivalent labels, on alcohol, food and tobacco selection and consumption. Independent, high-quality studies are urgently needed to inform policies on labelling regulations.

  4. New analysis strategies for micro aspheric lens metrology

    NASA Astrophysics Data System (ADS)

    Gugsa, Solomon Abebe

    Effective characterization of an aspheric micro lens is critical for understanding and improving processing in micro-optic manufacturing. Since most microlenses are plano-convex, where the convex geometry is a conic surface, current practice is often limited to obtaining an estimate of the lens conic constant, which average out the surface geometry that departs from an exact conic surface and any addition surface irregularities. We have developed a comprehensive approach of estimating the best fit conic and its uncertainty, and in addition propose an alternative analysis that focuses on surface errors rather than best-fit conic constant. We describe our new analysis strategy based on the two most dominant micro lens metrology methods in use today, namely, scanning white light interferometry (SWLI) and phase shifting interferometry (PSI). We estimate several parameters from the measurement. The major uncertainty contributors for SWLI are the estimates of base radius of curvature, the aperture of the lens, the sag of the lens, noise in the measurement, and the center of the lens. In the case of PSI the dominant uncertainty contributors are noise in the measurement, the radius of curvature, and the aperture. Our best-fit conic procedure uses least squares minimization to extract a best-fit conic value, which is then subjected to a Monte Carlo analysis to capture combined uncertainty. In our surface errors analysis procedure, we consider the surface errors as the difference between the measured geometry and the best-fit conic surface or as the difference between the measured geometry and the design specification for the lens. We focus on a Zernike polynomial description of the surface error, and again a Monte Carlo analysis is used to estimate a combined uncertainty, which in this case is an uncertainty for each Zernike coefficient. Our approach also allows us to investigate the effect of individual uncertainty parameters and measurement noise on both the best-fit conic constant analysis and the surface errors analysis, and compare the individual contributions to the overall uncertainty.

  5. Estimating Uncertainty in N2O Emissions from US Cropland Soils

    USDA-ARS?s Scientific Manuscript database

    A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...

  6. Uncertainty Analysis of Inertial Model Attitude Sensor Calibration and Application with a Recommended New Calibration Method

    NASA Technical Reports Server (NTRS)

    Tripp, John S.; Tcheng, Ping

    1999-01-01

    Statistical tools, previously developed for nonlinear least-squares estimation of multivariate sensor calibration parameters and the associated calibration uncertainty analysis, have been applied to single- and multiple-axis inertial model attitude sensors used in wind tunnel testing to measure angle of attack and roll angle. The analysis provides confidence and prediction intervals of calibrated sensor measurement uncertainty as functions of applied input pitch and roll angles. A comparative performance study of various experimental designs for inertial sensor calibration is presented along with corroborating experimental data. The importance of replicated calibrations over extended time periods has been emphasized; replication provides independent estimates of calibration precision and bias uncertainties, statistical tests for calibration or modeling bias uncertainty, and statistical tests for sensor parameter drift over time. A set of recommendations for a new standardized model attitude sensor calibration method and usage procedures is included. The statistical information provided by these procedures is necessary for the uncertainty analysis of aerospace test results now required by users of industrial wind tunnel test facilities.

  7. Toward a Probabilistic Phenological Model for Wheat Growing Degree Days (GDD)

    NASA Astrophysics Data System (ADS)

    Rahmani, E.; Hense, A.

    2017-12-01

    Are there deterministic relations between phenological and climate parameters? The answer is surely `No'. This answer motivated us to solve the problem through probabilistic theories. Thus, we developed a probabilistic phenological model which has the advantage of giving additional information in terms of uncertainty. To that aim, we turned to a statistical analysis named survival analysis. Survival analysis deals with death in biological organisms and failure in mechanical systems. In survival analysis literature, death or failure is considered as an event. By event, in this research we mean ripening date of wheat. We will assume only one event in this special case. By time, we mean the growing duration from sowing to ripening as lifetime for wheat which is a function of GDD. To be more precise we will try to perform the probabilistic forecast for wheat ripening. The probability value will change between 0 and 1. Here, the survivor function gives the probability that the not ripened wheat survives longer than a specific time or will survive to the end of its lifetime as a ripened crop. The survival function at each station is determined by fitting a normal distribution to the GDD as the function of growth duration. Verification of the models obtained is done using CRPS skill score (CRPSS). The positive values of CRPSS indicate the large superiority of the probabilistic phonologic survival model to the deterministic models. These results demonstrate that considering uncertainties in modeling are beneficial, meaningful and necessary. We believe that probabilistic phenological models have the potential to help reduce the vulnerability of agricultural production systems to climate change thereby increasing food security.

  8. Impact of climate change on crop yield and role of model for achieving food security.

    PubMed

    Kumar, Manoj

    2016-08-01

    In recent times, several studies around the globe indicate that climatic changes are likely to impact the food production and poses serious challenge to food security. In the face of climate change, agricultural systems need to adapt measures for not only increasing food supply catering to the growing population worldwide with changing dietary patterns but also to negate the negative environmental impacts on the earth. Crop simulation models are the primary tools available to assess the potential consequences of climate change on crop production and informative adaptive strategies in agriculture risk management. In consideration with the important issue, this is an attempt to provide a review on the relationship between climate change impacts and crop production. It also emphasizes the role of crop simulation models in achieving food security. Significant progress has been made in understanding the potential consequences of environment-related temperature and precipitation effect on agricultural production during the last half century. Increased CO2 fertilization has enhanced the potential impacts of climate change, but its feasibility is still in doubt and debates among researchers. To assess the potential consequences of climate change on agriculture, different crop simulation models have been developed, to provide informative strategies to avoid risks and understand the physical and biological processes. Furthermore, they can help in crop improvement programmes by identifying appropriate future crop management practises and recognizing the traits having the greatest impact on yield. Nonetheless, climate change assessment through model is subjected to a range of uncertainties. The prediction uncertainty can be reduced by using multimodel, incorporating crop modelling with plant physiology, biochemistry and gene-based modelling. For devloping new model, there is a need to generate and compile high-quality field data for model testing. Therefore, assessment of agricultural productivity to sustain food security for generations is essential to maintain a collective knowledge and resources for preventing negative impact as well as managing crop practises.

  9. Predicting Consumer Biomass, Size-Structure, Production, Catch Potential, Responses to Fishing and Associated Uncertainties in the World’s Marine Ecosystems

    PubMed Central

    Jennings, Simon; Collingridge, Kate

    2015-01-01

    Existing estimates of fish and consumer biomass in the world’s oceans are disparate. This creates uncertainty about the roles of fish and other consumers in biogeochemical cycles and ecosystem processes, the extent of human and environmental impacts and fishery potential. We develop and use a size-based macroecological model to assess the effects of parameter uncertainty on predicted consumer biomass, production and distribution. Resulting uncertainty is large (e.g. median global biomass 4.9 billion tonnes for consumers weighing 1 g to 1000 kg; 50% uncertainty intervals of 2 to 10.4 billion tonnes; 90% uncertainty intervals of 0.3 to 26.1 billion tonnes) and driven primarily by uncertainty in trophic transfer efficiency and its relationship with predator-prey body mass ratios. Even the upper uncertainty intervals for global predictions of consumer biomass demonstrate the remarkable scarcity of marine consumers, with less than one part in 30 million by volume of the global oceans comprising tissue of macroscopic animals. Thus the apparently high densities of marine life seen in surface and coastal waters and frequently visited abundance hotspots will likely give many in society a false impression of the abundance of marine animals. Unexploited baseline biomass predictions from the simple macroecological model were used to calibrate a more complex size- and trait-based model to estimate fisheries yield and impacts. Yields are highly dependent on baseline biomass and fisheries selectivity. Predicted global sustainable fisheries yield increases ≈4 fold when smaller individuals (< 20 cm from species of maximum mass < 1kg) are targeted in all oceans, but the predicted yields would rarely be accessible in practice and this fishing strategy leads to the collapse of larger species if fishing mortality rates on different size classes cannot be decoupled. Our analyses show that models with minimal parameter demands that are based on a few established ecological principles can support equitable analysis and comparison of diverse ecosystems. The analyses provide insights into the effects of parameter uncertainty on global biomass and production estimates, which have yet to be achieved with complex models, and will therefore help to highlight priorities for future research and data collection. However, the focus on simple model structures and global processes means that non-phytoplankton primary production and several groups, structures and processes of ecological and conservation interest are not represented. Consequently, our simple models become increasingly less useful than more complex alternatives when addressing questions about food web structure and function, biodiversity, resilience and human impacts at smaller scales and for areas closer to coasts. PMID:26226590

  10. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  11. Migration studies of nickel and chromium from ceramic and glass tableware into food simulants.

    PubMed

    Szynal, Tomasz; Rebeniak, Małgorzata; Mania, Monika

    In addition to the release of lead and cadmium from ceramic and glass vessels, (acceptable limits being set by the EU 84/500/EC Directive), other harmful metals can migrate, such as nickel and chromium. Permissible migration limits for these latter metals however have not yet been set in the EU legislation. Both the toxic properties of nickel and chromium and the measures taken by the European Commission Working Group on Food Contact Materials for verifying permissible migration limits for lead, cadmium and other metals from ceramics have acted as drivers for studies on nickel and chromium release from ceramic and glass tableware. To investigate the migration of nickel and chromium into food simulants from ceramic and glassware, available on the Polish market, which are intended for coming into contact with food. Potential consumer exposure can thereby be estimated from the release of these elements into food. Tableware consisted of ceramics and glass vessels generally available on the domestic market, with inner surfaces being mainly coloured and with rim decorations. Migration of nickel and chromium studied from the ceramics was carried out in 4% acetic acid (24 ± 0.5 hrs at 22 ± 2°C), whilst that from glassware in 4% acetic acid (24 ± 0.5 hrs at 22 ± 2°C) and 0.5% citric acid (2 ± 0.1 hrs at 70 ± 2°C). The concentrations of metals which had migrated into the test solutions were measured by using flame atomic absorption spectrometry (FAAS). This analytical procedure had been previously validated by measuring nickel and chromium released into food simulants from ceramic and glass tableware where working ranges, detection limits, quantification limits, repeatability, accuracy, mean recovery and uncertainty were established. Migration of nickel and chromium was measured from 172 ceramic and 52 and glass vessels samples, with all results being below the limits of quantification (LOQ = 0.02 mg/L), excepting one instance where a 0.04 mg/L concentration of nickel was found. The validated methods for measuring chromium achieved the following parameters; 0.02 to 0.80 mg/L operating range, 0.01 mg/L detection limit, 0.02 mg/L limit of quantification, 6% repeatability, 2.8% accuracy, 102% average recovery and 11% uncertainty. For the nickel method the corresponding parameters were 0.02 to 0.80 mg/L work- ing range, 0.02 mg/L limit of quantification, 0.01 mg/L detection limit, 5% repeatability, 6.5% accuracy, 101% average recovery and 12% uncertainty. The tested ceramics and glassware did not pose a threat to human health regarding migration of nickel and chromium, and thus any potential exposure to these metals released from these products into food will be small. However, due to the toxicity of these metals, the migration of nickel and chromium is still required for articles coming into contact with food, which includes metalware. ceramic tableware, ceramics, glassware, food contact articles, nickel, chromium leaching, migration.

  12. Determination of pesticide residues in animal origin baby foods by gas chromatography coupled with triple quadrupole mass spectrometry.

    PubMed

    Amendola, Graziella; Pelosi, Patrizia; Attard Barbini, Danilo

    2015-01-01

    A simple, fast and multiresidue method for the determination of pesticide residues in baby foods of animal origin has been developed in order to check the compliance with the Maximum Residue Levels (MRLs) set at a general value of 0.01 mg/kg by Commission Directive 2006/125/EC for infant foods. The main classes of organochlorine, organophosphorus and pyrethroid compounds have been considered, which are mainly fat soluble pesticides. The analytical procedure consists in the extraction of baby food samples by acetonitrile (ACN) followed by a clean up using C18 solid-phase extraction column eluted with ACN. The compounds were determined by gas chromatography-triple quadrupole mass spectrometry equipped with a Programmed Temperature Vaporizer (PTV) injection and a backflush system. In order to compensate for matrix effects PTV and matrix matched standard calibrations have been used. The method has been fully validated for 57 pesticides according to the Document SANCO/12571/2013. Accuracy and precision (repeatability) have been studied by recoveries at two spiking levels, the Limit of Quantitation (LOQ) (0.003-0.008 mg/kg) and 10 time greater (0.03-0.08 mg/kg), and the results were in the acceptable range of 70-120% with Relative Standards Deviations (RSD) ≤20%. Selectivity, linearity, LOQ and uncertainty of measurement were also determined for all the compounds. The method has been also applied for the analysis of 18 baby food animal origin samples, bought form the local market in Rome (Italy), and no pesticide in the scope of the method has been found above the MRL or the LOQ.

  13. A Review On Accuracy and Uncertainty of Spatial Data and Analyses with special reference to Urban and Hydrological Modelling

    NASA Astrophysics Data System (ADS)

    Devendran, A. A.; Lakshmanan, G.

    2014-11-01

    Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.

  14. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  15. 134Cs emission probabilities determination by gamma spectrometry

    NASA Astrophysics Data System (ADS)

    de Almeida, M. C. M.; Poledna, R.; Delgado, J. U.; Silva, R. L.; Araujo, M. T. F.; da Silva, C. J.

    2018-03-01

    The National Laboratory for Ionizing Radiation Metrology (LNMRI/IRD/CNEN) of Rio de Janeiro performed primary and secondary standardization of different radionuclides reaching satisfactory uncertainties. A solution of 134Cs radionuclide was purchased from commercial supplier to emission probabilities determination of some of its energies. 134Cs is a beta gamma emitter with 754 days of half-life. This radionuclide is used as standard in environmental, water and food control. It is also important to germanium detector calibration. The gamma emission probabilities (Pγ) were determined mainly for some energies of the 134Cs by efficiency curve method and the Pγ absolute uncertainties obtained were below 1% (k=1).

  16. Traceable Coulomb blockade thermometry

    NASA Astrophysics Data System (ADS)

    Hahtela, O.; Mykkänen, E.; Kemppinen, A.; Meschke, M.; Prunnila, M.; Gunnarsson, D.; Roschier, L.; Penttilä, J.; Pekola, J.

    2017-02-01

    We present a measurement and analysis scheme for determining traceable thermodynamic temperature at cryogenic temperatures using Coulomb blockade thermometry. The uncertainty of the electrical measurement is improved by utilizing two sampling digital voltmeters instead of the traditional lock-in technique. The remaining uncertainty is dominated by that of the numerical analysis of the measurement data. Two analysis methods are demonstrated: numerical fitting of the full conductance curve and measuring the height of the conductance dip. The complete uncertainty analysis shows that using either analysis method the relative combined standard uncertainty (k  =  1) in determining the thermodynamic temperature in the temperature range from 20 mK to 200 mK is below 0.5%. In this temperature range, both analysis methods produced temperature estimates that deviated from 0.39% to 0.67% from the reference temperatures provided by a superconducting reference point device calibrated against the Provisional Low Temperature Scale of 2000.

  17. Development of a Prototype Model-Form Uncertainty Knowledge Base

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  18. Investigating the Sustainability of Perennial Agriculture

    NASA Astrophysics Data System (ADS)

    Sutherlin, C. E.; Brunsell, N. A.; De Oliveira, G.; Crews, T.; Vico, G.

    2017-12-01

    The changing climate leads to uncertainties concerning the sustainability of certain agricultural resources, and with the additional stresses of an increasing global population, uncertainty in food security will greatly increase. To adhere to future food demands in the face of this changing climate, perennial agriculture has been a proposed solution. However, it is equally important to assure that perennial agriculture is not negatively affecting the climate in exchange for this proposed more robust food source. We chose to examine the interactions between perennial and annual agricultural crops by focusing on the efficiency of exchanges with the atmosphere. This is done using the omega decoupling factor for 4 different sites as a way of quantifying the contributions of radiation and stomatal conductance over the resulting water and carbon cycles. This gives us an indication of how the plant canopy is interacting with, and influencing the local microclimate. Ultimately, this should give us an indication of the ability of perennial crops to aid in the climate mitigation process. We hypothesized that the perennial site chosen would have omega values more similar to the omega values of a natural grassland rather than an annual crop site. Using AmeriFlux towers to determine the canopy values needed to calculate the omega decoupling factor, we focused on the Kernza perennial crops being grown at the Land Institute in Salina, Kansas (KLS), in comparison to a natural grassland in Manhattan, Kansas (KON), a typical land cover model in Lawrence, Kansas (KFS), and an annual crop site in Lamont, Oklahoma (ARM). These results will allow us to move forward in the investigation of perennial crops as a sustainable food source.

  19. UNCERTAINTY ANALYSIS IN WATER QUALITY MODELING USING QUAL2E

    EPA Science Inventory

    A strategy for incorporating uncertainty analysis techniques (sensitivity analysis, first order error analysis, and Monte Carlo simulation) into the mathematical water quality model QUAL2E is described. The model, named QUAL2E-UNCAS, automatically selects the input variables or p...

  20. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  1. Intrusive Method for Uncertainty Quantification in a Multiphase Flow Solver

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2016-11-01

    Uncertainty quantification (UQ) is a necessary, interesting, and often neglected aspect of fluid flow simulations. To determine the significance of uncertain initial and boundary conditions, a multiphase flow solver is being created which extends a single phase, intrusive, polynomial chaos scheme into multiphase flows. Reliably estimating the impact of input uncertainty on design criteria can help identify and minimize unwanted variability in critical areas, and has the potential to help advance knowledge in atomizing jets, jet engines, pharmaceuticals, and food processing. Use of an intrusive polynomial chaos method has been shown to significantly reduce computational cost over non-intrusive collocation methods such as Monte-Carlo. This method requires transforming the model equations into a weak form through substitution of stochastic (random) variables. Ultimately, the model deploys a stochastic Navier Stokes equation, a stochastic conservative level set approach including reinitialization, as well as stochastic normals and curvature. By implementing these approaches together in one framework, basic problems may be investigated which shed light on model expansion, uncertainty theory, and fluid flow in general. NSF Grant Number 1511325.

  2. A methodology to estimate uncertainty for emission projections through sensitivity analysis.

    PubMed

    Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación

    2015-04-01

    Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.

  3. Parameter optimization, sensitivity, and uncertainty analysis of an ecosystem model at a forest flux tower site in the United States

    USGS Publications Warehouse

    Wu, Yiping; Liu, Shuguang; Huang, Zhihong; Yan, Wende

    2014-01-01

    Ecosystem models are useful tools for understanding ecological processes and for sustainable management of resources. In biogeochemical field, numerical models have been widely used for investigating carbon dynamics under global changes from site to regional and global scales. However, it is still challenging to optimize parameters and estimate parameterization uncertainty for complex process-based models such as the Erosion Deposition Carbon Model (EDCM), a modified version of CENTURY, that consider carbon, water, and nutrient cycles of ecosystems. This study was designed to conduct the parameter identifiability, optimization, sensitivity, and uncertainty analysis of EDCM using our developed EDCM-Auto, which incorporated a comprehensive R package—Flexible Modeling Framework (FME) and the Shuffled Complex Evolution (SCE) algorithm. Using a forest flux tower site as a case study, we implemented a comprehensive modeling analysis involving nine parameters and four target variables (carbon and water fluxes) with their corresponding measurements based on the eddy covariance technique. The local sensitivity analysis shows that the plant production-related parameters (e.g., PPDF1 and PRDX) are most sensitive to the model cost function. Both SCE and FME are comparable and performed well in deriving the optimal parameter set with satisfactory simulations of target variables. Global sensitivity and uncertainty analysis indicate that the parameter uncertainty and the resulting output uncertainty can be quantified, and that the magnitude of parameter-uncertainty effects depends on variables and seasons. This study also demonstrates that using the cutting-edge R functions such as FME can be feasible and attractive for conducting comprehensive parameter analysis for ecosystem modeling.

  4. Analysis of uncertainties in turbine metal temperature predictions

    NASA Technical Reports Server (NTRS)

    Stepka, F. S.

    1980-01-01

    An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.

  5. Relating Data and Models to Characterize Parameter and Prediction Uncertainty

    EPA Science Inventory

    Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...

  6. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, Dennis; de Bruijn, Karin; Bouwer, Laurens; de Moel, Hans

    2015-04-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. This Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. This uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  7. Uncertainty in flood damage estimates and its potential effect on investment decisions

    NASA Astrophysics Data System (ADS)

    Wagenaar, D. J.; de Bruijn, K. M.; Bouwer, L. M.; De Moel, H.

    2015-01-01

    This paper addresses the large differences that are found between damage estimates of different flood damage models. It explains how implicit assumptions in flood damage models can lead to large uncertainties in flood damage estimates. This explanation is used to quantify this uncertainty with a Monte Carlo Analysis. As input the Monte Carlo analysis uses a damage function library with 272 functions from 7 different flood damage models. This results in uncertainties in the order of magnitude of a factor 2 to 5. The resulting uncertainty is typically larger for small water depths and for smaller flood events. The implications of the uncertainty in damage estimates for flood risk management are illustrated by a case study in which the economic optimal investment strategy for a dike segment in the Netherlands is determined. The case study shows that the uncertainty in flood damage estimates can lead to significant over- or under-investments.

  8. Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract

    NASA Astrophysics Data System (ADS)

    Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang

    2017-01-01

    This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.

  9. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  10. Incorporating uncertainty regarding applicability of evidence from meta-analyses into clinical decision making.

    PubMed

    Kriston, Levente; Meister, Ramona

    2014-03-01

    Judging applicability (relevance) of meta-analytical findings to particular clinical decision-making situations remains challenging. We aimed to describe an evidence synthesis method that accounts for possible uncertainty regarding applicability of the evidence. We conceptualized uncertainty regarding applicability of the meta-analytical estimates to a decision-making situation as the result of uncertainty regarding applicability of the findings of the trials that were included in the meta-analysis. This trial-level applicability uncertainty can be directly assessed by the decision maker and allows for the definition of trial inclusion probabilities, which can be used to perform a probabilistic meta-analysis with unequal probability resampling of trials (adaptive meta-analysis). A case study with several fictitious decision-making scenarios was performed to demonstrate the method in practice. We present options to elicit trial inclusion probabilities and perform the calculations. The result of an adaptive meta-analysis is a frequency distribution of the estimated parameters from traditional meta-analysis that provides individually tailored information according to the specific needs and uncertainty of the decision maker. The proposed method offers a direct and formalized combination of research evidence with individual clinical expertise and may aid clinicians in specific decision-making situations. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Uncertainty in BRCA1 cancer susceptibility testing.

    PubMed

    Baty, Bonnie J; Dudley, William N; Musters, Adrian; Kinney, Anita Y

    2006-11-15

    This study investigated uncertainty in individuals undergoing genetic counseling/testing for breast/ovarian cancer susceptibility. Sixty-three individuals from a single kindred with a known BRCA1 mutation rated uncertainty about 12 items on a five-point Likert scale before and 1 month after genetic counseling/testing. Factor analysis identified a five-item total uncertainty scale that was sensitive to changes before and after testing. The items in the scale were related to uncertainty about obtaining health care, positive changes after testing, and coping well with results. The majority of participants (76%) rated reducing uncertainty as an important reason for genetic testing. The importance of reducing uncertainty was stable across time and unrelated to anxiety or demographics. Yet, at baseline, total uncertainty was low and decreased after genetic counseling/testing (P = 0.004). Analysis of individual items showed that after genetic counseling/testing, there was less uncertainty about the participant detecting cancer early (P = 0.005) and coping well with their result (P < 0.001). Our findings support the importance to clients of genetic counseling/testing as a means of reducing uncertainty. Testing may help clients to reduce the uncertainty about items they can control, and it may be important to differentiate the sources of uncertainty that are more or less controllable. Genetic counselors can help clients by providing anticipatory guidance about the role of uncertainty in genetic testing. (c) 2006 Wiley-Liss, Inc.

  12. Explaining differences between bioaccumulation measurements in laboratory and field data through use of a probabilistic modeling approach

    USGS Publications Warehouse

    Selck, Henriette; Drouillard, Ken; Eisenreich, Karen; Koelmans, Albert A.; Palmqvist, Annemette; Ruus, Anders; Salvito, Daniel; Schultz, Irv; Stewart, Robin; Weisbrod, Annie; van den Brink, Nico W.; van den Heuvel-Greve, Martine

    2012-01-01

    In the regulatory context, bioaccumulation assessment is often hampered by substantial data uncertainty as well as by the poorly understood differences often observed between results from laboratory and field bioaccumulation studies. Bioaccumulation is a complex, multifaceted process, which calls for accurate error analysis. Yet, attempts to quantify and compare propagation of error in bioaccumulation metrics across species and chemicals are rare. Here, we quantitatively assessed the combined influence of physicochemical, physiological, ecological, and environmental parameters known to affect bioaccumulation for 4 species and 2 chemicals, to assess whether uncertainty in these factors can explain the observed differences among laboratory and field studies. The organisms evaluated in simulations including mayfly larvae, deposit-feeding polychaetes, yellow perch, and little owl represented a range of ecological conditions and biotransformation capacity. The chemicals, pyrene and the polychlorinated biphenyl congener PCB-153, represented medium and highly hydrophobic chemicals with different susceptibilities to biotransformation. An existing state of the art probabilistic bioaccumulation model was improved by accounting for bioavailability and absorption efficiency limitations, due to the presence of black carbon in sediment, and was used for probabilistic modeling of variability and propagation of error. Results showed that at lower trophic levels (mayfly and polychaete), variability in bioaccumulation was mainly driven by sediment exposure, sediment composition and chemical partitioning to sediment components, which was in turn dominated by the influence of black carbon. At higher trophic levels (yellow perch and the little owl), food web structure (i.e., diet composition and abundance) and chemical concentration in the diet became more important particularly for the most persistent compound, PCB-153. These results suggest that variation in bioaccumulation assessment is reduced most by improved identification of food sources as well as by accounting for the chemical bioavailability in food components. Improvements in the accuracy of aqueous exposure appear to be less relevant when applied to moderate to highly hydrophobic compounds, because this route contributes only marginally to total uptake. The determination of chemical bioavailability and the increase in understanding and qualifying the role of sediment components (black carbon, labile organic matter, and the like) on chemical absorption efficiencies has been identified as a key next steps.

  13. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  14. How does uncertainty shape patient experience in advanced illness? A secondary analysis of qualitative data.

    PubMed

    Etkind, Simon Noah; Bristowe, Katherine; Bailey, Katharine; Selman, Lucy Ellen; Murtagh, Fliss Em

    2017-02-01

    Uncertainty is common in advanced illness but is infrequently studied in this context. If poorly addressed, uncertainty can lead to adverse patient outcomes. We aimed to understand patient experiences of uncertainty in advanced illness and develop a typology of patients' responses and preferences to inform practice. Secondary analysis of qualitative interview transcripts. Studies were assessed for inclusion and interviews were sampled using maximum-variation sampling. Analysis used a thematic approach with 10% of coding cross-checked to enhance reliability. Qualitative interviews from six studies including patients with heart failure, chronic obstructive pulmonary disease, renal disease, cancer and liver failure. A total of 30 transcripts were analysed. Median age was 75 (range, 43-95), 12 patients were women. The impact of uncertainty was frequently discussed: the main related themes were engagement with illness, information needs, patient priorities and the period of time that patients mainly focused their attention on (temporal focus). A typology of patient responses to uncertainty was developed from these themes. Uncertainty influences patient experience in advanced illness through affecting patients' information needs, preferences and future priorities for care. Our typology aids understanding of how patients with advanced illness respond to uncertainty. Assessment of these three factors may be a useful starting point to guide clinical assessment and shared decision making.

  15. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  16. Variability And Uncertainty Analysis Of Contaminant Transport Model Using Fuzzy Latin Hypercube Sampling Technique

    NASA Astrophysics Data System (ADS)

    Kumar, V.; Nayagum, D.; Thornton, S.; Banwart, S.; Schuhmacher2, M.; Lerner, D.

    2006-12-01

    Characterization of uncertainty associated with groundwater quality models is often of critical importance, as for example in cases where environmental models are employed in risk assessment. Insufficient data, inherent variability and estimation errors of environmental model parameters introduce uncertainty into model predictions. However, uncertainty analysis using conventional methods such as standard Monte Carlo sampling (MCS) may not be efficient, or even suitable, for complex, computationally demanding models and involving different nature of parametric variability and uncertainty. General MCS or variant of MCS such as Latin Hypercube Sampling (LHS) assumes variability and uncertainty as a single random entity and the generated samples are treated as crisp assuming vagueness as randomness. Also when the models are used as purely predictive tools, uncertainty and variability lead to the need for assessment of the plausible range of model outputs. An improved systematic variability and uncertainty analysis can provide insight into the level of confidence in model estimates, and can aid in assessing how various possible model estimates should be weighed. The present study aims to introduce, Fuzzy Latin Hypercube Sampling (FLHS), a hybrid approach of incorporating cognitive and noncognitive uncertainties. The noncognitive uncertainty such as physical randomness, statistical uncertainty due to limited information, etc can be described by its own probability density function (PDF); whereas the cognitive uncertainty such estimation error etc can be described by the membership function for its fuzziness and confidence interval by ?-cuts. An important property of this theory is its ability to merge inexact generated data of LHS approach to increase the quality of information. The FLHS technique ensures that the entire range of each variable is sampled with proper incorporation of uncertainty and variability. A fuzzified statistical summary of the model results will produce indices of sensitivity and uncertainty that relate the effects of heterogeneity and uncertainty of input variables to model predictions. The feasibility of the method is validated to assess uncertainty propagation of parameter values for estimation of the contamination level of a drinking water supply well due to transport of dissolved phenolics from a contaminated site in the UK.

  17. What Are the Main Drivers of Young Consumers Purchasing Traditional Food Products? European Field Research.

    PubMed

    Vlontzos, George; Kyrgiakos, Leonidas; Duquenne, Marie Noelle

    2018-02-12

    In this research, the attitude of European young adults (age 18 to 30 years) regarding their consumption of local and traditional products was examined. The survey was conducted on a sample of 836 consumers from seven European countries (Greece, Bulgaria, Romania, Slovenia, Croatia, Denmark and France). Data collection was made by distributing a developed questionnaire through social media and university mail services. Principal Component Analysis (PCA) was used to identify consumer perception comparing the overall sample with two subsets (consumers from Eastern and Western European countries). Six major factors were revealed: consumer behavior, uncertainty about health issues, cost, influence of media and friends and availability in store. Young adults had a positive attitude to local and traditional food products, but they expressed insecurity about health issues. Cost factor had less of an influence on interviewees from Eastern European countries than those from the overall sample (3rd and 5th factor accordingly). Influence of close environment was a different factor in Eastern countries compared to Western ones, for which it was common to see an influence from media. Females and older people (25-30 years old) have fewer doubts about Traditional Food Products, while media have a high influence on consumers' decisions. The aim of this survey was to identify the consumer profiles of young adults and create different promotion strategies of local and traditional products among the two groups of countries.

  18. DRAINMOD-GIS: a lumped parameter watershed scale drainage and water quality model

    Treesearch

    G.P. Fernandez; G.M. Chescheir; R.W. Skaggs; D.M. Amatya

    2006-01-01

    A watershed scale lumped parameter hydrology and water quality model that includes an uncertainty analysis component was developed and tested on a lower coastal plain watershed in North Carolina. Uncertainty analysis was used to determine the impacts of uncertainty in field and network parameters of the model on the predicted outflows and nitrate-nitrogen loads at the...

  19. A Multi-Institution Look at College Students Seeking Counseling: Nature and Severity of Concerns

    ERIC Educational Resources Information Center

    Krumrei, Elizabeth J.; Newton, Fred B.; Kim, Eunhee

    2010-01-01

    This study provides information about students seeking counseling (N = 3,844) at 9 institutions of higher education. The K-PIRS, an empirically validated measure, was used to assess 7 problem areas (mood difficulties, learning problems, food concerns, interpersonal conflicts, career uncertainties, self-harm indicators, and addiction issues).…

  20. Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Ely, Jeffry W.

    2012-01-01

    A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.

  1. Including non-dietary sources into an exposure assessment of the European Food Safety Authority: The challenge of multi-sector chemicals such as Bisphenol A.

    PubMed

    von Goetz, N; Pirow, R; Hart, A; Bradley, E; Poças, F; Arcella, D; Lillegard, I T L; Simoneau, C; van Engelen, J; Husoy, T; Theobald, A; Leclercq, C

    2017-04-01

    In the most recent risk assessment for Bisphenol A for the first time a multi-route aggregate exposure assessment was conducted by the European Food Safety Authority. This assessment includes exposure via dietary sources, and also contributions of the most important non-dietary sources. Both average and high aggregate exposure were calculated by source-to-dose modeling (forward calculation) for different age groups and compared with estimates based on urinary biomonitoring data (backward calculation). The aggregate exposure estimates obtained by forward and backward modeling are in the same order of magnitude, with forward modeling yielding higher estimates associated with larger uncertainty. Yet, only forward modeling can indicate the relative contribution of different sources. Dietary exposure, especially via canned food, appears to be the most important exposure source and, based on the central aggregate exposure estimates, contributes around 90% to internal exposure to total (conjugated plus unconjugated) BPA. Dermal exposure via thermal paper and to a lesser extent via cosmetic products may contribute around 10% for some age groups. The uncertainty around these estimates is considerable, but since after dermal absorption a first-pass metabolism of BPA by conjugation is lacking, dermal sources may be of equal or even higher toxicological relevance than dietary sources. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Enterobacter sakazakii in food and beverages (other than infant formula and milk powder).

    PubMed

    Friedemann, Miriam

    2007-05-01

    The ubiqitous microorganism Enterobacter sakazakii is a rare contaminant of infant formula and may cause severe systemic infection in neonates. So far, other food is not known to cause E. sakazakii-infections. The scarce information about the ecology of E. sakazakii and the uncertainty concerning the source of infection in children and adults warrant a summary of the current knowledge about the presence of this opportunistic microorganism in food other than infant formula. This review systematizes publications on the presence of E. sakazakii in food and beverages until June 2006. Food other than infant formula has been rarely investigated for the presence of E. sakazakii. Nevertheless, this microorganism could be isolated from a wide spectrum of food and food ingredients. E. sakazakii was isolated from plant food and food ingredients like cereal, fruit and vegetables, legume products, herbs and spices as well as from animal food sources like milk, meat and fish and products made from these foods. The spectrum of E. sakazakii-contaminated food covers both raw and processed food. The kind of processing of E. sakazakii-contaminated food was not restricted to dry products. Fresh, frozen, ready-to-eat, fermented and cooked food products as well as beverages and water suitable for the preparation of food, were found to be contaminated by E. sakazakii. Although E. sakazakii-contaminated food do not have general public health significance, measures for prevention should consider the presence of E. sakazakii in food, food ingredients, their processing and preparation as possible source of contamination, colonization or infection.

  3. Examining the Gap between Science and Public Opinion about Genetically Modified Food and Global Warming.

    PubMed

    McFadden, Brandon R

    2016-01-01

    There is great uncertainty due to challenges of escalating population growth and climate change. Public perception that diverges from the scientific community may decrease the effectiveness of scientific inquiry and innovation as tools to solve these challenges. The objective of this study was to identify the factors associated with the divergence of public opinion from scientific consensus regarding the safety of genetically modified (GM) foods and human involvement in global warming (GW). Results indicate that the effects of knowledge on public opinion are complex and non-uniform across types of knowledge (i.e., perceived and actual) or issues. Political affiliation affects agreement with science; Democrats were more likely to agree that GM food is safe and human actions cause GW. Respondents who had relatively higher cognitive function or held illusionary correlations about GM food or GW were more likely to have an opinion that differed from the scientific community.

  4. Examining the Gap between Science and Public Opinion about Genetically Modified Food and Global Warming

    PubMed Central

    McFadden, Brandon R.

    2016-01-01

    There is great uncertainty due to challenges of escalating population growth and climate change. Public perception that diverges from the scientific community may decrease the effectiveness of scientific inquiry and innovation as tools to solve these challenges. The objective of this study was to identify the factors associated with the divergence of public opinion from scientific consensus regarding the safety of genetically modified (GM) foods and human involvement in global warming (GW). Results indicate that the effects of knowledge on public opinion are complex and non-uniform across types of knowledge (i.e., perceived and actual) or issues. Political affiliation affects agreement with science; Democrats were more likely to agree that GM food is safe and human actions cause GW. Respondents who had relatively higher cognitive function or held illusionary correlations about GM food or GW were more likely to have an opinion that differed from the scientific community. PMID:27829008

  5. Nutrient non-equivalence: Does restricting high-potassium plant foods help to prevent hyperkalemia in hemodialysis patients?

    PubMed Central

    St-Jules, DE; Goldfarb, DS; Sevick, MA

    2016-01-01

    Hemodialysis patients are often advised to limit their intake of high-potassium foods to help manage hyperkalemia. However, the benefits of this practice are entirely theoretical and not supported by rigorous randomized controlled trials. The hypothesis that potassium restriction is useful is based on the assumption that different sources of dietary potassium are therapeutically equivalent. In fact, animal and plant sources of potassium may differ in their potential to contribute to hyperkalemia. In this commentary, we summarize the historical research basis for limiting high-potassium foods. Ultimately, we conclude that this approach is not evidence-based and may actually present harm to patients. However, given the uncertainty arising from the paucity of conclusive data, we agree that until the appropriate intervention studies are conducted, practitioners should continue to advise restriction of high-potassium foods. PMID:26975777

  6. Nutrient Non-equivalence: Does Restricting High-Potassium Plant Foods Help to Prevent Hyperkalemia in Hemodialysis Patients?

    PubMed

    St-Jules, David E; Goldfarb, David S; Sevick, Mary Ann

    2016-09-01

    Hemodialysis patients are often advised to limit their intake of high-potassium foods to help manage hyperkalemia. However, the benefits of this practice are entirely theoretical and not supported by rigorous randomized controlled trials. The hypothesis that potassium restriction is useful is based on the assumption that different sources of dietary potassium are therapeutically equivalent. In fact, animal and plant sources of potassium may differ in their potential to contribute to hyperkalemia. In this commentary, we summarize the historical research basis for limiting high-potassium foods. Ultimately, we conclude that this approach is not evidence-based and may actually present harm to patients. However, given the uncertainty arising from the paucity of conclusive data, we agree that until the appropriate intervention studies are conducted, practitioners should continue to advise restriction of high-potassium foods. Copyright © 2016 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  7. Cancer Risk Assessment of Polycyclic Aromatic Hydrocarbons in the Soils and Sediments of India: A Meta-Analysis.

    PubMed

    Tarafdar, Abhrajyoti; Sinha, Alok

    2017-10-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons in soils and sediments was conducted using the probabilistic approach from a national perspective. Published monitoring data of polycyclic aromatic hydrocarbons present in soils and sediments at different study points across India were collected and converted to their corresponding BaP equivalent concentrations. These BaP equivalent concentrations were used to evaluate comprehensive cancer risk for two different age groups. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The analysis denotes 90% cancer risk value of 1.770E-5 for children and 3.156E-5 for adults at heavily polluted site soils. Overall carcinogenic risks of polycyclic aromatic hydrocarbons in soils of India were mostly in acceptance limits. However, the food ingestion exposure route for sediments leads them to a highly risked zone. The 90% risk values from sediments are 7.863E-05 for children and 3.999E-04 for adults. Sensitivity analysis reveals exposure duration and relative skin adherence factor for soil as the most influential parameter of the assessment, followed by BaP equivalent concentration of polycyclic aromatic hydrocarbons. For sediments, biota to sediment accumulation factor of fish in terms of BaP is most sensitive on the total outcome, followed by BaP equivalent and exposure duration. Individual exposure route analysis showed dermal contact for soils and food ingestion for sediments as the main exposure pathway. Some specific locations such as surrounding areas of Bhavnagar, Raniganj, Sunderban, Raipur, and Delhi demand potential strategies of carcinogenic risk management and reduction. The current study is probably the first attempt to provide information on the carcinogenic risk of polycyclic aromatic hydrocarbons in soil and sediments across India.

  8. Cancer Risk Assessment of Polycyclic Aromatic Hydrocarbons in the Soils and Sediments of India: A Meta-Analysis

    NASA Astrophysics Data System (ADS)

    Tarafdar, Abhrajyoti; Sinha, Alok

    2017-10-01

    A carcinogenic risk assessment of polycyclic aromatic hydrocarbons in soils and sediments was conducted using the probabilistic approach from a national perspective. Published monitoring data of polycyclic aromatic hydrocarbons present in soils and sediments at different study points across India were collected and converted to their corresponding BaP equivalent concentrations. These BaP equivalent concentrations were used to evaluate comprehensive cancer risk for two different age groups. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The analysis denotes 90% cancer risk value of 1.770E-5 for children and 3.156E-5 for adults at heavily polluted site soils. Overall carcinogenic risks of polycyclic aromatic hydrocarbons in soils of India were mostly in acceptance limits. However, the food ingestion exposure route for sediments leads them to a highly risked zone. The 90% risk values from sediments are 7.863E-05 for children and 3.999E-04 for adults. Sensitivity analysis reveals exposure duration and relative skin adherence factor for soil as the most influential parameter of the assessment, followed by BaP equivalent concentration of polycyclic aromatic hydrocarbons. For sediments, biota to sediment accumulation factor of fish in terms of BaP is most sensitive on the total outcome, followed by BaP equivalent and exposure duration. Individual exposure route analysis showed dermal contact for soils and food ingestion for sediments as the main exposure pathway. Some specific locations such as surrounding areas of Bhavnagar, Raniganj, Sunderban, Raipur, and Delhi demand potential strategies of carcinogenic risk management and reduction. The current study is probably the first attempt to provide information on the carcinogenic risk of polycyclic aromatic hydrocarbons in soil and sediments across India.

  9. Monte Carlo uncertainty analysis of dose estimates in radiochromic film dosimetry with single-channel and multichannel algorithms.

    PubMed

    Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen; González-López, Antonio

    2018-03-01

    To provide a multi-stage model to calculate uncertainty in radiochromic film dosimetry with Monte-Carlo techniques. This new approach is applied to single-channel and multichannel algorithms. Two lots of Gafchromic EBT3 are exposed in two different Varian linacs. They are read with an EPSON V800 flatbed scanner. The Monte-Carlo techniques in uncertainty analysis provide a numerical representation of the probability density functions of the output magnitudes. From this numerical representation, traditional parameters of uncertainty analysis as the standard deviations and bias are calculated. Moreover, these numerical representations are used to investigate the shape of the probability density functions of the output magnitudes. Also, another calibration film is read in four EPSON scanners (two V800 and two 10000XL) and the uncertainty analysis is carried out with the four images. The dose estimates of single-channel and multichannel algorithms show a Gaussian behavior and low bias. The multichannel algorithms lead to less uncertainty in the final dose estimates when the EPSON V800 is employed as reading device. In the case of the EPSON 10000XL, the single-channel algorithms provide less uncertainty in the dose estimates for doses higher than four Gy. A multi-stage model has been presented. With the aid of this model and the use of the Monte-Carlo techniques, the uncertainty of dose estimates for single-channel and multichannel algorithms are estimated. The application of the model together with Monte-Carlo techniques leads to a complete characterization of the uncertainties in radiochromic film dosimetry. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  10. How much swamp are we talking here?: Propagating uncertainty about the area of coastal wetlands into the U.S. greenhouse gas inventory

    NASA Astrophysics Data System (ADS)

    Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.

    2017-12-01

    Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.

  11. Uncertainty aggregation and reduction in structure-material performance prediction

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Mahadevan, Sankaran; Ao, Dan

    2018-02-01

    An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.

  12. Reduction of predictive uncertainty in estimating irrigation water requirement through multi-model ensembles and ensemble averaging

    NASA Astrophysics Data System (ADS)

    Multsch, S.; Exbrayat, J.-F.; Kirby, M.; Viney, N. R.; Frede, H.-G.; Breuer, L.

    2014-11-01

    Irrigation agriculture plays an increasingly important role in food supply. Many evapotranspiration models are used today to estimate the water demand for irrigation. They consider different stages of crop growth by empirical crop coefficients to adapt evapotranspiration throughout the vegetation period. We investigate the importance of the model structural vs. model parametric uncertainty for irrigation simulations by considering six evapotranspiration models and five crop coefficient sets to estimate irrigation water requirements for growing wheat in the Murray-Darling Basin, Australia. The study is carried out using the spatial decision support system SPARE:WATER. We find that structural model uncertainty is far more important than model parametric uncertainty to estimate irrigation water requirement. Using the Reliability Ensemble Averaging (REA) technique, we are able to reduce the overall predictive model uncertainty by more than 10%. The exceedance probability curve of irrigation water requirements shows that a certain threshold, e.g. an irrigation water limit due to water right of 400 mm, would be less frequently exceeded in case of the REA ensemble average (45%) in comparison to the equally weighted ensemble average (66%). We conclude that multi-model ensemble predictions and sophisticated model averaging techniques are helpful in predicting irrigation demand and provide relevant information for decision making.

  13. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NASA Astrophysics Data System (ADS)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-11-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.

  14. The potential for meta-analysis to support decision analysis in ecology.

    PubMed

    Mengersen, Kerrie; MacNeil, M Aaron; Caley, M Julian

    2015-06-01

    Meta-analysis and decision analysis are underpinned by well-developed methods that are commonly applied to a variety of problems and disciplines. While these two fields have been closely linked in some disciplines such as medicine, comparatively little attention has been paid to the potential benefits of linking them in ecology, despite reasonable expectations that benefits would be derived from doing so. Meta-analysis combines information from multiple studies to provide more accurate parameter estimates and to reduce the uncertainty surrounding them. Decision analysis involves selecting among alternative choices using statistical information that helps to shed light on the uncertainties involved. By linking meta-analysis to decision analysis, improved decisions can be made, with quantification of the costs and benefits of alternate decisions supported by a greater density of information. Here, we briefly review concepts of both meta-analysis and decision analysis, illustrating the natural linkage between them and the benefits from explicitly linking one to the other. We discuss some examples in which this linkage has been exploited in the medical arena and how improvements in precision and reduction of structural uncertainty inherent in a meta-analysis can provide substantive improvements to decision analysis outcomes by reducing uncertainty in expected loss and maximising information from across studies. We then argue that these significant benefits could be translated to ecology, in particular to the problem of making optimal ecological decisions in the face of uncertainty. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Regional scenario building as a tool to support vulnerability assessment of food & water security and livelihood conditions under varying natural resources managements

    NASA Astrophysics Data System (ADS)

    Reinhardt, Julia; Liersch, Stefan; Dickens, Chris; Kabaseke, Clovis; Mulugeta Lemenih, Kassaye; Sghaier, Mongi; Hattermann, Fred

    2013-04-01

    Participatory regional scenario building was carried out with stakeholders and local researchers in four meso-scale case studies (CS) in Africa. In all CS the improvement of food and / or water security and livelihood conditions was identified as the focal issue. A major concern was to analyze the impacts of different plausible future developments on these issues. The process of scenario development is of special importance as it helps to identify main drivers, critical uncertainties and patterns of change. Opportunities and constraints of actors and actions become clearer and reveal adaptation capacities. Effective strategies must be furthermore reasonable and accepted by local stakeholders to be implemented. Hence, developing scenarios and generating strategies need the integration of local knowledge. The testing of strategies shows how they play out in different scenarios and how robust they are. Reasons and patterns of social and natural vulnerability can so be shown. The scenario building exercise applied in this study is inspired by the approach from Peter Schwartz. It aims at determining critical uncertainties and to identify the most important driving forces for a specific focal issue which are likely to shape future developments of a region. The most important and uncertain drivers were analyzed and systematized with ranking exercises during meetings with local researchers and stakeholders. Cause-effect relationships were drawn in the form of concept maps either during the meetings or by researchers based on available information. Past observations and the scenario building outcomes were used to conduct a trend analysis. Cross-comparisons were made to find similarities and differences between CS in terms of main driving forces, patterns of change, opportunities and constraints. Driving forces and trends which aroused consistently over scenarios and CS were identified. First results indicate that livelihood conditions of people rely often directly on the state and availability of natural resources. Major concerns in all CS are the fast growing populations and natural resources degradation because of unsustainable natural resource management. Land use and resource competition are a consequence of unclear land tenure systems and limited resources availability. Scarce rainfall with high annual variability causes food insecurity if yield failures cannot be compensated, e.g. because of lacking financial resources. In all case studies critical uncertainties were identified to be more or less related to "poor governance". Missing governmental and political stability and effectiveness as well as corruption hamper the implementation of laws and policies related to natural resource management. Other critical uncertainties lie in the social domain. They are either related to demographic patterns like emigration or immigration varying the pressure on natural resources use or to the society in general like the evolvement of people's environmental awareness or voice and accountability. Methodological outcomes of the scenario building were that the complexity of the process requires the use of reliable and powerful tools to support the communication process. Concept maps were found to be a useful tool in this regard.

  16. A multi-model assessment of terrestrial biosphere model data needs

    NASA Astrophysics Data System (ADS)

    Gardella, A.; Cowdery, E.; De Kauwe, M. G.; Desai, A. R.; Duveneck, M.; Fer, I.; Fisher, R.; Knox, R. G.; Kooper, R.; LeBauer, D.; McCabe, T.; Minunno, F.; Raiho, A.; Serbin, S.; Shiklomanov, A. N.; Thomas, A.; Walker, A.; Dietze, M.

    2017-12-01

    Terrestrial biosphere models provide us with the means to simulate the impacts of climate change and their uncertainties. Going beyond direct observation and experimentation, models synthesize our current understanding of ecosystem processes and can give us insight on data needed to constrain model parameters. In previous work, we leveraged the Predictive Ecosystem Analyzer (PEcAn) to assess the contribution of different parameters to the uncertainty of the Ecosystem Demography model v2 (ED) model outputs across various North American biomes (Dietze et al., JGR-G, 2014). While this analysis identified key research priorities, the extent to which these priorities were model- and/or biome-specific was unclear. Furthermore, because the analysis only studied one model, we were unable to comment on the effect of variability in model structure to overall predictive uncertainty. Here, we expand this analysis to all biomes globally and a wide sample of models that vary in complexity: BioCro, CABLE, CLM, DALEC, ED2, FATES, G'DAY, JULES, LANDIS, LINKAGES, LPJ-GUESS, MAESPA, PRELES, SDGVM, SIPNET, and TEM. Prior to performing uncertainty analyses, model parameter uncertainties were assessed by assimilating all available trait data from the combination of the BETYdb and TRY trait databases, using an updated multivariate version of PEcAn's Hierarchical Bayesian meta-analysis. Next, sensitivity analyses were performed for all models across a range of sites globally to assess sensitivities for a range of different outputs (GPP, ET, SH, Ra, NPP, Rh, NEE, LAI) at multiple time scales from the sub-annual to the decadal. Finally, parameter uncertainties and model sensitivities were combined to evaluate the fractional contribution of each parameter to the predictive uncertainty for a specific variable at a specific site and timescale. Facilitated by PEcAn's automated workflows, this analysis represents the broadest assessment of the sensitivities and uncertainties in terrestrial models to date, and provides a comprehensive roadmap for constraining model uncertainties through model development and data collection.

  17. APMP.QM-S8: determination of mass fraction of benzoic acid, methyl paraben and n-butyl paraben in soy sauce

    NASA Astrophysics Data System (ADS)

    Teo, Tang Lin; Gui, Ee Mei; Lu, Ting; Sze Cheow, Pui; Giannikopoulou, Panagiota; Kakoulides, Elias; Lampi, Evgenia; Choi, Sik-man; Yip, Yiu-chung; Chan, Pui-kwan; Hui, Sin-kam; Wollinger, Wagner; Carvalho, Lucas J.; Garrido, Bruno C.; Rego, Eliane C. P.; Ahn, Seonghee; Kim, Byungjoo; Li, Xiuqin; Guo, Zhen; Styarini, Dyah; Aristiawan, Yosi; Putri Ramadhaningtyas, Dillani; Aryana, Nurhani; Ebarvia, Benilda S.; Dacuaya, Aaron; Tongson, Alleni; Aganda, Kim Christopher; Junvee Fortune, Thippaya; Tangtrirat, Pradthana; Mungmeechai, Thanarak; Ceyhan Gören, Ahmet; Gündüz, Simay; Yilmaz, Hasibe

    2017-01-01

    The supplementary comparison APMP.QM-S8: determination of mass fraction of benzoic acid, methyl paraben and n-butyl paraben in soy sauce was coordinated by the Health Sciences Authority, Singapore under the auspices of the Organic Analysis Working Group (OAWG) of the Comité Consultatif pour la Quantité de Matière (CCQM). Ten national metrology institutes (NMIs) or designated institutes (DIs) participated in the comparison. All the institutes participated in the comparison for benzoic acid, while six NMIs/DIs participated in the comparison for methyl paraben and n-butyl paraben. The comparison was designed to enable participating institutes to demonstrate their measurement capabilities in the determination of common preservatives in soy sauce, using procedure(s) that required simple sample preparation and selective detection in the mass fraction range of 50 to 1000 mg/kg. The demonstrated capabilities can be extended to include other polar food preservatives (e.g. sorbic acid, propionic acid and other alkyl benzoates) in water, aqueous-based beverages (e.g. fruit juices, tea extracts, sodas, sports drinks, etc) and aqueous-based condiments (e.g. vinegar, fish sauce, etc). Liquid--liquid extraction and/or dilution were applied, followed by instrumental analyses using LC-MS/MS, LC-MS, GC-MS (with or without derivatisation) or HPLC-DAD. Isotope dilution mass spectrometry was used for quantification, except in the case of a participating institute, where external calibration method was used for quantification of all three measurands. The assigned Supplementary Comparison Reference Values (SCRVs) were the medians of ten results for benzoic acid, six results for methyl paraben and six results for n-butyl paraben. Benzoic acid was assigned a SCRV of 154.55 mg/kg with a combined standard uncertainty of 0.94 mg/kg, methyl paraben was assigned a SCRV of 100.95 mg/kg with a combined standard uncertainty of 0.40 mg/kg, and n-butyl paraben was assigned a SCRV of 99.05 mg/kg with a combined standard uncertainty of 1.36 mg/kg. The k-factors for the estimation of the expanded uncertainties of the SCRVs were 2.26, 2.57 and 2.57, respectively. The degree of equivalence (with the SCRV) and its uncertainty were calculated for each result. All the participating institutes (except in one case for benzoic acid) were able to demonstrate or confirm their capabilities in the determination of polar food preservatives in water or aqueous-based beverages/condiments. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  18. Numerical Uncertainty Analysis for Computational Fluid Dynamics using Student T Distribution -- Application of CFD Uncertainty Analysis Compared to Exact Analytical Solution

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; Ilie, marcel; Shallhorn, Paul A.

    2014-01-01

    Computational Fluid Dynamics (CFD) is the standard numerical tool used by Fluid Dynamists to estimate solutions to many problems in academia, government, and industry. CFD is known to have errors and uncertainties and there is no universally adopted method to estimate such quantities. This paper describes an approach to estimate CFD uncertainties strictly numerically using inputs and the Student-T distribution. The approach is compared to an exact analytical solution of fully developed, laminar flow between infinite, stationary plates. It is shown that treating all CFD input parameters as oscillatory uncertainty terms coupled with the Student-T distribution can encompass the exact solution.

  19. Removal of Asperger's syndrome from the DSM V: community response to uncertainty.

    PubMed

    Parsloe, Sarah M; Babrow, Austin S

    2016-01-01

    The May 2013 release of the new version of the Diagnostic and Statistical Manual of Mental Disorders (DSM V) subsumed Asperger's syndrome under the wider diagnostic label of autism spectrum disorder (ASD). The revision has created much uncertainty in the community affected by this condition. This study uses problematic integration theory and thematic analysis to investigate how participants in Wrong Planet, a large online community associated with autism and Asperger's syndrome, have constructed these uncertainties. The analysis illuminates uncertainties concerning both the likelihood of diagnosis and value of diagnosis, and it details specific issues within these two general areas of uncertainty. The article concludes with both conceptual and practical implications.

  20. Analyzing the greenhouse gas impact potential of smallholder development actions across a global food security program

    NASA Astrophysics Data System (ADS)

    Grewer, Uwe; Nash, Julie; Gurwick, Noel; Bockel, Louis; Galford, Gillian; Richards, Meryl; Costa Junior, Ciniro; White, Julianna; Pirolli, Gillian; Wollenberg, Eva

    2018-04-01

    This article analyses the greenhouse gas (GHG) impact potential of improved management practices and technologies for smallholder agriculture promoted under a global food security development program. Under ‘business-as-usual’ development, global studies on the future of agriculture to 2050 project considerable increases in total food production and cultivated area. Conventional cropland intensification and conversion of natural vegetation typically result in increased GHG emissions and loss of carbon stocks. There is a strong need to understand the potential greenhouse gas impacts of agricultural development programs intended to achieve large-scale change, and to identify pathways of smallholder agricultural development that can achieve food security and agricultural production growth without drastic increases in GHG emissions. In an analysis of 134 crop and livestock production systems in 15 countries with reported impacts on 4.8 million ha, improved management practices and technologies by smallholder farmers significantly reduce GHG emission intensity of agricultural production, increase yields and reduce post-harvest losses, while either decreasing or only moderately increasing net GHG emissions per area. Investments in both production and post-harvest stages meaningfully reduced GHG emission intensity, contributing to low emission development. We present average impacts on net GHG emissions per hectare and GHG emission intensity, while not providing detailed statistics of GHG impacts at scale that are associated to additional uncertainties. While reported improvements in smallholder systems effectively reduce future GHG emissions compared to business-as-usual development, these contributions are insufficient to significantly reduce net GHG emission in agriculture beyond current levels, particularly if future agricultural production grows at projected rates.

  1. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  2. Proton and neutron electromagnetic form factors and uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Zhihong; Arrington, John; Hill, Richard J.

    We determine the nucleon electromagnetic form factors and their uncertainties from world electron scattering data. The analysis incorporates two-photon exchange corrections, constraints on the low-Q 2 and high-Q 2 behavior, and additional uncertainties to account for tensions between different data sets and uncertainties in radiative corrections.

  3. Proton and neutron electromagnetic form factors and uncertainties

    DOE PAGES

    Ye, Zhihong; Arrington, John; Hill, Richard J.; ...

    2017-12-06

    We determine the nucleon electromagnetic form factors and their uncertainties from world electron scattering data. The analysis incorporates two-photon exchange corrections, constraints on the low-Q 2 and high-Q 2 behavior, and additional uncertainties to account for tensions between different data sets and uncertainties in radiative corrections.

  4. Quantifying uncertainty in forest nutrient budgets

    Treesearch

    Ruth D. Yanai; Carrie R. Levine; Mark B. Green; John L. Campbell

    2012-01-01

    Nutrient budgets for forested ecosystems have rarely included error analysis, in spite of the importance of uncertainty to interpretation and extrapolation of the results. Uncertainty derives from natural spatial and temporal variation and also from knowledge uncertainty in measurement and models. For example, when estimating forest biomass, researchers commonly report...

  5. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  6. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2015-04-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterized and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterized uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  7. Techniques for analyses of trends in GRUAN data

    NASA Astrophysics Data System (ADS)

    Bodeker, G. E.; Kremser, S.

    2014-12-01

    The Global Climate Observing System (GCOS) Reference Upper Air Network (GRUAN) provides reference quality RS92 radiosonde measurements of temperature, pressure and humidity. A key attribute of reference quality measurements, and hence GRUAN data, is that each datum has a well characterised and traceable estimate of the measurement uncertainty. The long-term homogeneity of the measurement records, and their well characterised uncertainties, make these data suitable for reliably detecting changes in global and regional climate on decadal time scales. Considerable effort is invested in GRUAN operations to (i) describe and analyse all sources of measurement uncertainty to the extent possible, (ii) quantify and synthesize the contribution of each source of uncertainty to the total measurement uncertainty, and (iii) verify that the evaluated net uncertainty is within the required target uncertainty. However, if the climate science community is not sufficiently well informed on how to capitalize on this added value, the significant investment in estimating meaningful measurement uncertainties is largely wasted. This paper presents and discusses the techniques that will need to be employed to reliably quantify long-term trends in GRUAN data records. A pedagogical approach is taken whereby numerical recipes for key parts of the trend analysis process are explored. The paper discusses the construction of linear least squares regression models for trend analysis, boot-strapping approaches to determine uncertainties in trends, dealing with the combined effects of autocorrelation in the data and measurement uncertainties in calculating the uncertainty on trends, best practice for determining seasonality in trends, how to deal with co-linear basis functions, and interpreting derived trends. Synthetic data sets are used to demonstrate these concepts which are then applied to a first analysis of temperature trends in RS92 radiosonde upper air soundings at the GRUAN site at Lindenberg, Germany (52.21° N, 14.12° E).

  8. Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.

    NASA Astrophysics Data System (ADS)

    Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.

    2017-12-01

    A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.

  9. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on keymore » figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.« less

  10. Evaluation of landfill gas emissions from municipal solid waste landfills for the life-cycle analysis of waste-to-energy pathways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Uisung; Han, Jeongwoo; Wang, Michael

    Various waste-to-energy (WTE) conversion technologies can generate energy products from municipal solid waste (MSW). Accurately evaluating landfill gas (LFG, mainly methane) emissions from base case landfills is critical to conducting a WTE life-cycle analysis (LCA) of their greenhouse gas (GHG) emissions. To reduce uncertainties in estimating LFG, this study investigated key parameters for its generation, based on updated experimental results. These results showed that the updated parameters changed the calculated GHG emissions from landfills significantly depending on waste stream; they resulted in a 65% reduction for wood (from 2412 to 848 t CO 2e/dry t) to a 4% increase formore » food waste (from 2603 to 2708 t CO 2e/dry t). Landfill GHG emissions also vary significantly based on LFG management practices and climate. In LCAs of WTE conversion, generating electricity from LFG helps reduce GHG emissions indirectly by displacing regional electricity. When both active LFG collection and power generation are considered, GHG emissions are 44% less for food waste (from 2708 to 1524 t CO 2e/dry t), relative to conventional MSW landfilling. The method developed and data collected in this study can help improve the assessment of GHG impacts from landfills, which supports transparent decision-making regarding the sustainable treatment, management, and utilization of MSW.« less

  11. Evaluation of landfill gas emissions from municipal solid waste landfills for the life-cycle analysis of waste-to-energy pathways

    DOE PAGES

    Lee, Uisung; Han, Jeongwoo; Wang, Michael

    2017-08-05

    Various waste-to-energy (WTE) conversion technologies can generate energy products from municipal solid waste (MSW). Accurately evaluating landfill gas (LFG, mainly methane) emissions from base case landfills is critical to conducting a WTE life-cycle analysis (LCA) of their greenhouse gas (GHG) emissions. To reduce uncertainties in estimating LFG, this study investigated key parameters for its generation, based on updated experimental results. These results showed that the updated parameters changed the calculated GHG emissions from landfills significantly depending on waste stream; they resulted in a 65% reduction for wood (from 2412 to 848 t CO 2e/dry t) to a 4% increase formore » food waste (from 2603 to 2708 t CO 2e/dry t). Landfill GHG emissions also vary significantly based on LFG management practices and climate. In LCAs of WTE conversion, generating electricity from LFG helps reduce GHG emissions indirectly by displacing regional electricity. When both active LFG collection and power generation are considered, GHG emissions are 44% less for food waste (from 2708 to 1524 t CO 2e/dry t), relative to conventional MSW landfilling. The method developed and data collected in this study can help improve the assessment of GHG impacts from landfills, which supports transparent decision-making regarding the sustainable treatment, management, and utilization of MSW.« less

  12. RECONSTRUCTING EXPOSURE SCENARIOS USING DOSE BIOMARKERS - AN APPLICATION OF BAYESIAN UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We use Bayesian uncertainty analysis to explore how to estimate pollutant exposures from biomarker concentrations. The growing number of national databases with exposure data makes such an analysis possible. They contain datasets of pharmacokinetic biomarkers for many polluta...

  13. APPLICATION OF BAYESIAN MONTE CARLO ANALYSIS TO A LAGRANGIAN PHOTOCHEMICAL AIR QUALITY MODEL. (R824792)

    EPA Science Inventory

    Uncertainties in ozone concentrations predicted with a Lagrangian photochemical air quality model have been estimated using Bayesian Monte Carlo (BMC) analysis. Bayesian Monte Carlo analysis provides a means of combining subjective "prior" uncertainty estimates developed ...

  14. Modelling Southern Ocean ecosystems: krill, the food-web, and the impacts of harvesting.

    PubMed

    Hill, S L; Murphy, E J; Reid, K; Trathan, P N; Constable, A J

    2006-11-01

    The ecosystem approach to fisheries recognises the interdependence between harvested species and other ecosystem components. It aims to account for the propagation of the effects of harvesting through the food-web. The formulation and evaluation of ecosystem-based management strategies requires reliable models of ecosystem dynamics to predict these effects. The krill-based system in the Southern Ocean was the focus of some of the earliest models exploring such effects. It is also a suitable example for the development of models to support the ecosystem approach to fisheries because it has a relatively simple food-web structure and progress has been made in developing models of the key species and interactions, some of which has been motivated by the need to develop ecosystem-based management. Antarctic krill, Euphausia superba, is the main target species for the fishery and the main prey of many top predators. It is therefore critical to capture the processes affecting the dynamics and distribution of krill in ecosystem dynamics models. These processes include environmental influences on recruitment and the spatially variable influence of advection. Models must also capture the interactions between krill and its consumers, which are mediated by the spatial structure of the environment. Various models have explored predator-prey population dynamics with simplistic representations of these interactions, while others have focused on specific details of the interactions. There is now a pressing need to develop plausible and practical models of ecosystem dynamics that link processes occurring at these different scales. Many studies have highlighted uncertainties in our understanding of the system, which indicates future priorities in terms of both data collection and developing methods to evaluate the effects of these uncertainties on model predictions. We propose a modelling approach that focuses on harvested species and their monitored consumers and that evaluates model uncertainty by using alternative structures and functional forms in a Monte Carlo framework.

  15. An updated dose assessment for resettlement options at Bikini Atoll--a U.S. nuclear test site.

    PubMed

    Robison, W L; Bogen, K T; Conrado, C L

    1997-07-01

    On 1 March 1954, a nuclear weapon test, code-named BRAVO, conducted at Bikini Atoll in the northern Marshall Islands contaminated the major residence island. There has been a continuing effort since 1977 to refine dose assessments for resettlement options at Bikini Atoll. Here we provide a radiological dose assessment for the main residence island, Bikini, using extensive radionuclide concentration data derived from analysis of food crops, ground water, cistern water, fish and other marine species, animals, air, and soil collected at Bikini Island as part of our continuing research and monitoring program that began in 1978. The unique composition of coral soil greatly alters the relative contribution of 137Cs and 90Sr to the total estimated dose relative to expectations based on North American and European soils. Without counter measures, 137Cs produces 96% of the estimated dose for returning residents, mostly through uptake from the soil to terrestrial food crops but also from external gamma exposure. The doses are calculated assuming a resettlement date of 1999. The estimated maximum annual effective dose for current island conditions is 4.0 mSv when imported foods, which are now an established part of the diet, are available. The 30-, 50-, and 70-y integral effective doses are 91 mSv, 130 mSv, and 150 mSv, respectively. A detailed uncertainty analysis for these dose estimates is presented in a companion paper in this issue. We have evaluated various countermeasures to reduce 137Cs in food crops. Treatment with potassium reduces the uptake of 137Cs into food crops, and therefore the ingestion dose, to about 5% of pretreatment levels and has essentially no negative environmental consequences. We have calculated the dose for the rehabilitation scenario where the top 40 cm of soil is removed in the housing and village area, and the rest of the island is treated with potassium fertilizer; the maximum annual effective dose is 0.41 mSv and the 30-, 50-, and 70-y integral effective doses are 9.8 mSv, 14 mSv, and 16 mSv, respectively.

  16. Uncertainty Estimation using Bootstrapped Kriging Predictions for Precipitation Isoscapes

    NASA Astrophysics Data System (ADS)

    Ma, C.; Bowen, G. J.; Vander Zanden, H.; Wunder, M.

    2017-12-01

    Isoscapes are spatial models representing the distribution of stable isotope values across landscapes. Isoscapes of hydrogen and oxygen in precipitation are now widely used in a diversity of fields, including geology, biology, hydrology, and atmospheric science. To generate isoscapes, geostatistical methods are typically applied to extend predictions from limited data measurements. Kriging is a popular method in isoscape modeling, but quantifying the uncertainty associated with the resulting isoscapes is challenging. Applications that use precipitation isoscapes to determine sample origin require estimation of uncertainty. Here we present a simple bootstrap method (SBM) to estimate the mean and uncertainty of the krigged isoscape and compare these results with a generalized bootstrap method (GBM) applied in previous studies. We used hydrogen isotopic data from IsoMAP to explore these two approaches for estimating uncertainty. We conducted 10 simulations for each bootstrap method and found that SBM results in more kriging predictions (9/10) compared to GBM (4/10). Prediction from SBM was closer to the original prediction generated without bootstrapping and had less variance than GBM. SBM was tested on different datasets from IsoMAP with different numbers of observation sites. We determined that predictions from the datasets with fewer than 40 observation sites using SBM were more variable than the original prediction. The approaches we used for estimating uncertainty will be compiled in an R package that is under development. We expect that these robust estimates of precipitation isoscape uncertainty can be applied in diagnosing the origin of samples ranging from various type of waters to migratory animals, food products, and humans.

  17. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    NASA Astrophysics Data System (ADS)

    Freni, Gabriele; Mannina, Giorgio

    In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the residuals distribution. If residuals are not normally distributed, the uncertainty is over-estimated if Box-Cox transformation is not applied or non-calibrated parameter is used.

  18. Final Technical Report: Advanced Measurement and Analysis of PV Derate Factors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, Bruce Hardison; Burton, Patrick D.; Hansen, Clifford

    2015-12-01

    The Advanced Measurement and Analysis of PV Derate Factors project focuses on improving the accuracy and reducing the uncertainty of PV performance model predictions by addressing a common element of all PV performance models referred to as “derates”. Widespread use of “rules of thumb”, combined with significant uncertainty regarding appropriate values for these factors contribute to uncertainty in projected energy production.

  19. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity

    Treesearch

    Harbin Li; Steven G. McNulty

    2007-01-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL...

  20. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 2: Uncertainties due to reaction rates

    NASA Technical Reports Server (NTRS)

    Stolarski, R. S.; Butler, D. M.; Rundel, R. D.

    1977-01-01

    A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.

  1. Factors Affecting Healthful Eating Among Touring Popular Musicians and Singers.

    PubMed

    Cizek, Erin; Kelly, Patrick; Kress, Kathleen; Mattfeldt-Beman, Mildred

    2016-06-01

    Maintaining good health is essential for touring musicians and singers. The stressful demands of touring may impact food choices, leading to detrimental effects on health and performance. This exploratory pilot study aimed to assess factors affecting healthful eating of touring musicians and singers. A 46-item survey was used to assess food- and nutrition-related attitudes, knowledge and behaviors, and environmental factors, as well as lifestyle, musical background, and demographic data. Participants (n=35) were recruited from a musicians' assistance foundation as well as touring musical theater productions and a music festival. Results indicate that touring musicians and singers had positive attitudes regarding healthful foods. Of 35 respondents, 80.0% indicated eating healthful food was important to them. Respondents reported feeling confident selecting (76.5%) and preparing (82.4%) healthful foods; however, they showed uncertainty when determining if carbohydrate-containing foods should be consumed or avoided. Respondents indicated environmental factors including availability and cost of healthy food options and tour schedules limited access to healthful foods. Venues (73.5%), fast food restaurants (67.6%), and airports (64.7%) were the most frequently identified locations in need of offering more healthful food choices. Respondents (52.9%) indicated more support from others while touring would help them make healthier food choices. More research is needed to develop mobile wellness programs as well as performance-based nutrition guidelines for musicians and singers that address the unique demands associated with touring.

  2. Diabetes prevention information in Japanese magazines with the largest print runs. Content analysis using clinical guidelines as a standard.

    PubMed

    Noda, Emi; Mifune, Taka; Nakayama, Takeo

    2013-01-01

    To characterize information on diabetes prevention appearing in Japanese general health magazines and to examine the agreement of the content with that in clinical practice guidelines for the treatment of diabetes in Japan. We used the Japanese magazines' databases provided by the Media Research Center and selected magazines with large print runs published in 2006. Two medical professionals independently conducted content analysis based on items in the diabetes prevention guidelines. The number of pages for each item and agreement with the information in the guidelines were determined. We found 63 issues of magazines amounting to 8,982 pages; 484 pages included diabetes prevention related content. For 23 items included in the diabetes prevention guidelines, overall agreement of information printed in the magazines with that in the guidelines was 64.5% (471 out of 730). The number of times these items were referred to in the magazines varied widely, from 247 times for food items to 0 times for items on screening for pregnancy-induced diabetes, dyslipidemia, and hypertension. Among the 20 items that were referred to at least once, 18 items showed more than 90% agreement with the guidelines. However, there was poor agreement for information on vegetable oil (2/14, 14%) and for specific foods (5/247, 2%). For the fatty acids category, "fat" was not mentioned in the guidelines; however, the term frequently appeared in magazines. "Uncertainty" was never mentioned in magazines for specific food items. The diabetes prevention related content in the health magazines differed from that defined in clinical practice guidelines. Most information in the magazines agreed with the guidelines, however some items were referred to inappropriately. To disseminate correct information to the public on diabetes prevention, health professionals and the media must collaborate.

  3. Uncertainty Analysis and Parameter Estimation For Nearshore Hydrodynamic Models

    NASA Astrophysics Data System (ADS)

    Ardani, S.; Kaihatu, J. M.

    2012-12-01

    Numerical models represent deterministic approaches used for the relevant physical processes in the nearshore. Complexity of the physics of the model and uncertainty involved in the model inputs compel us to apply a stochastic approach to analyze the robustness of the model. The Bayesian inverse problem is one powerful way to estimate the important input model parameters (determined by apriori sensitivity analysis) and can be used for uncertainty analysis of the outputs. Bayesian techniques can be used to find the range of most probable parameters based on the probability of the observed data and the residual errors. In this study, the effect of input data involving lateral (Neumann) boundary conditions, bathymetry and off-shore wave conditions on nearshore numerical models are considered. Monte Carlo simulation is applied to a deterministic numerical model (the Delft3D modeling suite for coupled waves and flow) for the resulting uncertainty analysis of the outputs (wave height, flow velocity, mean sea level and etc.). Uncertainty analysis of outputs is performed by random sampling from the input probability distribution functions and running the model as required until convergence to the consistent results is achieved. The case study used in this analysis is the Duck94 experiment, which was conducted at the U.S. Army Field Research Facility at Duck, North Carolina, USA in the fall of 1994. The joint probability of model parameters relevant for the Duck94 experiments will be found using the Bayesian approach. We will further show that, by using Bayesian techniques to estimate the optimized model parameters as inputs and applying them for uncertainty analysis, we can obtain more consistent results than using the prior information for input data which means that the variation of the uncertain parameter will be decreased and the probability of the observed data will improve as well. Keywords: Monte Carlo Simulation, Delft3D, uncertainty analysis, Bayesian techniques, MCMC

  4. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard; Bostelmann, F.

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less

  5. Sensitivity of wildlife habitat models to uncertainties in GIS data

    NASA Technical Reports Server (NTRS)

    Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.

    1992-01-01

    Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.

  6. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, Greg; Wohlwend, Jen

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  7. Parameter sensitivity analysis of a 1-D cold region lake model for land-surface schemes

    NASA Astrophysics Data System (ADS)

    Guerrero, José-Luis; Pernica, Patricia; Wheater, Howard; Mackay, Murray; Spence, Chris

    2017-12-01

    Lakes might be sentinels of climate change, but the uncertainty in their main feedback to the atmosphere - heat-exchange fluxes - is often not considered within climate models. Additionally, these fluxes are seldom measured, hindering critical evaluation of model output. Analysis of the Canadian Small Lake Model (CSLM), a one-dimensional integral lake model, was performed to assess its ability to reproduce diurnal and seasonal variations in heat fluxes and the sensitivity of simulated fluxes to changes in model parameters, i.e., turbulent transport parameters and the light extinction coefficient (Kd). A C++ open-source software package, Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), was used to perform sensitivity analysis (SA) and identify the parameters that dominate model behavior. The generalized likelihood uncertainty estimation (GLUE) was applied to quantify the fluxes' uncertainty, comparing daily-averaged eddy-covariance observations to the output of CSLM. Seven qualitative and two quantitative SA methods were tested, and the posterior likelihoods of the modeled parameters, obtained from the GLUE analysis, were used to determine the dominant parameters and the uncertainty in the modeled fluxes. Despite the ubiquity of the equifinality issue - different parameter-value combinations yielding equivalent results - the answer to the question was unequivocal: Kd, a measure of how much light penetrates the lake, dominates sensible and latent heat fluxes, and the uncertainty in their estimates is strongly related to the accuracy with which Kd is determined. This is important since accurate and continuous measurements of Kd could reduce modeling uncertainty.

  8. Multivariate Probabilistic Analysis of an Hydrological Model

    NASA Astrophysics Data System (ADS)

    Franceschini, Samuela; Marani, Marco

    2010-05-01

    Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model response is highly nonlinear. Higher-order approximations can provide more accurate estimations, but reduce the numerical advantage of the LiM. The results of the uncertainty analysis identify the main sources of uncertainty in the computation of river discharge. In this particular case the spatial variability of rainfall and the model parameters uncertainty are shown to have the greatest impact on discharge evaluation. This, in turn, highlights the need to support any estimated hydrological response with probability information and risk analysis results in order to provide a robust, systematic framework for decision making.

  9. Informative Bayesian Type A uncertainty evaluation, especially applicable to a small number of observations

    NASA Astrophysics Data System (ADS)

    Cox, M.; Shirono, K.

    2017-10-01

    A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM’s Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all n ≥slant 2 , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.

  10. Active subspace uncertainty quantification for a polydomain ferroelectric phase-field model

    NASA Astrophysics Data System (ADS)

    Leon, Lider S.; Smith, Ralph C.; Miles, Paul; Oates, William S.

    2018-03-01

    Quantum-informed ferroelectric phase field models capable of predicting material behavior, are necessary for facilitating the development and production of many adaptive structures and intelligent systems. Uncertainty is present in these models, given the quantum scale at which calculations take place. A necessary analysis is to determine how the uncertainty in the response can be attributed to the uncertainty in the model inputs or parameters. A second analysis is to identify active subspaces within the original parameter space, which quantify directions in which the model response varies most dominantly, thus reducing sampling effort and computational cost. In this investigation, we identify an active subspace for a poly-domain ferroelectric phase-field model. Using the active variables as our independent variables, we then construct a surrogate model and perform Bayesian inference. Once we quantify the uncertainties in the active variables, we obtain uncertainties for the original parameters via an inverse mapping. The analysis provides insight into how active subspace methodologies can be used to reduce computational power needed to perform Bayesian inference on model parameters informed by experimental or simulated data.

  11. Error Analysis of CM Data Products Sources of Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across amore » wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.« less

  12. Symposium on "The challenge of translating nutrition research into public health nutrition". Session 4: Challenges facing the food industry in innovating for health. Regulatory challenges and opportunities for food innovation.

    PubMed

    Binns, Nino

    2009-02-01

    The primary role of the extensive and complex modern food legislation is to protect the consumer. Providing a framework for industry and enabling free trade are secondary aims. In the EU the 2006 Regulation on nutrition and health claims made on foods was adopted in December 2006. This Regulation defines detailed lists of permitted claims with precise conditions, requires foods making claims to meet specific nutrient profiles and requires the submission of a dossier for approval of new health claims. Nutrient profiles and an initial list of existing health claims will not be agreed until January 2009 and January 2010 respectively. The uncertainty about profiles and the initial list of claims as well as the prescriptive nature of the Regulation will have a major impact, some negative but some positive, on food innovation. Worldwide legislation on nutrition and health claims continues to develop. The current paper also provides an outline of some other key pieces of European legislation that affect food innovation. However, currently, all this legislation remains in development and up-to-date information can be sought from the reference material provided.

  13. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Treesearch

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  14. The Impact of Uncertainty and Irreversibility on Investments in Online Learning

    ERIC Educational Resources Information Center

    Oslington, Paul

    2004-01-01

    Uncertainty and irreversibility are central to online learning projects, but have been neglected in the existing educational cost-benefit analysis literature. This paper builds some simple illustrative models of the impact of irreversibility and uncertainty, and shows how different types of cost and demand uncertainty can have substantial impacts…

  15. Robust Flutter Margin Analysis that Incorporates Flight Data

    NASA Technical Reports Server (NTRS)

    Lind, Rick; Brenner, Martin J.

    1998-01-01

    An approach for computing worst-case flutter margins has been formulated in a robust stability framework. Uncertainty operators are included with a linear model to describe modeling errors and flight variations. The structured singular value, mu, computes a stability margin that directly accounts for these uncertainties. This approach introduces a new method of computing flutter margins and an associated new parameter for describing these margins. The mu margins are robust margins that indicate worst-case stability estimates with respect to the defined uncertainty. Worst-case flutter margins are computed for the F/A-18 Systems Research Aircraft using uncertainty sets generated by flight data analysis. The robust margins demonstrate flight conditions for flutter may lie closer to the flight envelope than previously estimated by p-k analysis.

  16. Risk-Based Sampling: I Don't Want to Weight in Vain.

    PubMed

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. © 2015 Society for Risk Analysis.

  17. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to k eff sensitivity data, cross-section uncertainty data, how k eff sensitivity data and k eff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  18. Bayesian characterization of uncertainty in species interaction strengths.

    PubMed

    Wolf, Christopher; Novak, Mark; Gitelman, Alix I

    2017-06-01

    Considerable effort has been devoted to the estimation of species interaction strengths. This effort has focused primarily on statistical significance testing and obtaining point estimates of parameters that contribute to interaction strength magnitudes, leaving the characterization of uncertainty associated with those estimates unconsidered. We consider a means of characterizing the uncertainty of a generalist predator's interaction strengths by formulating an observational method for estimating a predator's prey-specific per capita attack rates as a Bayesian statistical model. This formulation permits the explicit incorporation of multiple sources of uncertainty. A key insight is the informative nature of several so-called non-informative priors that have been used in modeling the sparse data typical of predator feeding surveys. We introduce to ecology a new neutral prior and provide evidence for its superior performance. We use a case study to consider the attack rates in a New Zealand intertidal whelk predator, and we illustrate not only that Bayesian point estimates can be made to correspond with those obtained by frequentist approaches, but also that estimation uncertainty as described by 95% intervals is more useful and biologically realistic using the Bayesian method. In particular, unlike in bootstrap confidence intervals, the lower bounds of the Bayesian posterior intervals for attack rates do not include zero when a predator-prey interaction is in fact observed. We conclude that the Bayesian framework provides a straightforward, probabilistic characterization of interaction strength uncertainty, enabling future considerations of both the deterministic and stochastic drivers of interaction strength and their impact on food webs.

  19. Uncertainties in the governance of animal disease: an interdisciplinary framework for analysis

    PubMed Central

    Fish, Robert; Austin, Zoe; Christley, Robert; Haygarth, Philip M.; Heathwaite, Louise A.; Latham, Sophia; Medd, William; Mort, Maggie; Oliver, David M.; Pickup, Roger; Wastling, Jonathan M.; Wynne, Brian

    2011-01-01

    Uncertainty is an inherent feature of strategies to contain animal disease. In this paper, an interdisciplinary framework for representing strategies of containment, and analysing how uncertainties are embedded and propagated through them, is developed and illustrated. Analysis centres on persistent, periodic and emerging disease threats, with a particular focus on cryptosporidiosis, foot and mouth disease and avian influenza. Uncertainty is shown to be produced at strategic, tactical and operational levels of containment, and across the different arenas of disease prevention, anticipation and alleviation. The paper argues for more critically reflexive assessments of uncertainty in containment policy and practice. An interdisciplinary approach has an important contribution to make, but is absent from current real-world containment policy. PMID:21624922

  20. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    NASA Astrophysics Data System (ADS)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.

  1. Novel Proximal Sensing for Monitoring Soil Organic C Stocks and Condition.

    PubMed

    Viscarra Rossel, Raphael A; Lobsey, Craig R; Sharman, Chris; Flick, Paul; McLachlan, Gordon

    2017-05-16

    Soil information is needed for environmental monitoring to address current concerns over food, water and energy securities, land degradation, and climate change. We developed the Soil Condition ANalysis System (SCANS) to help address these needs. It integrates an automated soil core sensing system (CSS) with statistical analytics and modeling to characterize soil at fine depth resolutions and across landscapes. The CSS's sensors include a γ-ray attenuation densitometer to measure bulk density, digital cameras to image the measured soil, and a visible-near-infrared (vis-NIR) spectrometer to measure iron oxides and clay mineralogy. The spectra are also modeled to estimate total soil organic carbon (C), particulate, humus, and resistant organic C (POC, HOC, and ROC, respectively), clay content, cation exchange capacity (CEC), pH, volumetric water content, available water capacity (AWC), and their uncertainties. Measurements of bulk density and organic C are combined to estimate C stocks. Kalman smoothing is used to derive complete soil property profiles with propagated uncertainties. The SCANS provides rapid, precise, quantitative, and spatially explicit information about the properties of soil profiles with a level of detail that is difficult to obtain with other approaches. The information gained effectively deepens our understanding of soil and calls attention to the central role soil plays in our environment.

  2. A Pilot Investigation of the Relationship between Climate Variability and Milk Compounds under the Bootstrap Technique

    PubMed Central

    Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika

    2015-01-01

    This study analyzes the linear relationship between climate variables and milk components in Iran by applying bootstrapping to include and assess the uncertainty. The climate parameters, Temperature Humidity Index (THI) and Equivalent Temperature Index (ETI) are computed from the NASA-Modern Era Retrospective-Analysis for Research and Applications (NASA-MERRA) reanalysis (2002–2010). Milk data for fat, protein (measured on fresh matter bases), and milk yield are taken from 936,227 milk records for the same period, using cows fed by natural pasture from April to September. Confidence intervals for the regression model are calculated using the bootstrap technique. This method is applied to the original times series, generating statistically equivalent surrogate samples. As a result, despite the short time data and the related uncertainties, an interesting behavior of the relationships between milk compound and the climate parameters is visible. During spring only, a weak dependency of milk yield and climate variations is obvious, while fat and protein concentrations show reasonable correlations. In summer, milk yield shows a similar level of relationship with ETI, but not with temperature and THI. We suggest this methodology for studies in the field of the impacts of climate change and agriculture, also environment and food with short-term data. PMID:28231215

  3. Certification of caffeine reference material purity by ultraviolet/visible spectrophotometry and high-performance liquid chromatography with diode-array detection as two independent analytical methods.

    PubMed

    Shehata, A B; Rizk, M S; Rend, E A

    2016-10-01

    Caffeine reference material certified for purity is produced worldwide, but no research work on the details of the certification process has been published in the literature. In this paper, we report the scientific details of the preparation and certification of pure caffeine reference materials. Caffeine was prepared by extraction from roasted and ground coffee by dichloromethane after heating in deionized water mixed with magnesium oxide. The extract was purified, dried, and bottled in dark glass vials. Stratified random selection was applied to select a number of vials for homogeneity and stability studies, which revealed that the prepared reference material is homogeneous and sufficiently stable. Quantification of caffeine purity % was carried out using a calibrated UV/visible spectrophotometer and a calibrated high-performance liquid chromatography with diode-array detection method. The results obtained from both methods were combined to drive the certified value and its associated uncertainty. The certified value of the reference material purity was found to be 99.86% and its associated uncertainty was ±0.65%, which makes the candidate reference material a very useful calibrant in food and drug chemical analysis. Copyright © 2016. Published by Elsevier B.V.

  4. Sensitivity of future U.S. water shortages to socioeconomic and climate drivers: A case study in Georgia using an integrated human-earth system modeling framework

    DOE PAGES

    Scott, Michael J.; Daly, Don S.; Hejazi, Mohamad I.; ...

    2016-02-06

    Here, one of the most important interactions between humans and climate is in the demand and supply of water. Humans withdraw, use, and consume water and return waste water to the environment for a variety of socioeconomic purposes, including domestic, commercial, and industrial use, production of energy resources and cooling thermal-electric power plants, and growing food, fiber, and chemical feed stocks for human consumption. Uncertainties in the future human demand for water interact with future impacts of climatic change on water supplies to impinge on water management decisions at the international, national, regional, and local level, but until recently toolsmore » were not available to assess the uncertainties surrounding these decisions. This paper demonstrates the use of a multi-model framework in a structured sensitivity analysis to project and quantify the sensitivity of future deficits in surface water in the context of climate and socioeconomic change for all U.S. states and sub-basins. The framework treats all sources of water demand and supply consistently from the world to local level. The paper illustrates the capabilities of the framework with sample results for a river sub-basin in the U.S. state of Georgia.« less

  5. Comparative life cycle assessment (LCA) of construction and demolition (C&D) derived biomass and U.S. northeast forest residuals gasification for electricity production.

    PubMed

    Nuss, Philip; Gardner, Kevin H; Jambeck, Jenna R

    2013-04-02

    With the goal to move society toward less reliance on fossil fuels and the mitigation of climate change, there is increasing interest and investment in the bioenergy sector. However, current bioenergy growth patterns may, in the long term, only be met through an expansion of global arable land at the expense of natural ecosystems and in competition with the food sector. Increasing thermal energy recovery from solid waste reduces dependence on fossil- and biobased energy production while enhancing landfill diversion. Using inventory data from pilot processes, this work assesses the cradle-to-gate environmental burdens of plasma gasification as a route capable of transforming construction and demolition (C&D) derived biomass (CDDB) and forest residues into electricity. Results indicate that the environmental burdens associated with CDDB and forest residue gasification may be similar to conventional electricity generation. Land occupation is lowest when CDDB is used. Environmental impacts are to a large extent due to coal cogasified, coke used as gasifier bed material, and fuel oil cocombusted in the steam boiler. However, uncertainties associated with preliminary system designs may be large, particularly the heat loss associated with pilot scale data resulting in overall low efficiencies of energy conversion to electricity; a sensitivity analysis assesses these uncertainties in further detail.

  6. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    NASA Astrophysics Data System (ADS)

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  7. An alternative method for analysis of food taints using stir bar sorptive extraction.

    PubMed

    Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M

    2010-09-10

    The determination of taints in food products currently can involve the use of several sample extraction techniques, including direct headspace (DHS), steam distillation extraction (SDE) and more recently solid phase microextraction (SPME). Each of these techniques has disadvantages, such as the use of large volumes of solvents (SDE), or limitations in sensitivity (DHS), or have only been applied to date for determination of individual or specific groups of tainting compounds (SPME). The use of stir bar sorptive extraction (SBSE) has been evaluated as a quantitative screening method for unknown tainting compounds in foods. A range of commonly investigated problem compounds, with a range of physical and chemical properties, were examined. The method was optimised to give the best response for the majority of compounds and the performance was evaluated by examining the accuracy, precision, linearity, limits of detection and quantitation and uncertainties for each analyte. For most compounds SBSE gave the lowest limits of detection compared to steam distillation extraction or direct headspace analysis and in general was better than these established techniques. However, for methyl methacrylate and hexanal no response was observed following stir bar extraction under the optimised conditions. The assays were carried out using a single quadrupole GC-MS in scan mode. A comparison of acquisition modes and instrumentation was performed using standards to illustrate the increase in sensitivity possible using more targeted ion monitoring or a more sensitive high resolution mass spectrometer. This comparison illustrated the usefulness of this approach as an alternative to specialised glassware or expensive instrumentation. SBSE in particular offers a 'greener' extraction method by a large reduction in the use of organic solvents and also minimises the potential for contamination from external laboratory sources, which is of particular concern for taint analysis. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Uncertainty Modeling for Structural Control Analysis and Synthesis

    NASA Technical Reports Server (NTRS)

    Campbell, Mark E.; Crawley, Edward F.

    1996-01-01

    The development of an accurate model of uncertainties for the control of structures that undergo a change in operational environment, based solely on modeling and experimentation in the original environment is studied. The application used throughout this work is the development of an on-orbit uncertainty model based on ground modeling and experimentation. A ground based uncertainty model consisting of mean errors and bounds on critical structural parameters is developed. The uncertainty model is created using multiple data sets to observe all relevant uncertainties in the system. The Discrete Extended Kalman Filter is used as an identification/parameter estimation method for each data set, in addition to providing a covariance matrix which aids in the development of the uncertainty model. Once ground based modal uncertainties have been developed, they are localized to specific degrees of freedom in the form of mass and stiffness uncertainties. Two techniques are presented: a matrix method which develops the mass and stiffness uncertainties in a mathematical manner; and a sensitivity method which assumes a form for the mass and stiffness uncertainties in macroelements and scaling factors. This form allows the derivation of mass and stiffness uncertainties in a more physical manner. The mass and stiffness uncertainties of the ground based system are then mapped onto the on-orbit system, and projected to create an analogous on-orbit uncertainty model in the form of mean errors and bounds on critical parameters. The Middeck Active Control Experiment is introduced as experimental verification for the localization and projection methods developed. In addition, closed loop results from on-orbit operations of the experiment verify the use of the uncertainty model for control analysis and synthesis in space.

  9. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    NASA Astrophysics Data System (ADS)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  10. Evaluation of a Mysis bioenergetics model

    USGS Publications Warehouse

    Chipps, S.R.; Bennett, D.H.

    2002-01-01

    Direct approaches for estimating the feeding rate of the opossum shrimp Mysis relicta can be hampered by variable gut residence time (evacuation rate models) and non-linear functional responses (clearance rate models). Bioenergetics modeling provides an alternative method, but the reliability of this approach needs to be evaluated using independent measures of growth and food consumption. In this study, we measured growth and food consumption for M. relicta and compared experimental results with those predicted from a Mysis bioenergetics model. For Mysis reared at 10??C, model predictions were not significantly different from observed values. Moreover, decomposition of mean square error indicated that 70% of the variation between model predictions and observed values was attributable to random error. On average, model predictions were within 12% of observed values. A sensitivity analysis revealed that Mysis respiration and prey energy density were the most sensitive parameters affecting model output. By accounting for uncertainty (95% CLs) in Mysis respiration, we observed a significant improvement in the accuracy of model output (within 5% of observed values), illustrating the importance of sensitive input parameters for model performance. These findings help corroborate the Mysis bioenergetics model and demonstrate the usefulness of this approach for estimating Mysis feeding rate.

  11. A comparative study of multivariable robustness analysis methods as applied to integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Lovell, T. A.; Schmidt, David K.

    1993-01-01

    Three multivariable robustness analysis methods are compared and contrasted. The focus of the analysis is on system stability and performance robustness to uncertainty in the coupling dynamics between two interacting subsystems. Of particular interest is interacting airframe and engine subsystems, and an example airframe/engine vehicle configuration is utilized in the demonstration of these approaches. The singular value (SV) and structured singular value (SSV) analysis methods are compared to a method especially well suited for analysis of robustness to uncertainties in subsystem interactions. This approach is referred to here as the interacting subsystem (IS) analysis method. This method has been used previously to analyze airframe/engine systems, emphasizing the study of stability robustness. However, performance robustness is also investigated here, and a new measure of allowable uncertainty for acceptable performance robustness is introduced. The IS methodology does not require plant uncertainty models to measure the robustness of the system, and is shown to yield valuable information regarding the effects of subsystem interactions. In contrast, the SV and SSV methods allow for the evaluation of the robustness of the system to particular models of uncertainty, and do not directly indicate how the airframe (engine) subsystem interacts with the engine (airframe) subsystem.

  12. A Study of the Role of Small Ethnic Retail Grocery Stores in Urban Renewal in a Social Housing Project, Toronto, Canada.

    PubMed

    Komakech, Morris D C; Jackson, Suzanne F

    2016-06-01

    Urban renewal often drives away the original residents, replacing them with higher income residents who can afford the new spaces, leading to gentrification. Urban renewal that takes place over many years can create uncertainties for retailers and residents, exacerbating the gentrification process. This qualitative study explored how the urban renewal process in a multi-cultural social housing neighborhood in Toronto (Regent Park) affected the small ethnic retail grocery stores (SERGS) that supplied ethnic foods and items to the ethnic populations living there. Interviews were conducted with ten SERGS store owners/managers and 16 ethnic residents who lived in Regent Park before renewal and were displaced, or who were displaced and returned. The SERGS stated that they provided culturally familiar items and offered a social credit scheme that recognized existing social relationships and allowed low-income residents to afford food and other amenities in a dignified manner and pay later, without penalty or interest. At the same time, the SERGS were unsupported during the renewal, were excluded from the civic planning processes, could not compete for space in the new buildings, and experienced declining sales and loss of business. The residents stated that the SERGS were trusted, provided a valued cultural social spaces for ethnic identity formation, and ethnic food security but they faced many uncertainties about the role of SERGS in a renewed neighborhood. Based on this study, it is recommended that ethnic retailers be recognized for the role they play in formulating ethnic identities and food security in mixed-use mixed-income communities and that they be included in planning processes during urban renewal. Such recognition may enable more former residents to return and lessen the gentrification.

  13. Managing uncertainty in collaborative robotics engineering projects: The influence of task structure and peer interaction

    NASA Astrophysics Data System (ADS)

    Jordan, Michelle

    Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and different types of uncertainty management strategies in the less structured task setting than in the more structured task setting. Peer interaction was influential because students relied on supportive social response to enact most of their uncertainty management strategies. When students could not garner socially supportive response from their peers, their options for managing uncertainty were greatly reduced.

  14. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  15. Uncertainty Analysis via Failure Domain Characterization: Unrestricted Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. The methods developed herein, which are based on nonlinear constrained optimization, are applicable to requirement functions whose functional dependency on the uncertainty is arbitrary and whose explicit form may even be unknown. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the assumed uncertainty model (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  16. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity.

    PubMed

    Li, Harbin; McNulty, Steven G

    2007-10-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC(w); 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC(w) base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL.

  17. On different types of uncertainties in the context of the precautionary principle.

    PubMed

    Aven, Terje

    2011-10-01

    Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.

  18. The potential application of European market research data in dietary exposure modelling of food additives.

    PubMed

    Tennant, David Robin; Bruyninckx, Chris

    2018-03-01

    Consumer exposure assessments for food additives are incomplete without information about the proportions of foods in each authorised category that contain the additive. Such information has been difficult to obtain but the Mintel Global New Products Database (GNPD) provides information about product launches across Europe over the past 20 years. These data can be searched to identify products with specific additives listed on product labels and the numbers compared with total product launches for food and drink categories in the same database to determine the frequency of occurrence. There are uncertainties associated with the data but these can be managed by adopting a cautious and conservative approach. GNPD data can be mapped with authorised food categories and with food descriptions used in the EFSA Comprehensive European Food Consumption Surveys Database for exposure modelling. The data, when presented as percent occurrence, could be incorporated into the EFSA ANS Panel's 'brand-loyal/non-brand loyal exposure model in a quantitative way. Case studies of preservative, antioxidant, colour and sweetener additives showed that the impact of including occurrence data is greatest in the non-brand loyal scenario. Recommendations for future research include identifying occurrence data for alcoholic beverages, linking regulatory food codes, FoodEx and GNPD product descriptions, developing the use of occurrence data for carry-over foods and improving understanding of brand loyalty in consumer exposure models.

  19. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  20. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    NASA Astrophysics Data System (ADS)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  1. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  2. Deregulation, Distrust, and Democracy: State and Local Action to Ensure Equitable Access to Healthy, Sustainably Produced Food.

    PubMed

    Wiley, Lindsay F

    2015-01-01

    Environmental, public health, alternative food, and food justice advocates are working together to achieve incremental agricultural subsidy and nutrition assistance reforms that increase access to fresh fruits and vegetables. When it comes to targeting food and beverage products for increased regulation and decreased consumption, however, the priorities of various food reform movements diverge. This article argues that foundational legal issues, including preemption of state and local authority to protect the public's health and welfare, increasing First Amendment protection for commercial speech, and eroding judicial deference to legislative policy judgments, present a more promising avenue for collaboration across movements than discrete food reform priorities around issues like sugary drinks, genetic modification, or organics. Using the Vermont Genetically Modified Organism (GMO) Labeling Act litigation, the Kauai GMO Cultivation Ordinance litigation, the New York City Sugary Drinks Portion Rule litigation, and the Cleveland Trans Fat Ban litigation as case studies, I discuss the foundational legal challenges faced by diverse food reformers, even when their discrete reform priorities diverge. I also 'explore the broader implications of cooperation among groups that respond differently to the "irrationalities" (from the public health perspective) or "values" (from the environmental and alternative food perspective) that permeate public risk perception for democratic governance in the face of scientific uncertainty.

  3. Uncertainty of Polarized Parton Distributions

    NASA Astrophysics Data System (ADS)

    Hirai, M.; Goto, Y.; Horaguchi, T.; Kobayashi, H.; Kumano, S.; Miyama, M.; Saito, N.; Shibata, T.-A.

    Polarized parton distribution functions are determined by a χ2 analysis of polarized deep inelastic experimental data. In this paper, uncertainty of obtained distribution functions is investigated by a Hessian method. We find that the uncertainty of the polarized gluon distribution is fairly large. Then, we estimate the gluon uncertainty by including the fake data which are generated from prompt photon process at RHIC. We observed that the uncertainty could be reduced with these data.

  4. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis version 6.0 theory manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the Dakota software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of Dakota-related research publications in the areas of surrogate-based optimization, uncertainty quanti cation, and optimization under uncertainty that provide the foundation for many of Dakota's iterative analysis capabilities.« less

  6. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  7. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  8. The IAEA coordinated research programme on HTGR uncertainty analysis: Phase I status and Ex. I-1 prismatic reference results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k ∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k ∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the latest ENDF-B-VII.1 cross section library in Serpent lead to ~180 pcm lower k ∞ values compared to the older ENDF-B-VII.0 dataset, caused by the modified graphite neutron capture cross section. Furthermore, the fourth beta release of SCALE 6.2 likewise produced lower CE k∞ values when compared to SCALE 6.1, and the improved performance of the new 252-group library available in SCALE 6.2 is especially noteworthy. A SCALE/TSUNAMI uncertainty analysis of the Hot Full Power variant for Ex. I-1a furthermore concluded that the 238U(n,γ) (capture) and 235U(View the MathML source) cross-section covariance matrices contributed the most to the total k ∞ uncertainty of 0.58%.« less

  9. The IAEA coordinated research programme on HTGR uncertainty analysis: Phase I status and Ex. I-1 prismatic reference results

    DOE PAGES

    Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik; ...

    2016-01-11

    The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k ∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k ∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the latest ENDF-B-VII.1 cross section library in Serpent lead to ~180 pcm lower k ∞ values compared to the older ENDF-B-VII.0 dataset, caused by the modified graphite neutron capture cross section. Furthermore, the fourth beta release of SCALE 6.2 likewise produced lower CE k∞ values when compared to SCALE 6.1, and the improved performance of the new 252-group library available in SCALE 6.2 is especially noteworthy. A SCALE/TSUNAMI uncertainty analysis of the Hot Full Power variant for Ex. I-1a furthermore concluded that the 238U(n,γ) (capture) and 235U(View the MathML source) cross-section covariance matrices contributed the most to the total k ∞ uncertainty of 0.58%.« less

  10. COMPUTATIONAL METHODS FOR SENSITIVITY AND UNCERTAINTY ANALYSIS FOR ENVIRONMENTAL AND BIOLOGICAL MODELS

    EPA Science Inventory

    This work introduces a computationally efficient alternative method for uncertainty propagation, the Stochastic Response Surface Method (SRSM). The SRSM approximates uncertainties in model outputs through a series expansion in normal random variables (polynomial chaos expansion)...

  11. Ensemble-based uncertainty quantification for coordination and control of thermostatically controlled loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lian, Jianming; Engel, Dave

    2017-07-27

    This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.

  12. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  13. Wave-optics uncertainty propagation and regression-based bias model in GNSS radio occultation bending angle retrievals

    NASA Astrophysics Data System (ADS)

    Gorbunov, Michael E.; Kirchengast, Gottfried

    2018-01-01

    A new reference occultation processing system (rOPS) will include a Global Navigation Satellite System (GNSS) radio occultation (RO) retrieval chain with integrated uncertainty propagation. In this paper, we focus on wave-optics bending angle (BA) retrieval in the lower troposphere and introduce (1) an empirically estimated boundary layer bias (BLB) model then employed to reduce the systematic uncertainty of excess phases and bending angles in about the lowest 2 km of the troposphere and (2) the estimation of (residual) systematic uncertainties and their propagation together with random uncertainties from excess phase to bending angle profiles. Our BLB model describes the estimated bias of the excess phase transferred from the estimated bias of the bending angle, for which the model is built, informed by analyzing refractivity fluctuation statistics shown to induce such biases. The model is derived from regression analysis using a large ensemble of Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) RO observations and concurrent European Centre for Medium-Range Weather Forecasts (ECMWF) analysis fields. It is formulated in terms of predictors and adaptive functions (powers and cross products of predictors), where we use six main predictors derived from observations: impact altitude, latitude, bending angle and its standard deviation, canonical transform (CT) amplitude, and its fluctuation index. Based on an ensemble of test days, independent of the days of data used for the regression analysis to establish the BLB model, we find the model very effective for bias reduction and capable of reducing bending angle and corresponding refractivity biases by about a factor of 5. The estimated residual systematic uncertainty, after the BLB profile subtraction, is lower bounded by the uncertainty from the (indirect) use of ECMWF analysis fields but is significantly lower than the systematic uncertainty without BLB correction. The systematic and random uncertainties are propagated from excess phase to bending angle profiles, using a perturbation approach and the wave-optical method recently introduced by Gorbunov and Kirchengast (2015), starting with estimated excess phase uncertainties. The results are encouraging and this uncertainty propagation approach combined with BLB correction enables a robust reduction and quantification of the uncertainties of excess phases and bending angles in the lower troposphere.

  14. Responses to clinical uncertainty in Australian general practice trainees: a cross-sectional analysis.

    PubMed

    Cooke, Georga; Tapley, Amanda; Holliday, Elizabeth; Morgan, Simon; Henderson, Kim; Ball, Jean; van Driel, Mieke; Spike, Neil; Kerr, Rohan; Magin, Parker

    2017-12-01

    Tolerance for ambiguity is essential for optimal learning and professional competence. General practice trainees must be, or must learn to be, adept at managing clinical uncertainty. However, few studies have examined associations of intolerance of uncertainty in this group. The aim of this study was to establish levels of tolerance of uncertainty in Australian general practice trainees and associations of uncertainty with demographic, educational and training practice factors. A cross-sectional analysis was performed on the Registrar Clinical Encounters in Training (ReCEnT) project, an ongoing multi-site cohort study. Scores on three of the four independent subscales of the Physicians' Reaction to Uncertainty (PRU) instrument were analysed as outcome variables in linear regression models with trainee and practice factors as independent variables. A total of 594 trainees contributed data on a total of 1209 occasions. Trainees in earlier training terms had higher scores for 'Anxiety due to uncertainty', 'Concern about bad outcomes' and 'Reluctance to disclose diagnosis/treatment uncertainty to patients'. Beyond this, findings suggest two distinct sets of associations regarding reaction to uncertainty. Firstly, affective aspects of uncertainty (the 'Anxiety' and 'Concern' subscales) were associated with female gender, less experience in hospital prior to commencing general practice training, and graduation overseas. Secondly, a maladaptive response to uncertainty (the 'Reluctance to disclose' subscale) was associated with urban practice, health qualifications prior to studying medicine, practice in an area of higher socio-economic status, and being Australian-trained. This study has established levels of three measures of trainees' responses to uncertainty and associations with these responses. The current findings suggest differing 'phenotypes' of trainees with high 'affective' responses to uncertainty and those reluctant to disclose uncertainty to patients. More research is needed to examine the relationship between clinical uncertainty and clinical outcomes, temporal changes in tolerance for uncertainty, and strategies that might assist physicians in developing adaptive responses to clinical uncertainty. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  15. Accounting for parameter uncertainty in the definition of parametric distributions used to describe individual patient variation in health economic models.

    PubMed

    Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik

    2017-12-15

    Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.

  16. Modelling uncertainties and possible future trends of precipitation and temperature for 10 sub-basins in Columbia River Basin (CRB)

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, A.; Rana, A.; Qin, Y.; Moradkhani, H.

    2014-12-01

    Trends and changes in future climatic parameters, such as, precipitation and temperature have been a central part of climate change studies. In the present work, we have analyzed the seasonal and yearly trends and uncertainties of prediction in all the 10 sub-basins of Columbia River Basin (CRB) for future time period of 2010-2099. The work is carried out using 2 different sets of statistically downscaled Global Climate Model (GCMs) projection datasets i.e. Bias correction and statistical downscaling (BCSD) generated at Portland State University and The Multivariate Adaptive Constructed Analogs (MACA) generated at University of Idaho. The analysis is done for with 10 GCM downscaled products each from CMIP5 daily dataset totaling to 40 different downscaled products for robust analysis. Summer, winter and yearly trend analysis is performed for all the 10 sub-basins using linear regression (significance tested by student t test) and Mann Kendall test (0.05 percent significance level), for precipitation (P), temperature maximum (Tmax) and temperature minimum (Tmin). Thereafter, all the parameters are modelled for uncertainty, across all models, in all the 10 sub-basins and across the CRB for future scenario periods. Results have indicated in varied degree of trends for all the sub-basins, mostly pointing towards a significant increase in all three climatic parameters, for all the seasons and yearly considerations. Uncertainty analysis have reveled very high change in all the parameters across models and sub-basins under consideration. Basin wide uncertainty analysis is performed to corroborate results from smaller, sub-basin scale. Similar trends and uncertainties are reported on the larger scale as well. Interestingly, both trends and uncertainties are higher during winter period than during summer, contributing to large part of the yearly change.

  17. puma: a Bioconductor package for propagating uncertainty in microarray analysis.

    PubMed

    Pearson, Richard D; Liu, Xuejun; Sanguinetti, Guido; Milo, Marta; Lawrence, Neil D; Rattray, Magnus

    2009-07-09

    Most analyses of microarray data are based on point estimates of expression levels and ignore the uncertainty of such estimates. By determining uncertainties from Affymetrix GeneChip data and propagating these uncertainties to downstream analyses it has been shown that we can improve results of differential expression detection, principal component analysis and clustering. Previously, implementations of these uncertainty propagation methods have only been available as separate packages, written in different languages. Previous implementations have also suffered from being very costly to compute, and in the case of differential expression detection, have been limited in the experimental designs to which they can be applied. puma is a Bioconductor package incorporating a suite of analysis methods for use on Affymetrix GeneChip data. puma extends the differential expression detection methods of previous work from the 2-class case to the multi-factorial case. puma can be used to automatically create design and contrast matrices for typical experimental designs, which can be used both within the package itself but also in other Bioconductor packages. The implementation of differential expression detection methods has been parallelised leading to significant decreases in processing time on a range of computer architectures. puma incorporates the first R implementation of an uncertainty propagation version of principal component analysis, and an implementation of a clustering method based on uncertainty propagation. All of these techniques are brought together in a single, easy-to-use package with clear, task-based documentation. For the first time, the puma package makes a suite of uncertainty propagation methods available to a general audience. These methods can be used to improve results from more traditional analyses of microarray data. puma also offers improvements in terms of scope and speed of execution over previously available methods. puma is recommended for anyone working with the Affymetrix GeneChip platform for gene expression analysis and can also be applied more generally.

  18. Efficient Data-Worth Analysis Using a Multilevel Monte Carlo Method Applied in Oil Reservoir Simulations

    NASA Astrophysics Data System (ADS)

    Lu, D.; Ricciuto, D. M.; Evans, K. J.

    2017-12-01

    Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.

  19. Optimization and uncertainty assessment of strongly nonlinear groundwater models with high parameter dimensionality

    NASA Astrophysics Data System (ADS)

    Keating, Elizabeth H.; Doherty, John; Vrugt, Jasper A.; Kang, Qinjun

    2010-10-01

    Highly parameterized and CPU-intensive groundwater models are increasingly being used to understand and predict flow and transport through aquifers. Despite their frequent use, these models pose significant challenges for parameter estimation and predictive uncertainty analysis algorithms, particularly global methods which usually require very large numbers of forward runs. Here we present a general methodology for parameter estimation and uncertainty analysis that can be utilized in these situations. Our proposed method includes extraction of a surrogate model that mimics key characteristics of a full process model, followed by testing and implementation of a pragmatic uncertainty analysis technique, called null-space Monte Carlo (NSMC), that merges the strengths of gradient-based search and parameter dimensionality reduction. As part of the surrogate model analysis, the results of NSMC are compared with a formal Bayesian approach using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm. Such a comparison has never been accomplished before, especially in the context of high parameter dimensionality. Despite the highly nonlinear nature of the inverse problem, the existence of multiple local minima, and the relatively large parameter dimensionality, both methods performed well and results compare favorably with each other. Experiences gained from the surrogate model analysis are then transferred to calibrate the full highly parameterized and CPU intensive groundwater model and to explore predictive uncertainty of predictions made by that model. The methodology presented here is generally applicable to any highly parameterized and CPU-intensive environmental model, where efficient methods such as NSMC provide the only practical means for conducting predictive uncertainty analysis.

  20. The effect of uncertainties in distance-based ranking methods for multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Jaini, Nor I.; Utyuzhnikov, Sergei V.

    2017-08-01

    Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.

  1. Uncertainty Analysis of the Grazing Flow Impedance Tube

    NASA Technical Reports Server (NTRS)

    Brown, Martha C.; Jones, Michael G.; Watson, Willie R.

    2012-01-01

    This paper outlines a methodology to identify the measurement uncertainty of NASA Langley s Grazing Flow Impedance Tube (GFIT) over its operating range, and to identify the parameters that most significantly contribute to the acoustic impedance prediction. Two acoustic liners are used for this study. The first is a single-layer, perforate-over-honeycomb liner that is nonlinear with respect to sound pressure level. The second consists of a wire-mesh facesheet and a honeycomb core, and is linear with respect to sound pressure level. These liners allow for evaluation of the effects of measurement uncertainty on impedances educed with linear and nonlinear liners. In general, the measurement uncertainty is observed to be larger for the nonlinear liners, with the largest uncertainty occurring near anti-resonance. A sensitivity analysis of the aerodynamic parameters (Mach number, static temperature, and static pressure) used in the impedance eduction process is also conducted using a Monte-Carlo approach. This sensitivity analysis demonstrates that the impedance eduction process is virtually insensitive to each of these parameters.

  2. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    NASA Astrophysics Data System (ADS)

    Hartini, Entin; Andiwijayakusuma, Dinan

    2014-09-01

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuel type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.

  3. Multivariate Copula Analysis Toolbox (MvCAT): Describing dependence and underlying uncertainty using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir

    2017-06-01

    We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.

  4. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartini, Entin, E-mail: entin@batan.go.id; Andiwijayakusuma, Dinan, E-mail: entin@batan.go.id

    2014-09-30

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuelmore » type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.« less

  5. Approach for validating actinide and fission product compositions for burnup credit criticality safety analyses

    DOE PAGES

    Radulescu, Georgeta; Gauld, Ian C.; Ilas, Germina; ...

    2014-11-01

    This paper describes a depletion code validation approach for criticality safety analysis using burnup credit for actinide and fission product nuclides in spent nuclear fuel (SNF) compositions. The technical basis for determining the uncertainties in the calculated nuclide concentrations is comparison of calculations to available measurements obtained from destructive radiochemical assay of SNF samples. Probability distributions developed for the uncertainties in the calculated nuclide concentrations were applied to the SNF compositions of a criticality safety analysis model by the use of a Monte Carlo uncertainty sampling method to determine bias and bias uncertainty in effective neutron multiplication factor. Application ofmore » the Monte Carlo uncertainty sampling approach is demonstrated for representative criticality safety analysis models of pressurized water reactor spent fuel pool storage racks and transportation packages using burnup-dependent nuclide concentrations calculated with SCALE 6.1 and the ENDF/B-VII nuclear data. Furthermore, the validation approach and results support a recent revision of the U.S. Nuclear Regulatory Commission Interim Staff Guidance 8.« less

  6. New methodology for estimating biofuel consumption for cooking: Atmospheric emissions of black carbon and sulfur dioxide from India

    NASA Astrophysics Data System (ADS)

    Habib, Gazala; Venkataraman, Chandra; Shrivastava, Manish; Banerjee, Rangan; Stehr, J. W.; Dickerson, Russell R.

    2004-09-01

    The dominance of biofuel combustion emissions in the Indian region, and the inherently large uncertainty in biofuel use estimates based on cooking energy surveys, prompted the current work, which develops a new methodology for estimating biofuel consumption for cooking. This is based on food consumption statistics, and the specific energy for food cooking. Estimated biofuel consumption in India was 379 (247-584) Tg yr-1. New information on the user population of different biofuels was compiled at a state level, to derive the biofuel mix, which varied regionally and was 74:16:10%, respectively, of fuelwood, dung cake and crop waste, at a national level. Importantly, the uncertainty in biofuel use from quantitative error assessment using the new methodology is around 50%, giving a narrower bound than in previous works. From this new activity data and currently used black carbon emission factors, the black carbon (BC) emissions from biofuel combustion were estimated as 220 (65-760) Gg yr-1. The largest BC emissions were from fuelwood (75%), with lower contributions from dung cake (16%) and crop waste (9%). The uncertainty of 245% in the BC emissions estimate is now governed by the large spread in BC emission factors from biofuel combustion (122%), implying the need for reducing this uncertainty through measurements. Emission factors of SO2 from combustion of biofuels widely used in India were measured, and ranged 0.03-0.08 g kg-1 from combustion of two wood species, 0.05-0.20 g kg-1 from 10 crop waste types, and 0.88 g kg-1 from dung cake, significantly lower than currently used emission factors for wood and crop waste. Estimated SO2 emissions from biofuels of 75 (36-160) Gg yr-1 were about a factor of 3 lower than that in recent studies, with a large contribution from dung cake (73%), followed by fuelwood (21%) and crop waste (6%).

  7. Uncertainty Analysis for Angle Calibrations Using Circle Closure

    PubMed Central

    Estler, W. Tyler

    1998-01-01

    We analyze two types of full-circle angle calibrations: a simple closure in which a single set of unknown angular segments is sequentially compared with an unknown reference angle, and a dual closure in which two divided circles are simultaneously calibrated by intercomparison. In each case, the constraint of circle closure provides auxiliary information that (1) enables a complete calibration process without reference to separately calibrated reference artifacts, and (2) serves to reduce measurement uncertainty. We derive closed-form expressions for the combined standard uncertainties of angle calibrations, following guidelines published by the International Organization for Standardization (ISO) and NIST. The analysis includes methods for the quantitative evaluation of the standard uncertainty of small angle measurement using electronic autocollimators, including the effects of calibration uncertainty and air turbulence. PMID:28009359

  8. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    PubMed

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  9. Quantifying uncertainty in the measurement of arsenic in suspended particulate matter by Atomic Absorption Spectrometry with hydride generator

    PubMed Central

    2011-01-01

    Arsenic is the toxic element, which creates several problems in human being specially when inhaled through air. So the accurate and precise measurement of arsenic in suspended particulate matter (SPM) is of prime importance as it gives information about the level of toxicity in the environment, and preventive measures could be taken in the effective areas. Quality assurance is equally important in the measurement of arsenic in SPM samples before making any decision. The quality and reliability of the data of such volatile elements depends upon the measurement of uncertainty of each step involved from sampling to analysis. The analytical results quantifying uncertainty gives a measure of the confidence level of the concerned laboratory. So the main objective of this study was to determine arsenic content in SPM samples with uncertainty budget and to find out various potential sources of uncertainty, which affects the results. Keeping these facts, we have selected seven diverse sites of Delhi (National Capital of India) for quantification of arsenic content in SPM samples with uncertainty budget following sampling by HVS to analysis by Atomic Absorption Spectrometer-Hydride Generator (AAS-HG). In the measurement of arsenic in SPM samples so many steps are involved from sampling to final result and we have considered various potential sources of uncertainties. The calculation of uncertainty is based on ISO/IEC17025: 2005 document and EURACHEM guideline. It has been found that the final results mostly depend on the uncertainty in measurement mainly due to repeatability, final volume prepared for analysis, weighing balance and sampling by HVS. After the analysis of data of seven diverse sites of Delhi, it has been concluded that during the period from 31st Jan. 2008 to 7th Feb. 2008 the arsenic concentration varies from 1.44 ± 0.25 to 5.58 ± 0.55 ng/m3 with 95% confidence level (k = 2). PMID:21466671

  10. Climate model uncertainty in impact assessments for agriculture: A multi-ensemble case study on maize in sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Dale, Amy; Fant, Charles; Strzepek, Kenneth; Lickley, Megan; Solomon, Susan

    2017-03-01

    We present maize production in sub-Saharan Africa as a case study in the exploration of how uncertainties in global climate change, as reflected in projections from a range of climate model ensembles, influence climate impact assessments for agriculture. The crop model AquaCrop-OS (Food and Agriculture Organization of the United Nations) was modified to run on a 2° × 2° grid and coupled to 122 climate model projections from multi-model ensembles for three emission scenarios (Coupled Model Intercomparison Project Phase 3 [CMIP3] SRES A1B and CMIP5 Representative Concentration Pathway [RCP] scenarios 4.5 and 8.5) as well as two "within-model" ensembles (NCAR CCSM3 and ECHAM5/MPI-OM) designed to capture internal variability (i.e., uncertainty due to chaos in the climate system). In spite of high uncertainty, most notably in the high-producing semi-arid zones, we observed robust regional and sub-regional trends across all ensembles. In agreement with previous work, we project widespread yield losses in the Sahel region and Southern Africa, resilience in Central Africa, and sub-regional increases in East Africa and at the southern tip of the continent. Spatial patterns of yield losses corresponded with spatial patterns of aridity increases, which were explicitly evaluated. Internal variability was a major source of uncertainty in both within-model and between-model ensembles and explained the majority of the spatial distribution of uncertainty in yield projections. Projected climate change impacts on maize production in different regions and nations ranged from near-zero or positive (upper quartile estimates) to substantially negative (lower quartile estimates), highlighting a need for risk management strategies that are adaptive and robust to uncertainty.

  11. Uncertainty Propagation for Terrestrial Mobile Laser Scanner

    NASA Astrophysics Data System (ADS)

    Mezian, c.; Vallet, Bruno; Soheilian, Bahman; Paparoditis, Nicolas

    2016-06-01

    Laser scanners are used more and more in mobile mapping systems. They provide 3D point clouds that are used for object reconstruction and registration of the system. For both of those applications, uncertainty analysis of 3D points is of great interest but rarely investigated in the literature. In this paper we present a complete pipeline that takes into account all the sources of uncertainties and allows to compute a covariance matrix per 3D point. The sources of uncertainties are laser scanner, calibration of the scanner in relation to the vehicle and direct georeferencing system. We suppose that all the uncertainties follow the Gaussian law. The variances of the laser scanner measurements (two angles and one distance) are usually evaluated by the constructors. This is also the case for integrated direct georeferencing devices. Residuals of the calibration process were used to estimate the covariance matrix of the 6D transformation between scanner laser and the vehicle system. Knowing the variances of all sources of uncertainties, we applied uncertainty propagation technique to compute the variance-covariance matrix of every obtained 3D point. Such an uncertainty analysis enables to estimate the impact of different laser scanners and georeferencing devices on the quality of obtained 3D points. The obtained uncertainty values were illustrated using error ellipsoids on different datasets.

  12. Performance and Reliability Optimization for Aerospace Systems subject to Uncertainty and Degradation

    NASA Technical Reports Server (NTRS)

    Miller, David W.; Uebelhart, Scott A.; Blaurock, Carl

    2004-01-01

    This report summarizes work performed by the Space Systems Laboratory (SSL) for NASA Langley Research Center in the field of performance optimization for systems subject to uncertainty. The objective of the research is to develop design methods and tools to the aerospace vehicle design process which take into account lifecycle uncertainties. It recognizes that uncertainty between the predictions of integrated models and data collected from the system in its operational environment is unavoidable. Given the presence of uncertainty, the goal of this work is to develop means of identifying critical sources of uncertainty, and to combine these with the analytical tools used with integrated modeling. In this manner, system uncertainty analysis becomes part of the design process, and can motivate redesign. The specific program objectives were: 1. To incorporate uncertainty modeling, propagation and analysis into the integrated (controls, structures, payloads, disturbances, etc.) design process to derive the error bars associated with performance predictions. 2. To apply modern optimization tools to guide in the expenditure of funds in a way that most cost-effectively improves the lifecycle productivity of the system by enhancing the subsystem reliability and redundancy. The results from the second program objective are described. This report describes the work and results for the first objective: uncertainty modeling, propagation, and synthesis with integrated modeling.

  13. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  14. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE PAGES

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    2017-03-28

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  15. Insight from uncertainty: bootstrap-derived diffusion metrics differentially predict memory function among older adults.

    PubMed

    Vorburger, Robert S; Habeck, Christian G; Narkhede, Atul; Guzman, Vanessa A; Manly, Jennifer J; Brickman, Adam M

    2016-01-01

    Diffusion tensor imaging suffers from an intrinsic low signal-to-noise ratio. Bootstrap algorithms have been introduced to provide a non-parametric method to estimate the uncertainty of the measured diffusion parameters. To quantify the variability of the principal diffusion direction, bootstrap-derived metrics such as the cone of uncertainty have been proposed. However, bootstrap-derived metrics are not independent of the underlying diffusion profile. A higher mean diffusivity causes a smaller signal-to-noise ratio and, thus, increases the measurement uncertainty. Moreover, the goodness of the tensor model, which relies strongly on the complexity of the underlying diffusion profile, influences bootstrap-derived metrics as well. The presented simulations clearly depict the cone of uncertainty as a function of the underlying diffusion profile. Since the relationship of the cone of uncertainty and common diffusion parameters, such as the mean diffusivity and the fractional anisotropy, is not linear, the cone of uncertainty has a different sensitivity. In vivo analysis of the fornix reveals the cone of uncertainty to be a predictor of memory function among older adults. No significant correlation occurs with the common diffusion parameters. The present work not only demonstrates the cone of uncertainty as a function of the actual diffusion profile, but also discloses the cone of uncertainty as a sensitive predictor of memory function. Future studies should incorporate bootstrap-derived metrics to provide more comprehensive analysis.

  16. Reduction of predictive uncertainty in estimating irrigation water requirement through multi-model ensembles and ensemble averaging

    NASA Astrophysics Data System (ADS)

    Multsch, S.; Exbrayat, J.-F.; Kirby, M.; Viney, N. R.; Frede, H.-G.; Breuer, L.

    2015-04-01

    Irrigation agriculture plays an increasingly important role in food supply. Many evapotranspiration models are used today to estimate the water demand for irrigation. They consider different stages of crop growth by empirical crop coefficients to adapt evapotranspiration throughout the vegetation period. We investigate the importance of the model structural versus model parametric uncertainty for irrigation simulations by considering six evapotranspiration models and five crop coefficient sets to estimate irrigation water requirements for growing wheat in the Murray-Darling Basin, Australia. The study is carried out using the spatial decision support system SPARE:WATER. We find that structural model uncertainty among reference ET is far more important than model parametric uncertainty introduced by crop coefficients. These crop coefficients are used to estimate irrigation water requirement following the single crop coefficient approach. Using the reliability ensemble averaging (REA) technique, we are able to reduce the overall predictive model uncertainty by more than 10%. The exceedance probability curve of irrigation water requirements shows that a certain threshold, e.g. an irrigation water limit due to water right of 400 mm, would be less frequently exceeded in case of the REA ensemble average (45%) in comparison to the equally weighted ensemble average (66%). We conclude that multi-model ensemble predictions and sophisticated model averaging techniques are helpful in predicting irrigation demand and provide relevant information for decision making.

  17. Land Resources Allocation Strategies in an Urban Area Involving Uncertainty: A Case Study of Suzhou, in the Yangtze River Delta of China

    NASA Astrophysics Data System (ADS)

    Lu, Shasha; Guan, Xingliang; Zhou, Min; Wang, Yang

    2014-05-01

    A large number of mathematical models have been developed to support land resource allocation decisions and land management needs; however, few of them can address various uncertainties that exist in relation to many factors presented in such decisions (e.g., land resource availabilities, land demands, land-use patterns, and social demands, as well as ecological requirements). In this study, a multi-objective interval-stochastic land resource allocation model (MOISLAM) was developed for tackling uncertainty that presents as discrete intervals and/or probability distributions. The developed model improves upon the existing multi-objective programming and inexact optimization approaches. The MOISLAM not only considers economic factors, but also involves food security and eco-environmental constraints; it can, therefore, effectively reflect various interrelations among different aspects in a land resource management system. Moreover, the model can also help examine the reliability of satisfying (or the risk of violating) system constraints under uncertainty. In this study, the MOISLAM was applied to a real case of long-term urban land resource allocation planning in Suzhou, in the Yangtze River Delta of China. Interval solutions associated with different risk levels of constraint violation were obtained. The results are considered useful for generating a range of decision alternatives under various system conditions, and thus helping decision makers to identify a desirable land resource allocation strategy under uncertainty.

  18. Risk assessment principle for engineered nanotechnology in food and drug.

    PubMed

    Hwang, Myungsil; Lee, Eun Ji; Kweon, Se Young; Park, Mi Sun; Jeong, Ji Yoon; Um, Jun Ho; Kim, Sun Ah; Han, Bum Suk; Lee, Kwang Ho; Yoon, Hae Jung

    2012-06-01

    While the ability to develop nanomaterials and incorporate them into products is advancing rapidly worldwide, understanding of the potential health safety effects of nanomaterials has proceeded at a much slower pace. Since 2008, Korea Food and Drug Administration (KFDA) started an investigation to prepare "Strategic Action Plan" to evaluate safety and nano risk management associated with foods, drugs, medical devices and cosmetics using nano-scale materials. Although there are some studies related to potential risk of nanomaterials, physical-chemical characterization of nanomaterials is not clear yet and these do not offer enough information due to their limitations. Their uncertainties make it impossible to determine whether nanomaterials are actually hazardous to human. According to the above mention, we have some problems to conduct the human exposure risk assessment currently. On the other hand, uncertainty about safety may lead to polarized public debate and to businesses unwillingness for further nanotechnology investigation. Therefore, the criteria and methods to assess possible adverse effects of nanomaterials have been vigorously taken into consideration by many international organizations: the World Health Organization, the Organization for Economic and Commercial Development and the European Commission. The object of this study was to develop risk assessment principles for safety management of future nanoproducts and also to identify areas of research to strengthen risk assessment for nanomaterials. The research roadmaps which were proposed in this study will be helpful to fill up the current gaps in knowledge relevant nano risk assessment.

  19. Joining Forces for Food Security - Linking Earth Observation and Crowd-sourcing for improved Decision-support

    NASA Astrophysics Data System (ADS)

    Enenkel, M.; Dorigo, W.; See, L. M.; Vinck, P.; Papp, A.

    2014-12-01

    Droughts statistically exceed all other natural disasters in complexity, spatio-temporal extent and number of people affected. Triggered by crop failure, food insecurity is a major manifestation of agricultural drought and water scarcity. However, other socio-economic precursors, such as chronically low levels of disaster preparedness, hampered access to food security or a lack of social safety nets are equally important factors. We will present the first results of the SATIDA (Satellite Technologies for Improved Drought-Risk Assessment) project, which advances three complementary developments. First, an existing drought indicator is enhanced by replacing in-situ measurements on rainfall and surface air temperature with satellite-derived datasets. We identify the vegetation status via a new noise-corrected and gap-filled vegetation index. In addition, we introduce a soil moisture component to close the gap between rainfall deficiencies, extreme temperature and the first visible impacts of atmospheric anomalies on vegetation. Second, once calibrated, the index is forced with seasonal forecasts to quantify their uncertainty and added value in the regions of interest. Third, a mobile application is developed to disseminate relevant visualizations to decision-makers in affected areas, to collect additional information about socio-economic conditions and to validate the output of the drought index in real conditions. Involving Doctors without Borders (MSF) as a key user, SATIDA aims at decreasing uncertainties in decision-making via a more holistic risk framework, resulting in longer lead times for disaster logistics in the preparedness phase.

  20. The uncertainty of crop yield projections is reduced by improved temperature response functions.

    PubMed

    Wang, Enli; Martre, Pierre; Zhao, Zhigan; Ewert, Frank; Maiorano, Andrea; Rötter, Reimund P; Kimball, Bruce A; Ottman, Michael J; Wall, Gerard W; White, Jeffrey W; Reynolds, Matthew P; Alderman, Phillip D; Aggarwal, Pramod K; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andrew J; De Sanctis, Giacomo; Doltra, Jordi; Fereres, Elias; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A; Izaurralde, Roberto C; Jabloun, Mohamed; Jones, Curtis D; Kersebaum, Kurt C; Koehler, Ann-Kristin; Liu, Leilei; Müller, Christoph; Naresh Kumar, Soora; Nendel, Claas; O'Leary, Garry; Olesen, Jørgen E; Palosuo, Taru; Priesack, Eckart; Eyshi Rezaei, Ehsan; Ripoche, Dominique; Ruane, Alex C; Semenov, Mikhail A; Shcherbak, Iurii; Stöckle, Claudio; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wallach, Daniel; Wang, Zhimin; Wolf, Joost; Zhu, Yan; Asseng, Senthold

    2017-07-17

    Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on crop yield, but have large uncertainty in yield simulations. Here, we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for >50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 °C to 33 °C. We derived a set of new temperature response functions that when substituted in four wheat models reduced the error in grain yield simulations across seven global sites with different temperature regimes by 19% to 50% (42% average). We anticipate the improved temperature responses to be a key step to improve modelling of crops under rising temperature and climate change, leading to higher skill of crop yield projections.

  1. The Uncertainty of Crop Yield Projections Is Reduced by Improved Temperature Response Functions

    NASA Technical Reports Server (NTRS)

    Wang, Enli; Martre, Pierre; Zhao, Zhigan; Ewert, Frank; Maiorano, Andrea; Rotter, Reimund P.; Kimball, Bruce A.; Ottman, Michael J.; White, Jeffrey W.; Reynolds, Matthew P.; hide

    2017-01-01

    Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on crop yield, but have large uncertainty in yield simulations. Here, we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for is greater than 50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 C to 33 C. We derived a set of new temperature response functions that when substituted in four wheat models reduced the error in grain yield simulations across seven global sites with different temperature regimes by 19% to 50% (42% average). We anticipate the improved temperature responses to be a key step to improve modelling of crops under rising temperature and climate change, leading to higher skill of crop yield projections.

  2. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less

  3. Aeroservoelastic Model Validation and Test Data Analysis of the F/A-18 Active Aeroelastic Wing

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.; Prazenica, Richard J.

    2003-01-01

    Model validation and flight test data analysis require careful consideration of the effects of uncertainty, noise, and nonlinearity. Uncertainty prevails in the data analysis techniques and results in a composite model uncertainty from unmodeled dynamics, assumptions and mechanics of the estimation procedures, noise, and nonlinearity. A fundamental requirement for reliable and robust model development is an attempt to account for each of these sources of error, in particular, for model validation, robust stability prediction, and flight control system development. This paper is concerned with data processing procedures for uncertainty reduction in model validation for stability estimation and nonlinear identification. F/A-18 Active Aeroelastic Wing (AAW) aircraft data is used to demonstrate signal representation effects on uncertain model development, stability estimation, and nonlinear identification. Data is decomposed using adaptive orthonormal best-basis and wavelet-basis signal decompositions for signal denoising into linear and nonlinear identification algorithms. Nonlinear identification from a wavelet-based Volterra kernel procedure is used to extract nonlinear dynamics from aeroelastic responses, and to assist model development and uncertainty reduction for model validation and stability prediction by removing a class of nonlinearity from the uncertainty.

  4. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    NASA Astrophysics Data System (ADS)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  5. Public Perception of Uncertainties Within Climate Change Science.

    PubMed

    Visschers, Vivianne H M

    2018-01-01

    Climate change is a complex, multifaceted problem involving various interacting systems and actors. Therefore, the intensities, locations, and timeframes of the consequences of climate change are hard to predict and cause uncertainties. Relatively little is known about how the public perceives this scientific uncertainty and how this relates to their concern about climate change. In this article, an online survey among 306 Swiss people is reported that investigated whether people differentiate between different types of uncertainty in climate change research. Also examined was the way in which the perception of uncertainty is related to people's concern about climate change, their trust in science, their knowledge about climate change, and their political attitude. The results of a principal component analysis showed that respondents differentiated between perceived ambiguity in climate research, measurement uncertainty, and uncertainty about the future impact of climate change. Using structural equation modeling, it was found that only perceived ambiguity was directly related to concern about climate change, whereas measurement uncertainty and future uncertainty were not. Trust in climate science was strongly associated with each type of uncertainty perception and was indirectly associated with concern about climate change. Also, more knowledge about climate change was related to less strong perceptions of each type of climate science uncertainty. Hence, it is suggested that to increase public concern about climate change, it may be especially important to consider the perceived ambiguity about climate research. Efforts that foster trust in climate science also appear highly worthwhile. © 2017 Society for Risk Analysis.

  6. Uncertainty Analysis of Air Radiation for Lunar Return Shock Layers

    NASA Technical Reports Server (NTRS)

    Kleb, Bil; Johnston, Christopher O.

    2008-01-01

    By leveraging a new uncertainty markup technique, two risk analysis methods are used to compute the uncertainty of lunar-return shock layer radiation predicted by the High temperature Aerothermodynamic Radiation Algorithm (HARA). The effects of epistemic uncertainty, or uncertainty due to a lack of knowledge, is considered for the following modeling parameters: atomic line oscillator strengths, atomic line Stark broadening widths, atomic photoionization cross sections, negative ion photodetachment cross sections, molecular bands oscillator strengths, and electron impact excitation rates. First, a simplified shock layer problem consisting of two constant-property equilibrium layers is considered. The results of this simplified problem show that the atomic nitrogen oscillator strengths and Stark broadening widths in both the vacuum ultraviolet and infrared spectral regions, along with the negative ion continuum, are the dominant uncertainty contributors. Next, three variable property stagnation-line shock layer cases are analyzed: a typical lunar return case and two Fire II cases. For the near-equilibrium lunar return and Fire 1643-second cases, the resulting uncertainties are very similar to the simplified case. Conversely, the relatively nonequilibrium 1636-second case shows significantly larger influence from electron impact excitation rates of both atoms and molecules. For all cases, the total uncertainty in radiative heat flux to the wall due to epistemic uncertainty in modeling parameters is 30% as opposed to the erroneously-small uncertainty levels (plus or minus 6%) found when treating model parameter uncertainties as aleatory (due to chance) instead of epistemic (due to lack of knowledge).

  7. "I Don't Want to Be an Ostrich": Managing Mothers' Uncertainty during BRCA1/2 Genetic Counseling.

    PubMed

    Fisher, Carla L; Roccotagliata, Thomas; Rising, Camella J; Kissane, David W; Glogowski, Emily A; Bylund, Carma L

    2017-06-01

    Families who face genetic disease risk must learn how to grapple with complicated uncertainties about their health and future on a long-term basis. Women who undergo BRCA 1/2 genetic testing describe uncertainty related to personal risk as well as their loved ones', particularly daughters', risk. The genetic counseling setting is a prime opportunity for practitioners to help mothers manage uncertainty in the moment but also once they leave a session. Uncertainty Management Theory (UMT) helps to illuminate the various types of uncertainty women encounter and the important role of communication in uncertainty management. Informed by UMT, we conducted a thematic analysis of 16 genetic counseling sessions between practitioners and mothers at risk for, or carriers of, a BRCA1/2 mutation. Five themes emerged that represent communication strategies used to manage uncertainty: 1) addresses myths, misunderstandings, or misconceptions; 2) introduces uncertainty related to science; 3) encourages information seeking or sharing about family medical history; 4) reaffirms or validates previous behavior or decisions; and 5) minimizes the probability of personal risk or family members' risk. Findings illustrate the critical role of genetic counseling for families in managing emotionally challenging risk-related uncertainty. The analysis may prove beneficial to not only genetic counseling practice but generations of families at high risk for cancer who must learn strategic approaches to managing a complex web of uncertainty that can challenge them for a lifetime.

  8. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    DOE PAGES

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    2016-12-05

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughoutmore » this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4–190 %, with an average of 120 % (2 σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.« less

  9. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughoutmore » this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4–190 %, with an average of 120 % (2 σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.« less

  10. Aeroservoelastic Uncertainty Model Identification from Flight Data

    NASA Technical Reports Server (NTRS)

    Brenner, Martin J.

    2001-01-01

    Uncertainty modeling is a critical element in the estimation of robust stability margins for stability boundary prediction and robust flight control system development. There has been a serious deficiency to date in aeroservoelastic data analysis with attention to uncertainty modeling. Uncertainty can be estimated from flight data using both parametric and nonparametric identification techniques. The model validation problem addressed in this paper is to identify aeroservoelastic models with associated uncertainty structures from a limited amount of controlled excitation inputs over an extensive flight envelope. The challenge to this problem is to update analytical models from flight data estimates while also deriving non-conservative uncertainty descriptions consistent with the flight data. Multisine control surface command inputs and control system feedbacks are used as signals in a wavelet-based modal parameter estimation procedure for model updates. Transfer function estimates are incorporated in a robust minimax estimation scheme to get input-output parameters and error bounds consistent with the data and model structure. Uncertainty estimates derived from the data in this manner provide an appropriate and relevant representation for model development and robust stability analysis. This model-plus-uncertainty identification procedure is applied to aeroservoelastic flight data from the NASA Dryden Flight Research Center F-18 Systems Research Aircraft.

  11. Gridded uncertainty in fossil fuel carbon dioxide emission maps, a CDIAC example

    NASA Astrophysics Data System (ADS)

    Andres, Robert J.; Boden, Thomas A.; Higdon, David M.

    2016-12-01

    Due to a current lack of physical measurements at appropriate spatial and temporal scales, all current global maps and distributions of fossil fuel carbon dioxide (FFCO2) emissions use one or more proxies to distribute those emissions. These proxies and distribution schemes introduce additional uncertainty into these maps. This paper examines the uncertainty associated with the magnitude of gridded FFCO2 emissions. This uncertainty is gridded at the same spatial and temporal scales as the mass magnitude maps. This gridded uncertainty includes uncertainty contributions from the spatial, temporal, proxy, and magnitude components used to create the magnitude map of FFCO2 emissions. Throughout this process, when assumptions had to be made or expert judgment employed, the general tendency in most cases was toward overestimating or increasing the magnitude of uncertainty. The results of the uncertainty analysis reveal a range of 4-190 %, with an average of 120 % (2σ) for populated and FFCO2-emitting grid spaces over annual timescales. This paper also describes a methodological change specific to the creation of the Carbon Dioxide Information Analysis Center (CDIAC) FFCO2 emission maps: the change from a temporally fixed population proxy to a temporally varying population proxy.

  12. INSPECTION SHOP: PLAN TO PROVIDE UNCERTAINTY ANALYSIS WITH MEASUREMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nederbragt, W W

    The LLNL inspection shop is chartered to make dimensional measurements of components for critical programmatic experiments. These measurements ensure that components are within tolerance and provide geometric details that can be used to further refine simulations. For these measurements to be useful, they must be significantly more accurate than the tolerances that are being checked. For example, if a part has a specified dimension of 100 millimeters and a tolerance of 1 millimeter, then the precision and/or accuracy of the measurement should be less than 1 millimeter. Using the ''10-to-1 gaugemaker's rule of thumb'', the desired precision of the measurementmore » should be less than 100 micrometers. Currently, the process for associating measurement uncertainty with data is not standardized, nor is the uncertainty based on a thorough uncertainty analysis. The goal of this project is to begin providing measurement uncertainty statements with critical measurements performed in the inspection shop. To accomplish this task, comprehensive knowledge about the underlying sources of uncertainty for measurement instruments need to be understood and quantified. Moreover, measurements of elemental uncertainties for each physical source need to be combined in a meaningful way to obtain an overall measurement uncertainty.« less

  13. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  14. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data limitations to date. Conclusions include a brief discussion of subsequent uncertainty and risk analysis methodologies, and a commentary on present and future applications of QRA in the management of the public health risks associated with the presence of pathogenic microorganisms in the food supply.

  15. Greenhouse gas emissions and reactive nitrogen releases during the life-cycles of staple food production in China and their mitigation potential.

    PubMed

    Xia, Longlong; Ti, Chaopu; Li, Bolun; Xia, Yongqiu; Yan, Xiaoyuan

    2016-06-15

    Life-cycle analysis of staple food (rice, flour and corn-based fodder) production and assessments of the associated greenhouse gas (GHG) and reactive nitrogen (Nr) releases, from environmental and economic perspectives, help to develop effective mitigation options. However, such evaluations have rarely been executed in China. We evaluated the GHG and Nr releases per kilogram of staple food production (carbon and Nr footprints) and per unit of net economic benefit (CO2-NEB and Nr-NEB), and explored their mitigation potential. Carbon footprints of food production in China were obviously higher than those in some developed countries. There was a high spatial variation in the footprints, primarily attributable to differences in synthetic N use (or CH4 emissions) per unit of food production. Provincial carbon footprints had a significant linear relationship with Nr footprints, attributed to large contribution of N fertilizer use to both GHG and Nr releases. Synthetic N fertilizer applications and CH4 emissions dominated the carbon footprints, while NH3 volatilization and N leaching were the main contributors to the Nr footprints. About 564 (95% uncertainty range: 404-701) TgCO2eqGHG and 10 (7.4-12.4) Tg Nr-N were released every year during 2001-2010 from staple food production. This caused the total damage costs of 325 (70-555) billion ¥, equivalent to nearly 1.44% of the Gross Domestic Product of China. Moreover, the combined damage costs and economic input costs, accounted for 66%-80% of the gross economic benefit generated from food production. A reduction of 92.7TgCO2eqyr(-1) and 2.2TgNr-Nyr(-1) could be achieved by reducing synthetic N inputs by 20%, increasing grain yields by 5% and implementing off-season application of straw and mid-season drainage practices for rice cultivation. In order to realize these scenarios, an ecological compensation scheme should be established to incentivize farmers to gradually adopt knowledge-based managements. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Model selection and averaging in the assessment of the drivers of household food waste to reduce the probability of false positives.

    PubMed

    Grainger, Matthew James; Aramyan, Lusine; Piras, Simone; Quested, Thomas Edward; Righi, Simone; Setti, Marco; Vittuari, Matteo; Stewart, Gavin Bruce

    2018-01-01

    Food waste from households contributes the greatest proportion to total food waste in developed countries. Therefore, food waste reduction requires an understanding of the socio-economic (contextual and behavioural) factors that lead to its generation within the household. Addressing such a complex subject calls for sound methodological approaches that until now have been conditioned by the large number of factors involved in waste generation, by the lack of a recognised definition, and by limited available data. This work contributes to food waste generation literature by using one of the largest available datasets that includes data on the objective amount of avoidable household food waste, along with information on a series of socio-economic factors. In order to address one aspect of the complexity of the problem, machine learning algorithms (random forests and boruta) for variable selection integrated with linear modelling, model selection and averaging are implemented. Model selection addresses model structural uncertainty, which is not routinely considered in assessments of food waste in literature. The main drivers of food waste in the home selected in the most parsimonious models include household size, the presence of fussy eaters, employment status, home ownership status, and the local authority. Results, regardless of which variable set the models are run on, point toward large households as being a key target element for food waste reduction interventions.

  17. Developing a broader scientific foundation for river restoration: Columbia River food webs

    USGS Publications Warehouse

    Naiman, Robert J.; Alldredge, Richard; Beauchamp, David A.; Bisson, Peter A.; Congleton, James; Henny, Charles J.; Huntly, Nancy; Lamberson, Roland; Levings, Colin; Merrill, Erik N.; Pearcy, William G.; Rieman, Bruce E.; Ruggerone, Gregory T.; Scarnecchia, Dennis; Smouse, Peter E.; Wood, Chris C.

    2012-01-01

    Well-functioning food webs are fundamental for sustaining rivers as ecosystems and maintaining associated aquatic and terrestrial communities. The current emphasis on restoring habitat structure—without explicitly considering food webs—has been less successful than hoped in terms of enhancing the status of targeted species and often overlooks important constraints on ecologically effective restoration. We identify three priority food web-related issues that potentially impede successful river restoration: uncertainty about habitat carrying capacity, proliferation of chemicals and contaminants, and emergence of hybrid food webs containing a mixture of native and invasive species. Additionally, there is the need to place these food web considerations in a broad temporal and spatial framework by understanding the consequences of altered nutrient, organic matter (energy), water, and thermal sources and flows, reconnecting critical habitats and their food webs, and restoring for changing environments. As an illustration, we discuss how the Columbia River Basin, site of one of the largest aquatic/riparian restoration programs in the United States, would benefit from implementing a food web perspective. A food web perspective for the Columbia River would complement ongoing approaches and enhance the ability to meet the vision and legal obligations of the US Endangered Species Act, the Northwest Power Act (Fish and Wildlife Program), and federal treaties with Northwest Indian Tribes while meeting fundamental needs for improved river management.

  18. Developing a broader scientific foundation for river restoration: Columbia River food webs

    PubMed Central

    Naiman, Robert J.; Alldredge, J. Richard; Beauchamp, David A.; Bisson, Peter A.; Congleton, James; Henny, Charles J.; Huntly, Nancy; Lamberson, Roland; Levings, Colin; Merrill, Erik N.; Pearcy, William G.; Rieman, Bruce E.; Ruggerone, Gregory T.; Scarnecchia, Dennis; Smouse, Peter E.; Wood, Chris C.

    2012-01-01

    Well-functioning food webs are fundamental for sustaining rivers as ecosystems and maintaining associated aquatic and terrestrial communities. The current emphasis on restoring habitat structure—without explicitly considering food webs—has been less successful than hoped in terms of enhancing the status of targeted species and often overlooks important constraints on ecologically effective restoration. We identify three priority food web-related issues that potentially impede successful river restoration: uncertainty about habitat carrying capacity, proliferation of chemicals and contaminants, and emergence of hybrid food webs containing a mixture of native and invasive species. Additionally, there is the need to place these food web considerations in a broad temporal and spatial framework by understanding the consequences of altered nutrient, organic matter (energy), water, and thermal sources and flows, reconnecting critical habitats and their food webs, and restoring for changing environments. As an illustration, we discuss how the Columbia River Basin, site of one of the largest aquatic/riparian restoration programs in the United States, would benefit from implementing a food web perspective. A food web perspective for the Columbia River would complement ongoing approaches and enhance the ability to meet the vision and legal obligations of the US Endangered Species Act, the Northwest Power Act (Fish and Wildlife Program), and federal treaties with Northwest Indian Tribes while meeting fundamental needs for improved river management. PMID:23197837

  19. Model selection and averaging in the assessment of the drivers of household food waste to reduce the probability of false positives

    PubMed Central

    Aramyan, Lusine; Piras, Simone; Quested, Thomas Edward; Righi, Simone; Setti, Marco; Vittuari, Matteo; Stewart, Gavin Bruce

    2018-01-01

    Food waste from households contributes the greatest proportion to total food waste in developed countries. Therefore, food waste reduction requires an understanding of the socio-economic (contextual and behavioural) factors that lead to its generation within the household. Addressing such a complex subject calls for sound methodological approaches that until now have been conditioned by the large number of factors involved in waste generation, by the lack of a recognised definition, and by limited available data. This work contributes to food waste generation literature by using one of the largest available datasets that includes data on the objective amount of avoidable household food waste, along with information on a series of socio-economic factors. In order to address one aspect of the complexity of the problem, machine learning algorithms (random forests and boruta) for variable selection integrated with linear modelling, model selection and averaging are implemented. Model selection addresses model structural uncertainty, which is not routinely considered in assessments of food waste in literature. The main drivers of food waste in the home selected in the most parsimonious models include household size, the presence of fussy eaters, employment status, home ownership status, and the local authority. Results, regardless of which variable set the models are run on, point toward large households as being a key target element for food waste reduction interventions. PMID:29389949

  20. Application of the JENDL-4.0 nuclear data set for uncertainty analysis of the prototype FBR Monju

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.

    2012-07-01

    This paper deals with uncertainty analysis of the Monju reactor using JENDL-4.0 and the ERANOS code 1. In 2010 the Japan Atomic Energy Agency - JAEA - released the JENDL-4.0 nuclear data set. This new evaluation contains improved values of cross-sections and emphasizes accurate covariance matrices. Also in 2010, JAEA restarted the sodium-cooled fast reactor prototype Monju after about 15 years of shutdown. The long shutdown time resulted in a build-up of {sup 241}Am by natural decay from the initially loaded Pu. As well as improved covariance matrices, JENDL-4.0 is announced to contain improved data for minor actinides 2. Themore » choice of Monju reactor as an application of the new evaluation seems then even more relevant. The uncertainty analysis requires the determination of sensitivity coefficients. The well-established ERANOS code was chosen because of its integrated modules that allow users to perform sensitivity and uncertainty analysis. A JENDL-4.0 cross-sections library is not available for ERANOS. Therefor a cross-sections library had to be made from the original ENDF files for the ECCO cell code (part of ERANOS). For confirmation of the newly made library, calculations of a benchmark core were performed. These calculations used the MZA and MZB benchmarks and showed consistent results with other libraries. Calculations for the Monju reactor were performed using hexagonal 3D geometry and PN transport theory. However, the ERANOS sensitivity modules cannot use the resulting fluxes, as these modules require finite differences based fluxes, obtained from RZ SN-transport or 3D diffusion calculations. The corresponding geometrical models have been made and the results verified with Monju restart experimental data 4. Uncertainty analysis was performed using the RZ model. JENDL-4.0 uncertainty analysis showed a significant reduction of the uncertainty related to the fission cross-section of Pu along with an increase of the uncertainty related to the capture cross-section of {sup 238}U compared with the previous JENDL-3.3 version. Covariance data recently added in JENDL-4.0 for {sup 241}Am appears to have a non-negligible contribution. (authors)« less

  1. [Transformer winding's temperature rising and an analysis of its uncertainty].

    PubMed

    Wang, Pei-Lian; Chen, Yu-En; Zhong, Sheng-Kui

    2007-09-01

    This paper introduces the temperature rising experimental process and some matters needing attention when the transformer is normally loading. And an analysis of the uncertainty for transformer's temperature rising is also made based on the practical examples' data.

  2. Sustaining food self-sufficiency of a nation: The case of Sri Lankan rice production and related water and fertilizer demands.

    PubMed

    Davis, Kyle Frankel; Gephart, Jessica A; Gunda, Thushara

    2016-04-01

    Rising human demand and climatic variability have created greater uncertainty regarding global food trade and its effects on the food security of nations. To reduce reliance on imported food, many countries have focused on increasing their domestic food production in recent years. With clear goals for the complete self-sufficiency of rice production, Sri Lanka provides an ideal case study for examining the projected growth in domestic rice supply, how this compares to future national demand, and what the associated impacts from water and fertilizer demands may be. Using national rice statistics and estimates of intensification, this study finds that improvements in rice production can feed 25.3 million Sri Lankans (compared to a projected population of 23.8 million people) by 2050. However, to achieve this growth, consumptive water use and nitrogen fertilizer application may need to increase by as much as 69 and 23 %, respectively. This assessment demonstrates that targets for maintaining self-sufficiency should better incorporate avenues for improving resource use efficiency.

  3. Irradiated ready-to-eat spinach leaves: How information influences awareness towards irradiation treatment and consumer's purchase intention

    NASA Astrophysics Data System (ADS)

    Finten, G.; Garrido, J. I.; Agüero, M. V.; Jagus, R. J.

    2017-01-01

    This article aims to clarify and supply further information on food irradiation acceptance, with particular focus on Argentina and irradiated ready-to-eat (RTE) spinach leaves through an open web-online survey. Results showed that half of respondents did not know food irradiation, but the other half demonstrated uncertainty despite they declared they had knowledge about it; thus, confirming little awareness towards this technology. Respondents who believed in the misleading myth about food irradiation represented 39%, while roughly the same number was doubtful. On the other hand, after supplying informative material, respondents were positively influenced and an increase in acceptance by 90% was found. Finally, 42% of respondents were willing to consume/purchase irradiated RTE spinach leaves, and 35% remained doubtful. Respondents who did not exclude to accept irradiated spinach could be considered potential consumers if intensive campaigns about the benefits of food irradiation were carried out by reliable actors. If the Argentinean RTE market grew, following the world consumption trend towards these products, irradiated spinach leaves could be successfully introduced by making better efforts to inform consumers about food irradiation.

  4. Rainfall or parameter uncertainty? The power of sensitivity analysis on grouped factors

    NASA Astrophysics Data System (ADS)

    Nossent, Jiri; Pereira, Fernando; Bauwens, Willy

    2017-04-01

    Hydrological models are typically used to study and represent (a part of) the hydrological cycle. In general, the output of these models mostly depends on their input rainfall and parameter values. Both model parameters and input precipitation however, are characterized by uncertainties and, therefore, lead to uncertainty on the model output. Sensitivity analysis (SA) allows to assess and compare the importance of the different factors for this output uncertainty. Hereto, the rainfall uncertainty can be incorporated in the SA by representing it as a probabilistic multiplier. Such multiplier can be defined for the entire time series, or several of these factors can be determined for every recorded rainfall pulse or for hydrological independent storm events. As a consequence, the number of parameters included in the SA related to the rainfall uncertainty can be (much) lower or (much) higher than the number of model parameters. Although such analyses can yield interesting results, it remains challenging to determine which type of uncertainty will affect the model output most due to the different weight both types will have within the SA. In this study, we apply the variance based Sobol' sensitivity analysis method to two different hydrological simulators (NAM and HyMod) for four diverse watersheds. Besides the different number of model parameters (NAM: 11 parameters; HyMod: 5 parameters), the setup of our sensitivity and uncertainty analysis-combination is also varied by defining a variety of scenarios including diverse numbers of rainfall multipliers. To overcome the issue of the different number of factors and, thus, the different weights of the two types of uncertainty, we build on one of the advantageous properties of the Sobol' SA, i.e. treating grouped parameters as a single parameter. The latter results in a setup with a single factor for each uncertainty type and allows for a straightforward comparison of their importance. In general, the results show a clear influence of the weights in the different SA scenarios. However, working with grouped factors resolves this issue and leads to clear importance results.

  5. Planning for robust reserve networks using uncertainty analysis

    USGS Publications Warehouse

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  6. Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.

    PubMed

    Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James

    2009-04-01

    The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.

  7. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu; Jablonowski, Christopher; Lake, Larry

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum designmore » concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.« less

  8. Climate Change and Food Security: Health Impacts in Developed Countries

    PubMed Central

    Hooper, Lee; Abdelhamid, Asmaa; Bentham, Graham; Boxall, Alistair B.A.; Draper, Alizon; Fairweather-Tait, Susan; Hulme, Mike; Hunter, Paul R.; Nichols, Gordon; Waldron, Keith W.

    2012-01-01

    Background: Anthropogenic climate change will affect global food production, with uncertain consequences for human health in developed countries. Objectives: We investigated the potential impact of climate change on food security (nutrition and food safety) and the implications for human health in developed countries. Methods: Expert input and structured literature searches were conducted and synthesized to produce overall assessments of the likely impacts of climate change on global food production and recommendations for future research and policy changes. Results: Increasing food prices may lower the nutritional quality of dietary intakes, exacerbate obesity, and amplify health inequalities. Altered conditions for food production may result in emerging pathogens, new crop and livestock species, and altered use of pesticides and veterinary medicines, and affect the main transfer mechanisms through which contaminants move from the environment into food. All these have implications for food safety and the nutritional content of food. Climate change mitigation may increase consumption of foods whose production reduces greenhouse gas emissions. Impacts may include reduced red meat consumption (with positive effects on saturated fat, but negative impacts on zinc and iron intake) and reduced winter fruit and vegetable consumption. Developed countries have complex structures in place that may be used to adapt to the food safety consequences of climate change, although their effectiveness will vary between countries, and the ability to respond to nutritional challenges is less certain. Conclusions: Climate change will have notable impacts upon nutrition and food safety in developed countries, but further research is necessary to accurately quantify these impacts. Uncertainty about future impacts, coupled with evidence that climate change may lead to more variable food quality, emphasizes the need to maintain and strengthen existing structures and policies to regulate food production, monitor food quality and safety, and respond to nutritional and safety issues that arise. PMID:23124134

  9. Climate change and food security: health impacts in developed countries.

    PubMed

    Lake, Iain R; Hooper, Lee; Abdelhamid, Asmaa; Bentham, Graham; Boxall, Alistair B A; Draper, Alizon; Fairweather-Tait, Susan; Hulme, Mike; Hunter, Paul R; Nichols, Gordon; Waldron, Keith W

    2012-11-01

    Anthropogenic climate change will affect global food production, with uncertain consequences for human health in developed countries. We investigated the potential impact of climate change on food security (nutrition and food safety) and the implications for human health in developed countries. Expert input and structured literature searches were conducted and synthesized to produce overall assessments of the likely impacts of climate change on global food production and recommendations for future research and policy changes. Increasing food prices may lower the nutritional quality of dietary intakes, exacerbate obesity, and amplify health inequalities. Altered conditions for food production may result in emerging pathogens, new crop and livestock species, and altered use of pesticides and veterinary medicines, and affect the main transfer mechanisms through which contaminants move from the environment into food. All these have implications for food safety and the nutritional content of food. Climate change mitigation may increase consumption of foods whose production reduces greenhouse gas emissions. Impacts may include reduced red meat consumption (with positive effects on saturated fat, but negative impacts on zinc and iron intake) and reduced winter fruit and vegetable consumption. Developed countries have complex structures in place that may be used to adapt to the food safety consequences of climate change, although their effectiveness will vary between countries, and the ability to respond to nutritional challenges is less certain. Climate change will have notable impacts upon nutrition and food safety in developed countries, but further research is necessary to accurately quantify these impacts. Uncertainty about future impacts, coupled with evidence that climate change may lead to more variable food quality, emphasizes the need to maintain and strengthen existing structures and policies to regulate food production, monitor food quality and safety, and respond to nutritional and safety issues that arise.

  10. Predictive Uncertainty And Parameter Sensitivity Of A Sediment-Flux Model: Nitrogen Flux and Sediment Oxygen Demand

    EPA Science Inventory

    Estimating model predictive uncertainty is imperative to informed environmental decision making and management of water resources. This paper applies the Generalized Sensitivity Analysis (GSA) to examine parameter sensitivity and the Generalized Likelihood Uncertainty Estimation...

  11. Rights-based approaches to addressing food poverty and food insecurity in Ireland and UK.

    PubMed

    Dowler, Elizabeth A; O'Connor, Deirdre

    2012-01-01

    Food poverty is an important contributing factor to health inequalities in industrialised countries; it refers to the inability to acquire or eat an adequate quality or sufficient quantity of food in socially acceptable ways (or the uncertainty of being able to do so). Synonymous with household food insecurity, the issue needs to be located within a social justice framework. Recognising the clear interdependence between the right to food and the right to health, this paper explores how international human rights obligations could inform approaches to addressing food poverty and insecurity with specific reference to Ireland and the UK. Little attention has been paid to how countries should meet their obligations to respect, protect and fulfil the right to food in developed countries. The paper contributes by examining the social and policy circumstances which inhibit poor households from obtaining sufficient food to eat healthily, along with strategies and interventions from State and civil society actors in the two countries. In practice, problems and potential solutions have largely been directed towards the individual rather than at social determinants, particularly as research on environmental factors such as distance to shops has produced equivocal results. Other key structural aspects such as income sufficiency for food are broadly ignored by the State, and anti-poverty strategies are often implemented without monitoring for effects on food outcomes. Thus scant evidence exists for either Ireland or the UK meeting its rights to food obligations to date, in terms of roles and responsibilities in ensuring access to affordable, available and appropriate food for all. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.

  13. State of the World 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCarthy, D.

    1993-01-01

    State of the World 1993 warns particularly about global decline in food production and rise in poverty. However, other aspects are more positive: governments responding quickly to global environmental concerns such as the ozone hole and CFCs; the Earth Summit at Rio; the possibility we are on the road to a sustainable society. The uncertainty surrounding the issue of global warming is also presented.

  14. Climate and fishing steer ecosystem regeneration to uncertain economic futures

    PubMed Central

    Blenckner, Thorsten; Llope, Marcos; Möllmann, Christian; Voss, Rudi; Quaas, Martin F.; Casini, Michele; Lindegren, Martin; Folke, Carl; Chr. Stenseth, Nils

    2015-01-01

    Overfishing of large predatory fish populations has resulted in lasting restructurings of entire marine food webs worldwide, with serious socio-economic consequences. Fortunately, some degraded ecosystems show signs of recovery. A key challenge for ecosystem management is to anticipate the degree to which recovery is possible. By applying a statistical food-web model, using the Baltic Sea as a case study, we show that under current temperature and salinity conditions, complete recovery of this heavily altered ecosystem will be impossible. Instead, the ecosystem regenerates towards a new ecological baseline. This new baseline is characterized by lower and more variable biomass of cod, the commercially most important fish stock in the Baltic Sea, even under very low exploitation pressure. Furthermore, a socio-economic assessment shows that this signal is amplified at the level of societal costs, owing to increased uncertainty in biomass and reduced consumer surplus. Specifically, the combined economic losses amount to approximately 120 million € per year, which equals half of today's maximum economic yield for the Baltic cod fishery. Our analyses suggest that shifts in ecological and economic baselines can lead to higher economic uncertainty and costs for exploited ecosystems, in particular, under climate change. PMID:25694626

  15. Probabilistic Decision Tools for Determining Impacts of Agricultural Development Policy on Household Nutrition

    NASA Astrophysics Data System (ADS)

    Whitney, Cory W.; Lanzanova, Denis; Muchiri, Caroline; Shepherd, Keith D.; Rosenstock, Todd S.; Krawinkel, Michael; Tabuti, John R. S.; Luedeling, Eike

    2018-03-01

    Governments around the world have agreed to end hunger and food insecurity and to improve global nutrition, largely through changes to agriculture and food systems. However, they are faced with a lot of uncertainty when making policy decisions, since any agricultural changes will influence social and biophysical systems, which could yield either positive or negative nutrition outcomes. We outline a holistic probability modeling approach with Bayesian Network (BN) models for nutritional impacts resulting from agricultural development policy. The approach includes the elicitation of expert knowledge for impact model development, including sensitivity analysis and value of information calculations. It aims at a generalizable methodology that can be applied in a wide range of contexts. To showcase this approach, we develop an impact model of Vision 2040, Uganda's development strategy, which, among other objectives, seeks to transform the country's agricultural landscape from traditional systems to large-scale commercial agriculture. Model results suggest that Vision 2040 is likely to have negative outcomes for the rural livelihoods it intends to support; it may have no appreciable influence on household hunger but, by influencing preferences for and access to quality nutritional foods, may increase the prevalence of micronutrient deficiency. The results highlight the trade-offs that must be negotiated when making decisions regarding agriculture for nutrition, and the capacity of BNs to make these trade-offs explicit. The work illustrates the value of BNs for supporting evidence-based agricultural development decisions.

  16. Shelf life extension as solution for environmental impact mitigation: A case study for bakery products.

    PubMed

    Bacenetti, Jacopo; Cavaliere, Alessia; Falcone, Giacomo; Giovenzana, Valentina; Banterle, Alessandro; Guidetti, Riccardo

    2018-06-15

    Over the last years, increasing attention has been paid to environmental concerns related to food production and potential solutions to this issue. Among the different strategies being considered to reduce the impact food production has on the environment, only moderate has been paid to the extension of shelf life; a longer shelf life can reduce food losses as well as the economic and environmental impacts of the distribution logistics. The aim of this study is to assess the environmental performance of whole-wheat breadsticks with extended shelf lives and to evaluate whether the shelf-life extension is an effective mitigation solution from an environmental point of view. To this purpose, the life cycle assessment (LCA) approach was applied from a "cradle-to-grave" perspective. Rosmarinic acid was used as an antioxidant to extend the shelf life. To test the robustness of the results and to investigate the influence of the choices made in the modelling phase, a sensitivity and uncertainty analysis were carried out. The achieved results highlighted how, for 10 of the 12 evaluated impact categories, the shelf-life extension is a proper mitigation solution, and its effectiveness depends on the magnitude of product loss reduction that is achieved. The shelf-life extension doesn't allow for the reduction of environmental impact in the categories of human toxicity, cancer effects and freshwater eutrophication. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Dissipation of difenoconazole in apples used for production of baby food.

    PubMed

    Szpyrka, Ewa; Walorczyk, Stanisław

    2017-02-01

    Dissipation of fungicide difenoconazole (3-chloro-4-[(2RS,4RS;2RS,4SR)-4-methyl-2-(1H-1,2,4-triazol-1-ylmethyl)-1,3-dioxolan-2-yl]phenyl 4-chlorophenyl ether) was studied following its application on apples intended for production of baby food. The apples (varieties: Jonagold Decosta, Gala and Idared) were sprayed with the formulation to control pathogens causing fungal diseases: powdery mildew (Podosphaera leucotricha ELL et Ev./Salm.) and apple scab (Venturia inaequalis Cooke/Aderh.). A validated gas chromatography-based method with simultaneous electron capture and nitrogen phosphorus detection (GC-ECD/NPD) was used for the residue analysis. The analytical performance of the method was highly satisfactory, with expanded uncertainties ≤ 19% (a coverage factor, k = 2, and a confidence level of 95%). The dissipation of difenoconazole was studied in pseudo-first-order kinetic models (for which the coefficients of determination, R 2 , ranged between 0.880 and 0.977). The half-life of difenoconazole was 12-21 days in experiments conducted on three apple varieties. In these experiments, the initial residue levels declined gradually and reached the level of 0.01 mg kg -1 in 50-79 days. For the residue levels to remain below 0.01 mg kg -1 (the maximum acceptable concentration for baby foods), difenoconazole must be applied approximately 3 months before harvest, at a dose of 0.2 L ha -1 (50 g of an active ingredient per ha).

  18. What Are the Main Drivers of Young Consumers Purchasing Traditional Food Products? European Field Research

    PubMed Central

    Kyrgiakos, Leonidas

    2018-01-01

    In this research, the attitude of European young adults (age 18 to 30 years) regarding their consumption of local and traditional products was examined. The survey was conducted on a sample of 836 consumers from seven European countries (Greece, Bulgaria, Romania, Slovenia, Croatia, Denmark and France). Data collection was made by distributing a developed questionnaire through social media and university mail services. Principal Component Analysis (PCA) was used to identify consumer perception comparing the overall sample with two subsets (consumers from Eastern and Western European countries). Six major factors were revealed: consumer behavior, uncertainty about health issues, cost, influence of media and friends and availability in store. Young adults had a positive attitude to local and traditional food products, but they expressed insecurity about health issues. Cost factor had less of an influence on interviewees from Eastern European countries than those from the overall sample (3rd and 5th factor accordingly). Influence of close environment was a different factor in Eastern countries compared to Western ones, for which it was common to see an influence from media. Females and older people (25–30 years old) have fewer doubts about Traditional Food Products, while media have a high influence on consumers’ decisions. The aim of this survey was to identify the consumer profiles of young adults and create different promotion strategies of local and traditional products among the two groups of countries. PMID:29439536

  19. The diagnostic value of component-resolved diagnostics in peanut allergy in children attending a Regional Paediatric Allergology Clinic.

    PubMed

    van Veen, Leonieke N; Heron, Michiel; Batstra, Manou; van Haard, Paul M M; de Groot, Hans

    2016-06-02

    To date, diagnosing food allergies in children still presents a diagnostic dilemma, leading to uncertainty concerning the definite diagnosis of peanut allergy, as well as to the need for strict diets and the potential need for adrenalin auto-injectors. This uncertainty in particular is thought to contribute to a lower quality of life. In the diagnostic process double-blind food challenges are considered the gold standard, but they are time-consuming as well as potentially hazardous. Other diagnostic tests have been extensively studied and among these component-resolved diagnostics appeared to present a promising alternative: Ara h2, a peanut storage protein in previous studies showed to have a significant predictive value. Sixty-two out of 72 children, with suspected peanut allergy were analyzed using serum specific IgE and/or skin prick tests and specific IgE to several components of peanut (Ara h 1, 2, 3, 6, 8, 9). Subsequently, double-blind food challenges were performed. The correlation between the various diagnostic tests and the overall outcome of the double-blind food challenges were studied, in particular the severity of the reaction and the eliciting dose. The double-blind provocation with peanut was positive in 33 children (53 %). There was no relationship between the eliciting dose and the severity of the reaction. A statistically significant relationship was found between the skin prick test, specific IgE directed to peanut, Ara h 1, Ara h 2 or Ara h 6, and the outcome of the food challenge test, in terms of positive or negative (P < .001). However, we did not find any relationship between sensitisation to peanut extract or the different allergen components and the severity of the reaction or the eliciting dose. There was no correlation between IgE directed to Ara h 3, Ara h 8, Ara h 9 and the clinical outcome of the food challenge. This study shows that component-resolved diagnostics is not superior to specific IgE to peanut extract or to skin prick testing. At present, it cannot replace double-blind placebo-controlled food challenges for determination of the eliciting dose or the severity of the peanut allergy in our patient group.

  20. Assessments of Maize Yield Potential in the Korean Peninsula Using Multiple Crop Models

    NASA Astrophysics Data System (ADS)

    Kim, S. H.; Myoung, B.; Lim, C. H.; Lee, S. G.; Lee, W. K.; Kafatos, M.

    2015-12-01

    The Korean Peninsular has unique agricultural environments due to the differences in the political and socio-economical systems between the Republic of Korea (SK, hereafter) and the Democratic Peoples' Republic of Korea (NK, hereafter). NK has been suffering from the lack of food supplies caused by natural disasters, land degradation and failed political system. The neighboring developed country SK has a better agricultural system but very low food self-sufficiency rate (around 1% of maize). Maize is an important crop in both countries since it is staple food for NK and SK is No. 2 maize importing country in the world after Japan. Therefore evaluating maize yield potential (Yp) in the two distinct regions is essential to assess food security under climate change and variability. In this study, we have utilized multiple process-based crop models capable of regional-scale assessments to evaluate maize Yp over the Korean Peninsula - the GIS version of EPIC model (GEPIC) and APSIM model that can be expanded to regional scales (APSIM regions). First we evaluated model performance and skill for 20 years from 1991 to 2010 using reanalysis data (Local Data Assimilation and Prediction System (LDAPS); 1.5km resolution) and observed data. Each model's performances were compared over different regions within the Korean Peninsula of different regional climate characteristics. To quantify the major influence of individual climate variables, we also conducted a sensitivity test using 20 years of climatology. Lastly, a multi-model ensemble analysis was performed to reduce crop model uncertainties. The results will provide valuable information for estimating the climate change or variability impacts on Yp over the Korean Peninsula.

Top