Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation
Wang, Yan; Swiler, Laura
2017-09-07
The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.
Special Issue on Uncertainty Quantification in Multiscale System Design and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yan; Swiler, Laura
The importance of uncertainty has been recognized in various modeling, simulation, and analysis applications, where inherent assumptions and simplifications affect the accuracy of model predictions for physical phenomena. As model predictions are now heavily relied upon for simulation-based system design, which includes new materials, vehicles, mechanical and civil structures, and even new drugs, wrong model predictions could potentially cause catastrophic consequences. Therefore, uncertainty and associated risks due to model errors should be quantified to support robust systems engineering.
Population viability analysis for endangered Roanoke logperch
Roberts, James H.; Angermeier, Paul; Anderson, Gregory B.
2016-01-01
A common strategy for recovering endangered species is ensuring that populations exceed the minimum viable population size (MVP), a demographic benchmark that theoretically ensures low long-term extinction risk. One method of establishing MVP is population viability analysis, a modeling technique that simulates population trajectories and forecasts extinction risk based on a series of biological, environmental, and management assumptions. Such models also help identify key uncertainties that have a large influence on extinction risk. We used stochastic count-based simulation models to explore extinction risk, MVP, and the possible benefits of alternative management strategies in populations of Roanoke logperch Percina rex, an endangered stream fish. Estimates of extinction risk were sensitive to the assumed population growth rate and model type, carrying capacity, and catastrophe regime (frequency and severity of anthropogenic fish kills), whereas demographic augmentation did little to reduce extinction risk. Under density-dependent growth, the estimated MVP for Roanoke logperch ranged from 200 to 4200 individuals, depending on the assumed severity of catastrophes. Thus, depending on the MVP threshold, anywhere from two to all five of the logperch populations we assessed were projected to be viable. Despite this uncertainty, these results help identify populations with the greatest relative extinction risk, as well as management strategies that might reduce this risk the most, such as increasing carrying capacity and reducing fish kills. Better estimates of population growth parameters and catastrophe regimes would facilitate the refinement of MVP and extinction-risk estimates, and they should be a high priority for future research on Roanoke logperch and other imperiled stream-fish species.
Tools used by the insurance industry to assess risk from hydroclimatic extremes
NASA Astrophysics Data System (ADS)
Higgs, Stephanie; McMullan, Caroline
2016-04-01
Probabilistic catastrophe models are widely used within the insurance industry to assess and price the risk of natural hazards to individual residences through to portfolios of millions of properties. Over the relatively short period that catastrophe models have been available (almost 30 years), the insurance industry has built up a financial resilience to key natural hazards in certain areas (e.g. US tropical cyclone, European extra-tropical cyclone and flood). However, due the rapidly expanding global population and increase in wealth, together with uncertainties in the behaviour of meteorological phenomena introduced by climate change, the domain in which natural hazards impact society is growing. As a result, the insurance industry faces new challenges in assessing the risk and uncertainty from natural hazards. As a catastrophe modelling company, AIR Worldwide has a toolbox of options available to help the insurance industry assess extreme climatic events and their associated uncertainty. Here we discuss several of these tools: from helping analysts understand how uncertainty is inherently built in to probabilistic catastrophe models, to understanding alternative stochastic catalogs for tropical cyclone based on climate conditioning. Through the use of stochastic extreme disaster events such as those provided through AIR's catalogs or through the Lloyds of London marketplace (RDS's) to provide useful benchmarks for the loss probability exceedence and tail-at-risk metrics outputted from catastrophe models; to the visualisation of 1000+ year event footprints and hazard intensity maps. Ultimately the increased transparency of catastrophe models and flexibility of a software platform that allows for customisation of modelled and non-modelled risks will drive a greater understanding of extreme hydroclimatic events within the insurance industry.
Quantifying the risk of extreme aviation accidents
NASA Astrophysics Data System (ADS)
Das, Kumer Pial; Dey, Asim Kumer
2016-12-01
Air travel is considered a safe means of transportation. But when aviation accidents do occur they often result in fatalities. Fortunately, the most extreme accidents occur rarely. However, 2014 was the deadliest year in the past decade causing 111 plane crashes, and among them worst four crashes cause 298, 239, 162 and 116 deaths. In this study, we want to assess the risk of the catastrophic aviation accidents by studying historical aviation accidents. Applying a generalized Pareto model we predict the maximum fatalities from an aviation accident in future. The fitted model is compared with some of its competitive models. The uncertainty in the inferences are quantified using simulated aviation accident series, generated by bootstrap resampling and Monte Carlo simulations.
NASA Astrophysics Data System (ADS)
Terando, A. J.; Reich, B. J.; Pacifici, K.
2013-12-01
Fire is an important disturbance process in many coupled natural-human systems. Changes in the frequency and severity of fires due to anthropogenic climate change could have significant costs to society and the plant and animal communities that are adapted to a particular fire regime Planning for these changes requires a robust model of the relationship between climate and fire that accounts for multiple sources of uncertainty that are present when simulating ecological and climatological processes. Here we model how anthropogenic climate change could affect the wildfire regime for a region in the Southeast US whose natural ecosystems are dependent on frequent, low-intensity fires while humans are at risk from large catastrophic fires. We develop a modeling framework that incorporates three major sources of uncertainty: (1) uncertainty in the ecological drivers of expected monthly area burned, (2) uncertainty in the environmental drivers influencing the probability of an extreme fire event, and (3) structural uncertainty in different downscaled climate models. In addition we use two policy-relevant emission scenarios (climate stabilization and 'business-as-usual') to characterize the uncertainty in future greenhouse gas forcings. We use a Bayesian framework to incorporate different sources of uncertainty including simulation of predictive errors and Stochastic Search Variable Selection. Our results suggest that although the mean process remains stationary, the probability of extreme fires declines through time, owing to the persistence of high atmospheric moisture content during the peak fire season that dampens the effect of increasing temperatures. Including multiple sources of uncertainty leads to wide prediction intervals, but is potentially more useful for decision-makers that will require adaptation strategies that are robust to rapid but uncertain climate and ecological change.
NASA Astrophysics Data System (ADS)
Chisolm, Rachel E.; McKinney, Daene C.
2018-05-01
This paper studies the lake dynamics for avalanche-triggered glacial lake outburst floods (GLOFs) in the Cordillera Blanca mountain range in Ancash, Peru. As new glacial lakes emerge and existing lakes continue to grow, they pose an increasing threat of GLOFs that can be catastrophic to the communities living downstream. In this work, the dynamics of displacement waves produced from avalanches are studied through three-dimensional hydrodynamic simulations of Lake Palcacocha, Peru, with an emphasis on the sensitivity of the lake model to input parameters and boundary conditions. This type of avalanche-generated wave is an important link in the GLOF process chain because there is a high potential for overtopping and erosion of the lake-damming moraine. The lake model was evaluated for sensitivity to turbulence model and grid resolution, and the uncertainty due to these model parameters is significantly less than that due to avalanche boundary condition characteristics. Wave generation from avalanche impact was simulated using two different boundary condition methods. Representation of an avalanche as water flowing into the lake generally resulted in higher peak flows and overtopping volumes than simulating the avalanche impact as mass-momentum inflow at the lake boundary. Three different scenarios of avalanche size were simulated for the current lake conditions, and all resulted in significant overtopping of the lake-damming moraine. Although the lake model introduces significant uncertainty, the avalanche portion of the GLOF process chain is likely to be the greatest source of uncertainty. To aid in evaluation of hazard mitigation alternatives, two scenarios of lake lowering were investigated. While large avalanches produced significant overtopping waves for all lake-lowering scenarios, simulations suggest that it may be possible to contain waves generated from smaller avalanches if the surface of the lake is lowered.
Repeatability and uncertainty analyses of light gas gun test data
NASA Technical Reports Server (NTRS)
Schonberg, William P.; Cooper, David
1994-01-01
All large spacecraft are susceptible to high-speed impacts by meteoroids and pieces of orbiting space debris which can damage flight-critical systems and in turn lead to catastrophic failure. One way to obtain information on the response of a structure to a meteoroid impact or an orbital debris impact is to simulate the impact conditions of interest in the laboratory and analyze the resulting damage to a target structure. As part of the Phase B and C/D development activities for the Space Station Freedom, 950 impact tests were performed using the NASA/Marshall Space Flight Center (MSFC) light gas gun from 1985-1991. This paper presents the results of impact phenomena repeatability and data uncertainty studies performed using the information obtained from those tests. The results of these studies can be used to assess the utility of individual current and future NASA/MSFC impact test results in the design of long-duration spacecraft.
NASA Astrophysics Data System (ADS)
Davis, D. R.; Farinella, P.; Paolicchi, P.; Zappala, V.
Theoretical, numerical, and experimental investigations of the violent disruption of asteroids or planetary satellites are discussed in reviews and reports. Topics examined include acceleration techniques and results of experiments simulating catastrophic fragmentation events; laboratory simulations of catastrophic impact; scaling laws for the catastrophic collisions of asteroids; asteroid collisional history, the origin of the Hirayama families, and disruption of small satellites; and the implications of the inferred compositions of a steroids for their collisional evolution. Diagrams, graphs, tables, and a summary of the discussion at the workshop are provided.
NASA Technical Reports Server (NTRS)
Davis, D. R. (Editor); Farinella, P. (Editor); Paolicchi, P. (Editor); Zappala, V. (Editor)
1986-01-01
Theoretical, numerical, and experimental investigations of the violent disruption of asteroids or planetary satellites are discussed in reviews and reports. Topics examined include acceleration techniques and results of experiments simulating catastrophic fragmentation events; laboratory simulations of catastrophic impact; scaling laws for the catastrophic collisions of asteroids; asteroid collisional history, the origin of the Hirayama families, and disruption of small satellites; and the implications of the inferred compositions of a steroids for their collisional evolution. Diagrams, graphs, tables, and a summary of the discussion at the workshop are provided.
Avoiding an uncertain catastrophe: Climate change mitigation under risk and wealth heterogeneity
Thomas C. Brown; Stephan Kroll
2017-01-01
For environmental problems such as climate change, uncertainty about future conditions makes it difficult to know what the goal of mitigation efforts should be, and inequality among the affected parties makes it hard for them to know how much they each should do toward reaching the goal. We examine the effects of scientific uncertainty and wealth inequality in...
VizieR Online Data Catalog: SDSS-DR9 photometric redshifts (Brescia+, 2014)
NASA Astrophysics Data System (ADS)
Brescia, M.; Cavuoti, S.; Longo, G.; de Stefano, V.
2014-07-01
We present an application of a machine learning method to the estimation of photometric redshifts for the galaxies in the SDSS Data Release 9 (SDSS-DR9). Photometric redshifts for more than 143 million galaxies were produced. The MLPQNA (Multi Layer Perceptron with Quasi Newton Algorithm) model provided within the framework of the DAMEWARE (DAta Mining and Exploration Web Application REsource) is an interpolative method derived from machine learning models. The obtained redshifts have an overall uncertainty of σ=0.023 with a very small average bias of about 3x10-5 and a fraction of catastrophic outliers of about 5%. After removal of the catastrophic outliers, the uncertainty is about σ=0.017. The catalogue files report in their name the range of DEC degrees related to the included objects. (60 data files).
Penetration of n-hexadecane and water into wood under conditions simulating catastrophic floods
Ganna Baglayeva; Wayne S. Seames; Charles R. Frihart; Jane O' Dell; Evguenii I. Kozliak
2017-01-01
To simulate fuel oil spills occurring during catastrophic floods, short-term absorption of two chemicals, n-hexadecane (representative of semivolatile organic compounds in fuel oil) and water, into southern yellow pine was gravimetrically monitored as a function of time at ambient conditions. Different scenarios were run on the basis of (1) the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Quanhao; Wang, Yuming; Hu, Youqiu
Since only the magnetic conditions at the photosphere can be routinely observed in current observations, it is of great significance to determine the influences of photospheric magnetic conditions on solar eruptive activities. Previous studies about catastrophe indicated that the magnetic system consisting of a flux rope in a partially open bipolar field is subject to catastrophe, but not if the bipolar field is completely closed under the same specified photospheric conditions. In order to investigate the influence of the photospheric magnetic conditions on the catastrophic behavior of this system, we expand upon the 2.5-dimensional ideal magnetohydrodynamic model in Cartesian coordinatesmore » to simulate the evolution of the equilibrium states of the system under different photospheric flux distributions. Our simulation results reveal that a catastrophe occurs only when the photospheric flux is not concentrated too much toward the polarity inversion line and the source regions of the bipolar field are not too weak; otherwise no catastrophe occurs. As a result, under certain photospheric conditions, a catastrophe could take place in a completely closed configuration, whereas it ceases to exist in a partially open configuration. This indicates that whether the background field is completely closed or partially open is not the only necessary condition for the existence of catastrophe, and that the photospheric conditions also play a crucial role in the catastrophic behavior of the flux rope system.« less
ERIC Educational Resources Information Center
Cryer, Patricia
1988-01-01
Develops models for participants' behaviors in games, simulations, and workshops based on Catastrophe Theory and Herzberg's two-factor theory of motivation. Examples are given of how these models can be used, both for describing and understanding the behaviors of individuals, and for eliciting insights into why participants behave as they do. (11…
Modeling extreme hurricane damage in the United States using generalized Pareto distribution
NASA Astrophysics Data System (ADS)
Dey, Asim Kumer
Extreme value distributions are used to understand and model natural calamities, man made catastrophes and financial collapses. Extreme value theory has been developed to study the frequency of such events and to construct a predictive model so that one can attempt to forecast the frequency of a disaster and the amount of damage from such a disaster. In this study, hurricane damages in the United States from 1900-2012 have been studied. The aim of the paper is three-fold. First, normalizing hurricane damage and fitting an appropriate model for the normalized damage data. Secondly, predicting the maximum economic damage from a hurricane in future by using the concept of return period. Finally, quantifying the uncertainty in the inference of extreme return levels of hurricane losses by using a simulated hurricane series, generated by bootstrap sampling. Normalized hurricane damage data are found to follow a generalized Pareto distribution. tion. It is demonstrated that standard deviation and coecient of variation increase with the return period which indicates an increase in uncertainty with model extrapolation.
Medium term hurricane catastrophe models: a validation experiment
NASA Astrophysics Data System (ADS)
Bonazzi, Alessandro; Turner, Jessica; Dobbin, Alison; Wilson, Paul; Mitas, Christos; Bellone, Enrica
2013-04-01
Climate variability is a major source of uncertainty for the insurance industry underwriting hurricane risk. Catastrophe models provide their users with a stochastic set of events that expands the scope of the historical catalogue by including synthetic events that are likely to happen in a defined time-frame. The use of these catastrophe models is widespread in the insurance industry but it is only in recent years that climate variability has been explicitly accounted for. In the insurance parlance "medium term catastrophe model" refers to products that provide an adjusted view of risk that is meant to represent hurricane activity on a 1 to 5 year horizon, as opposed to long term models that integrate across the climate variability of the longest available time series of observations. In this presentation we discuss how a simple reinsurance program can be used to assess the value of medium term catastrophe models. We elaborate on similar concepts as discussed in "Potential Economic Value of Seasonal Hurricane Forecasts" by Emanuel et al. (2012, WCAS) and provide an example based on 24 years of historical data of the Chicago Mercantile Hurricane Index (CHI), an insured loss proxy. Profit and loss volatility of a hypothetical primary insurer are used to score medium term models versus their long term counterpart. Results show that medium term catastrophe models could help a hypothetical primary insurer to improve their financial resiliency to varying climate conditions.
Catastrophic event recorded among Holocene eolianites (Sidi Salem Formation, SE Tunisia)
NASA Astrophysics Data System (ADS)
Frébourg, Gregory; Hasler, Claude-Alain; Davaud, Eric
2010-03-01
A high-energy deposit cuts through the early Holocene eolianites of the Sidi Salem Formation which forms a ridge along the southeastern coast of Tunisia. The sedimentary structures as well as the paleo-altitude and paleo-location of the outcrop state for a subaqueous deposition by an unusually large catastrophic event. Regarding its age and the related uncertainties, it could be either an exceptional storm, or a landslide or impact triggered tsunami. The mega-tsunami of the 8000 BP collapse of the Valle del Bove valley (Etna Volcano) could be this event, for its matching age and calculated run-up height.
The Cusp Catastrophe Model as Cross-Sectional and Longitudinal Mixture Structural Equation Models
Chow, Sy-Miin; Witkiewitz, Katie; Grasman, Raoul P. P. P.; Maisto, Stephen A.
2015-01-01
Catastrophe theory (Thom, 1972, 1993) is the study of the many ways in which continuous changes in a system’s parameters can result in discontinuous changes in one or several outcome variables of interest. Catastrophe theory–inspired models have been used to represent a variety of change phenomena in the realm of social and behavioral sciences. Despite their promise, widespread applications of catastrophe models have been impeded, in part, by difficulties in performing model fitting and model comparison procedures. We propose a new modeling framework for testing one kind of catastrophe model — the cusp catastrophe model — as a mixture structural equation model (MSEM) when cross-sectional data are available; or alternatively, as an MSEM with regime-switching (MSEM-RS) when longitudinal panel data are available. The proposed models and the advantages offered by this alternative modeling framework are illustrated using two empirical examples and a simulation study. PMID:25822209
NASA Astrophysics Data System (ADS)
Bachelet, D. M.; Ferschweiler, K.; Baker, B.; Sleeter, B. M.
2016-12-01
Climate variability and a warming trend during the 21st century ensures fuel build-up and episodic catastrophic wildfires. We used downscaled (2.5 arcmin) CMIP5 climate futures from 20 models under RCP 8.5 to run the dynamic global vegetation model MC2 over the conterminous US and identify key drivers of land cover change. We show regional and temporal differences in the magnitude of projected C losses due to fire over the 21st century. We also look at the vigor (NPP/LAI) of forest lands and estimate the loss in C capture due to declines in production as well as the increase in heterotrophic respiration due to increased mortality. We compare simulated the carbon sequestration potential of terrestrial biomes and the risk of carbon losses through disturbance. We quantify uncertainty in model results by showing the distribution of possible future impacts under 20 futures. We explore the effects of land use and highlight the challenges we met to simulate credible transient management practices throughout the 20th century and into the future.
Finkelstein, M.E.; Wolf, S.; Goldman, M.; Doak, D.F.; Sievert, P.R.; Balogh, G.; Hasegawa, H.
2010-01-01
Catastrophic events, either from natural (e.g., hurricane) or human-induced (e.g., forest clear-cut) processes, are a well-known threat to wild populations. However, our lack of knowledge about population-level effects of catastrophic events has inhibited the careful examination of how catastrophes affect population growth and persistence. For the critically endangered short-tailed albatross (Phoebastria albatrus), episodic volcanic eruptions are considered a serious catastrophic threat since approximately 80% of the global population of ???2500 birds (in 2006) currently breeds on an active volcano, Torishima Island. We evaluated how short-tailed albatross population persistence is affected by the catastrophic threat of a volcanic eruption relative to chronic threats. We also provide an example for overcoming the seemingly overwhelming problems created by modelling the population dynamics of a species with limited demographic data by incorporating uncertainty in our analysis. As such, we constructed a stochastic age-based matrix model that incorporated both catastrophic mortality due to volcanic eruptions and chronic mortality from several potential sources (e.g., contaminant exposure, fisheries bycatch) to determine the relative effects of these two types of threats on short-tailed albatross population growth and persistence. Modest increases (1%) in chronic (annual) mortality had a 2.5-fold greater effect on predicted short-tailed albatross stochastic population growth rate (lambda) than did the occurrence of periodic volcanic eruptions that follow historic eruption frequencies (annual probability of eruption 2.2%). Our work demonstrates that periodic catastrophic volcanic eruptions, despite their dramatic nature, are less likely to affect the population viability and recovery of short-tailed albatross than low-level chronic mortality. ?? 2009 Elsevier Ltd.
Advanced Booster Liquid Engine Combustion Stability
NASA Technical Reports Server (NTRS)
Tucker, Kevin; Gentz, Steve; Nettles, Mindy
2015-01-01
Combustion instability is a phenomenon in liquid rocket engines caused by complex coupling between the time-varying combustion processes and the fluid dynamics in the combustor. Consequences of the large pressure oscillations associated with combustion instability often cause significant hardware damage and can be catastrophic. The current combustion stability assessment tools are limited by the level of empiricism in many inputs and embedded models. This limited predictive capability creates significant uncertainty in stability assessments. This large uncertainty then increases hardware development costs due to heavy reliance on expensive and time-consuming testing.
Modeling of Flood Risk for the Continental United States
NASA Astrophysics Data System (ADS)
Lohmann, D.; Li, S.; Katz, B.; Goteti, G.; Kaheil, Y. H.; Vojjala, R.
2011-12-01
The science of catastrophic risk modeling helps people to understand the physical and financial implications of natural catastrophes (hurricanes, flood, earthquakes, etc.), terrorism, and the risks associated with changes in life expectancy. As such it depends on simulation techniques that integrate multiple disciplines such as meteorology, hydrology, structural engineering, statistics, computer science, financial engineering, actuarial science, and more in virtually every field of technology. In this talk we will explain the techniques and underlying assumptions of building the RMS US flood risk model. We especially will pay attention to correlation (spatial and temporal), simulation and uncertainty in each of the various components in the development process. Recent extreme floods (e.g. US Midwest flood 2008, US Northeast flood, 2010) have increased the concern of flood risk. Consequently, there are growing needs to adequately assess the flood risk. The RMS flood hazard model is mainly comprised of three major components. (1) Stochastic precipitation simulation module based on a Monte-Carlo analogue technique, which is capable of producing correlated rainfall events for the continental US. (2) Rainfall-runoff and routing module. A semi-distributed rainfall-runoff model was developed to properly assess the antecedent conditions, determine the saturation area and runoff. The runoff is further routed downstream along the rivers by a routing model. Combined with the precipitation model, it allows us to correlate the streamflow and hence flooding from different rivers, as well as low and high return-periods across the continental US. (3) Flood inundation module. It transforms the discharge (output from the flow routing) into water level, which is further combined with a two-dimensional off-floodplain inundation model to produce comprehensive flood hazard map. The performance of the model is demonstrated by comparing to the observation and published data. Output from the flood hazard model is used to drive a flood loss model that is coupled to a financial model.
Laboratory tests of catastrophic disruption of rotating bodies
NASA Astrophysics Data System (ADS)
Morris, A. J. W.; Burchell, M. J.
2017-11-01
The results of catastrophic disruption experiments on static and rotating targets are reported. The experiments used cement spheres of diameter 10 cm as the targets. Impacts were by mm sized stainless steel spheres at speeds of between 1 and 7.75 km s-1. Energy densities (Q) in the targets ranged from 7 to 2613 J kg-1. The experiments covered both the cratering and catastrophic disruption regimes. For static, i.e. non-rotating targets the critical energy density for disruption (Q*, the value of Q when the largest surviving target fragment has a mass equal to one half of the pre-impact target mass) was Q* = 1447 ± 90 J kg-1. For rotating targets (median rotation frequency of 3.44 Hz) we found Q* = 987 ± 349 J kg-1, a reduction of 32% in the mean value. This lower value of Q* for rotating targets was also accompanied by a larger scatter on the data, hence the greater uncertainty. We suggest that in some cases the rotating targets behaved as static targets, i.e. broke up with the same catastrophic disruption threshold, but in other cases the rotation helped the break up causing a lower catastrophic disruption threshold, hence both the lower value of Q* and the larger scatter on the data. The fragment mass distributions after impact were similar in both the static and rotating target experiments with similar slopes.
Effects of aging in catastrophe on the steady state and dynamics of a microtubule population
NASA Astrophysics Data System (ADS)
Jemseena, V.; Gopalakrishnan, Manoj
2015-05-01
Several independent observations have suggested that the catastrophe transition in microtubules is not a first-order process, as is usually assumed. Recent in vitro observations by Gardner et al. [M. K. Gardner et al., Cell 147, 1092 (2011), 10.1016/j.cell.2011.10.037] showed that microtubule catastrophe takes place via multiple steps and the frequency increases with the age of the filament. Here we investigate, via numerical simulations and mathematical calculations, some of the consequences of the age dependence of catastrophe on the dynamics of microtubules as a function of the aging rate, for two different models of aging: exponential growth, but saturating asymptotically, and purely linear growth. The boundary demarcating the steady-state and non-steady-state regimes in the dynamics is derived analytically in both cases. Numerical simulations, supported by analytical calculations in the linear model, show that aging leads to nonexponential length distributions in steady state. More importantly, oscillations ensue in microtubule length and velocity. The regularity of oscillations, as characterized by the negative dip in the autocorrelation function, is reduced by increasing the frequency of rescue events. Our study shows that the age dependence of catastrophe could function as an intrinsic mechanism to generate oscillatory dynamics in a microtubule population, distinct from hitherto identified ones.
GDP-to-GTP exchange on the microtubule end can contribute to the frequency of catastrophe
Piedra, Felipe-Andrés; Kim, Tae; Garza, Emily S.; Geyer, Elisabeth A.; Burns, Alexander; Ye, Xuecheng; Rice, Luke M.
2016-01-01
Microtubules are dynamic polymers of αβ-tubulin that have essential roles in chromosome segregation and organization of the cytoplasm. Catastrophe—the switch from growing to shrinking—occurs when a microtubule loses its stabilizing GTP cap. Recent evidence indicates that the nucleotide on the microtubule end controls how tightly an incoming subunit will be bound (trans-acting GTP), but most current models do not incorporate this information. We implemented trans-acting GTP into a computational model for microtubule dynamics. In simulations, growing microtubules often exposed terminal GDP-bound subunits without undergoing catastrophe. Transient GDP exposure on the growing plus end slowed elongation by reducing the number of favorable binding sites on the microtubule end. Slower elongation led to erosion of the GTP cap and an increase in the frequency of catastrophe. Allowing GDP-to-GTP exchange on terminal subunits in simulations mitigated these effects. Using mutant αβ-tubulin or modified GTP, we showed experimentally that a more readily exchangeable nucleotide led to less frequent catastrophe. Current models for microtubule dynamics do not account for GDP-to-GTP exchange on the growing microtubule end, so our findings provide a new way of thinking about the molecular events that initiate catastrophe. PMID:27146111
An application of Mean Escape Time and metapopulation on forestry catastrophe insurance
NASA Astrophysics Data System (ADS)
Li, Jiangcheng; Zhang, Chunmin; Liu, Jifa; Li, Zhen; Yang, Xuan
2018-04-01
A forestry catastrophe insurance model due to forestry pest infestations and disease epidemics is developed by employing metapopulation dynamics and statistics properties of Mean Escape Time (MET). The probability of outbreak of forestry catastrophe loss and the catastrophe loss payment time with MET are respectively investigated. Forestry loss data in China is used for model simulation. Experimental results are concluded as: (1) The model with analytical results is shown to be a better fit; (2) Within the condition of big area of patches and structure of patches, high system factor, low extinction rate, high multiplicative noises, and additive noises with a high cross-correlated strength range, an outbreak of forestry catastrophe loss or catastrophe loss payment due to forestry pest infestations and disease epidemics could occur; (3) An optimal catastrophe loss payment time MET due to forestry pest infestations and disease epidemics can be identified by taking proper value of multiplicative noises and limits the additive noises on a low range of value, and cross-correlated strength at a high range of value.
"But it might be a heart attack": intolerance of uncertainty and panic disorder symptoms.
Carleton, R Nicholas; Duranceau, Sophie; Freeston, Mark H; Boelen, Paul A; McCabe, Randi E; Antony, Martin M
2014-06-01
Panic disorder models describe interactions between feared anxiety-related physical sensations (i.e., anxiety sensitivity; AS) and catastrophic interpretations therein. Intolerance of uncertainty (IU) has been implicated as necessary for catastrophic interpretations in community samples. The current study examined relationships between IU, AS, and panic disorder symptoms in a clinical sample. Participants had a principal diagnosis of panic disorder, with or without agoraphobia (n=132; 66% women). IU was expected to account for significant variance in panic symptoms controlling for AS. AS was expected to mediate the relationship between IU and panic symptoms, whereas IU was expected to moderate the relationship between AS and panic symptoms. Hierarchical linear regressions indicated that IU accounted for significant unique variance in panic symptoms relative to AS, with comparable part correlations. Mediation and moderation models were also tested and suggested direct and indirect effects of IU on panic symptoms through AS; however, an interaction effect was not supported. The current cross-sectional evidence supports a role for IU in panic symptoms, independent of AS. Copyright © 2014 Elsevier Ltd. All rights reserved.
GDP-to-GTP exchange on the microtubule end can contribute to the frequency of catastrophe.
Piedra, Felipe-Andrés; Kim, Tae; Garza, Emily S; Geyer, Elisabeth A; Burns, Alexander; Ye, Xuecheng; Rice, Luke M
2016-11-07
Microtubules are dynamic polymers of αβ-tubulin that have essential roles in chromosome segregation and organization of the cytoplasm. Catastrophe-the switch from growing to shrinking-occurs when a microtubule loses its stabilizing GTP cap. Recent evidence indicates that the nucleotide on the microtubule end controls how tightly an incoming subunit will be bound (trans-acting GTP), but most current models do not incorporate this information. We implemented trans-acting GTP into a computational model for microtubule dynamics. In simulations, growing microtubules often exposed terminal GDP-bound subunits without undergoing catastrophe. Transient GDP exposure on the growing plus end slowed elongation by reducing the number of favorable binding sites on the microtubule end. Slower elongation led to erosion of the GTP cap and an increase in the frequency of catastrophe. Allowing GDP-to-GTP exchange on terminal subunits in simulations mitigated these effects. Using mutant αβ-tubulin or modified GTP, we showed experimentally that a more readily exchangeable nucleotide led to less frequent catastrophe. Current models for microtubule dynamics do not account for GDP-to-GTP exchange on the growing microtubule end, so our findings provide a new way of thinking about the molecular events that initiate catastrophe. © 2016 Piedra et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
NASA Astrophysics Data System (ADS)
Foulser-Piggott, R.; Saito, K.; Spence, R.
2012-04-01
Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake event. This enables the results of the models to be compared with real data and the relative performance of the different methodologies to be evaluated. A sensitivity analysis is also conducted for two main reasons. Firstly, to determine the key input variables in the methodology that have the most significant impact on the resulting loss estimate. Secondly, to enable the uncertainty in the different approaches to be quantified and therefore provide a range of uncertainty in the loss estimates.
NASA Astrophysics Data System (ADS)
Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.
2012-04-01
Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the repeatability of such catastrophe losses in the country. The validation process also included collaboration between Aon Benfield and its client in order to consider the insurance market penetration in Algeria estimated approximately at 5%. Thus, we believe that the applied approach led towards the production of an earthquake model for Algeria that is scientifically sound and reliable from one side and market and client oriented on the other side.
Sensitivity of collective action to uncertainty about climate tipping points
NASA Astrophysics Data System (ADS)
Barrett, Scott; Dannenberg, Astrid
2014-01-01
Despite more than two decades of diplomatic effort, concentrations of greenhouse gases continue to trend upwards, creating the risk that we may someday cross a threshold for `dangerous' climate change. Although climate thresholds are very uncertain, new research is trying to devise `early warning signals' of an approaching tipping point. This research offers a tantalizing promise: whereas collective action fails when threshold uncertainty is large, reductions in this uncertainty may bring about the behavioural change needed to avert a climate `catastrophe'. Here we present the results of an experiment, rooted in a game-theoretic model, showing that behaviour differs markedly either side of a dividing line for threshold uncertainty. On one side of the dividing line, where threshold uncertainty is relatively large, free riding proves irresistible and trust illusive, making it virtually inevitable that the tipping point will be crossed. On the other side, where threshold uncertainty is small, the incentive to coordinate is strong and trust more robust, often leading the players to avoid crossing the tipping point. Our results show that uncertainty must be reduced to this `good' side of the dividing line to stimulate the behavioural shift needed to avoid `dangerous' climate change.
1987-09-15
optical levitation of bubbles; D. Acoustical and optical diffraction catastrophes (theory and optical simulation of transverse cusps, experiments with...35 C. Optical Levitation of Bubbles in Water by the Radiation Pressure of a Laser Beam: An Acoustically Quiet Levitator ...radiation pressure of a laser beam: an acoustically quiet levitator ," J. Acoust . Soc. Am. (submitted July 1987). C. Books (and sections thereof) Published
NASA Astrophysics Data System (ADS)
Li, Sichen; Liao, Zhixian; Luo, Xiaoshu; Wei, Duqu; Jiang, Pinqun; Jiang, Qinghong
2018-02-01
The value of the output capacitance (C) should be carefully considered when designing a photovoltaic (PV) inverter since it can cause distortion in the working state of the circuit, and the circuit produces nonlinear dynamic behavior. According to Kirchhoff’s laws and the characteristics of an ideal operational amplifier for a strict piecewise linear state equation, a circuit simulation model is constructed to study the system parameters (time, C) for the current passing through an inductor with an inductance of L and the voltage across the capacitor with a capacitance of C. The developed simulation model uses Runge-Kutta methods to solve the state equations. This study focuses on predicting the fault of the circuit from the two aspects of the harmonic distortion and simulation results. Moreover, the presented model is also used to research the working state of the system in the case of a load capacitance catastrophe. The nonlinear dynamic behaviors in the inverter are simulated and verified.
The economics of mitigation and remediation measures - preliminary results
NASA Astrophysics Data System (ADS)
Wiedemann, Carsten; Flegel, Sven Kevin; Vörsmann, Peter; Gelhaus, Johannes; Moeckel, Marek; Braun, Vitali; Kebschull, Christopher; Metz, Manuel
2012-07-01
Today there exists a high spatial density of orbital debris objects at about 800 km altitude. The control of the debris population in this region is important for the long-term evolution of the debris environment. The future debris population is investigated by simulations using the software tool LUCA (Long-Term Orbit Utilization Collision Analysis). It is likely that in the future there will occur more catastrophic collisions. Debris objects generated during such events may again trigger further catastrophic collisions. Current simulations have revealed that the number of debris objects will increase in the future. In a long-term perspective, catastrophic collisions may become the dominating mechanism in generating orbital debris. In this study it is investigated, when the situation will become unstable. To prevent this instability it is necessary to implement mitigation and maybe even remediation measures. It is investigated how these measures affect the future debris environment. It is simulated if the growth of the number of debris objects can be interrupted and how much this may cost. Different mitigation scenarios are considered. Furthermore also one remediation measure, the active removal of high-risk objects, is simulated. Cost drivers for the different measures are identified. It is investigated how selected measures are associated with costs. The goal is to find out which economic benefits may result from mitigation or remediation. First results of a cost benefit analyses are presented.
Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks
Clune, Jeff
2017-01-01
A long-term goal of AI is to produce agents that can learn a diversity of skills throughout their lifetimes and continuously improve those skills via experience. A longstanding obstacle towards that goal is catastrophic forgetting, which is when learning new information erases previously learned information. Catastrophic forgetting occurs in artificial neural networks (ANNs), which have fueled most recent advances in AI. A recent paper proposed that catastrophic forgetting in ANNs can be reduced by promoting modularity, which can limit forgetting by isolating task information to specific clusters of nodes and connections (functional modules). While the prior work did show that modular ANNs suffered less from catastrophic forgetting, it was not able to produce ANNs that possessed task-specific functional modules, thereby leaving the main theory regarding modularity and forgetting untested. We introduce diffusion-based neuromodulation, which simulates the release of diffusing, neuromodulatory chemicals within an ANN that can modulate (i.e. up or down regulate) learning in a spatial region. On the simple diagnostic problem from the prior work, diffusion-based neuromodulation 1) induces task-specific learning in groups of nodes and connections (task-specific localized learning), which 2) produces functional modules for each subtask, and 3) yields higher performance by eliminating catastrophic forgetting. Overall, our results suggest that diffusion-based neuromodulation promotes task-specific localized learning and functional modularity, which can help solve the challenging, but important problem of catastrophic forgetting. PMID:29145413
NASA Astrophysics Data System (ADS)
Joetzjer, E.; Poulter, B.; Ciais, P.; Sala, A.; Sack, L.; Bartlett, M.
2015-12-01
In the past decade, two extreme droughts experienced by the Amazon rainforest led to a perturbation of carbon cycle dynamics and forest structure, partly through an increase in tree mortality. While there is a relatively strong consensus in CMIP5 projections for an increase in both frequency and intensity of droughts across the Amazon, the potential for forest die-off constitutes a large uncertainty in projections of climate impacts on terrestrial ecosystems and carbon cycle feedbacks. Two long-term through fall exclusion experiments (TFE) provided novel observations of Amazonian ecosystem responses under drought. These experiments also provided a great opportunity to evaluate and improve models' behavior under drought by comparing simulations and observations. While current DGVM use a wide array of algorithms to represent mortality, most are associated with large uncertainty for representing drought-induced mortality, and require updating to include current information of physiological processes. During very strong droughts, the leaves desiccate and stems may undergo catastrophic embolism. However, even before that point, stomata close, to minimize excessive water loss and risk of hydraulic failure, which reduces carbon assimilation. To maintain respiration and other functions, plants may eventually deplete stored non-structural carbon compounds (NSC), which may have negative impacts on plant and eventually increase the probability of mortality.Here, we describe a new parameterization of the mortality process induced by drought using the ORCHIDEE-CAN dynamic vegetation model and test it using the two TFE results. We first updated and evaluated both the representation of hydraulic architecture and the NSC pool dynamics using in situ data. We implemented a direct climate effect on mortality through catastrophic stem embolism, based on hydraulic vulnerability curves. In addition, we explored the role of NSC on hydraulic failure and mortality by coupling in the model NSC content and vulnerability curves, following the idea that stored NSC serves a critical osmotic function. Our results suggest that models have the capacity to represent individual mortality from a mechanistic perspective, providing a framework for informing future experiments and data collection for model development.
Analytical Assessment of Simultaneous Parallel Approach Feasibility from Total System Error
NASA Technical Reports Server (NTRS)
Madden, Michael M.
2014-01-01
In a simultaneous paired approach to closely-spaced parallel runways, a pair of aircraft flies in close proximity on parallel approach paths. The aircraft pair must maintain a longitudinal separation within a range that avoids wake encounters and, if one of the aircraft blunders, avoids collision. Wake avoidance defines the rear gate of the longitudinal separation. The lead aircraft generates a wake vortex that, with the aid of crosswinds, can travel laterally onto the path of the trail aircraft. As runway separation decreases, the wake has less distance to traverse to reach the path of the trail aircraft. The total system error of each aircraft further reduces this distance. The total system error is often modeled as a probability distribution function. Therefore, Monte-Carlo simulations are a favored tool for assessing a "safe" rear-gate. However, safety for paired approaches typically requires that a catastrophic wake encounter be a rare one-in-a-billion event during normal operation. Using a Monte-Carlo simulation to assert this event rarity with confidence requires a massive number of runs. Such large runs do not lend themselves to rapid turn-around during the early stages of investigation when the goal is to eliminate the infeasible regions of the solution space and to perform trades among the independent variables in the operational concept. One can employ statistical analysis using simplified models more efficiently to narrow the solution space and identify promising trades for more in-depth investigation using Monte-Carlo simulations. These simple, analytical models not only have to address the uncertainty of the total system error but also the uncertainty in navigation sources used to alert an abort of the procedure. This paper presents a method for integrating total system error, procedure abort rates, avionics failures, and surveillance errors into a statistical analysis that identifies the likely feasible runway separations for simultaneous paired approaches.
Simulation modeling of population viability for the leopard darter (Percidae: Percina pantherina)
Williams, L.R.; Echelle, A.A.; Toepfer, C.S.; Williams, M.G.; Fisher, W.L.
1999-01-01
We used the computer program RAMAS to perform a population viability analysis for the leopard darter, Percina pantherina. This percid fish is a threatened species confined to five isolated rivers in the Ouachita Mountains of Oklahoma and Arkansas. A base model created from life history data indicated a 6% probability that the leopard darter would go extinct in 50 years. We performed sensitivity analyses to determine the effects of initial population size, variation in age structure, variation in severity and probability of catastrophe, and migration rate. Catastrophe (modeled as the probability and severity of drought) and migration had the greatest effects on persistence. Results of these simulations have implications for management of this species.
Pain uncertainty in patients with fibromyalgia, yoga practitioners, and healthy volunteers.
Bradshaw, David H; Donaldson, Gary W; Okifuji, Akiko
2012-01-01
Uncertainty about potentially painful events affects how pain is experienced. Individuals with fibromyalgia (FM) often exhibit anxiety and catastrophic thoughts regarding pain and difficulties dealing with pain uncertainty. The effects of pain uncertainty in predictably high odds (HO), predictably low odds (LO), and even odds (EO) conditions on subjective ratings of pain (PR) and skin conductance responses (SCR) following the administration of a painful stimulus were examined for individuals with fibromyalgia (IWFM), healthy volunteers (HVs), and yoga practitioners (YPs). We hypothesized IWFM would demonstrate the greatest physiological reactivity to pain uncertainty, followed by HVs and YPs, respectively. Nine IWFM, 7 YPs, and 10 HVs participated. Custom contrast estimates comparing responses for HO, LO, and EO pain conditions showed higher SCR for IWFM (CE = 1.27, p = 0.01) but not for HVs or for YPs. PR for the EO condition were significantly greater than for HO and LO conditions for IWFM (CE = 0.60, p = 0.012) but not for HVs or YPs. YPs had lower SCR and PR than did HVs. Results show that uncertainty regarding pain increases the experience of pain, whereas certainty regarding pain may reduce pain ratings for individuals with fibromyalgia.
40 CFR 86.004-30 - Certification.
Code of Federal Regulations, 2011 CFR
2011-07-01
... simulation of such, resulting in an increase of 1.5 times the NMHC+NOX standard or FEL above the NMHC+NOX... simulation of such, resulting in exhaust emissions exceeding 1.5 times the applicable standard or FEL for... catastrophically failed, or an electronic simulation of such. (2)(i) Otto-cycle. An engine misfire condition is...
40 CFR 86.004-30 - Certification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... simulation of such, resulting in an increase of 1.5 times the NMHC+NOX standard or FEL above the NMHC+NOX... simulation of such, resulting in exhaust emissions exceeding 1.5 times the applicable standard or FEL for... catastrophically failed, or an electronic simulation of such. (2)(i) Otto-cycle. An engine misfire condition is...
Well below 2 °C: Mitigation strategies for avoiding dangerous to catastrophic climate changes
NASA Astrophysics Data System (ADS)
Xu, Yangyang; Ramanathan, Veerabhadran
2017-09-01
The historic Paris Agreement calls for limiting global temperature rise to “well below 2 °C.” Because of uncertainties in emission scenarios, climate, and carbon cycle feedback, we interpret the Paris Agreement in terms of three climate risk categories and bring in considerations of low-probability (5%) high-impact (LPHI) warming in addition to the central (˜50% probability) value. The current risk category of dangerous warming is extended to more categories, which are defined by us here as follows: >1.5 °C as dangerous; >3 °C as catastrophic; and >5 °C as unknown, implying beyond catastrophic, including existential threats. With unchecked emissions, the central warming can reach the dangerous level within three decades, with the LPHI warming becoming catastrophic by 2050. We outline a three-lever strategy to limit the central warming below the dangerous level and the LPHI below the catastrophic level, both in the near term (<2050) and in the long term (2100): the carbon neutral (CN) lever to achieve zero net emissions of CO2, the super pollutant (SP) lever to mitigate short-lived climate pollutants, and the carbon extraction and sequestration (CES) lever to thin the atmospheric CO2 blanket. Pulling on both CN and SP levers and bending the emissions curve by 2020 can keep the central warming below dangerous levels. To limit the LPHI warming below dangerous levels, the CES lever must be pulled as well to extract as much as 1 trillion tons of CO2 before 2100 to both limit the preindustrial to 2100 cumulative net CO2 emissions to 2.2 trillion tons and bend the warming curve to a cooling trend.
Well below 2 °C: Mitigation strategies for avoiding dangerous to catastrophic climate changes.
Xu, Yangyang; Ramanathan, Veerabhadran
2017-09-26
The historic Paris Agreement calls for limiting global temperature rise to "well below 2 °C." Because of uncertainties in emission scenarios, climate, and carbon cycle feedback, we interpret the Paris Agreement in terms of three climate risk categories and bring in considerations of low-probability (5%) high-impact (LPHI) warming in addition to the central (∼50% probability) value. The current risk category of dangerous warming is extended to more categories, which are defined by us here as follows: >1.5 °C as dangerous; >3 °C as catastrophic; and >5 °C as unknown, implying beyond catastrophic, including existential threats. With unchecked emissions, the central warming can reach the dangerous level within three decades, with the LPHI warming becoming catastrophic by 2050. We outline a three-lever strategy to limit the central warming below the dangerous level and the LPHI below the catastrophic level, both in the near term (<2050) and in the long term (2100): the carbon neutral (CN) lever to achieve zero net emissions of CO 2 , the super pollutant (SP) lever to mitigate short-lived climate pollutants, and the carbon extraction and sequestration (CES) lever to thin the atmospheric CO 2 blanket. Pulling on both CN and SP levers and bending the emissions curve by 2020 can keep the central warming below dangerous levels. To limit the LPHI warming below dangerous levels, the CES lever must be pulled as well to extract as much as 1 trillion tons of CO 2 before 2100 to both limit the preindustrial to 2100 cumulative net CO 2 emissions to 2.2 trillion tons and bend the warming curve to a cooling trend. Copyright © 2017 the Author(s). Published by PNAS.
Well below 2 °C: Mitigation strategies for avoiding dangerous to catastrophic climate changes
Xu, Yangyang; Ramanathan, Veerabhadran
2017-01-01
The historic Paris Agreement calls for limiting global temperature rise to “well below 2 °C.” Because of uncertainties in emission scenarios, climate, and carbon cycle feedback, we interpret the Paris Agreement in terms of three climate risk categories and bring in considerations of low-probability (5%) high-impact (LPHI) warming in addition to the central (∼50% probability) value. The current risk category of dangerous warming is extended to more categories, which are defined by us here as follows: >1.5 °C as dangerous; >3 °C as catastrophic; and >5 °C as unknown, implying beyond catastrophic, including existential threats. With unchecked emissions, the central warming can reach the dangerous level within three decades, with the LPHI warming becoming catastrophic by 2050. We outline a three-lever strategy to limit the central warming below the dangerous level and the LPHI below the catastrophic level, both in the near term (<2050) and in the long term (2100): the carbon neutral (CN) lever to achieve zero net emissions of CO2, the super pollutant (SP) lever to mitigate short-lived climate pollutants, and the carbon extraction and sequestration (CES) lever to thin the atmospheric CO2 blanket. Pulling on both CN and SP levers and bending the emissions curve by 2020 can keep the central warming below dangerous levels. To limit the LPHI warming below dangerous levels, the CES lever must be pulled as well to extract as much as 1 trillion tons of CO2 before 2100 to both limit the preindustrial to 2100 cumulative net CO2 emissions to 2.2 trillion tons and bend the warming curve to a cooling trend. PMID:28912354
Quantification of uncertainties for application in detonation simulation
NASA Astrophysics Data System (ADS)
Zheng, Miao; Ma, Zhibo
2016-06-01
Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.
Observational constraints indicate risk of drying in the Amazon basin.
Shiogama, Hideo; Emori, Seita; Hanasaki, Naota; Abe, Manabu; Masutomi, Yuji; Takahashi, Kiyoshi; Nozawa, Toru
2011-03-29
Climate warming due to human activities will be accompanied by hydrological cycle changes. Economies, societies and ecosystems in South America are vulnerable to such water resource changes. Hence, water resource impact assessments for South America, and corresponding adaptation and mitigation policies, have attracted increased attention. However, substantial uncertainties remain in the current water resource assessments that are based on multiple coupled Atmosphere Ocean General Circulation models. This uncertainty varies from significant wetting to catastrophic drying. By applying a statistical method, we characterized the uncertainty and identified global-scale metrics for measuring the reliability of water resource assessments in South America. Here, we show that, although the ensemble mean assessment suggested wetting across most of South America, the observational constraints indicate a higher probability of drying in the Amazon basin. Thus, over-reliance on the consensus of models can lead to inappropriate decision making.
Anatomy of a bottleneck: diagnosing factors limiting population growth in the Puerto Rican parrot
Beissenger, S.R.; Wunderle, J.M.; Meyers, J.M.; Saether, B.-E.; Engen, S.
2008-01-01
The relative importance of genetic, demographic, environmental, and catastrophic processes that maintain population bottlenecks has received little consideration. We evaluate the role of these factors in maintaining the Puerto Rican Parrot (Amazona vittata) in a prolonged bottleneck from 1973 through 2000 despite intensive conservation efforts. We first conduct a risk analysis, then examine evidence for the importance of specific processes maintaining the bottleneck using the multiple competing hypotheses approach, and finally integrate these results through a sensitivity analysis of a demographic model using life-stage simulation analysis (LSA) to determine the relative importance of genetic, demographic, environmental, and catastrophic processes on population growth. Annual population growth has been slow and variable (1.0 6 5.2 parrots per year, or an average k?1.05 6 0.19) from 16 parrots (1973) to a high of 40-42 birds (1997-1998). A risk analysis based on population prediction intervals (PPI) indicates great risk and large uncertainty, with a range of 22?83 birds in the 90% PPI only five years into the future. Four primary factors (reduced hatching success due to inbreeding, failure of adults to nest, nest failure due to nongenetic causes, and reduced survival of adults and juveniles) were responsible for maintaining the bottleneck. Egghatchability rates were low (70.6% per egg and 76.8% per pair), and hatchability increased after mate changes, suggesting inbreeding effects. Only an average of 34% of the population nested annually, which was well below the percentage of adults that should have reached an age of first breeding (41-56%). This chronic failure to nest appears to have been caused primarily by environmental and/or behavioral factors, and not by nest-site scarcity or a skewed sex ratio. Nest failure rates from nongenetic causes (i.e., predation, parasitism, and wet cavities) were low (29%) due to active management (protecting nests and fostering captive young into wild nests), diminishing the importance of nest failure as a limiting factor. Annual survival has been periodically reduced by catastrophes (hurricanes), which have greatly constrained population growth, but survival rates were high under non-catastrophic conditions. Although the importance of factors maintaining the Puerto Rican Parrot bottleneck varied throughout the 30-year period of study, we determined their long-term influence using LSA simulations to correlate variation in demographic rates with variation in population growth (k). The bottleneck appears to have been maintained primarily by periodic catastrophes (hurricanes) that reduced adult survival, and secondarily by environmental and/or behavioral factors that resulted in a failure of many adults to nest. The influence of inbreeding through reduced hatching success played a much less significant role, even when additional effects of inbreeding on the production and mortality of young were incorporated into the LSA. Management actions needed to speed recovery include (1) continued nest guarding to minimize the effects of nest failure due to nongenetic causes; (2) creating a second population at another location on the island --a process that was recently initiated--to reduce the chance that hurricane strikes will cause extinction; and (3) determining the causes of the low percentage of breeders in the population and ameliorating them, which would have a large impact on population growth.
Destruction and Re-Accretion of Mid-Size Moons During an Outer Solar System Late Heavy Bombardment
NASA Astrophysics Data System (ADS)
Movshovitz, N.; Nimmo, F.; Korycansky, D. G.; Asphaug, E. I.; Owen, M.
2014-12-01
To explain the lunar Late Heavy Bombardment the Nice Model (Tsiganis, K., Gomes, R., Morbidelli, A., & Levison, H. 2005, Nature, 435, 459; Tsiganis, K., Gomes, R., Morbidelli, A., & Levison, H. 2005, Nature, 435, 459) invokes a period of dynamical instability, occurring long after planet formation, that destabilizes both the main asteroid belt and a remnant exterior planetesimal disk. As a side effect of explaining the lunar LHB, this model also predicts an LHB-like period in the outer Solar System. With higher collision probabilities and impact energies due to gravitational focusing by the giant planets the inner satellites of Jupiter, Saturn, and Uranus would have experienced a bombardment much more severe than the one supposedly responsible for the lunar basins. The concern is that such bombardment should have resulted in significant, even catastrophic modification of the mid-size satellites. Here we look at the problem of satellite survival during a hypothetical outer Solar System LHB. Using a Monte-Carlo approach we calculate, for 10 satellites of Saturn and Uranus, the probability of having experienced at least one catastrophic collision during an LHB. We use a scaling law for the energy required to disrupt a target in a gravity-dominated collision derived from new SPH simulations. These simulations extend the scaling law previously obtained by Benz & Asphaug (1999, Icarus, 142, 5) to larger targets. We then simulate randomized LHB impacts by drawing from appropriate size and velocity distributions, with the total delivered mass as a controlled parameter. We find that Mimas, Enceladus, Tethys, Hyperion, and Miranda experience at least one catastrophic impact in every simulation. In most simulations, Mimas, Enceladus, and Tethys experience multiple catastrophic impacts, including impacts with energies several times that required to completely disrupt the target. The implication is that these close-in, mid-size satellites could not have survived a Late Heavy Bombardment unmodified, unless the mass delivered to the outer Solar System was at least 30 times less that the value predicted by the Nice Model, or 10 times less than the reduced value more recently suggested by Dones & Levison (2013, in 44th Lunar Planet. Sci. Conf.).
NASA Astrophysics Data System (ADS)
Gunardi, Setiawan, Ezra Putranda
2015-12-01
Indonesia is a country with high risk of earthquake, because of its position in the border of earth's tectonic plate. An earthquake could raise very high amount of damage, loss, and other economic impacts. So, Indonesia needs a mechanism for transferring the risk of earthquake from the government or the (reinsurance) company, as it could collect enough money for implementing the rehabilitation and reconstruction program. One of the mechanisms is by issuing catastrophe bond, `act-of-God bond', or simply CAT bond. A catastrophe bond issued by a special-purpose-vehicle (SPV) company, and then sold to the investor. The revenue from this transaction is joined with the money (premium) from the sponsor company and then invested in other product. If a catastrophe happened before the time-of-maturity, cash flow from the SPV to the investor will discounted or stopped, and the cash flow is paid to the sponsor company to compensate their loss because of this catastrophe event. When we consider the earthquake only, the amount of discounted cash flow could determine based on the earthquake's magnitude. A case study with Indonesian earthquake magnitude data show that the probability of maximum magnitude can model by generalized extreme value (GEV) distribution. In pricing this catastrophe bond, we assumed stochastic interest rate that following the Cox-Ingersoll-Ross (CIR) interest rate model. We develop formulas for pricing three types of catastrophe bond, namely zero coupon bonds, `coupon only at risk' bond, and `principal and coupon at risk' bond. Relationship between price of the catastrophe bond and CIR model's parameter, GEV's parameter, percentage of coupon, and discounted cash flow rule then explained via Monte Carlo simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunardi,; Setiawan, Ezra Putranda
Indonesia is a country with high risk of earthquake, because of its position in the border of earth’s tectonic plate. An earthquake could raise very high amount of damage, loss, and other economic impacts. So, Indonesia needs a mechanism for transferring the risk of earthquake from the government or the (reinsurance) company, as it could collect enough money for implementing the rehabilitation and reconstruction program. One of the mechanisms is by issuing catastrophe bond, ‘act-of-God bond’, or simply CAT bond. A catastrophe bond issued by a special-purpose-vehicle (SPV) company, and then sold to the investor. The revenue from this transactionmore » is joined with the money (premium) from the sponsor company and then invested in other product. If a catastrophe happened before the time-of-maturity, cash flow from the SPV to the investor will discounted or stopped, and the cash flow is paid to the sponsor company to compensate their loss because of this catastrophe event. When we consider the earthquake only, the amount of discounted cash flow could determine based on the earthquake’s magnitude. A case study with Indonesian earthquake magnitude data show that the probability of maximum magnitude can model by generalized extreme value (GEV) distribution. In pricing this catastrophe bond, we assumed stochastic interest rate that following the Cox-Ingersoll-Ross (CIR) interest rate model. We develop formulas for pricing three types of catastrophe bond, namely zero coupon bonds, ‘coupon only at risk’ bond, and ‘principal and coupon at risk’ bond. Relationship between price of the catastrophe bond and CIR model’s parameter, GEV’s parameter, percentage of coupon, and discounted cash flow rule then explained via Monte Carlo simulation.« less
Strategic reasoning and bargaining in catastrophic climate change games
NASA Astrophysics Data System (ADS)
Verendel, Vilhelm; Johansson, Daniel J. A.; Lindgren, Kristian
2016-03-01
Two decades of international negotiations show that agreeing on emission levels for climate change mitigation is a hard challenge. However, if early warning signals were to show an upcoming tipping point with catastrophic damage, theory and experiments suggest this could simplify collective action to reduce greenhouse gas emissions. At the actual threshold, no country would have a free-ride incentive to increase emissions over the tipping point, but it remains for countries to negotiate their emission levels to reach these agreements. We model agents bargaining for emission levels using strategic reasoning to predict emission bids by others and ask how this affects the possibility of reaching agreements that avoid catastrophic damage. It is known that policy elites often use a higher degree of strategic reasoning, and in our model this increases the risk for climate catastrophe. Moreover, some forms of higher strategic reasoning make agreements to reduce greenhouse gases unstable. We use empirically informed levels of strategic reasoning when simulating the model.
NASA Astrophysics Data System (ADS)
Unger, André J. A.
2010-02-01
This work is the second installment in a two-part series, and focuses on object-oriented programming methods to implement an augmented-state variable approach to aggregate the PCS index and introduce the Bermudan-style call feature into the proposed CAT bond model. The PCS index is aggregated quarterly using a discrete Asian running-sum formulation. The resulting aggregate PCS index augmented-state variable is used to specify the payoff (principle) on the CAT bond based on reinsurance layers. The purpose of the Bermudan-style call option is to allow the reinsurer to minimize their interest rate risk exposure on making fixed coupon payments under prevailing interest rates. A sensitivity analysis is performed to determine the impact of uncertainty in the frequency and magnitude of hurricanes on the price of the CAT bond. Results indicate that while the CAT bond is highly sensitive to the natural variability in the frequency of landfalling hurricanes between El Ninõ and non-El Ninõ years, it remains relatively insensitive to uncertainty in the magnitude of damages. In addition, results indicate that the maximum price of the CAT bond is insensitive to whether it is engineered to cover low frequency high magnitude events in a 'high' reinsurance layer relative to high frequency low magnitude events in a 'low' reinsurance layer. Also, while it is possible for the reinsurer to minimize their interest rate risk exposure on the fixed coupon payments, the impact of this risk on the price of the CAT bond appears small relative to the natural variability in the CAT bond price, and consequently catastrophic risk, due to uncertainty in the frequency and magnitude of landfalling hurricanes.
SPICE Work Package 3: Modelling the Effects of Stratospheric Aerosol Geoengineering
NASA Astrophysics Data System (ADS)
Driscoll, Simon
2015-04-01
This talk presents the results of the SPICE Work Package 3. There is an obvious need for methods to verify the accuracy of geoengineering given no observations of a geoengineering programme. Accordingly, model ability in reproducing the observed dynamical response to volcanic eruptions is discussed using analysis of CMIP5 data and different configurations of the HadGEM2 model. With the HadGEM2-L60 model shown to be substantially better in reproducing the observed dynamical response to volcanic eruptions, simulations of GeoMIP's G4 scenario are performed. Simulated impacts of geoengineering are described, and asymmetries between the immediate onset and immediate cessation ('termination') of geoengineering are analysed. Whilst a rapid large increase in stratospheric sulphate aerosols (such as from volcanic eruptions) can cause substantial damage, most volcanic eruptions in general are not catastrophic. One may therefore suspect that an 'equal but opposite' change in radiative forcing from termination may therefore not be catastrophic, if the climatic response is simulated to be symmetric. HadGEM2 simulations reveal a substantially more rapid change in variables such as near-surface temperature and precipitation following termination than the onset, indicating that termination may be substantially more damaging and even catastrophic. Some suggestions for hemispherically asymmetric geoengineering have been proposed as a way to reduce Northern Hemisphere sea ice, for example, with lesser impacts on the rest of the climate. However, HadGEM2 simulations are performed and observations analysed following volcanic eruptions. Both indicate substantial averse consequences from hemispherically asymmetric loading of stratospheric loading on precipitation in the Sahelian region - a vulnerable region where drought has caused hundreds of thousands of deaths and created millions of refugees in the past.
NASA Technical Reports Server (NTRS)
Breininger, David; Duncan, Brean; Eaton, Mitchell; Johnson, Fred; Nichols, James
2014-01-01
Land cover modeling is used to inform land management, but most often via a two-step process where science informs how management alternatives can influence resources and then decision makers can use this to make decisions. A more efficient process is to directly integrate science and decision making, where science allows us to learn to better accomplish management objectives and is developed to address specific decisions. Co-development of management and science is especially productive when decisions are complicated by multiple objectives and impeded by uncertainty. Multiple objectives can be met by specification of tradeoffs, and relevant uncertainty can be addressed through targeted science (i.e., models and monitoring). We describe how to integrate habitat and fuels monitoring with decision making focused on dual objectives of managing for endangered species and minimizing catastrophic fire risk. Under certain conditions, both objectives might be achieved by a similar management policy, but habitat trajectories suggest tradeoffs. Knowledge about system responses to actions can be informed by applying competing management actions to different land units in the same system state and by ideas about fire behavior. Monitoring and management integration is important to optimize state-specific management decisions and increase knowledge about system responses. We believe this approach has broad utility for and cover modeling programs intended to inform decision making.
Application of Catastrophe Risk Modelling to Evacuation Public Policy
NASA Astrophysics Data System (ADS)
Woo, G.
2009-04-01
The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a catastrophe risk model, is required to explore the casualty implications of different possible hazard scenarios, to assess the proportion of an evacuated population who would owe their lives to an evacuation, and to estimate the economic loss associated with an unnecessary evacuation. This paper will review the developing methodology for applying catastrophe risk modelling to support public policy in evacuation decision-making, and provide illustrations from across the range of natural hazards. Evacuation during volcanic crises is a prime example, recognizing the improving forecasting skill of volcanologists, now able to account probabilistically for precursory seismological, geodetic, and geochemical monitoring data. This methodology will be shown to help civic authorities make sounder risk-informed decisions on the timing and population segmentation of evacuation from both volcanoes and calderas, such as Vesuvius and Campi Flegrei, which are in densely populated urban regions.
NASA Astrophysics Data System (ADS)
Harré, Michael S.
2013-02-01
Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.
Computational simulation of progressive fracture in fiber composites
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1986-01-01
Computational methods for simulating and predicting progressive fracture in fiber composite structures are presented. These methods are integrated into a computer code of modular form. The modules include composite mechanics, finite element analysis, and fracture criteria. The code is used to computationally simulate progressive fracture in composite laminates with and without defects. The simulation tracks the fracture progression in terms of modes initiating fracture, damage growth, and imminent global (catastrophic) laminate fracture.
NASA Technical Reports Server (NTRS)
Santiago, D. L.; Colaprete, A.; Haberle, R. M.; Sloan, L. C.; Asphaug, E. I.
2005-01-01
The existence of surface water on Mars in past geologic epochs is inferred on the basis of geomorphologic interpretation of spaceflight images, and is supported by the recent Mars Odyssey identification of ice-rich soils [1]. The Mars Exploration Rovers have provided further chemical evidence for past surface hydrologic activity [2]. One issue is whether this water-rich climate ever existed in a steady state, or whether it was triggered by catastrophic events such as large impacts [3], and/ or catastrophic outburst floods, the topic of consideration here.
Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease
NASA Astrophysics Data System (ADS)
Marsden, Alison
2009-11-01
Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.
NASA Astrophysics Data System (ADS)
Thornton, James; Desarthe, Jérémy; Naulin, Jean-Philippe; Garnier, Emmanuel; Liu, Ye; Moncoulon, David
2015-04-01
On the islands of the French Antilles, the period for which systematic meteorological measurements and historic event loss data are available is short relative to the recurrence intervals of very intense, damaging hurricanes. Additionally, the value of property at risk changes through time. As such, the recent past can only provide limited insight into potential losses from extreme storms in coming years. Here we present some research that seeks to overcome, as far as is possible, the limitations of record length in assessing the possible impacts of near-future hurricanes on insured properties. First, using the archives of the French overseas departments (which included administrative and weather reports, inventories of damage to houses, crops and trees, as well as some meteorological observations after 1950) we reconstructed the spatial patterns of hazard intensity associated with three historical events. They are: i) the 1928 Hurricane (Guadeloupe), ii) Hurricane Betsy (1956, Guadeloupe) and iii) Hurricane David (1979, Martinique). These events were selected because all were damaging, and the information available on each is rich. Then, using a recently developed catastrophe model for hurricanes affecting Guadeloupe, Martinique, Saint-Barthélemy and Saint-Martin, we simulated the hypothetical losses to insured properties that the reconstructed events might cause if they were to reoccur today. The model simulated damage due to wind, rainfall-induced flooding and storm surge flooding. These 'what if' scenarios provided an initial indication of the potential present-day exposure of the insurance industry to intense hurricanes. However, we acknowledge that historical events are unlikely to repeat exactly. We therefore extended the study by producing a stochastic event catalogue containing a large number of synthetic but plausible hurricane events. Instrumental data were used as a basis for event generation, but importantly the statistical methods we applied permit the extrapolation of simulated events beyond the observed intensity ranges. The event catalogue enabled the model to be run in a probabilistic mode; the losses for each synthetic event in a 10,000-year period were simulated. In this way, the aleatory uncertainty associated with future hazard outcomes was addressed. In conclusion, we consider how the reconstructed event hazard intensities and losses compare with the distribution of 32,320 events in the stochastic event set. Further comparisons are made with a longer chronology of tropical cyclones in the Antilles (going back to the 17th Century) prepared solely from documentary sources. Overall, the novelty of this work lies in the integration of data sources that are frequently overlooked in catastrophe model development and evaluation.
Knotts, Thomas A.
2017-01-01
Molecular simulation has the ability to predict various physical properties that are difficult to obtain experimentally. For example, we implement molecular simulation to predict the critical constants (i.e., critical temperature, critical density, critical pressure, and critical compressibility factor) for large n-alkanes that thermally decompose experimentally (as large as C48). Historically, molecular simulation has been viewed as a tool that is limited to providing qualitative insight. One key reason for this perceived weakness in molecular simulation is the difficulty to quantify the uncertainty in the results. This is because molecular simulations have many sources of uncertainty that propagate and are difficult to quantify. We investigate one of the most important sources of uncertainty, namely, the intermolecular force field parameters. Specifically, we quantify the uncertainty in the Lennard-Jones (LJ) 12-6 parameters for the CH4, CH3, and CH2 united-atom interaction sites. We then demonstrate how the uncertainties in the parameters lead to uncertainties in the saturated liquid density and critical constant values obtained from Gibbs Ensemble Monte Carlo simulation. Our results suggest that the uncertainties attributed to the LJ 12-6 parameters are small enough that quantitatively useful estimates of the saturated liquid density and the critical constants can be obtained from molecular simulation. PMID:28527455
1.5 °C ? - Solutions for avoiding catastrophic climate change in this century
NASA Astrophysics Data System (ADS)
Xu, Y.
2017-12-01
The historic Paris Agreement calls for limiting global temperature rise to "well below 2 °C." Because of uncertainties in emission scenarios, climate, and carbon cycle feedback, we interpret the Paris Agreement in terms of three climate risk categories and bring in considerations of low-probability (5%) high impact (LPHI) warming in addition to the central (˜50% probability) value. The current risk category of dangerous warming is extended to more categories, which are defined by us here as follows: >1.5 °C as dangerous; >3 °C as catastrophic; and >5 °C as unknown, implying beyond catastrophic, including existential threats. With unchecked emissions, the central warming can reach the dangerous level within three decades, with the LPHI warming becoming catastrophic by 2050. We outline a three-lever strategy to limit the central warming below the dangerous level and the LPHI below the catastrophic level, both in the near term (<2050) and in the long term (2100): the carbon neutral (CN) lever to achieve zero net emissions of CO2, the super pollutant (SP) lever to mitigate short-lived climate pollutants, and the carbon extraction and sequestration (CES) lever to thin the atmospheric CO2 blanket. Pulling on both CN and SP levers and bending the emissions curve by 2020 can keep the central warming below dangerous levels. To limit the LPHI warming below dangerous levels, the CES lever must be pulled as well to extract as much as 1 trillion tons of CO2 before 2100 to both limit the preindustrial to 2100 cumulative net CO2 emissions to 2.2 trillion tons and bend the warming curve to a cooling trend. In addition to present the analysis above, I will also share (1) perspective on developed and developing world actions and interactions on climate solutions; (2) Prof V. Ramanathan's interactions with the Pontifical Academy of Sciences and other religious groups which are highly valuable to the interdisciplinary audience.
Design Safety Used in NASA's Human-rated Primary Lithium Batteries
NASA Technical Reports Server (NTRS)
Jeevarajan, J.
2013-01-01
Single cell tests were benign for external short, inadvertent charge and overdischarge into reversal up to 4.5 A. At lower current loads cells die (may be due to excessive dendrite formation) benignly. String level external short circuits lead to an unbalanced overdischarge, with one cell going into reversal. The result is catastrophic violent venting. Unbalanced string overdischarges at different currents causes catastrophic violent venting also. Heat-to-vent is very dramatic displaying violent venting Simulated internal short is also catastrophic and displays violent venting. Battery is not UL-rated; hence does not have dual-fault tolerance or tolerance to inherent cell tolerance to failures Battery Design for NASA JSC's human-rated application for use on ISS was changed to include two bypass diodes per cell to provide for two-failure tolerance to overdischarge into reversal (and external short) hazards.
Knaul, Felicia; Arreola-Ornelas, Héctor; Méndez, Oscar; Martínez, Alejandra
2005-01-01
To assess the impact on fair health financing and household catastrophic health expenditures of the implementation of the Popular Health Insurance (Seguro Popular de Salud). Data analyzed in this study come from the National Income and Expenditure Household Survey (Encuesta Nacional de Ingresos y Gastos de los Hogares, ENIGH), 2000, and the National Health Insurance and Expenditure Survey, (Encuesta Nacional de Aseguramiento y Gasto en Salud, ENAGS), 2001. Estimations are based on projections of extension of the Popular Health Insurance under different conditions of coverage and out-of-pocket expenditure reductions in the uninsured population. The mathematic simulation model assumes applying the new Popular Health Insurance financial structure to the 2000 expenditure values reported by ENIGH, given the probability of affiliation by households. The model of determinants of affiliation to the Popular Health Insurance yielded three significant variables: being in income quintiles I and II, being a female head of household, and that a household member had a medical visit in the past year. Simulation results show that important impacts on the performance of the Mexican Health System will occur in terms of fair financing and catastrophic expenditures, even before achieving the universal coverage goal in 2010. A reduction of 40% in out-of-pocket expenditures and a Popular Health Insurance coverage of 100% will decrease catastrophic health expenditures from 3.4% to 1.6%. Our results show that the reduction of out-of-pocket expenditures generated by the new financing and health provision Popular Health Insurance model, will improve the financial fairness index and the financial contribution to the health system, and will decrease the percentage of households with catastrophic expenditures, even before reaching universal coverage. A greater impact may be expected due to coverage extension initiating in the poorest communities that have a very restricted and progressive financial contribution.
Physical Uncertainty Bounds (PUB)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, Diane Elizabeth; Preston, Dean L.
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switchingmore » out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.« less
Pan-European stochastic flood event set
NASA Astrophysics Data System (ADS)
Kadlec, Martin; Pinto, Joaquim G.; He, Yi; Punčochář, Petr; Kelemen, Fanni D.; Manful, Desmond; Palán, Ladislav
2017-04-01
Impact Forecasting (IF), the model development center of Aon Benfield, has been developing a large suite of catastrophe flood models on probabilistic bases for individual countries in Europe. Such natural catastrophes do not follow national boundaries: for example, the major flood in 2016 was responsible for the Europe's largest insured loss of USD3.4bn and affected Germany, France, Belgium, Austria and parts of several other countries. Reflecting such needs, IF initiated a pan-European flood event set development which combines cross-country exposures with country based loss distributions to provide more insightful data to re/insurers. Because the observed discharge data are not available across the whole Europe in sufficient quantity and quality to permit a detailed loss evaluation purposes, a top-down approach was chosen. This approach is based on simulating precipitation from a GCM/RCM model chain followed by a calculation of discharges using rainfall-runoff modelling. IF set up this project in a close collaboration with Karlsruhe Institute of Technology (KIT) regarding the precipitation estimates and with University of East Anglia (UEA) in terms of the rainfall-runoff modelling. KIT's main objective is to provide high resolution daily historical and stochastic time series of key meteorological variables. A purely dynamical downscaling approach with the regional climate model COSMO-CLM (CCLM) is used to generate the historical time series, using re-analysis data as boundary conditions. The resulting time series are validated against the gridded observational dataset E-OBS, and different bias-correction methods are employed. The generation of the stochastic time series requires transfer functions between large-scale atmospheric variables and regional temperature and precipitation fields. These transfer functions are developed for the historical time series using reanalysis data as predictors and bias-corrected CCLM simulated precipitation and temperature as predictands. Finally, the transfer functions are applied to a large ensemble of GCM simulations with forcing corresponding to present day climate conditions to generate highly resolved stochastic time series of precipitation and temperature for several thousand years. These time series form the input for the rainfall-runoff model developed by the UEA team. It is a spatially distributed model adapted from the HBV model and will be calibrated for individual basins using historical discharge data. The calibrated model will be driven by the precipitation time series generated by the KIT team to simulate discharges at a daily time step. The uncertainties in the simulated discharges will be analysed using multiple model parameter sets. A number of statistical methods will be used to assess return periods, changes in the magnitudes, changes in the characteristics of floods such as time base and time to peak, and spatial correlations of large flood events. The Pan-European flood stochastic event set will permit a better view of flood risk for market applications.
On the Monte Carlo simulation of electron transport in the sub-1 keV energy range.
Thomson, Rowan M; Kawrakow, Iwan
2011-08-01
The validity of "classic" Monte Carlo (MC) simulations of electron and positron transport at sub-1 keV energies is investigated in the context of quantum theory. Quantum theory dictates that uncertainties on the position and energy-momentum four-vectors of radiation quanta obey Heisenberg's uncertainty relation; however, these uncertainties are neglected in "classical" MC simulations of radiation transport in which position and momentum are known precisely. Using the quantum uncertainty relation and electron mean free path, the magnitudes of uncertainties on electron position and momentum are calculated for different kinetic energies; a validity bound on the classical simulation of electron transport is derived. In order to satisfy the Heisenberg uncertainty principle, uncertainties of 5% must be assigned to position and momentum for 1 keV electrons in water; at 100 eV, these uncertainties are 17 to 20% and are even larger at lower energies. In gaseous media such as air, these uncertainties are much smaller (less than 1% for electrons with energy 20 eV or greater). The classical Monte Carlo transport treatment is questionable for sub-1 keV electrons in condensed water as uncertainties on position and momentum must be large (relative to electron momentum and mean free path) to satisfy the quantum uncertainty principle. Simulations which do not account for these uncertainties are not faithful representations of the physical processes, calling into question the results of MC track structure codes simulating sub-1 keV electron transport. Further, the large difference in the scale at which quantum effects are important in gaseous and condensed media suggests that track structure measurements in gases are not necessarily representative of track structure in condensed materials on a micrometer or a nanometer scale.
NASA Astrophysics Data System (ADS)
Garagash, I. A.; Lobkovsky, L. I.; Mazova, R. Kh.
2012-04-01
The study of generation of strongest earthquakes with upper-value magnitude (near above 9) and induced by them catastrophic tsunamis, is performed by authors on the basis of new approach to the generation process, occurring in subduction zones under earthquake. The necessity of performing of such studies is connected with recent 11 March 2011 catastrophic underwater earthquake close to north-east Japan coastline and following it catastrophic tsunami which had led to vast victims and colossal damage for Japan. The essential importance in this study is determined by unexpected for all specialists the strength of earthquake occurred (determined by magnitude M = 9), inducing strongest tsunami with wave height runup on the beach up to 10 meters. The elaborated by us model of interaction of ocean lithosphere with island-arc blocks in subduction zones, with taking into account of incomplete stress discharge at realization of seismic process and further accumulation of elastic energy, permits to explain arising of strongest mega-earthquakes, such as catastrophic earthquake with source in Japan deep-sea trench in March, 2011. In our model, the wide possibility for numerical simulation of dynamical behaviour of underwater seismic source is provided by kinematical model of seismic source as well as by elaborated by authors numerical program for calculation of tsunami wave generation by dynamical and kinematical seismic sources. The method obtained permits take into account the contribution of residual tectonic stress in lithosphere plates, leading to increase of earthquake energy, which is usually not taken into account up to date.
USDA-ARS?s Scientific Manuscript database
Experimental and simulation uncertainties have not been included in many of the statistics used in assessing agricultural model performance. The objectives of this study were to develop an F-test that can be used to evaluate model performance considering experimental and simulation uncertainties, an...
Breininger, David; Duncan, Brean; Eaton, Mitchell J.; Johnson, Fred; Nichols, James
2014-01-01
Land cover modeling is used to inform land management, but most often via a two-step process, where science informs how management alternatives can influence resources, and then, decision makers can use this information to make decisions. A more efficient process is to directly integrate science and decision-making, where science allows us to learn in order to better accomplish management objectives and is developed to address specific decisions. Co-development of management and science is especially productive when decisions are complicated by multiple objectives and impeded by uncertainty. Multiple objectives can be met by the specification of tradeoffs, and relevant uncertainty can be addressed through targeted science (i.e., models and monitoring). We describe how to integrate habitat and fuel monitoring with decision-making focused on the dual objectives of managing for endangered species and minimizing catastrophic fire risk. Under certain conditions, both objectives might be achieved by a similar management policy; other conditions require tradeoffs between objectives. Knowledge about system responses to actions can be informed by developing hypotheses based on ideas about fire behavior and then applying competing management actions to different land units in the same system state. Monitoring and management integration is important to optimize state-specific management decisions and to increase knowledge about system responses. We believe this approach has broad utility and identifies a clear role for land cover modeling programs intended to inform decision-making.
NASA Astrophysics Data System (ADS)
Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.
2014-12-01
We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.
NASA Astrophysics Data System (ADS)
Patel, Dhruvesh; Ramirez, Jorge; Srivastava, Prashant; Bray, Michaela; Han, Dawei
2017-04-01
Surat, known as the diamond city of Gujart is situated 100 km downstream of Ukai dam and near the mouth of river Tapi and affected by the flood at every alternate year. The city experienced catastrophic floods in 1933, 1959, 1968, 1970, 1994, 1998 and 2006. It is estimated that a single flood event during August 6-12, 2006 in Surat and Hazira twin-city, caused heavy damages, resulted in the death of 300 people and property damage worth € 289 million. The peak discharge of 25768 m3 s-1 release from Ukai dam was responsible for the disastrous flood in Surat city. To identifylow lying areas prone to inundation and reduce the uncertainty in flood mitigation measures, HEC-RAS based 1D/2D Couple hydrodynamic modeling is carried out for Surat city. Release from the Ukai dam and tidal level of the sea are considered for upstream and downstream boundary condition. 299 surveyed cross-sections have been considered for 1D modeling, whereas a topographic map at 0.5 m contour interval was used to produce a 5 m grid and SRTM (30 & 90 m) grid has been considered for Suart and Lower Tapi Basin (LTB). Flow is simulated under unsteady conditions, calibrated for the year 1998 and validated for the year 2006. The simulated result shows that the 9th August 18.00 hr was the worst day for Surat city and maximum 75-77 % area was under inundation. Most of the flooded area experienced 0.25 m/s water velocity with the duration of 90 hr. Due to low velocity and high duration of the flood, a low lying area within the west zone and south-west zone of the city was badly affected by the flood, whereas the south zone and south-east zone was least. Simulated results show good correlation when compared with an observed flood level map. The simulated results will be helpful to improve the flood resilience strategy at Surat city and reduce the uncertainty for flood inundation mapping for future dam releases. The present case study shows the applicability of 1D/2D coupled hydrodynamic modeling for flood inundation mapping and can be applied for flood assessment at locations with similar geographical conditions.
Crash testing difference-smoothing algorithm on a large sample of simulated light curves from TDC1
NASA Astrophysics Data System (ADS)
Rathna Kumar, S.
2017-09-01
In this work, we propose refinements to the difference-smoothing algorithm for the measurement of time delay from the light curves of the images of a gravitationally lensed quasar. The refinements mainly consist of a more pragmatic approach to choose the smoothing time-scale free parameter, generation of more realistic synthetic light curves for the estimation of time delay uncertainty and using a plot of normalized χ2 computed over a wide range of trial time delay values to assess the reliability of a measured time delay and also for identifying instances of catastrophic failure. We rigorously tested the difference-smoothing algorithm on a large sample of more than thousand pairs of simulated light curves having known true time delays between them from the two most difficult 'rungs' - rung3 and rung4 - of the first edition of Strong Lens Time Delay Challenge (TDC1) and found an inherent tendency of the algorithm to measure the magnitude of time delay to be higher than the true value of time delay. However, we find that this systematic bias is eliminated by applying a correction to each measured time delay according to the magnitude and sign of the systematic error inferred by applying the time delay estimator on synthetic light curves simulating the measured time delay. Following these refinements, the TDC performance metrics for the difference-smoothing algorithm are found to be competitive with those of the best performing submissions of TDC1 for both the tested 'rungs'. The MATLAB codes used in this work and the detailed results are made publicly available.
Uncertainty in simulating wheat yields under climate change
NASA Astrophysics Data System (ADS)
Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J. W.; Hatfield, J. L.; Ruane, A. C.; Boote, K. J.; Thorburn, P. J.; Rötter, R. P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P. K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, A. J.; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, R.; Heng, L.; Hooker, J.; Hunt, L. A.; Ingwersen, J.; Izaurralde, R. C.; Kersebaum, K. C.; Müller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.; Olesen, J. E.; Osborne, T. M.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M. A.; Shcherbak, I.; Steduto, P.; Stöckle, C.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J. W.; Williams, J. R.; Wolf, J.
2013-09-01
Projections of climate change impacts on crop yields are inherently uncertain. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models are difficult. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development andpolicymaking.
Boosting flood warning schemes with fast emulator of detailed hydrodynamic models
NASA Astrophysics Data System (ADS)
Bellos, V.; Carbajal, J. P.; Leitao, J. P.
2017-12-01
Floods are among the most destructive catastrophic events and their frequency has incremented over the last decades. To reduce flood impact and risks, flood warning schemes are installed in flood prone areas. Frequently, these schemes are based on numerical models which quickly provide predictions of water levels and other relevant observables. However, the high complexity of flood wave propagation in the real world and the need of accurate predictions in urban environments or in floodplains hinders the use of detailed simulators. This sets the difficulty, we need fast predictions that meet the accuracy requirements. Most physics based detailed simulators although accurate, will not fulfill the speed demand. Even if High Performance Computing techniques are used (the magnitude of required simulation time is minutes/hours). As a consequence, most flood warning schemes are based in coarse ad-hoc approximations that cannot take advantage a detailed hydrodynamic simulation. In this work, we present a methodology for developing a flood warning scheme using an Gaussian Processes based emulator of a detailed hydrodynamic model. The methodology consists of two main stages: 1) offline stage to build the emulator; 2) online stage using the emulator to predict and generate warnings. The offline stage consists of the following steps: a) definition of the critical sites of the area under study, and the specification of the observables to predict at those sites, e.g. water depth, flow velocity, etc.; b) generation of a detailed simulation dataset to train the emulator; c) calibration of the required parameters (if measurements are available). The online stage is carried on using the emulator to predict the relevant observables quickly, and the detailed simulator is used in parallel to verify key predictions of the emulator. The speed gain given by the emulator allows also to quantify uncertainty in predictions using ensemble methods. The above methodology is applied in real world scenario.
COLLISIONS BETWEEN GRAVITY-DOMINATED BODIES. I. OUTCOME REGIMES AND SCALING LAWS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Leinhardt, Zoee M.; Stewart, Sarah T., E-mail: Zoe.Leinhardt@bristol.ac.uk, E-mail: sstewart@eps.harvard.edu
2012-01-20
Collisions are the core agent of planet formation. In this work, we derive an analytic description of the dynamical outcome for any collision between gravity-dominated bodies. We conduct high-resolution simulations of collisions between planetesimals; the results are used to isolate the effects of different impact parameters on collision outcome. During growth from planetesimals to planets, collision outcomes span multiple regimes: cratering, merging, disruption, super-catastrophic disruption, and hit-and-run events. We derive equations (scaling laws) to demarcate the transition between collision regimes and to describe the size and velocity distributions of the post-collision bodies. The scaling laws are used to calculate mapsmore » of collision outcomes as a function of mass ratio, impact angle, and impact velocity, and we discuss the implications of the probability of each collision regime during planet formation. Collision outcomes are described in terms of the impact conditions and the catastrophic disruption criteria, Q*{sub RD}-the specific energy required to disperse half the total colliding mass. All planet formation and collisional evolution studies have assumed that catastrophic disruption follows pure energy scaling; however, we find that catastrophic disruption follows nearly pure momentum scaling. As a result, Q*{sub RD} is strongly dependent on the impact velocity and projectile-to-target mass ratio in addition to the total mass and impact angle. To account for the impact angle, we derive the interacting mass fraction of the projectile; the outcome of a collision is dependent on the kinetic energy of the interacting mass rather than the kinetic energy of the total mass. We also introduce a new material parameter, c*, that defines the catastrophic disruption criteria between equal-mass bodies in units of the specific gravitational binding energy. For a diverse range of planetesimal compositions and internal structures, c* has a value of 5 {+-} 2; whereas for strengthless planets, we find c* = 1.9 {+-} 0.3. We refer to the catastrophic disruption criteria for equal-mass bodies as the principal disruption curve, which is used as the reference value in the calculation of Q*{sub RD} for any collision scenario. The analytic collision model presented in this work will significantly improve the physics of collisions in numerical simulations of planet formation and collisional evolution.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salloum, Maher N.; Sargsyan, Khachik; Jones, Reese E.
2015-08-11
We present a methodology to assess the predictive fidelity of multiscale simulations by incorporating uncertainty in the information exchanged between the components of an atomistic-to-continuum simulation. We account for both the uncertainty due to finite sampling in molecular dynamics (MD) simulations and the uncertainty in the physical parameters of the model. Using Bayesian inference, we represent the expensive atomistic component by a surrogate model that relates the long-term output of the atomistic simulation to its uncertain inputs. We then present algorithms to solve for the variables exchanged across the atomistic-continuum interface in terms of polynomial chaos expansions (PCEs). We alsomore » consider a simple Couette flow where velocities are exchanged between the atomistic and continuum components, while accounting for uncertainty in the atomistic model parameters and the continuum boundary conditions. Results show convergence of the coupling algorithm at a reasonable number of iterations. As a result, the uncertainty in the obtained variables significantly depends on the amount of data sampled from the MD simulations and on the width of the time averaging window used in the MD simulations.« less
Catastrophic Disruption Threshold and Maximum Deflection from Kinetic Impact
NASA Astrophysics Data System (ADS)
Cheng, A. F.
2017-12-01
The use of a kinetic impactor to deflect an asteroid on a collision course with Earth was described in the NASA Near-Earth Object Survey and Deflection Analysis of Alternatives (2007) as the most mature approach for asteroid deflection and mitigation. The NASA DART mission will demonstrate asteroid deflection by kinetic impact at the Potentially Hazardous Asteroid 65803 Didymos in October, 2022. The kinetic impactor approach is considered to be applicable with warning times of 10 years or more and with hazardous asteroid diameters of 400 m or less. In principle, a larger kinetic impactor bringing greater kinetic energy could cause a larger deflection, but input of excessive kinetic energy will cause catastrophic disruption of the target, leaving possibly large fragments still on collision course with Earth. Thus the catastrophic disruption threshold limits the maximum deflection from a kinetic impactor. An often-cited rule of thumb states that the maximum deflection is 0.1 times the escape velocity before the target will be disrupted. It turns out this rule of thumb does not work well. A comparison to numerical simulation results shows that a similar rule applies in the gravity limit, for large targets more than 300 m, where the maximum deflection is roughly the escape velocity at momentum enhancement factor β=2. In the gravity limit, the rule of thumb corresponds to pure momentum coupling (μ=1/3), but simulations find a slightly different scaling μ=0.43. In the smaller target size range that kinetic impactors would apply to, the catastrophic disruption limit is strength-controlled. A DART-like impactor won't disrupt any target asteroid down to significantly smaller size than the 50 m below which a hazardous object would not penetrate the atmosphere in any case unless it is unusually strong.
NASA Astrophysics Data System (ADS)
Michel, P.; Benz, W.; Richardson, D. C.
2005-08-01
Recent simulations of asteroid break-ups, including both the fragmentation of the parent body and the gravitational interactions of the fragments, have allowed to reproduced successfully the main properties of asteroid families formed in different regimes of impact energy. Here, using the same kind of simulations, we concentrate on a single regime of impact energy, the so-called catastrophic threshold usually designated by Qcrit, which results in the escape of half of the target's mass. Considering a wide range of diameter values and two kinds of internal structures of the parent body, monolithic and pre-shattered, we analyse their potential influences on the value of Qcrit and on the collisional outcome limited here to the fragment size and ejection speed distributions, which are the main outcome properties used by collisional models to study the evolutions of the different populations of small bodies. For all the considered diameters and the two internal structures of the parent body, we confirm that the process of gravitational reaccumulation is at the origin of the largest remnant's mass. We then find that, for a given diameter of the parent body, the impact energy corresponding to the catastrophic disruption threshold is highly dependent on the internal structure of the parent body. In particular, a pre-shattered parent body containing only damaged zones but no macroscopic voids is easier to disrupt than a monolithic parent body. Other kinds of internal properties that can also characterize small bodies in real populations will be investigated in a future work.
The Modification of Biocellular Chemical Reactions by Environmental Physicochemicals
NASA Astrophysics Data System (ADS)
Ishido, M.
Environmental risk factors affect human biological system to different extent from modification of biochemical reaction to cellular catastrophe. There are considerable public concerns about electromagnetic fields and endocrine disruptors. Their risk assessments have not been fully achieved because of their scientific uncertainty: electromagnetic fields just modify the bioreaction in the restricted cells and endocrine disruptors are quite unique in that their expression is dependent on the exposure periods throughout a life. Thus, we here describe their molecular characterization to establish the new risk assessments for environmental physicochemicals.
Incorporating parametric uncertainty into population viability analysis models
McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.
2011-01-01
Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.
Extreme sensitivity in Thermoacoustics
NASA Astrophysics Data System (ADS)
Juniper, Matthew
2017-11-01
In rocket engines and gas turbines, fluctuations in the heat release rate can lock in to acoustic oscillations and grow catastrophically. Nine decades of engine development have shown that these oscillations are difficult to predict but can usually be eliminated with small ad hoc design changes. The difficulty in prediction arises because the oscillations' growth rate is exceedingly sensitive to parameters that cannot always be measured or simulated reliably, which introduces severe systematic error into thermoacoustic models of engines. Passive control strategies then have to be devised through full scale engine tests, which can be ruinously expensive. For the Apollo F1 engine, for example, 2000 full-scale tests were required. Even today, thermoacoustic oscillations often re-appear unexpectedly at full engine test stage. Although the physics is well known, a novel approach to design is required. In this presentation, the parameters of a thermoacoustic model are inferred from many thousand automated experiments using inverse uncertainty quantification. The adjoint of this model is used to obtain cheaply the gradients of every unstable mode with respect to the model parameters. This gradient information is then used in an optimization algorithm to stabilize every thermoacoustic mode by subtly changing the geometry of the model.
Uncertainty in Simulating Wheat Yields Under Climate Change
NASA Technical Reports Server (NTRS)
Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.;
2013-01-01
Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.
Uncertainty Analysis of Sonic Boom Levels Measured in a Simulator at NASA Langley
NASA Technical Reports Server (NTRS)
Rathsam, Jonathan; Ely, Jeffry W.
2012-01-01
A sonic boom simulator has been constructed at NASA Langley Research Center for testing the human response to sonic booms heard indoors. Like all measured quantities, sonic boom levels in the simulator are subject to systematic and random errors. To quantify these errors, and their net influence on the measurement result, a formal uncertainty analysis is conducted. Knowledge of the measurement uncertainty, or range of values attributable to the quantity being measured, enables reliable comparisons among measurements at different locations in the simulator as well as comparisons with field data or laboratory data from other simulators. The analysis reported here accounts for acoustic excitation from two sets of loudspeakers: one loudspeaker set at the facility exterior that reproduces the exterior sonic boom waveform and a second set of interior loudspeakers for reproducing indoor rattle sounds. The analysis also addresses the effect of pressure fluctuations generated when exterior doors of the building housing the simulator are opened. An uncertainty budget is assembled to document each uncertainty component, its sensitivity coefficient, and the combined standard uncertainty. The latter quantity will be reported alongside measurement results in future research reports to indicate data reliability.
Valuing Precaution in Climate Change Policy Analysis (Invited)
NASA Astrophysics Data System (ADS)
Howarth, R. B.
2010-12-01
The U.N. Framework Convention on Climate Change calls for stabilizing greenhouse gas concentrations to prevent “dangerous anthropogenic interference” (DAI) with the global environment. This treaty language emphasizes a precautionary approach to climate change policy in a setting characterized by substantial uncertainty regarding the timing, magnitude, and impacts of climate change. In the economics of climate change, however, analysts often work with deterministic models that assign best-guess values to parameters that are highly uncertain. Such models support a “policy ramp” approach in which only limited steps should be taken to reduce the future growth of greenhouse gas emissions. This presentation will explore how uncertainties related to (a) climate sensitivity and (b) climate-change damages can be satisfactorily addressed in a coupled model of climate-economy dynamics. In this model, capping greenhouse gas concentrations at ~450 ppm of carbon dioxide equivalent provides substantial net benefits by reducing the risk of low-probability, catastrophic impacts. This result formalizes the intuition embodied in the DAI criterion in a manner consistent with rational decision-making under uncertainty.
Numerical modelling of glacial lake outburst floods using physically based dam-breach models
NASA Astrophysics Data System (ADS)
Westoby, M. J.; Brasington, J.; Glasser, N. F.; Hambrey, M. J.; Reynolds, J. M.; Hassan, M. A. A. M.; Lowe, A.
2015-03-01
The instability of moraine-dammed proglacial lakes creates the potential for catastrophic glacial lake outburst floods (GLOFs) in high-mountain regions. In this research, we use a unique combination of numerical dam-breach and two-dimensional hydrodynamic modelling, employed within a generalised likelihood uncertainty estimation (GLUE) framework, to quantify predictive uncertainty in model outputs associated with a reconstruction of the Dig Tsho failure in Nepal. Monte Carlo analysis was used to sample the model parameter space, and morphological descriptors of the moraine breach were used to evaluate model performance. Multiple breach scenarios were produced by differing parameter ensembles associated with a range of breach initiation mechanisms, including overtopping waves and mechanical failure of the dam face. The material roughness coefficient was found to exert a dominant influence over model performance. The downstream routing of scenario-specific breach hydrographs revealed significant differences in the timing and extent of inundation. A GLUE-based methodology for constructing probabilistic maps of inundation extent, flow depth, and hazard is presented and provides a useful tool for communicating uncertainty in GLOF hazard assessment.
NASA Astrophysics Data System (ADS)
Zhang, G.; Chen, F.; Gan, Y.
2017-12-01
Assessing and mitigating uncertainties in the Noah-MP land-model simulations over the Tibet Plateau region Guo Zhang1, Fei Chen1,2, Yanjun Gan11State Key Laboratory of Severe Weather, Chinese Academy of Meteorological Sciences, Beijing, China 2National Center for Atmospheric Research, Boulder, Colorado, USA Uncertainties in the Noah with multiparameterization (Noah-MP) land surface model were assessed through physics ensemble simulations for four sparsely-vegetated sites located in the Tibetan Plateau region. Those simulations were evaluated using observations at the four sites during the third Tibetan Plateau Experiment (TIPEX III).The impacts of uncertainties in precipitation data used as forcing conditions, parameterizations of sub-processes such as soil organic matter and rhizosphere on physics-ensemble simulations are identified using two different methods: the natural selection and Tukey's test. This study attempts to answer the following questions: 1) what is the relative contribution of precipitation-forcing uncertainty to the overall uncertainty range of Noah-MP simulations at those sites as compared to that at a more moisture and densely vegetated site; 2) what are the most sensitive physical parameterization for those sites; 3) can we identify the parameterizations that need to be improved? The investigation was conducted by evaluating simulated seasonal evolution of soil temperature, soilmoisture, surface heat fluxes through a number of Noah-MP ensemble simulations.
Understanding Impact and Implications of Data Standards on Post Disaster Risk Analysis
NASA Astrophysics Data System (ADS)
Stevenson, Robert
2010-05-01
Although the physical and humanitarian effects of a natural catastrophe are often bound to the locality of the event the financial impacts can have global effects. This is particularly prominent in the re/insurance community, where through a number of market mechanisms and re/insurance structures financial loss is mitigated amongst many companies across the globe. The level of risk a company wishes to retain, given an event, represents the level of risk decision makers deem acceptable. Catastrophe risk modelling tools aid the estimation of risk retention and transfer mechanisms, and increasingly the level of capital required to withstand a catastrophic event. These tools rely on appropriate representations hazard, exposure, vulnerability and insurance conditions that reflect the reality of risk. In addition, accurate estimation of loss potential in the aftermath of a catastrophic event equally relies on the data available to assess the scale of damages experienced and to provide views on the likely scale of loss. A coherent and focussed data and modelling strategy is required to ensure that the risk assessment made is as accurate as possible. A fundamental factor in determining the accuracy of catastrophe output, is the quality of data entered. It is of vital importance, therefore, to have an understanding of both the data used as well as the standard of this data, which will so powerfully impact upon the decision making process. This is perhaps best illustrated through the study of historical events, such as Hurricane Katrina and Ike. The extent of data variance in post disaster analysis clearly demonstrates issues of data discrepancies, vintage, resolution and uncertainty propagation, and reflects on the standard of the original data utilized for modelling purposes and decision making. Using experience gained from recent events, this paper will explore current data variabilities, and the impacts on effective loss estimation, both in relation to reinsurance structuring, but also in terms of effective post-event analysis. It will provide views on how data is currently applied in this context, and will make suggestions as to the most important areas for future data improvements.
Quantifying radar-rainfall uncertainties in urban drainage flow modelling
NASA Astrophysics Data System (ADS)
Rico-Ramirez, M. A.; Liguori, S.; Schellart, A. N. A.
2015-09-01
This work presents the results of the implementation of a probabilistic system to model the uncertainty associated to radar rainfall (RR) estimates and the way this uncertainty propagates through the sewer system of an urban area located in the North of England. The spatial and temporal correlations of the RR errors as well as the error covariance matrix were computed to build a RR error model able to generate RR ensembles that reproduce the uncertainty associated with the measured rainfall. The results showed that the RR ensembles provide important information about the uncertainty in the rainfall measurement that can be propagated in the urban sewer system. The results showed that the measured flow peaks and flow volumes are often bounded within the uncertainty area produced by the RR ensembles. In 55% of the simulated events, the uncertainties in RR measurements can explain the uncertainties observed in the simulated flow volumes. However, there are also some events where the RR uncertainty cannot explain the whole uncertainty observed in the simulated flow volumes indicating that there are additional sources of uncertainty that must be considered such as the uncertainty in the urban drainage model structure, the uncertainty in the urban drainage model calibrated parameters, and the uncertainty in the measured sewer flows.
Simulation's Ensemble is Better Than Ensemble Simulation
NASA Astrophysics Data System (ADS)
Yan, X.
2017-12-01
Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.
Economics, ethics, and climate policy: framing the debate
NASA Astrophysics Data System (ADS)
Howarth, Richard B.; Monahan, Patricia A.
1996-04-01
This paper examines the economic and ethical dimensions of climate policy in light of existing knowledge of the impacts of global warming and the costs of greenhouse gas emissions abatement. We find that the criterion of economic efficiency, operationalized through cost-benefit analysis, is ill-equipped to cope with the pervasive uncertainties and issues of intergenerational fairness that characterize climate change. In contrast, the concept of sustainable development—that today's policies should ensure that future generations enjoy life opportunities undiminished relative to the present—is a normative criterion that explicitly addresses the uncertainties and distributional aspects of global environmental change. If one interprets the sustainability criterion to imply that it is morally wrong to impose catastrophic risks on unborn generations when reducing those risks would not noticeably diminish the quality of life of existing persons, a case can be made for significant steps to reduce greenhouse gas emissions.
NASA Astrophysics Data System (ADS)
Santabarbara, Ignacio; Haas, Edwin; Kraus, David; Herrera, Saul; Klatt, Steffen; Kiese, Ralf
2014-05-01
When using biogeochemical models to estimate greenhouse gas emissions at site to regional/national levels, the assessment and quantification of the uncertainties of simulation results are of significant importance. The uncertainties in simulation results of process-based ecosystem models may result from uncertainties of the process parameters that describe the processes of the model, model structure inadequacy as well as uncertainties in the observations. Data for development and testing of uncertainty analisys were corp yield observations, measurements of soil fluxes of nitrous oxide (N2O) and carbon dioxide (CO2) from 8 arable sites across Europe. Using the process-based biogeochemical model LandscapeDNDC for simulating crop yields, N2O and CO2 emissions, our aim is to assess the simulation uncertainty by setting up a Bayesian framework based on Metropolis-Hastings algorithm. Using Gelman statistics convergence criteria and parallel computing techniques, enable multi Markov Chains to run independently in parallel and create a random walk to estimate the joint model parameter distribution. Through means distribution we limit the parameter space, get probabilities of parameter values and find the complex dependencies among them. With this parameter distribution that determines soil-atmosphere C and N exchange, we are able to obtain the parameter-induced uncertainty of simulation results and compare them with the measurements data.
NASA Astrophysics Data System (ADS)
Noacco, V.; Wagener, T.; Pianosi, F.; Philp, T.
2017-12-01
Insurance companies provide insurance against a wide range of threats, such as natural catastrophes, nuclear incidents and terrorism. To quantify risk and support investment decisions, mathematical models are used, for example to set the premiums charged to clients that protect from financial loss, should deleterious events occur. While these models are essential tools for adequately assessing the risk attached to an insurer's portfolio, their development is costly and their value for decision-making may be limited by an incomplete understanding of uncertainty and sensitivity. Aside from the business need to understand risk and uncertainty, the insurance sector also faces regulation which requires them to test their models in such a way that uncertainties are appropriately captured and that plans are in place to assess the risks and their mitigation. The building and testing of models constitutes a high cost for insurance companies, and it is a time intensive activity. This study uses an established global sensitivity analysis toolbox (SAFE) to more efficiently capture the uncertainties and sensitivities embedded in models used by a leading re/insurance firm, with structured approaches to validate these models and test the impact of assumptions on the model predictions. It is hoped that this in turn will lead to better-informed and more robust business decisions.
Flow field topology of transient mixing driven by buoyancy
NASA Technical Reports Server (NTRS)
Duval, Walter M B.
2004-01-01
Transient mixing driven by buoyancy occurs through the birth of a symmetric Rayleigh-Taylor morphology (RTM) structure for large length scales. Beyond its critical bifurcation the RTM structure exhibits self-similarity and occurs on smaller and smaller length scales. The dynamics of the RTM structure, its nonlinear growth and internal collision, show that its genesis occurs from an explosive bifurcation which leads to the overlap of resonance regions in phase space. This event shows the coexistence of regular and chaotic regions in phase space which is corroborated with the existence of horseshoe maps. A measure of local chaos given by the topological entropy indicates that as the system evolves there is growth of uncertainty. Breakdown of the dissipative RTM structure occurs during the transition from explosive to catastrophic bifurcation; this event gives rise to annihilation of the separatrices which drives overlap of resonance regions. The global bifurcation of explosive and catastrophic events in phase space for the large length scale of the RTM structure serves as a template for which mixing occurs on smaller and smaller length scales. Copyright 2004 American Institute of Physics.
Assessment of an explosive LPG release accident: a case study.
Bubbico, Roberto; Marchini, Mauro
2008-07-15
In the present paper, an accident occurred during a liquefied petroleum gas (LPG) tank filling activity has been taken into consideration. During the transfer of LPG from the source road tank car to the receiving fixed storage vessel, an accidental release of LPG gave rise to different final consequences ranging from a pool fire, to a fireball and to the catastrophic rupture of the tank with successive explosion of its contents. The sequence of events has been investigated by using some of the consequence calculation models most commonly adopted in risk analysis and accident investigation. On one hand, this allows to better understand the link between the various events of the accident. On the other hand, a comparison between the results of the calculations and the damages actually observed after the accident, allows to check the accuracy of the prediction models and to critically assess their validity. In particular, it was shown that the largest uncertainty is associated with the calculation of the energy involved in the physical expansion of the fluid (both liquid and vapor) after the catastrophic rupture of the tank.
Windstorms and Insured Loss in the UK: Modelling the Present and the Future
NASA Astrophysics Data System (ADS)
Hewston, R.; Dorling, S.; Viner, D.
2006-12-01
Worldwide, the costs of catastrophic weather events have increased dramatically in recent years, with average annual insured losses rising from a negligible level in 1950 to over $10bn in 2005 (Munich Re 2006). When losses from non-catastrophic weather related events are included this figure is doubled. A similar trend is exhibited in the UK with claims totalling over £6bn for the period 1998-2003, more than twice the value for the previous five years (Dlugolecki 2004). More than 70% of this loss is associated with storms. Extratropical cyclones are the main source of wind damage in the UK. In this research, a windstorm model is constructed to simulate patterns of insured loss associated with wind damage in the UK. Observed daily maximum wind gust speeds and a variety of socioeconomic datasets are utilised in a GIS generated model, which is verified against actual domestic property insurance claims data from two major insurance providers. The increased frequency and intensity of extreme events which are anticipated to accompany climate change in the UK will have a direct affect on general insurance, with the greatest impact expected to be on property insurance (Dlugolecki 2004). A range of experiments will be run using Regional Climate Model output data, in conjunction with the windstorm model, to simulate possible future losses resulting from climate change, assuming no alteration to the vulnerability of the building stock. Losses for the periods 2020-2050 and 2070- 2100 will be simulated under the various IPCC emissions scenarios. Munich Re (2006). Annual Review: Natural Catastrophes 2005. Munich, Munich Re: 52. Dlugolecki, A. (2004). A Changing Climate for Insurance - A summary report for Chief Executives and Policymakers, Association of British Insurers
Adjoint-Based Uncertainty Quantification with MCNP
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seifried, Jeffrey E.
2011-09-01
This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence inmore » the simulation is acquired.« less
Can reduction of uncertainties in cervix cancer brachytherapy potentially improve clinical outcome?
Nesvacil, Nicole; Tanderup, Kari; Lindegaard, Jacob C; Pötter, Richard; Kirisits, Christian
2016-09-01
The aim of this study was to quantify the impact of different types and magnitudes of dosimetric uncertainties in cervix cancer brachytherapy (BT) on tumour control probability (TCP) and normal tissue complication probability (NTCP) curves. A dose-response simulation study was based on systematic and random dose uncertainties and TCP/NTCP models for CTV and rectum. Large patient cohorts were simulated assuming different levels of dosimetric uncertainties. TCP and NTCP were computed, based on the planned doses, the simulated dose uncertainty, and an underlying TCP/NTCP model. Systematic uncertainties of 3-20% and random uncertainties with a 5-30% standard deviation per BT fraction were analysed. Systematic dose uncertainties of 5% lead to a 1% decrease/increase of TCP/NTCP, while random uncertainties of 10% had negligible impact on the dose-response curve at clinically relevant dose levels for target and OAR. Random OAR dose uncertainties of 30% resulted in an NTCP increase of 3-4% for planned doses of 70-80Gy EQD2. TCP is robust to dosimetric uncertainties when dose prescription is in the more flat region of the dose-response curve at doses >75Gy. For OARs, improved clinical outcome is expected by reduction of uncertainties via sophisticated dose delivery and treatment verification. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
New perspectives on the accretion and internal evolution of Venus
NASA Astrophysics Data System (ADS)
O'Rourke, J. G.
2017-12-01
Dichotomous conditions on Earth and Venus present one of the most compelling mysteries in our Solar System. Ongoing debate centers on how the internal dynamics of Venus have shaped its atmospheric composition, surface features, and even habitability over geologic time. In particular, Venus may have resembled Earth for billions of years before suffering catastrophic transformation, or perhaps some accretionary process set these twin planets on divergent paths from the beginning. Unfortunately, the limited quality of decades-old data—particularly the low resolution of radar imagery and global topography from NASA's Magellan mission—harms our ability to draw definite conclusions. But some progress is possible given recent advances in modeling techniques and improved topography derived from stereo images that are available for roughly twenty percent of the surface. Here I present simulations of the interior evolution of Venus consistent with all available constraints and, more importantly, identify future measurements that would dramatically narrow the range of acceptable scenarios. Obtaining high-resolution imagery and topography, along with any information about the temporal history of a magnetic field, is extremely important. Deformation of geologic features constrains the surface heat flow and lithospheric rheology during their formation. Determining whether craters with radar-dark floors (which comprise 80% of the population) are actually embayed by lava flows would finally settle the controversy over catastrophic versus equilibrium resurfacing. If the core of Venus has completely solidified, then the lack of an internally generated magnetic field today is unsurprising. We might expect dynamo action in the past since relatively high mantle temperatures may increase the rate of core cooling—unless a lack of giant impacts during accretion permitted chemical stratification that resists convection. In any case, uncertainty about our celestial cousin reveals a general ignorance of fundamental processes governing planetary evolution and demands renewed effort to gather new observations.
NASA Technical Reports Server (NTRS)
Radespiel, Rolf; Hemsch, Michael J.
2007-01-01
The complexity of modern military systems, as well as the cost and difficulty associated with experimentally verifying system and subsystem design makes the use of high-fidelity based simulation a future alternative for design and development. The predictive ability of such simulations such as computational fluid dynamics (CFD) and computational structural mechanics (CSM) have matured significantly. However, for numerical simulations to be used with confidence in design and development, quantitative measures of uncertainty must be available. The AVT 147 Symposium has been established to compile state-of-the art methods of assessing computational uncertainty, to identify future research and development needs associated with these methods, and to present examples of how these needs are being addressed and how the methods are being applied. Papers were solicited that address uncertainty estimation associated with high fidelity, physics-based simulations. The solicitation included papers that identify sources of error and uncertainty in numerical simulation from either the industry perspective or from the disciplinary or cross-disciplinary research perspective. Examples of the industry perspective were to include how computational uncertainty methods are used to reduce system risk in various stages of design or development.
NASA Astrophysics Data System (ADS)
Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara
2015-09-01
Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.
Huang, Zhijiong; Hu, Yongtao; Zheng, Junyu; Yuan, Zibing; Russell, Armistead G; Ou, Jiamin; Zhong, Zhuangmin
2017-04-04
The traditional reduced-form model (RFM) based on the high-order decoupled direct method (HDDM), is an efficient uncertainty analysis approach for air quality models, but it has large biases in uncertainty propagation due to the limitation of the HDDM in predicting nonlinear responses to large perturbations of model inputs. To overcome the limitation, a new stepwise-based RFM method that combines several sets of local sensitive coefficients under different conditions is proposed. Evaluations reveal that the new RFM improves the prediction of nonlinear responses. The new method is applied to quantify uncertainties in simulated PM 2.5 concentrations in the Pearl River Delta (PRD) region of China as a case study. Results show that the average uncertainty range of hourly PM 2.5 concentrations is -28% to 57%, which can cover approximately 70% of the observed PM 2.5 concentrations, while the traditional RFM underestimates the upper bound of the uncertainty range by 1-6%. Using a variance-based method, the PM 2.5 boundary conditions and primary PM 2.5 emissions are found to be the two major uncertainty sources in PM 2.5 simulations. The new RFM better quantifies the uncertainty range in model simulations and can be applied to improve applications that rely on uncertainty information.
NASA Astrophysics Data System (ADS)
Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining
2017-11-01
Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
He, L; Huang, G H; Lu, H W
2010-04-15
Solving groundwater remediation optimization problems based on proxy simulators can usually yield optimal solutions differing from the "true" ones of the problem. This study presents a new stochastic optimization model under modeling uncertainty and parameter certainty (SOMUM) and the associated solution method for simultaneously addressing modeling uncertainty associated with simulator residuals and optimizing groundwater remediation processes. This is a new attempt different from the previous modeling efforts. The previous ones focused on addressing uncertainty in physical parameters (i.e. soil porosity) while this one aims to deal with uncertainty in mathematical simulator (arising from model residuals). Compared to the existing modeling approaches (i.e. only parameter uncertainty is considered), the model has the advantages of providing mean-variance analysis for contaminant concentrations, mitigating the effects of modeling uncertainties on optimal remediation strategies, offering confidence level of optimal remediation strategies to system designers, and reducing computational cost in optimization processes. 2009 Elsevier B.V. All rights reserved.
Effects of Parameter Uncertainty on Long-Term Simulations of Lake Alkalinity
NASA Astrophysics Data System (ADS)
Lee, Sijin; Georgakakos, Konstantine P.; Schnoor, Jerald L.
1990-03-01
A first-order second-moment uncertainty analysis has been applied to two lakes in the Adirondack Park, New York, to assess the long-term response of lakes to acid deposition. Uncertainty due to parameter error and initial condition error was considered. Because the enhanced trickle-down (ETD) model is calibrated with only 3 years of field data and is used to simulate a 50-year period, the uncertainty in the lake alkalinity prediction is relatively large. When a best estimate of parameter uncertainty is used, the annual average alkalinity is predicted to be -11 ±28 μeq/L for Lake Woods and 142 ± 139 μeq/L for Lake Panther after 50 years. Hydrologic parameters and chemical weathering rate constants contributed most to the uncertainty of the simulations. Results indicate that the uncertainty in long-range predictions of lake alkalinity increased significantly over a 5- to 10-year period and then reached a steady state.
Guo, Changning; Doub, William H; Kauffman, John F
2010-08-01
Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association
Theoretical Grounds for the Propagation of Uncertainties in Monte Carlo Particle Transport
NASA Astrophysics Data System (ADS)
Saracco, Paolo; Pia, Maria Grazia; Batic, Matej
2014-04-01
We introduce a theoretical framework for the calculation of uncertainties affecting observables produced by Monte Carlo particle transport, which derive from uncertainties in physical parameters input into simulation. The theoretical developments are complemented by a heuristic application, which illustrates the method of calculation in a streamlined simulation environment.
Estimating Uncertainty in N2O Emissions from US Cropland Soils
USDA-ARS?s Scientific Manuscript database
A Monte Carlo analysis was combined with an empirically-based approach to quantify uncertainties in soil N2O emissions from US croplands estimated with the DAYCENT simulation model. Only a subset of croplands was simulated in the Monte Carlo analysis which was used to infer uncertainties across the ...
Uncertainty Quantification in Multi-Scale Coronary Simulations Using Multi-resolution Expansion
NASA Astrophysics Data System (ADS)
Tran, Justin; Schiavazzi, Daniele; Ramachandra, Abhay; Kahn, Andrew; Marsden, Alison
2016-11-01
Computational simulations of coronary flow can provide non-invasive information on hemodynamics that can aid in surgical planning and research on disease propagation. In this study, patient-specific geometries of the aorta and coronary arteries are constructed from CT imaging data and finite element flow simulations are carried out using the open source software SimVascular. Lumped parameter networks (LPN), consisting of circuit representations of vascular hemodynamics and coronary physiology, are used as coupled boundary conditions for the solver. The outputs of these simulations depend on a set of clinically-derived input parameters that define the geometry and boundary conditions, however their values are subjected to uncertainty. We quantify the effects of uncertainty from two sources: uncertainty in the material properties of the vessel wall and uncertainty in the lumped parameter models whose values are estimated by assimilating patient-specific clinical and literature data. We use a generalized multi-resolution chaos approach to propagate the uncertainty. The advantages of this approach lies in its ability to support inputs sampled from arbitrary distributions and its built-in adaptivity that efficiently approximates stochastic responses characterized by steep gradients.
DOT National Transportation Integrated Search
1995-12-01
Partial failures of aircraft primary flight-control systems and structural : damages to aircraft during flight have led to catastrophic accidents with : subsequent loss of life. These accidents can be prevented if sufficient : alternate control autho...
Middleton, John; Vaks, Jeffrey E
2007-04-01
Errors of calibrator-assigned values lead to errors in the testing of patient samples. The ability to estimate the uncertainties of calibrator-assigned values and other variables minimizes errors in testing processes. International Organization of Standardization guidelines provide simple equations for the estimation of calibrator uncertainty with simple value-assignment processes, but other methods are needed to estimate uncertainty in complex processes. We estimated the assigned-value uncertainty with a Monte Carlo computer simulation of a complex value-assignment process, based on a formalized description of the process, with measurement parameters estimated experimentally. This method was applied to study uncertainty of a multilevel calibrator value assignment for a prealbumin immunoassay. The simulation results showed that the component of the uncertainty added by the process of value transfer from the reference material CRM470 to the calibrator is smaller than that of the reference material itself (<0.8% vs 3.7%). Varying the process parameters in the simulation model allowed for optimizing the process, while keeping the added uncertainty small. The patient result uncertainty caused by the calibrator uncertainty was also found to be small. This method of estimating uncertainty is a powerful tool that allows for estimation of calibrator uncertainty for optimization of various value assignment processes, with a reduced number of measurements and reagent costs, while satisfying the requirements to uncertainty. The new method expands and augments existing methods to allow estimation of uncertainty in complex processes.
NASA Astrophysics Data System (ADS)
Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison
2017-11-01
Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.
Verifying and Validating Simulation Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hemez, Francois M.
2015-02-23
This presentation is a high-level discussion of the Verification and Validation (V&V) of computational models. Definitions of V&V are given to emphasize that “validation” is never performed in a vacuum; it accounts, instead, for the current state-of-knowledge in the discipline considered. In particular comparisons between physical measurements and numerical predictions should account for their respective sources of uncertainty. The differences between error (bias), aleatoric uncertainty (randomness) and epistemic uncertainty (ignorance, lack-of- knowledge) are briefly discussed. Four types of uncertainty in physics and engineering are discussed: 1) experimental variability, 2) variability and randomness, 3) numerical uncertainty and 4) model-form uncertainty. Statisticalmore » sampling methods are available to propagate, and analyze, variability and randomness. Numerical uncertainty originates from the truncation error introduced by the discretization of partial differential equations in time and space. Model-form uncertainty is introduced by assumptions often formulated to render a complex problem more tractable and amenable to modeling and simulation. The discussion concludes with high-level guidance to assess the “credibility” of numerical simulations, which stems from the level of rigor with which these various sources of uncertainty are assessed and quantified.« less
NASA Astrophysics Data System (ADS)
Dethlefsen, Frank; Tilmann Pfeiffer, Wolf; Schäfer, Dirk
2016-04-01
Numerical simulations of hydraulic, thermal, geomechanical, or geochemical (THMC-) processes in the subsurface have been conducted for decades. Often, such simulations are commenced by applying a parameter set that is as realistic as possible. Then, a base scenario is calibrated on field observations. Finally, scenario simulations can be performed, for instance to forecast the system behavior after varying input data. In the context of subsurface energy and mass storage, however, these model calibrations based on field data are often not available, as these storage actions have not been carried out so far. Consequently, the numerical models merely rely on the parameter set initially selected, and uncertainties as a consequence of a lack of parameter values or process understanding may not be perceivable, not mentioning quantifiable. Therefore, conducting THMC simulations in the context of energy and mass storage deserves a particular review of the model parameterization with its input data, and such a review so far hardly exists to the required extent. Variability or aleatory uncertainty exists for geoscientific parameter values in general, and parameters for that numerous data points are available, such as aquifer permeabilities, may be described statistically thereby exhibiting statistical uncertainty. In this case, sensitivity analyses for quantifying the uncertainty in the simulation resulting from varying this parameter can be conducted. There are other parameters, where the lack of data quantity and quality implies a fundamental changing of ongoing processes when such a parameter value is varied in numerical scenario simulations. As an example for such a scenario uncertainty, varying the capillary entry pressure as one of the multiphase flow parameters can either allow or completely inhibit the penetration of an aquitard by gas. As the last example, the uncertainty of cap-rock fault permeabilities and consequently potential leakage rates of stored gases into shallow compartments are regarded as recognized ignorance by the authors of this study, as no realistic approach exists to determine this parameter and values are best guesses only. In addition to these aleatory uncertainties, an equivalent classification is possible for rating epistemic uncertainties describing the degree of understanding processes such as the geochemical and hydraulic effects following potential gas intrusions from deeper reservoirs into shallow aquifers. As an outcome of this grouping of uncertainties, prediction errors of scenario simulations can be calculated by sensitivity analyses, if the uncertainties are identified as statistical. However, if scenario uncertainties exist or even recognized ignorance has to be attested to a parameter or a process in question, the outcomes of simulations mainly depend on the decision of the modeler by choosing parameter values or by interpreting the occurring of processes. In that case, the informative value of numerical simulations is limited by ambiguous simulation results, which cannot be refined without improving the geoscientific database through laboratory or field studies on a longer term basis, so that the effects of the subsurface use may be predicted realistically. This discussion, amended by a compilation of available geoscientific data to parameterize such simulations, will be presented in this study.
NASA Astrophysics Data System (ADS)
Michel, Patrick; Jutzi, M.; Richardson, D. C.; Benz, W.
2010-10-01
Asteroids of dark (e.g. C, D) taxonomic classes as well as Kuiper Belt objects and comets are believed to have high porosity, not only in the form of large voids but also in the form of micro-pores. The presence of such microscale porosity introduces additional physics in the impact process. We have enhanced our 3D SPH hydrocode, used to simulate catastrophic breakups, with a model of porosity [1] and validated it at small scale by comparison with impact experiments on pumice targets [2]. Our model is now ready to be applied to a large range of problems. In particular, accounting for the gravitational phase of an impact, we can study the formation of dark-type asteroid families, such as Veritas, and Kuiper-Belt families, such as Haumea. Recently we characterized for the first time the catastrophic impact energy threshold, usually called Q*D, as a function of the target's diameter, porosity, material strength and impact speed [3]. Regarding the mentioned families, our preliminary results show that accounting for porosity leads to different outcomes that may better represent their properties and constrain their definition. In particular, for Veritas, we find that its membership may need some revision [4]. The parameter space is still large, many interesting families need to be investigated and our model will be applied to a large range of cases. PM, MJ and DCR acknowledge financial support from the French Programme National de Planétologie, NASA PG&G "Small Bodies and Planetary Collisions" and NASA under Grant No. NNX08AM39G issued through the Office of Space Science, respectively. [1] Jutzi et al. 2008. Icarus 198, 242-255; [2] Jutzi et al. 2009. Icarus 201, 802-813; [3] Jutzi et al. 2010. Fragment properties at the catastrophic disruption threshold: The effect of the parent body's internal structure, Icarus 207, 54-65; [4] Michel et al. 2010. Icarus, submitted.
Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.
2002-01-01
Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, K.M.
1992-10-01
Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It ismore » recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds.« less
Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.
2016-01-01
Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.
NASA Astrophysics Data System (ADS)
Sun, Guodong; Mu, Mu
2016-04-01
An important source of uncertainty, which then causes further uncertainty in numerical simulations, is that residing in the parameters describing physical processes in numerical models. There are many physical parameters in numerical models in the atmospheric and oceanic sciences, and it would cost a great deal to reduce uncertainties in all physical parameters. Therefore, finding a subset of these parameters, which are relatively more sensitive and important parameters, and reducing the errors in the physical parameters in this subset would be a far more efficient way to reduce the uncertainties involved in simulations. In this context, we present a new approach based on the conditional nonlinear optimal perturbation related to parameter (CNOP-P) method. The approach provides a framework to ascertain the subset of those relatively more sensitive and important parameters among the physical parameters. The Lund-Potsdam-Jena (LPJ) dynamical global vegetation model was utilized to test the validity of the new approach. The results imply that nonlinear interactions among parameters play a key role in the uncertainty of numerical simulations in arid and semi-arid regions of China compared to those in northern, northeastern and southern China. The uncertainties in the numerical simulations were reduced considerably by reducing the errors of the subset of relatively more sensitive and important parameters. The results demonstrate that our approach not only offers a new route to identify relatively more sensitive and important physical parameters but also that it is viable to then apply "target observations" to reduce the uncertainties in model parameters.
Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2
NASA Technical Reports Server (NTRS)
Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.;
2016-01-01
Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.
NASA Astrophysics Data System (ADS)
Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel
2013-06-01
To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.
USDA-ARS?s Scientific Manuscript database
Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...
USDA-ARS?s Scientific Manuscript database
Multimodeling (MM) has been developed during the last decade to improve prediction capability of hydrological models. The MM combined with the pedotransfer functions (PTFs) was successfully applied to soil water flow simulations. This study examined the uncertainty in water content simulations assoc...
Wang, S; Huang, G H
2013-03-15
Flood disasters have been extremely severe in recent decades, and they account for about one third of all natural catastrophes throughout the world. In this study, a two-stage mixed-integer fuzzy programming with interval-valued membership functions (TMFP-IMF) approach is developed for flood-diversion planning under uncertainty. TMFP-IMF integrates the fuzzy flexible programming, two-stage stochastic programming, and integer programming within a general framework. A concept of interval-valued fuzzy membership function is introduced to address complexities of system uncertainties. TMFP-IMF can not only deal with uncertainties expressed as fuzzy sets and probability distributions, but also incorporate pre-regulated water-diversion policies directly into its optimization process. TMFP-IMF is applied to a hypothetical case study of flood-diversion planning for demonstrating its applicability. Results indicate that reasonable solutions can be generated for binary and continuous variables. A variety of flood-diversion and capacity-expansion schemes can be obtained under four scenarios, which enable decision makers (DMs) to identify the most desired one based on their perceptions and attitudes towards the objective-function value and constraints. Copyright © 2013 Elsevier Ltd. All rights reserved.
Phosphorus Adsorption and Desorption During and After Swine Manure Spill Simulations
USDA-ARS?s Scientific Manuscript database
Manure spills contribute phosphorus (P) to surface waters during catastrophic events and little is known about the effectiveness of the current manure spill remediation methods with regard to the water column and sediments within the fluvial system. Therefore, the objectives of this study were to (1...
Transport and Fate of Phosphorus During and After Manure Spill Simulations
USDA-ARS?s Scientific Manuscript database
Manure spills contribute phosphorus (P) to surface waters during catastrophic events and little is known about the effectiveness of the current manure spill remediation methods with regard to the water column and sediments within the fluvial system. Therefore, the objectives of this study were to (1...
Teaching Pediatric Residents to Provide Emotion-Ladened Information.
ERIC Educational Resources Information Center
Wolraich, Mark; And Others
1981-01-01
The ability of physicians to convey catastrophic information such as death or terminal illness is seen as an underdeveloped area of communication skills. A study to determine whether simulation with videotape feedback is an effective teaching technique to improve pediatric residents' skills in communication is discussed. (Author/MLW)
GETIT--Geoscience Education through Interactive Technology[TM]. [CD-ROM].
ERIC Educational Resources Information Center
2000
This CD-ROM uses catastrophic events to teach the fundamentals of the earth's dynamism. Topics discussed include earthquakes, volcanoes, hurricanes, plate tectonics, and many subjects that have to do with energy transfer. It contains 63 interactive, inquiry-based activities that closely simulate real life scientific practice. Students work with…
Business return in New Orleans: decision making amid post-Katrina uncertainty.
Lam, Nina S N; Pace, Kelley; Campanella, Richard; Lesage, James; Arenas, Helbert
2009-08-26
Empirical observations on how businesses respond after a major catastrophe are rare, especially for a catastrophe as great as Hurricane Katrina, which hit New Orleans, Louisiana on August 29, 2005. We analyzed repeated telephone surveys of New Orleans businesses conducted in December 2005, June 2006, and October 2007 to understand factors that influenced decisions to re-open amid post-disaster uncertainty. Businesses in the group of professional, scientific, and technical services reopened the fastest in the near term, but differences in the rate of reopening for businesses stratified by type became indistinguishable in the longer term (around two years later). A reopening rate of 65% was found for all businesses by October 2007. Discriminant analysis showed significant differences in responses reflecting their attitudes about important factors between businesses that reopened and those that did not. Businesses that remained closed at the time of our third survey (two years after Katrina) ranked levee protection as the top concern immediately after Katrina, but damage to their premises and financing became major concerns in subsequent months reflected in the later surveys. For businesses that had opened (at the time of our third survey), infrastructure protection including levee, utility, and communications were the main concerns mentioned in surveys up to the third survey, when the issue of crime became their top concern. These findings underscore the need to have public policy and emergency plans in place prior to the actual disaster, such as infrastructure protection, so that the policy can be applied in a timely manner before business decisions to return or close are made. Our survey results, which include responses from both open and closed businesses, overcome the "survivorship bias" problem and provide empirical observations that should be useful to improve micro-level spatial economic modeling of factors that influence business return decisions.
Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations
NASA Astrophysics Data System (ADS)
Niemeier, Wolfgang; Tengen, Dieter
2017-06-01
In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.
Multimodel Uncertainty Changes in Simulated River Flows Induced by Human Impact Parameterizations
NASA Technical Reports Server (NTRS)
Liu, Xingcai; Tang, Qiuhong; Cui, Huijuan; Mu, Mengfei; Gerten Dieter; Gosling, Simon; Masaki, Yoshimitsu; Satoh, Yusuke; Wada, Yoshihide
2017-01-01
Human impacts increasingly affect the global hydrological cycle and indeed dominate hydrological changes in some regions. Hydrologists have sought to identify the human-impact-induced hydrological variations via parameterizing anthropogenic water uses in global hydrological models (GHMs). The consequently increased model complexity is likely to introduce additional uncertainty among GHMs. Here, using four GHMs, between-model uncertainties are quantified in terms of the ratio of signal to noise (SNR) for average river flow during 1971-2000 simulated in two experiments, with representation of human impacts (VARSOC) and without (NOSOC). It is the first quantitative investigation of between-model uncertainty resulted from the inclusion of human impact parameterizations. Results show that the between-model uncertainties in terms of SNRs in the VARSOC annual flow are larger (about 2 for global and varied magnitude for different basins) than those in the NOSOC, which are particularly significant in most areas of Asia and northern areas to the Mediterranean Sea. The SNR differences are mostly negative (-20 to 5, indicating higher uncertainty) for basin-averaged annual flow. The VARSOC high flow shows slightly lower uncertainties than NOSOC simulations, with SNR differences mostly ranging from -20 to 20. The uncertainty differences between the two experiments are significantly related to the fraction of irrigation areas of basins. The large additional uncertainties in VARSOC simulations introduced by the inclusion of parameterizations of human impacts raise the urgent need of GHMs development regarding a better understanding of human impacts. Differences in the parameterizations of irrigation, reservoir regulation and water withdrawals are discussed towards potential directions of improvements for future GHM development. We also discuss the advantages of statistical approaches to reduce the between-model uncertainties, and the importance of calibration of GHMs for not only better performances of historical simulations but also more robust and confidential future projections of hydrological changes under a changing environment.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.
Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less
Uncertainty and sensitivity assessment of flood risk assessments
NASA Astrophysics Data System (ADS)
de Moel, H.; Aerts, J. C.
2009-12-01
Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
This paper presents a study on the optimization of systems with structured uncertainties, whose inputs and outputs can be exhaustively described in the probabilistic sense. By propagating the uncertainty from the input to the output in the space of the probability density functions and the moments, optimization problems that pursue performance, robustness and reliability based designs are studied. Be specifying the desired outputs in terms of desired probability density functions and then in terms of meaningful probabilistic indices, we settle a computationally viable framework for solving practical optimization problems. Applications to static optimization and stability control are used to illustrate the relevance of incorporating uncertainty in the early stages of the design. Several examples that admit a full probabilistic description of the output in terms of the design variables and the uncertain inputs are used to elucidate the main features of the generic problem and its solution. Extensions to problems that do not admit closed form solutions are also evaluated. Concrete evidence of the importance of using a consistent probabilistic formulation of the optimization problem and a meaningful probabilistic description of its solution is provided in the examples. In the stability control problem the analysis shows that standard deterministic approaches lead to designs with high probability of running into instability. The implementation of such designs can indeed have catastrophic consequences.
Error and Uncertainty Quantification in the Numerical Simulation of Complex Fluid Flows
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2010-01-01
The failure of numerical simulation to predict physical reality is often a direct consequence of the compounding effects of numerical error arising from finite-dimensional approximation and physical model uncertainty resulting from inexact knowledge and/or statistical representation. In this topical lecture, we briefly review systematic theories for quantifying numerical errors and restricted forms of model uncertainty occurring in simulations of fluid flow. A goal of this lecture is to elucidate both positive and negative aspects of applying these theories to practical fluid flow problems. Finite-element and finite-volume calculations of subsonic and hypersonic fluid flow are presented to contrast the differing roles of numerical error and model uncertainty. for these problems.
Pauly, M V; Herring, B J
2000-07-01
This paper outlines a feasible employee premium contribution policy, which would reduce the inefficiency associated with adverse selection when a limited coverage insurance policy is offered alongside a more generous policy. The "efficient premium contribution" is defined and is shown to lead to an efficient allocation across plans of persons who differ by risk, but it may also redistribute against higher risks. A simulation of the additional option of a catastrophic health plan (CHP) accompanied by a medical savings account (MSA) is presented. The efficiency gains from adding the MSA/catastrophic health insurance plan (CHP) option are positive but small, and the adverse consequences for high risks under an efficient employee premium are also small.
Vas, Pál József; Zseni, Annamária
2007-01-01
The authors think that the destructive factors that influence one's destiny in life could be the transmissions of collective, familial, and other factors coming from the clan system. This transmission is described by the concept of transgenerational trauma. A burdensome heritage can either directly, or indirectly, be passed on, even through several generations, as it can be seen in the presented cases. Also cases of intrauterine catastrophes are presented. A catastrophe like this is the case of vanishing twins. Four psychotherapy cases are analyzed in which the patients' sufferings may be attributed to the intrauterine death of their twin. In two of the cases the loss of a twin sibling is a proven biological fact. In the other two cases there is a high probability that the same has happened. A novel element introduced by the authors in the interpretation of this phenomenon is the concept that the fetus and the embrio may be able to preserve the memories of the experienced catastrophes, which as state-dependent memories will be revived in stress situations in the form of physical symptoms and feelings connected to the trauma. However, at this point in time traditional medical thinking is unable to explain the process through which a burdensome heritage is taken over from previous generations. The authors present Bert Hellinger's family constellation and Rupert Sheldrake's theory of morphic resonance as well as the uncertainty principle of quantum psychology. All these consider the multi-dimensional, topological reality that is beyond time and not the four-dimensional geometrical space as the medium in which transgenerational pieces of information spread.
NASA Astrophysics Data System (ADS)
Karssenberg, Derek; Bierkens, Marc
2014-05-01
Complex systems may switch between contrasting stable states under gradual change of a driver. Such critical transitions often result in considerable long-term damage because strong hysteresis impedes reversion, and the transition becomes catastrophic. Critical transitions largely reduce our capability of forecasting future system states because it is hard to predict the timing of their occurrence [2]. Moreover, for many systems it is unknown how rapidly the critical transition unfolds when the tipping point has been reached. The rate of change during collapse, however, is important information because it determines the time available to take action to reverse a shift [1]. In this study we explore the rate of change during the degradation of a vegetation-soil system on a hillslope from a state with considerable vegetation cover and large soil depths, to a state with sparse vegetation and a bare rock or negligible soil depths. Using a distributed, stochastic model coupling hydrology, vegetation, weathering and water erosion, we derive two differential equations describing the vegetation and the soil system, and their interaction. Two stable states - vegetated and bare - are identified by means of analytical investigation, and it is shown that the change between these two states is a critical transition as indicated by hysteresis. Surprisingly, when the tipping point is reached under a very slow increase of grazing pressure, the transition between the vegetated and the bare state can either unfold rapidly, over a few years, or gradually, occurring over decennia up to millennia. These differences in the rate of change during the transient state are explained by differences in bedrock weathering rates. This finding emphasizes the considerable uncertainty associated with forecasting catastrophic shifts in ecosystems, which is due to both difficulties in forecasting the timing of the tipping point and the rate of change when the transition unfolds. References [1] Hughes, T. P., Linares, C., Dakos, V., van de Leemput, I. a, & van Nes, E. H. (2013). Living dangerously on borrowed time during slow, unrecognized regime shifts. Trends in ecology & evolution, 28(3), 149-55. [2] Karssenberg, D., & Bierkens, M. F. P. (2012). Early-warning signals (potentially) reduce uncertainty in forecasted timing of critical shifts. Ecosphere, 3(2).
Effects of long and short simulated flights on the saccadic eye movement velocity of aviators.
Di Stasi, Leandro L; McCamy, Michael B; Martinez-Conde, Susana; Gayles, Ellis; Hoare, Chad; Foster, Michael; Catena, Andrés; Macknik, Stephen L
2016-01-01
Aircrew fatigue is a major contributor to operational errors in civil and military aviation. Objective detection of pilot fatigue is thus critical to prevent aviation catastrophes. Previous work has linked fatigue to changes in oculomotor dynamics, but few studies have studied this relationship in critical safety environments. Here we measured the eye movements of US Marine Corps combat helicopter pilots before and after simulated flight missions of different durations.We found a decrease in saccadic velocities after long simulated flights compared to short simulated flights. These results suggest that saccadic velocity could serve as a biomarker of aviator fatigue.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-12
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
NASA Astrophysics Data System (ADS)
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-01
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ketabchi, Hamed
2017-12-01
Combined simulation-optimization (S/O) schemes have long been recognized as a valuable tool in coastal groundwater management (CGM). However, previous applications have mostly relied on deterministic seawater intrusion (SWI) simulations. This is a questionable simplification, knowing that SWI models are inevitably prone to epistemic and aleatory uncertainty, and hence a management strategy obtained through S/O without consideration of uncertainty may result in significantly different real-world outcomes than expected. However, two key issues have hindered the use of uncertainty-based S/O schemes in CGM, which are addressed in this paper. The first issue is how to solve the computational challenges resulting from the need to perform massive numbers of simulations. The second issue is how the management problem is formulated in presence of uncertainty. We propose the use of Gaussian process (GP) emulation as a valuable tool in solving the computational challenges of uncertainty-based S/O in CGM. We apply GP emulation to the case study of Kish Island (located in the Persian Gulf) using an uncertainty-based S/O algorithm which relies on continuous ant colony optimization and Monte Carlo simulation. In doing so, we show that GP emulation can provide an acceptable level of accuracy, with no bias and low statistical dispersion, while tremendously reducing the computational time. Moreover, five new formulations for uncertainty-based S/O are presented based on concepts such as energy distances, prediction intervals and probabilities of SWI occurrence. We analyze the proposed formulations with respect to their resulting optimized solutions, the sensitivity of the solutions to the intended reliability levels, and the variations resulting from repeated optimization runs.
MODIS land cover uncertainty in regional climate simulations
NASA Astrophysics Data System (ADS)
Li, Xue; Messina, Joseph P.; Moore, Nathan J.; Fan, Peilei; Shortridge, Ashton M.
2017-12-01
MODIS land cover datasets are used extensively across the climate modeling community, but inherent uncertainties and associated propagating impacts are rarely discussed. This paper modeled uncertainties embedded within the annual MODIS Land Cover Type (MCD12Q1) products and propagated these uncertainties through the Regional Atmospheric Modeling System (RAMS). First, land cover uncertainties were modeled using pixel-based trajectory analyses from a time series of MCD12Q1 for Urumqi, China. Second, alternative land cover maps were produced based on these categorical uncertainties and passed into RAMS. Finally, simulations from RAMS were analyzed temporally and spatially to reveal impacts. Our study found that MCD12Q1 struggles to discriminate between grasslands and croplands or grasslands and barren in this study area. Such categorical uncertainties have significant impacts on regional climate model outputs. All climate variables examined demonstrated impact across the various regions, with latent heat flux affected most with a magnitude of 4.32 W/m2 in domain average. Impacted areas were spatially connected to locations of greater land cover uncertainty. Both biophysical characteristics and soil moisture settings in regard to land cover types contribute to the variations among simulations. These results indicate that formal land cover uncertainty analysis should be included in MCD12Q1-fed climate modeling as a routine procedure.
NASA Astrophysics Data System (ADS)
Badawy, B.; Fletcher, C. G.
2017-12-01
The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.
Uncertainty quantification in volumetric Particle Image Velocimetry
NASA Astrophysics Data System (ADS)
Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos
2016-11-01
Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.
Uncertainties in Past and Future Global Water Availability
NASA Astrophysics Data System (ADS)
Sheffield, J.; Kam, J.
2014-12-01
Understanding how water availability changes on inter-annual to decadal time scales and how it may change in the future under climate change are a key part of understanding future stresses on water and food security. Historic evaluations of water availability on regional to global scales are generally based on large-scale model simulations with their associated uncertainties, in particular for long-term changes. Uncertainties are due to model errors and missing processes, parameter uncertainty, and errors in meteorological forcing data. Recent multi-model inter-comparisons and impact studies have highlighted large differences for past reconstructions, due to different simplifying assumptions in the models or the inclusion of physical processes such as CO2 fertilization. Modeling of direct anthropogenic factors such as water and land management also carry large uncertainties in their physical representation and from lack of socio-economic data. Furthermore, there is little understanding of the impact of uncertainties in the meteorological forcings that underpin these historic simulations. Similarly, future changes in water availability are highly uncertain due to climate model diversity, natural variability and scenario uncertainty, each of which dominates at different time scales. In particular, natural climate variability is expected to dominate any externally forced signal over the next several decades. We present results from multi-land surface model simulations of the historic global availability of water in the context of natural variability (droughts) and long-term changes (drying). The simulations take into account the impact of uncertainties in the meteorological forcings and the incorporation of water management in the form of reservoirs and irrigation. The results indicate that model uncertainty is important for short-term drought events, and forcing uncertainty is particularly important for long-term changes, especially uncertainty in precipitation due to reduced gauge density in recent years. We also discuss uncertainties in future projections from these models as driven by bias-corrected and downscaled CMIP5 climate projections, in the context of the balance between climate model robustness and climate model diversity.
Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu
2005-01-01
Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...
Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V
2009-01-01
The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.
The Million-Body Problem: Particle Simulations in Astrophysics
Rasio, Fred
2018-05-21
Computer simulations using particles play a key role in astrophysics. They are widely used to study problems across the entire range of astrophysical scales, from the dynamics of stars, gaseous nebulae, and galaxies, to the formation of the largest-scale structures in the universe. The 'particles' can be anything from elementary particles to macroscopic fluid elements, entire stars, or even entire galaxies. Using particle simulations as a common thread, this talk will present an overview of computational astrophysics research currently done in our theory group at Northwestern. Topics will include stellar collisions and the gravothermal catastrophe in dense star clusters.
Planning additional drilling campaign using two-space genetic algorithm: A game theoretical approach
NASA Astrophysics Data System (ADS)
Kumral, Mustafa; Ozer, Umit
2013-03-01
Grade and tonnage are the most important technical uncertainties in mining ventures because of the use of estimations/simulations, which are mostly generated from drill data. Open pit mines are planned and designed on the basis of the blocks representing the entire orebody. Each block has different estimation/simulation variance reflecting uncertainty to some extent. The estimation/simulation realizations are submitted to mine production scheduling process. However, the use of a block model with varying estimation/simulation variances will lead to serious risk in the scheduling. In the medium of multiple simulations, the dispersion variances of blocks can be thought to regard technical uncertainties. However, the dispersion variance cannot handle uncertainty associated with varying estimation/simulation variances of blocks. This paper proposes an approach that generates the configuration of the best additional drilling campaign to generate more homogenous estimation/simulation variances of blocks. In other words, the objective is to find the best drilling configuration in such a way as to minimize grade uncertainty under budget constraint. Uncertainty measure of the optimization process in this paper is interpolation variance, which considers data locations and grades. The problem is expressed as a minmax problem, which focuses on finding the best worst-case performance i.e., minimizing interpolation variance of the block generating maximum interpolation variance. Since the optimization model requires computing the interpolation variances of blocks being simulated/estimated in each iteration, the problem cannot be solved by standard optimization tools. This motivates to use two-space genetic algorithm (GA) approach to solve the problem. The technique has two spaces: feasible drill hole configuration with minimization of interpolation variance and drill hole simulations with maximization of interpolation variance. Two-space interacts to find a minmax solution iteratively. A case study was conducted to demonstrate the performance of approach. The findings showed that the approach could be used to plan a new drilling campaign.
Technical note: false catastrophic age-at-death profiles in commingled bone deposits.
Sołtysiak, Arkadiusz
2013-12-01
Age-at-death profiles obtained using the minimum number of individuals (MNI) for mass deposits of commingled human remains may be biased by over-representation of subadult individuals. A computer simulation designed in the R environment has shown that this effect may lead to misinterpretation of such samples even in cases where the completeness rate is relatively high. The simulation demonstrates that the use of the Most Likely Number of Individuals (MLNI) substantially reduces this bias. Copyright © 2013 Wiley Periodicals, Inc.
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
NASA Astrophysics Data System (ADS)
Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.
2012-12-01
Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. Tests show that the decoupled approach is both efficient and able to provide accurate uncertainty estimates. The method is demonstrated on a Danish field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the co-simulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.
This presentation, Particle-Resolved Simulations for Quantifying Black Carbon Climate Impact and Model Uncertainty, was given at the STAR Black Carbon 2016 Webinar Series: Changing Chemistry over Time held on Oct. 31, 2016.
Effect of monthly areal rainfall uncertainty on streamflow simulation
NASA Astrophysics Data System (ADS)
Ndiritu, J. G.; Mkhize, N.
2017-08-01
Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic monthly rainfalls were 86 and 90% of the mean naturalised streamflow. In calibration, 33% of the naturalised flow located within the streamflow ranges with historic rainfall simulations and using stochastic rainfalls increased this to 66%. In validation the respective percentages of naturalised flows located within the simulated streamflow ranges were 32 and 72% respectively. The analysis reveals that monthly areal rainfall uncertainty is significant and incorporating it into streamflow simulation would add validity to the results.
Uncertainty in BMP evaluation and optimization for watershed management
NASA Astrophysics Data System (ADS)
Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.
2012-12-01
Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.
Impact disruption of gravity-dominated bodies: New simulation data and scaling
NASA Astrophysics Data System (ADS)
Movshovitz, N.; Nimmo, F.; Korycansky, D. G.; Asphaug, E.; Owen, J. M.
2016-09-01
We present results from a suite of 169 hydrocode simulations of collisions between planetary bodies with radii from 100 to 1000 km. The simulation data are used to derive a simple scaling law for the threshold for catastrophic disruption, defined as a collision that leads to half the total colliding mass escaping the system post impact. For a target radius 100 ≤ RT ≤ 1000km and a mass MT and a projectile radius rp ≤ RT and mass mp we find that a head-on impact with velocity magnitude v is catastrophic if the kinetic energy of the system in the center of mass frame, K = 0.5MTmpv2 /(MT +mp) , exceeds a threshold value K* that is a few times U =(3 / 5) GMT2/RT +(3 / 5) Gmp2/rp + GMTmp /(RT +rp) , the gravitational binding energy of the system at the moment of impact; G is the gravitational constant. In all head-on collision runs we find K* =(5.5 ± 2.9) U . Oblique impacts are catastrophic when the fraction of kinetic energy contained in the volume of the projectile intersecting the target during impact exceeds ∼2 K* for 30° impacts and ∼3.5 K* for 45° impacts. We compare predictions made with this scaling to those made with existing scaling laws in the literature extrapolated from numerical studies on smaller targets. We find significant divergence between predictions where in general our results suggest a lower threshold for disruption except for highly oblique impacts with rp ≪ RT. This has implications for the efficiency of collisional grinding in the asteroid belt (Morbidelli et al., [2009] Icarus, 204, 558-573), Kuiper belt (Greenstreet et al., [2015] Icarus, 258, 267-288), and early Solar System accretion (Chambers [2013], Icarus, 224, 43-56).
NASA Astrophysics Data System (ADS)
White, Jeremy; Stengel, Victoria; Rendon, Samuel; Banta, John
2017-08-01
Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral
in that they reproduce daily mean streamflow acceptably well according to Nash-Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.
White, Jeremy; Stengel, Victoria G.; Rendon, Samuel H.; Banta, John
2017-01-01
Computer models of hydrologic systems are frequently used to investigate the hydrologic response of land-cover change. If the modeling results are used to inform resource-management decisions, then providing robust estimates of uncertainty in the simulated response is an important consideration. Here we examine the importance of parameterization, a necessarily subjective process, on uncertainty estimates of the simulated hydrologic response of land-cover change. Specifically, we applied the soil water assessment tool (SWAT) model to a 1.4 km2 watershed in southern Texas to investigate the simulated hydrologic response of brush management (the mechanical removal of woody plants), a discrete land-cover change. The watershed was instrumented before and after brush-management activities were undertaken, and estimates of precipitation, streamflow, and evapotranspiration (ET) are available; these data were used to condition and verify the model. The role of parameterization in brush-management simulation was evaluated by constructing two models, one with 12 adjustable parameters (reduced parameterization) and one with 1305 adjustable parameters (full parameterization). Both models were subjected to global sensitivity analysis as well as Monte Carlo and generalized likelihood uncertainty estimation (GLUE) conditioning to identify important model inputs and to estimate uncertainty in several quantities of interest related to brush management. Many realizations from both parameterizations were identified as behavioral in that they reproduce daily mean streamflow acceptably well according to Nash–Sutcliffe model efficiency coefficient, percent bias, and coefficient of determination. However, the total volumetric ET difference resulting from simulated brush management remains highly uncertain after conditioning to daily mean streamflow, indicating that streamflow data alone are not sufficient to inform the model inputs that influence the simulated outcomes of brush management the most. Additionally, the reduced-parameterization model grossly underestimates uncertainty in the total volumetric ET difference compared to the full-parameterization model; total volumetric ET difference is a primary metric for evaluating the outcomes of brush management. The failure of the reduced-parameterization model to provide robust uncertainty estimates demonstrates the importance of parameterization when attempting to quantify uncertainty in land-cover change simulations.
Characterizing bias correction uncertainty in wheat yield predictions
NASA Astrophysics Data System (ADS)
Ortiz, Andrea Monica; Jones, Julie; Freckleton, Robert; Scaife, Adam
2017-04-01
Farming systems are under increased pressure due to current and future climate change, variability and extremes. Research on the impacts of climate change on crop production typically rely on the output of complex Global and Regional Climate Models, which are used as input to crop impact models. Yield predictions from these top-down approaches can have high uncertainty for several reasons, including diverse model construction and parameterization, future emissions scenarios, and inherent or response uncertainty. These uncertainties propagate down each step of the 'cascade of uncertainty' that flows from climate input to impact predictions, leading to yield predictions that may be too complex for their intended use in practical adaptation options. In addition to uncertainty from impact models, uncertainty can also stem from the intermediate steps that are used in impact studies to adjust climate model simulations to become more realistic when compared to observations, or to correct the spatial or temporal resolution of climate simulations, which are often not directly applicable as input into impact models. These important steps of bias correction or calibration also add uncertainty to final yield predictions, given the various approaches that exist to correct climate model simulations. In order to address how much uncertainty the choice of bias correction method can add to yield predictions, we use several evaluation runs from Regional Climate Models from the Coordinated Regional Downscaling Experiment over Europe (EURO-CORDEX) at different resolutions together with different bias correction methods (linear and variance scaling, power transformation, quantile-quantile mapping) as input to a statistical crop model for wheat, a staple European food crop. The objective of our work is to compare the resulting simulation-driven hindcasted wheat yields to climate observation-driven wheat yield hindcasts from the UK and Germany in order to determine ranges of yield uncertainty that result from different climate model simulation input and bias correction methods. We simulate wheat yields using a General Linear Model that includes the effects of seasonal maximum temperatures and precipitation, since wheat is sensitive to heat stress during important developmental stages. We use the same statistical model to predict future wheat yields using the recently available bias-corrected simulations of EURO-CORDEX-Adjust. While statistical models are often criticized for their lack of complexity, an advantage is that we are here able to consider only the effect of the choice of climate model, resolution or bias correction method on yield. Initial results using both past and future bias-corrected climate simulations with a process-based model will also be presented. Through these methods, we make recommendations in preparing climate model output for crop models.
NASA Astrophysics Data System (ADS)
Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai
2017-10-01
Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.
Effects of Boron and Graphite Uncertainty in Fuel for TREAT Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughn, Kyle; Mausolff, Zander; Gonzalez, Esteban
Advanced modeling techniques and current computational capacity make full core TREAT simulations possible, with the goal of such simulations to understand the pre-test core and minimize the number of required calibrations. But, in order to simulate TREAT with a high degree of precision the reactor materials and geometry must also be modeled with a high degree of precision. This paper examines how uncertainty in the reported values of boron and graphite have an effect on simulations of TREAT.
A geostatistical extreme-value framework for fast simulation of natural hazard events
Stephenson, David B.
2016-01-01
We develop a statistical framework for simulating natural hazard events that combines extreme value theory and geostatistics. Robust generalized additive model forms represent generalized Pareto marginal distribution parameters while a Student’s t-process captures spatial dependence and gives a continuous-space framework for natural hazard event simulations. Efficiency of the simulation method allows many years of data (typically over 10 000) to be obtained at relatively little computational cost. This makes the model viable for forming the hazard module of a catastrophe model. We illustrate the framework by simulating maximum wind gusts for European windstorms, which are found to have realistic marginal and spatial properties, and validate well against wind gust measurements. PMID:27279768
NASA Astrophysics Data System (ADS)
Virtanen, I. O. I.; Virtanen, I. I.; Pevtsov, A. A.; Yeates, A.; Mursula, K.
2017-07-01
Aims: We aim to use the surface flux transport model to simulate the long-term evolution of the photospheric magnetic field from historical observations. In this work we study the accuracy of the model and its sensitivity to uncertainties in its main parameters and the input data. Methods: We tested the model by running simulations with different values of meridional circulation and supergranular diffusion parameters, and studied how the flux distribution inside active regions and the initial magnetic field affected the simulation. We compared the results to assess how sensitive the simulation is to uncertainties in meridional circulation speed, supergranular diffusion, and input data. We also compared the simulated magnetic field with observations. Results: We find that there is generally good agreement between simulations and observations. Although the model is not capable of replicating fine details of the magnetic field, the long-term evolution of the polar field is very similar in simulations and observations. Simulations typically yield a smoother evolution of polar fields than observations, which often include artificial variations due to observational limitations. We also find that the simulated field is fairly insensitive to uncertainties in model parameters or the input data. Due to the decay term included in the model the effects of the uncertainties are somewhat minor or temporary, lasting typically one solar cycle.
NASA Astrophysics Data System (ADS)
Garrigues, S.; Olioso, A.; Calvet, J.-C.; Lafont, S.; Martin, E.; Chanzy, A.; Marloie, O.; Bertrand, N.; Desfonds, V.; Renard, D.
2012-04-01
Vegetation productivity and water balance of Mediterranean regions will be particularly affected by climate and land-use changes. In order to analyze and predict these changes through land surface models, a critical step is to quantify the uncertainties associated with these models (processes, parameters) and their implementation over a long period of time. Besides, uncertainties attached to the data used to force these models (atmospheric forcing, vegetation and soil characteristics, crop management practices...) which are generally available at coarse spatial resolution (>1-10 km) and for a limited number of plant functional types, need to be evaluated. This paper aims at assessing the uncertainties in water (evapotranspiration) and energy fluxes estimated from a Soil Vegetation Atmosphere Transfer (SVAT) model over a Mediterranean agricultural site. While similar past studies focused on particular crop types and limited period of time, the originality of this paper consists in implementing the SVAT model and assessing its uncertainties over a long period of time (10 years), encompassing several cycles of distinct crops (wheat, sorghum, sunflower, peas). The impacts on the SVAT simulations of the following sources of uncertainties are characterized: - Uncertainties in atmospheric forcing are assessed comparing simulations forced with local meteorological measurements and simulations forced with re-analysis atmospheric dataset (SAFRAN database). - Uncertainties in key surface characteristics (soil, vegetation, crop management practises) are tested comparing simulations feeded with standard values from global database (e.g. ECOCLIMAP) and simulations based on in situ or site-calibrated values. - Uncertainties dues to the implementation of the SVAT model over a long period of time are analyzed with regards to crop rotation. The SVAT model being analyzed in this paper is ISBA in its a-gs version which simulates the photosynthesis and its coupling with the stomata conductance, as well as the time course of the plant biomass and the Leaf Area Index (LAI). The experiment was conducted at the INRA-Avignon (France) crop site (ICOS associated site), for which 10 years of energy and water eddy fluxes, soil moisture profiles, vegetation measurements, agricultural practises are available for distinct crop types. The uncertainties in evapotranspiration and energy flux estimates are quantified from both 10-year trend analysis and selected daily cycles spanning a range of atmospheric conditions and phenological stages. While the net radiation flux is correctly simulated, the cumulated latent heat flux is under-estimated. Daily plots indicate i) an overestimation of evapotranspiration over bare soil probably due to an overestimation of the soil water reservoir available for evaporation and ii) an under-estimation of transpiration for developed canopy. Uncertainties attached to the re-analysis atmospheric data show little influence on the cumulated values of evapotranspiration. Better performances are reached using in situ soil depths and site-calibrated photosynthesis parameters compared to the simulations based on the ECOCLIMAP standard values. Finally, this paper highlights the impact of the temporal succession of vegetation cover and bare soil on the simulation of soil moisture and evapotranspiration over a long period of time. Thus, solutions to account for crop rotation in the implementation of SVAT models are discussed.
Understanding Climate Uncertainty with an Ocean Focus
NASA Astrophysics Data System (ADS)
Tokmakian, R. T.
2009-12-01
Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in ocean circulation due to parameter specification will be described and early results using the ocean/ice components of the CCSM climate model in a designed experiment framework will be shown. Cox, P. and D. Stephenson, Climate Change: A Changing Climate for Prediction, 2007, Science 317 (5835), 207, DOI: 10.1126/science.1145956. Rougier, J. C., 2007: Probabilistic Inference for Future Climate Using an Ensemble of Climate Model Evaluations, Climatic Change, 81, 247-264. Smith L., 2002, What might we learn from climate forecasts? Proc. Nat’l Academy of Sciences, Vol. 99, suppl. 1, 2487-2492 doi:10.1073/pnas.012580599.
Calculating Measurement Uncertainty of the “Conventional Value of the Result of Weighing in Air”
Flicker, Celia J.; Tran, Hy D.
2016-04-02
The conventional value of the result of weighing in air is frequently used in commercial calibrations of balances. The guidance in OIML D-028 for reporting uncertainty of the conventional value is too terse. When calibrating mass standards at low measurement uncertainties, it is necessary to perform a buoyancy correction before reporting the result. When calculating the conventional result after calibrating true mass, the uncertainty due to calculating the conventional result is correlated with the buoyancy correction. We show through Monte Carlo simulations that the measurement uncertainty of the conventional result is less than the measurement uncertainty when reporting true mass.more » The Monte Carlo simulation tool is available in the online version of this article.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, G.; Lackner, M.; Haid, L.
2013-07-01
With the push towards siting wind turbines farther offshore due to higher wind quality and less visibility, floating offshore wind turbines, which can be located in deep water, are becoming an economically attractive option. The International Electrotechnical Commission's (IEC) 61400-3 design standard covers fixed-bottom offshore wind turbines, but there are a number of new research questions that need to be answered to modify these standards so that they are applicable to floating wind turbines. One issue is the appropriate simulation length needed for floating turbines. This paper will discuss the results from a study assessing the impact of simulation lengthmore » on the ultimate and fatigue loads of the structure, and will address uncertainties associated with changing the simulation length for the analyzed floating platform. Recommendations of required simulation length based on load uncertainty will be made and compared to current simulation length requirements.« less
Engineering applications of strong ground motion simulation
NASA Astrophysics Data System (ADS)
Somerville, Paul
1993-02-01
The formulation, validation and application of a procedure for simulating strong ground motions for use in engineering practice are described. The procedure uses empirical source functions (derived from near-source strong motion recordings of small earthquakes) to provide a realistic representation of effects such as source radiation that are difficult to model at high frequencies due to their partly stochastic behavior. Wave propagation effects are modeled using simplified Green's functions that are designed to transfer empirical source functions from their recording sites to those required for use in simulations at a specific site. The procedure has been validated against strong motion recordings of both crustal and subduction earthquakes. For the validation process we choose earthquakes whose source models (including a spatially heterogeneous distribution of the slip of the fault) are independently known and which have abundant strong motion recordings. A quantitative measurement of the fit between the simulated and recorded motion in this validation process is used to estimate the modeling and random uncertainty associated with the simulation procedure. This modeling and random uncertainty is one part of the overall uncertainty in estimates of ground motions of future earthquakes at a specific site derived using the simulation procedure. The other contribution to uncertainty is that due to uncertainty in the source parameters of future earthquakes that affect the site, which is estimated from a suite of simulations generated by varying the source parameters over their ranges of uncertainty. In this paper, we describe the validation of the simulation procedure for crustal earthquakes against strong motion recordings of the 1989 Loma Prieta, California, earthquake, and for subduction earthquakes against the 1985 Michoacán, Mexico, and Valparaiso, Chile, earthquakes. We then show examples of the application of the simulation procedure to the estimatation of the design response spectra for crustal earthquakes at a power plant site in California and for subduction earthquakes in the Seattle-Portland region. We also demonstrate the use of simulation methods for modeling the attenuation of strong ground motion, and show evidence of the effect of critical reflections from the lower crust in causing the observed flattening of the attenuation of strong ground motion from the 1988 Saguenay, Quebec, and 1989 Loma Prieta earthquakes.
Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models
Debasish Saha; Armen R. Kemanian; Benjamin M. Rau; Paul R. Adler; Felipe Montes
2017-01-01
Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reeve, Samuel Temple; Strachan, Alejandro, E-mail: strachan@purdue.edu
We use functional, Fréchet, derivatives to quantify how thermodynamic outputs of a molecular dynamics (MD) simulation depend on the potential used to compute atomic interactions. Our approach quantifies the sensitivity of the quantities of interest with respect to the input functions as opposed to its parameters as is done in typical uncertainty quantification methods. We show that the functional sensitivity of the average potential energy and pressure in isothermal, isochoric MD simulations using Lennard–Jones two-body interactions can be used to accurately predict those properties for other interatomic potentials (with different functional forms) without re-running the simulations. This is demonstrated undermore » three different thermodynamic conditions, namely a crystal at room temperature, a liquid at ambient pressure, and a high pressure liquid. The method provides accurate predictions as long as the change in potential can be reasonably described to first order and does not significantly affect the region in phase space explored by the simulation. The functional uncertainty quantification approach can be used to estimate the uncertainties associated with constitutive models used in the simulation and to correct predictions if a more accurate representation becomes available.« less
Quantifying Uncertainty in Model Predictions for the Pliocene (Plio-QUMP): Initial results
Pope, J.O.; Collins, M.; Haywood, A.M.; Dowsett, H.J.; Hunter, S.J.; Lunt, D.J.; Pickering, S.J.; Pound, M.J.
2011-01-01
Examination of the mid-Pliocene Warm Period (mPWP; ~. 3.3 to 3.0. Ma BP) provides an excellent opportunity to test the ability of climate models to reproduce warm climate states, thereby assessing our confidence in model predictions. To do this it is necessary to relate the uncertainty in model simulations of mPWP climate to uncertainties in projections of future climate change. The uncertainties introduced by the model can be estimated through the use of a Perturbed Physics Ensemble (PPE). Developing on the UK Met Office Quantifying Uncertainty in Model Predictions (QUMP) Project, this paper presents the results from an initial investigation using the end members of a PPE in a fully coupled atmosphere-ocean model (HadCM3) running with appropriate mPWP boundary conditions. Prior work has shown that the unperturbed version of HadCM3 may underestimate mPWP sea surface temperatures at higher latitudes. Initial results indicate that neither the low sensitivity nor the high sensitivity simulations produce unequivocally improved mPWP climatology relative to the standard. Whilst the high sensitivity simulation was able to reconcile up to 6 ??C of the data/model mismatch in sea surface temperatures in the high latitudes of the Northern Hemisphere (relative to the standard simulation), it did not produce a better prediction of global vegetation than the standard simulation. Overall the low sensitivity simulation was degraded compared to the standard and high sensitivity simulations in all aspects of the data/model comparison. The results have shown that a PPE has the potential to explore weaknesses in mPWP modelling simulations which have been identified by geological proxies, but that a 'best fit' simulation will more likely come from a full ensemble in which simulations that contain the strengths of the two end member simulations shown here are combined. ?? 2011 Elsevier B.V.
From cutting-edge pointwise cross-section to groupwise reaction rate: A primer
NASA Astrophysics Data System (ADS)
Sublet, Jean-Christophe; Fleming, Michael; Gilbert, Mark R.
2017-09-01
The nuclear research and development community has a history of using both integral and differential experiments to support accurate lattice-reactor, nuclear reactor criticality and shielding simulations, as well as verification and validation efforts of cross sections and emitted particle spectra. An important aspect to this type of analysis is the proper consideration of the contribution of the neutron spectrum in its entirety, with correct propagation of uncertainties and standard deviations derived from Monte Carlo simulations, to the local and total uncertainty in the simulated reactions rates (RRs), which usually only apply to one application at a time. This paper identifies deficiencies in the traditional treatment, and discusses correct handling of the RR uncertainty quantification and propagation, including details of the cross section components in the RR uncertainty estimates, which are verified for relevant applications. The methodology that rigorously captures the spectral shift and cross section contributions to the uncertainty in the RR are discussed with quantified examples that demonstrate the importance of the proper treatment of the spectrum profile and cross section contributions to the uncertainty in the RR and subsequent response functions. The recently developed inventory code FISPACT-II, when connected to the processed nuclear data libraries TENDL-2015, ENDF/B-VII.1, JENDL-4.0u or JEFF-3.2, forms an enhanced multi-physics platform providing a wide variety of advanced simulation methods for modelling activation, transmutation, burnup protocols and simulating radiation damage sources terms. The system has extended cutting-edge nuclear data forms, uncertainty quantification and propagation methods, which have been the subject of recent integral and differential, fission, fusion and accelerators validation efforts. The simulation system is used to accurately and predictively probe, understand and underpin a modern and sustainable understanding of the nuclear physics that is so important for many areas of science and technology; advanced fission and fuel systems, magnetic and inertial confinement fusion, high energy, accelerator physics, medical application, isotope production, earth exploration, astrophysics and homeland security.
Bridging Scientific Model Outputs with Emergency Response Needs in Catastrophic Earthquake Responses
ERIC Educational Resources Information Center
Johannes, Tay W.
2010-01-01
In emergency management, scientific models are widely used for running hazard simulations and estimating losses often in support of planning and mitigation efforts. This work expands utility of the scientific model into the response phase of emergency management. The focus is on the common operating picture as it gives context to emergency…
The influence of experimental wind disturbance on forest fuels and fire characteristics
Jeffery B. Cannon; Joseph J. O' Brien; Louise Loudermilk; Matthew Dickinson; Chris J. Peterson
2014-01-01
Current theory in disturbance ecology predicts that extreme disturbances in rapid succession can lead to dramatic changes in species composition or ecosystem processes due to interactions among disturbances. However, the extent to which less catastrophic, yet chronic, disturbances such as wind damage and fire interact is not well studied. In this study, we simulated...
Constant-Elasticity-of-Substitution Simulation
NASA Technical Reports Server (NTRS)
Reiter, G.
1986-01-01
Program simulates constant elasticity-of-substitution (CES) production function. CES function used by economic analysts to examine production costs as well as uncertainties in production. User provides such input parameters as price of labor, price of capital, and dispersion levels. CES minimizes expected cost to produce capital-uncertainty pair. By varying capital-value input, one obtains series of capital-uncertainty pairs. Capital-uncertainty pairs then used to generate several cost curves. CES program menu driven and features specific print menu for examining selected output curves. Program written in BASIC for interactive execution and implemented on IBM PC-series computer.
Overcoming challenges of catastrophe modelling in data poor regions
NASA Astrophysics Data System (ADS)
Grassby, L.; Millinship, I.; Breinl, K.
2012-04-01
There is an increasing demand for loss accumulation tools in expanding international insurance markets such as India, China and Thailand. This reflects the combination of an increase in exposures in these territories as industry intensifies and urban development expands, as well as several notable natural catastrophes affecting these areas over the past few years (e.g. extreme floods in Mumbai in 2006 and in Thailand in 2011). Large, global insurers and reinsurers are embracing the opportunity to underwrite these exposures but only where adequate tools are available to provide understanding of the hazards, exposures and potential losses. Unlike more developed countries, data availability in these regions is typically limited and of poor resolution, but model development is still required in order to analyse the risk. Some of the modelling challenges associated with data limitations include: (1) dealing with a lack of hydrological data which results in greater uncertainty of the flow rate and event frequency; (2) lower DTM resolution than that available across much of Europe, which underlies the hazard component of the catastrophe model; (3) limited accessibility to data that characterises the Built Environment including information on different building types and their susceptibility to damage; and (4) a lack of claims data from previous events or engineering research into the vulnerability of different building types. This is used to generate of country and structure specific vulnerability curves that explain the relationship between hazard intensity and damages. By presenting an industry specific flood model for data-poor India in collaboration with Allianz Re, we illustrate how we have overcome many of these challenges to allow loss accumulations to be made. The resulting model was successfully validated against the floods in Mumbai and Surat in 2006 and is being developed further with the availability of new data.
NASA Astrophysics Data System (ADS)
Matt, Felix; Burkhart, John F.
2017-04-01
Light absorbing impurities in snow and ice (LAISI) originating from atmospheric deposition enhance snow melt by increasing the absorption of short wave radiation. The consequences are a shortening of the snow cover duration due to increased snow melt and, with respect to hydrologic processes, a temporal shift in the discharge generation. However, the magnitude of these effects as simulated in numerical models have large uncertainties, originating mainly from uncertainties in the wet and dry deposition of light absorbing aerosols, limitations in the model representation of the snowpack, and the lack of observable variables required to estimate model parameters and evaluate the simulated variables connected with the representation of LAISI. This leads to high uncertainties in the additional energy absorbed by the snow due to the presence of LAISI, a key variable in understanding snowpack energy-balance dynamics. In this study, we assess the effect of LAISI on snow melt and discharge generation and the involved uncertainties in a high mountain catchment located in the western Himalayas by using a distributed hydrological catchment model with focus on the representation of the seasonal snow pack. The snow albedo is hereby calculated from a radiative transfer model for snow, taking the increased absorption of short wave radiation by LAISI into account. Meteorological forcing data is generated from an assimilation of observations and high resolution WRF simulations, and LAISI mixing ratios from deposition rates of Black Carbon simulated with the FLEXPART model. To asses the quality of our simulations and the related uncertainties, we compare the simulated additional energy absorbed by the snow due to the presence of LAISI to the MODIS Dust Radiative Forcing in Snow (MODDRFS) algorithm satellite product.
OBIST methodology incorporating modified sensitivity of pulses for active analogue filter components
NASA Astrophysics Data System (ADS)
Khade, R. H.; Chaudhari, D. S.
2018-03-01
In this paper, oscillation-based built-in self-test method is used to diagnose catastrophic and parametric faults in integrated circuits. Sallen-Key low pass filter and high pass filter circuits with different gains are used to investigate defects. Variation in seven parameters of operational amplifier (OP-AMP) like gain, input impedance, output impedance, slew rate, input bias current, input offset current, input offset voltage and catastrophic as well as parametric defects in components outside OP-AMP are introduced in the circuit and simulation results are analysed. Oscillator output signal is converted to pulses which are used to generate a signature of the circuit. The signature and pulse count changes with the type of fault present in the circuit under test (CUT). The change in oscillation frequency is observed for fault detection. Designer has flexibility to predefine tolerance band of cut-off frequency and range of pulses for which circuit should be accepted. The fault coverage depends upon the required tolerance band of the CUT. We propose a modification of sensitivity of parameter (pulses) to avoid test escape and enhance yield. Result shows that the method provides 100% fault coverage for catastrophic faults.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jennings, Elise; Wolf, Rachel; Sako, Masao
2016-11-09
Cosmological parameter estimation techniques that robustly account for systematic measurement uncertainties will be crucial for the next generation of cosmological surveys. We present a new analysis method, superABC, for obtaining cosmological constraints from Type Ia supernova (SN Ia) light curves using Approximate Bayesian Computation (ABC) without any likelihood assumptions. The ABC method works by using a forward model simulation of the data where systematic uncertainties can be simulated and marginalized over. A key feature of the method presented here is the use of two distinct metrics, the `Tripp' and `Light Curve' metrics, which allow us to compare the simulated data to the observed data set. The Tripp metric takes as input the parameters of models fit to each light curve with the SALT-II method, whereas the Light Curve metric uses the measured fluxes directly without model fitting. We apply the superABC sampler to a simulated data set ofmore » $$\\sim$$1000 SNe corresponding to the first season of the Dark Energy Survey Supernova Program. Varying $$\\Omega_m, w_0, \\alpha$$ and $$\\beta$$ and a magnitude offset parameter, with no systematics we obtain $$\\Delta(w_0) = w_0^{\\rm true} - w_0^{\\rm best \\, fit} = -0.036\\pm0.109$$ (a $$\\sim11$$% 1$$\\sigma$$ uncertainty) using the Tripp metric and $$\\Delta(w_0) = -0.055\\pm0.068$$ (a $$\\sim7$$% 1$$\\sigma$$ uncertainty) using the Light Curve metric. Including 1% calibration uncertainties in four passbands, adding 4 more parameters, we obtain $$\\Delta(w_0) = -0.062\\pm0.132$$ (a $$\\sim14$$% 1$$\\sigma$$ uncertainty) using the Tripp metric. Overall we find a $17$% increase in the uncertainty on $$w_0$$ with systematics compared to without. We contrast this with a MCMC approach where systematic effects are approximately included. We find that the MCMC method slightly underestimates the impact of calibration uncertainties for this simulated data set.« less
Myers, Casey A.; Laz, Peter J.; Shelburne, Kevin B.; Davidson, Bradley S.
2015-01-01
Uncertainty that arises from measurement error and parameter estimation can significantly affect the interpretation of musculoskeletal simulations; however, these effects are rarely addressed. The objective of this study was to develop an open-source probabilistic musculoskeletal modeling framework to assess how measurement error and parameter uncertainty propagate through a gait simulation. A baseline gait simulation was performed for a male subject using OpenSim for three stages: inverse kinematics, inverse dynamics, and muscle force prediction. A series of Monte Carlo simulations were performed that considered intrarater variability in marker placement, movement artifacts in each phase of gait, variability in body segment parameters, and variability in muscle parameters calculated from cadaveric investigations. Propagation of uncertainty was performed by also using the output distributions from one stage as input distributions to subsequent stages. Confidence bounds (5–95%) and sensitivity of outputs to model input parameters were calculated throughout the gait cycle. The combined impact of uncertainty resulted in mean bounds that ranged from 2.7° to 6.4° in joint kinematics, 2.7 to 8.1 N m in joint moments, and 35.8 to 130.8 N in muscle forces. The impact of movement artifact was 1.8 times larger than any other propagated source. Sensitivity to specific body segment parameters and muscle parameters were linked to where in the gait cycle they were calculated. We anticipate that through the increased use of probabilistic tools, researchers will better understand the strengths and limitations of their musculoskeletal simulations and more effectively use simulations to evaluate hypotheses and inform clinical decisions. PMID:25404535
Parameter Uncertainty on AGCM-simulated Tropical Cyclones
NASA Astrophysics Data System (ADS)
He, F.
2015-12-01
This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.
Titanium-Oxygen Reactivity Study
NASA Technical Reports Server (NTRS)
Chafey, J. E.; Scheck, W. G.; Witzell, W. E.
1962-01-01
A program has been conducted at Astronautics to investigate the likelihood of occurrence of the catastrophic oxidation of titanium alloy sheet under conditions which simulate certain cases of accidental failure of the metal while it is in contact with liquid or gaseous oxygen. Three methods of fracturing the metal were used; they consisted of mechanical puncture, tensile fracture of welded joints, and perforation by very high velocity particles. The results of the tests which have been conducted provide further evidence of the reactivity of titanium with liquid and gaseous oxygen. The evidence indicates that the rapid fracturing of titanium sheet while it is in contact with oxygen initiates the catastrophic oxidation reaction. Initiation occurred when the speed of the fracture was some few feet per second, as in both the drop-weight puncture tests and the static tensile fracture tests of welded joints, as well as when the speed was several thousand feet per second, as in the simulated micrometeoroid penetration tests. The slow propagation of a crack, however, did not initiate the reaction. It may logically be concluded that the localized frictional heat of rapid fracture and/or spontaneous oxidation (exothermic) of minute particles emanating from the fracture cause initiation of the reaction. Under conditions of slow fracture, however, the small heat generated may be adequately dissipated and the reaction is not initiated. A portion of the study conducted consisted of investigating various means by which the reaction might be retarded or prevented. Providing a "barrier" at the titanium-oxygen interface consisting of either aluminum metal or a coating of a petroleum base corrosion inhibitor appeared to be only partially effective in retarding the reaction. The accidental puncturing or similar rupturing of thin-walled pressurized oxygen tanks on missiles and space vehicle will usually constitute loss of function, and may sometimes cause their catastrophic destruction by explosive decompression regardless of the type of material used for their construction. In the case of tanks constructed of titanium alloys the added risk is incurred of catastrophic burning of the tanks. In view of this it is recommended that thin-walled tanks constructed of titanium alloys should not be used to contain liquid or gaseous oxygen.
Uncertainty in simulating wheat yields under climate change
USDA-ARS?s Scientific Manuscript database
Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change...
Assessment of input uncertainty by seasonally categorized latent variables using SWAT
USDA-ARS?s Scientific Manuscript database
Watershed processes have been explored with sophisticated simulation models for the past few decades. It has been stated that uncertainty attributed to alternative sources such as model parameters, forcing inputs, and measured data should be incorporated during the simulation process. Among varyin...
Estimating winter wheat phenological parameters: Implications for crop modeling
USDA-ARS?s Scientific Manuscript database
Crop parameters, such as the timing of developmental events, are critical for accurate simulation results in crop simulation models, yet uncertainty often exists in determining the parameters. Factors contributing to the uncertainty include: a) sources of variation within a plant (i.e., within diffe...
NASA Astrophysics Data System (ADS)
Dodov, B.
2017-12-01
Stochastic simulation of realistic and statistically robust patterns of Tropical Cyclone (TC) induced precipitation is a challenging task. It is even more challenging in a catastrophe modeling context, where tens of thousands of typhoon seasons need to be simulated in order to provide a complete view of flood risk. Ultimately, one could run a coupled global climate model and regional Numerical Weather Prediction (NWP) model, but this approach is not feasible in the catastrophe modeling context and, most importantly, may not provide TC track patterns consistent with observations. Rather, we propose to leverage NWP output for the observed TC precipitation patterns (in terms of downscaled reanalysis 1979-2015) collected on a Lagrangian frame along the historical TC tracks and reduced to the leading spatial principal components of the data. The reduced data from all TCs is then grouped according to timing, storm evolution stage (developing, mature, dissipating, ETC transitioning) and central pressure and used to build a dictionary of stationary (within a group) and non-stationary (for transitions between groups) covariance models. Provided that the stochastic storm tracks with all the parameters describing the TC evolution are already simulated, a sequence of conditional samples from the covariance models chosen according to the TC characteristics at a given moment in time are concatenated, producing a continuous non-stationary precipitation pattern in a Lagrangian framework. The simulated precipitation for each event is finally distributed along the stochastic TC track and blended with a non-TC background precipitation using a data assimilation technique. The proposed framework provides means of efficient simulation (10000 seasons simulated in a couple of days) and robust typhoon precipitation patterns consistent with observed regional climate and visually undistinguishable from high resolution NWP output. The framework is used to simulate a catalog of 10000 typhoon seasons implemented in a flood risk model for Japan.
NASA Technical Reports Server (NTRS)
DeLannoy, Gabrielle J. M.; Reichle, Rolf H.; Vrugt, Jasper A.
2013-01-01
Uncertainties in L-band (1.4 GHz) radiative transfer modeling (RTM) affect the simulation of brightness temperatures (Tb) over land and the inversion of satellite-observed Tb into soil moisture retrievals. In particular, accurate estimates of the microwave soil roughness, vegetation opacity and scattering albedo for large-scale applications are difficult to obtain from field studies and often lack an uncertainty estimate. Here, a Markov Chain Monte Carlo (MCMC) simulation method is used to determine satellite-scale estimates of RTM parameters and their posterior uncertainty by minimizing the misfit between long-term averages and standard deviations of simulated and observed Tb at a range of incidence angles, at horizontal and vertical polarization, and for morning and evening overpasses. Tb simulations are generated with the Goddard Earth Observing System (GEOS-5) and confronted with Tb observations from the Soil Moisture Ocean Salinity (SMOS) mission. The MCMC algorithm suggests that the relative uncertainty of the RTM parameter estimates is typically less than 25 of the maximum a posteriori density (MAP) parameter value. Furthermore, the actual root-mean-square-differences in long-term Tb averages and standard deviations are found consistent with the respective estimated total simulation and observation error standard deviations of m3.1K and s2.4K. It is also shown that the MAP parameter values estimated through MCMC simulation are in close agreement with those obtained with Particle Swarm Optimization (PSO).
Impacts of climate change and internal climate variability on french rivers streamflows
NASA Astrophysics Data System (ADS)
Dayon, Gildas; Boé, Julien; Martin, Eric
2016-04-01
The assessment of the impacts of climate change often requires to set up long chains of modeling, from the model to estimate the future concentration of greenhouse gases to the impact model. Throughout the modeling chain, sources of uncertainty accumulate making the exploitation of results for the development of adaptation strategies difficult. It is proposed here to assess the impacts of climate change on the hydrological cycle over France and the associated uncertainties. The contribution of the uncertainties from greenhouse gases emission scenario, climate models and internal variability are addressed in this work. To have a large ensemble of climate simulations, the study is based on Global Climate Models (GCM) simulations from the Coupled Model Intercomparison Phase 5 (CMIP5), including several simulations from the same GCM to properly assess uncertainties from internal climate variability. Simulations from the four Radiative Concentration Pathway (RCP) are downscaled with a statistical method developed in a previous study (Dayon et al. 2015). The hydrological system Isba-Modcou is then driven by the downscaling results on a 8 km grid over France. Isba is a land surface model that calculates the energy and water balance and Modcou a hydrogeological model that routes the surface runoff given by Isba. Based on that framework, uncertainties uncertainties from greenhouse gases emission scenario, climate models and climate internal variability are evaluated. Their relative importance is described for the next decades and the end of this century. In a last part, uncertainties due to internal climate variability on streamflows simulated with downscaled GCM and Isba-Modcou are evaluated against observations and hydrological reconstructions on the whole 20th century. Hydrological reconstructions are based on the downscaling of recent atmospheric reanalyses of the 20th century and observations of temperature and precipitation. We show that the multi-decadal variability of streamflows observed in the 20th century is generally weaker in the hydrological simulations done with the historical simulations from climate models. References: Dayon et al. (2015), Transferability in the future climate of a statistical downscaling mehtod for precipitation in France, J. Geophys. Res. Atmos., 120, 1023-1043, doi:10.1002/2014JD022236
Probabilistic simulation of uncertainties in composite uniaxial strengths
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Stock, T. A.
1990-01-01
Probabilistic composite micromechanics methods are developed that simulate uncertainties in unidirectional fiber composite strengths. These methods are in the form of computational procedures using composite mechanics with Monte Carlo simulation. The variables for which uncertainties are accounted include constituent strengths and their respective scatter. A graphite/epoxy unidirectional composite (ply) is studied to illustrate the procedure and its effectiveness to formally estimate the probable scatter in the composite uniaxial strengths. The results show that ply longitudinal tensile and compressive, transverse compressive and intralaminar shear strengths are not sensitive to single fiber anomalies (breaks, intergacial disbonds, matrix microcracks); however, the ply transverse tensile strength is.
Orbital Debris Shape and Orientation Effects on Ballistic Limits
NASA Technical Reports Server (NTRS)
Evans, Steven W.; Williamsen, Joel E.
2005-01-01
The SPHC hydrodynamic code was used to evaluate the effects of orbital debris particle shape and orientation on penetration of a typical spacecraft dual-wall shield. Impacts were simulated at near-normal obliquity at 12 km/sec. Debris cloud characteristics and damage potential are compared with those from impacts by spherical projectiles. Results of these simulations indicate the uncertainties in the predicted ballistic limits due to modeling uncertainty and to uncertainty in the impactor orientation.
Post-processing of multi-hydrologic model simulations for improved streamflow projections
NASA Astrophysics Data System (ADS)
khajehei, sepideh; Ahmadalipour, Ali; Moradkhani, Hamid
2016-04-01
Hydrologic model outputs are prone to bias and uncertainty due to knowledge deficiency in model and data. Uncertainty in hydroclimatic projections arises due to uncertainty in hydrologic model as well as the epistemic or aleatory uncertainties in GCM parameterization and development. This study is conducted to: 1) evaluate the recently developed multi-variate post-processing method for historical simulations and 2) assess the effect of post-processing on uncertainty and reliability of future streamflow projections in both high-flow and low-flow conditions. The first objective is performed for historical period of 1970-1999. Future streamflow projections are generated for 10 statistically downscaled GCMs from two widely used downscaling methods: Bias Corrected Statistically Downscaled (BCSD) and Multivariate Adaptive Constructed Analogs (MACA), over the period of 2010-2099 for two representative concentration pathways of RCP4.5 and RCP8.5. Three semi-distributed hydrologic models were employed and calibrated at 1/16 degree latitude-longitude resolution for over 100 points across the Columbia River Basin (CRB) in the pacific northwest USA. Streamflow outputs are post-processed through a Bayesian framework based on copula functions. The post-processing approach is relying on a transfer function developed based on bivariate joint distribution between the observation and simulation in historical period. Results show that application of post-processing technique leads to considerably higher accuracy in historical simulations and also reducing model uncertainty in future streamflow projections.
NASA Astrophysics Data System (ADS)
Pilz, Tobias; Francke, Till; Bronstert, Axel
2016-04-01
Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.
NASA Astrophysics Data System (ADS)
Yatheendradas, S.; Vivoni, E.
2007-12-01
A common practice in distributed hydrological modeling is to assign soil hydraulic properties based on coarse textural datasets. For semiarid regions with poor soil information, the performance of a model can be severely constrained due to the high model sensitivity to near-surface soil characteristics. Neglecting the uncertainty in soil hydraulic properties, their spatial variation and their naturally-occurring horizonation can potentially affect the modeled hydrological response. In this study, we investigate such effects using the TIN-based Real-time Integrated Basin Simulator (tRIBS) applied to the mid-sized (100 km2) Sierra Los Locos watershed in northern Sonora, Mexico. The Sierra Los Locos basin is characterized by complex mountainous terrain leading to topographic organization of soil characteristics and ecosystem distributions. We focus on simulations during the 2004 North American Monsoon Experiment (NAME) when intensive soil moisture measurements and aircraft- based soil moisture retrievals are available in the basin. Our experiments focus on soil moisture comparisons at the point, topographic transect and basin scales using a range of different soil characterizations. We compare the distributed soil moisture estimates obtained using (1) a deterministic simulation based on soil texture from coarse soil maps, (2) a set of ensemble simulations that capture soil parameter uncertainty and their spatial distribution, and (3) a set of simulations that conditions the ensemble on recent soil profile measurements. Uncertainties considered in near-surface soil characterization provide insights into their influence on the modeled uncertainty, into the value of soil profile observations, and into effective use of on-going field observations for constraining the soil moisture response uncertainty.
NASA Astrophysics Data System (ADS)
Rautman, C. A.; Treadway, A. H.
1991-11-01
Regulatory geologists are concerned with predicting the performance of sites proposed for waste disposal or for remediation of existing pollution problems. Geologic modeling of these sites requires large-scale expansion of knowledge obtained from very limited sampling. This expansion induces considerable uncertainty into the geologic models of rock properties that are required for modeling the predicted performance of the site. One method for assessing this uncertainty is through nonparametric geostatistical simulation. Simulation can produce a series of equiprobable models of a rock property of interest. Each model honors measured values at sampled locations, and each can be constructed to emulate both the univariate histogram and the spatial covariance structure of the measured data. Computing a performance model for a number of geologic simulations allows evaluation of the effects of geologic uncertainty. A site may be judged acceptable if the number of failures to meet a particular performance criterion produced by these computations is sufficiently low. A site that produces too many failures may be either unacceptable or simply inadequately described. The simulation approach to addressing geologic uncertainty is being applied to the potential high-level nuclear waste repository site at Yucca Mountain, Nevada, U.S.A. Preliminary geologic models of unsaturated permeability have been created that reproduce observed statistical properties reasonably well. A spread of unsaturated groundwater travel times has been computed that reflects the variability of those geologic models. Regions within the simulated models exhibiting the greatest variability among multiple runs are candidates for obtaining the greatest reduction in uncertainty through additional site characterization.
Rocket nozzle thermal shock tests in an arc heater facility
NASA Technical Reports Server (NTRS)
Painter, James H.; Williamson, Ronald A.
1986-01-01
A rocket motor nozzle thermal structural test technique that utilizes arc heated nitrogen to simulate a motor burn was developed. The technique was used to test four heavily instrumented full-scale Star 48 rocket motor 2D carbon/carbon segments at conditions simulating the predicted thermal-structural environment. All four nozzles survived the tests without catastrophic or other structural failures. The test technique demonstrated promise as a low cost, controllable alternative to rocket motor firing. The technique includes the capability of rapid termination in the event of failure, allowing post-test analysis.
NASA Astrophysics Data System (ADS)
Sawicka, K.; Breuer, L.; Houska, T.; Santabarbara Ruiz, I.; Heuvelink, G. B. M.
2016-12-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty propagation analysis and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the `spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo techniques, as well as several uncertainty visualization functions. Here we will demonstrate that the 'spup' package is an effective and easy-to-use tool to be applied even in a very complex study case, and that it can be used in multi-disciplinary research and model-based decision support. As an example, we use the ecological LandscapeDNDC model to analyse propagation of uncertainties associated with spatial variability of the model driving forces such as rainfall, nitrogen deposition and fertilizer inputs. The uncertainty propagation is analysed for the prediction of emissions of N2O and CO2 for a German low mountainous, agriculturally developed catchment. The study tests the effect of spatial correlations on spatially aggregated model outputs, and could serve as an advice for developing best management practices and model improvement strategies.
Rifai, Sami W; Urquiza Muñoz, José D; Negrón-Juárez, Robinson I; Ramírez Arévalo, Fredy R; Tello-Espinoza, Rodil; Vanderwel, Mark C; Lichstein, Jeremy W; Chambers, Jeffrey Q; Bohlman, Stephanie A
2016-10-01
Wind disturbance can create large forest blowdowns, which greatly reduces live biomass and adds uncertainty to the strength of the Amazon carbon sink. Observational studies from within the central Amazon have quantified blowdown size and estimated total mortality but have not determined which trees are most likely to die from a catastrophic wind disturbance. Also, the impact of spatial dependence upon tree mortality from wind disturbance has seldom been quantified, which is important because wind disturbance often kills clusters of trees due to large treefalls killing surrounding neighbors. We examine (1) the causes of differential mortality between adult trees from a 300-ha blowdown event in the Peruvian region of the northwestern Amazon, (2) how accounting for spatial dependence affects mortality predictions, and (3) how incorporating both differential mortality and spatial dependence affect the landscape level estimation of necromass produced from the blowdown. Standard regression and spatial regression models were used to estimate how stem diameter, wood density, elevation, and a satellite-derived disturbance metric influenced the probability of tree death from the blowdown event. The model parameters regarding tree characteristics, topography, and spatial autocorrelation of the field data were then used to determine the consequences of non-random mortality for landscape production of necromass through a simulation model. Tree mortality was highly non-random within the blowdown, where tree mortality rates were highest for trees that were large, had low wood density, and were located at high elevation. Of the differential mortality models, the non-spatial models overpredicted necromass, whereas the spatial model slightly underpredicted necromass. When parameterized from the same field data, the spatial regression model with differential mortality estimated only 7.5% more dead trees across the entire blowdown than the random mortality model, yet it estimated 51% greater necromass. We suggest that predictions of forest carbon loss from wind disturbance are sensitive to not only the underlying spatial dependence of observations, but also the biological differences between individuals that promote differential levels of mortality. © 2016 by the Ecological Society of America.
NASA Astrophysics Data System (ADS)
Zolfaghari, Mohammad R.
2009-07-01
Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.
Uncertainty in predictions of forest carbon dynamics: separating driver error from model error.
Spadavecchia, L; Williams, M; Law, B E
2011-07-01
We present an analysis of the relative magnitude and contribution of parameter and driver uncertainty to the confidence intervals on estimates of net carbon fluxes. Model parameters may be difficult or impractical to measure, while driver fields are rarely complete, with data gaps due to sensor failure and sparse observational networks. Parameters are generally derived through some optimization method, while driver fields may be interpolated from available data sources. For this study, we used data from a young ponderosa pine stand at Metolius, Central Oregon, and a simple daily model of coupled carbon and water fluxes (DALEC). An ensemble of acceptable parameterizations was generated using an ensemble Kalman filter and eddy covariance measurements of net C exchange. Geostatistical simulations generated an ensemble of meteorological driving variables for the site, consistent with the spatiotemporal autocorrelations inherent in the observational data from 13 local weather stations. Simulated meteorological data were propagated through the model to derive the uncertainty on the CO2 flux resultant from driver uncertainty typical of spatially extensive modeling studies. Furthermore, the model uncertainty was partitioned between temperature and precipitation. With at least one meteorological station within 25 km of the study site, driver uncertainty was relatively small ( 10% of the total net flux), while parameterization uncertainty was larger, 50% of the total net flux. The largest source of driver uncertainty was due to temperature (8% of the total flux). The combined effect of parameter and driver uncertainty was 57% of the total net flux. However, when the nearest meteorological station was > 100 km from the study site, uncertainty in net ecosystem exchange (NEE) predictions introduced by meteorological drivers increased by 88%. Precipitation estimates were a larger source of bias in NEE estimates than were temperature estimates, although the biases partly compensated for each other. The time scales on which precipitation errors occurred in the simulations were shorter than the temporal scales over which drought developed in the model, so drought events were reasonably simulated. The approach outlined here provides a means to assess the uncertainty and bias introduced by meteorological drivers in regional-scale ecological forecasting.
NASA Astrophysics Data System (ADS)
Doroszkiewicz, J. M.; Romanowicz, R. J.
2016-12-01
The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Zhiyun; Krasnopolsky, Ruben; Shang, Hsien
2013-09-01
Stars form in dense cores of molecular clouds that are observed to be significantly magnetized. In the simplest case of a laminar (non-turbulent) core with the magnetic field aligned with the rotation axis, both analytic considerations and numerical simulations have shown that the formation of a large, 10{sup 2} AU scale, rotationally supported protostellar disk is suppressed by magnetic braking in the ideal MHD limit for a realistic level of core magnetization. This theoretical difficulty in forming protostellar disks is termed the ''magnetic braking catastrophe''. A possible resolution to this problem, proposed by Hennebelle and Ciardi and Joos et al.,more » is that misalignment between the magnetic field and rotation axis may weaken the magnetic braking enough to enable disk formation. We evaluate this possibility quantitatively through numerical simulations. We confirm the basic result of Joos et al. that the misalignment is indeed conducive to disk formation. In relatively weakly magnetized cores with dimensionless mass-to-flux ratio {approx}> 4, it enabled the formation of rotationally supported disks that would otherwise be suppressed if the magnetic field and rotation axis are aligned. For more strongly magnetized cores, disk formation remains suppressed, however, even for the maximum tilt angle of 90 Degree-Sign . If dense cores are as strongly magnetized as indicated by OH Zeeman observations (with a mean dimensionless mass-to-flux ratio {approx}2), it would be difficult for the misalignment alone to enable disk formation in the majority of them. We conclude that, while beneficial to disk formation, especially for the relatively weak field case, misalignment does not completely solve the problem of catastrophic magnetic braking in general.« less
Pekmezci, Murat; Tang, Jessica A; Cheng, Liu; Modak, Ashin; McClellan, Robert T; Buckley, Jenni M; Ames, Christopher P
2016-11-01
In vitro cadaver biomechanics study. The goal of this study is to compare the in situ fatigue life of expandable versus fixed interbody cage designs. Expandable cages are becoming more popular, in large part, due to their versatility; however, subsidence and catastrophic failure remain a concern. This in vitro analysis investigates the fatigue life of expandable and fixed interbody cages in a single level human cadaver corpectomy model by evaluating modes of subsidence of expandable and fixed cages as well as change in stiffness of the constructs with cyclic loading. Nineteen specimens from 10 human thoracolumbar spines (T10-L2, L3-L5) were biomechanically evaluated after a single level corpectomy that was reconstructed with an expandable or fixed cage and anterior dual rod instrumentation. All specimens underwent 98 K cycles to simulate 3 months of postoperative weight bearing. In addition, a third group with hyperlordotic cages was used to simulate catastrophic failure that is observed in clinical practice. Three fixed and 2 expandable cages withstood the cyclic loading despite perfect sagittal and coronal plane fitting of the endcaps. The majority of the constructs settled in after initial subsidence. The catastrophic failures that were observed in clinical practice could not be reproduced with hyperlordotic cages. However, all cages in this group subsided, and 60% resulted in endplate fractures during deployment of the cage. Despite greater surface contact area, expandable cages have a trend for higher subsidence rates when compared with fixed cages. When there is edge loading as in the hyperlordotic cage scenario, there is a higher risk of subsidence and intraoperative fracture during deployment of expandable cages.
Yip, Winnie; Hsiao, William C
2009-01-01
In recent years, many lower to middle income countries have looked to insurance as a means to protect their populations from medical impoverishment. In 2003, the Chinese government initiated the New Cooperative Medical System (NCMS), a government-run voluntary insurance program for its rural population. The prevailing model of NCMS combines medical savings accounts with high-deductible catastrophic hospital insurance (MSA/Catastrophic). To assess the effectiveness of this approach in reducing medical impoverishment, we used household survey data from 2006 linked to claims records of health expenditures to simulate the effect of MSA/Catastrophic on reducing the share of individuals falling below the poverty line (headcount), and the amount by which household resources fall short of the poverty line (poverty gap) due to medical expenses. We compared the effects of MSA/Catastrophic to Rural Mutual Health Care (RMHC), an experimental model that provides first dollar coverage for primary care, hospital services and drugs with a similar premium but a lower ceiling. Our results show that RMHC is more effective at reducing medical impoverishment than NCMS. Under the internationally accepted poverty line of US$1.08 per person per day, the MSA/Catastrophic models would reduce the poverty headcount by 3.5-3.9% and the average poverty gap by 11.8-16.4%, compared with reductions of 6.1-6.8% and 15-18.5% under the RMHC model. The primary reason for this is that NCMS does not address a major cause of medical impoverishment: expensive outpatient services for chronic conditions. As such, health policymakers need first to examine the disease profile and health expenditure pattern of a population before they can direct resources to where they will be most effective. As chronic diseases impose a growing share of the burden on the population in developing countries, it is not necessarily true that insurance coverage focusing on expensive hospital care alone is the most effective at providing financial risk protection.
Wesolowski, Edwin A.
1996-01-01
Two separate studies to simulate the effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota, have been completed. In the first study, the Red River at Fargo Water-Quality Model was calibrated and verified for icefree conditions. In the second study, the Red River at Fargo Ice-Cover Water-Quality Model was verified for ice-cover conditions.To better understand and apply the Red River at Fargo Water-Quality Model and the Red River at Fargo Ice-Cover Water-Quality Model, the uncertainty associated with simulated constituent concentrations and property values was analyzed and quantified using the Enhanced Stream Water Quality Model-Uncertainty Analysis. The Monte Carlo simulation and first-order error analysis methods were used to analyze the uncertainty in simulated values for six constituents and properties at sites 5, 10, and 14 (upstream to downstream order). The constituents and properties analyzed for uncertainty are specific conductance, total organic nitrogen (reported as nitrogen), total ammonia (reported as nitrogen), total nitrite plus nitrate (reported as nitrogen), 5-day carbonaceous biochemical oxygen demand for ice-cover conditions and ultimate carbonaceous biochemical oxygen demand for ice-free conditions, and dissolved oxygen. Results are given in detail for both the ice-cover and ice-free conditions for specific conductance, total ammonia, and dissolved oxygen.The sensitivity and uncertainty of the simulated constituent concentrations and property values to input variables differ substantially between ice-cover and ice-free conditions. During ice-cover conditions, simulated specific-conductance values are most sensitive to the headwatersource specific-conductance values upstream of site 10 and the point-source specific-conductance values downstream of site 10. These headwater-source and point-source specific-conductance values also are the key sources of uncertainty. Simulated total ammonia concentrations are most sensitive to the point-source total ammonia concentrations at all three sites. Other input variables that contribute substantially to the variability of simulated total ammonia concentrations are the headwater-source total ammonia and the instream reaction coefficient for biological decay of total ammonia to total nitrite. Simulated dissolved-oxygen concentrations at all three sites are most sensitive to headwater-source dissolved-oxygen concentration. This input variable is the key source of variability for simulated dissolved-oxygen concentrations at sites 5 and 10. Headwatersource and point-source dissolved-oxygen concentrations are the key sources of variability for simulated dissolved-oxygen concentrations at site 14.During ice-free conditions, simulated specific-conductance values at all three sites are most sensitive to the headwater-source specific-conductance values. Headwater-source specificconductance values also are the key source of uncertainty. The input variables to which total ammonia and dissolved oxygen are most sensitive vary from site to site and may or may not correspond to the input variables that contribute the most to the variability. The input variables that contribute the most to the variability of simulated total ammonia concentrations are pointsource total ammonia, instream reaction coefficient for biological decay of total ammonia to total nitrite, and Manning's roughness coefficient. The input variables that contribute the most to the variability of simulated dissolved-oxygen concentrations are reaeration rate, sediment oxygen demand rate, and headwater-source algae as chlorophyll a.
Uncertainty in Twenty-First-Century CMIP5 Sea Level Projections
NASA Technical Reports Server (NTRS)
Little, Christopher M.; Horton, Radley M.; Kopp, Robert E.; Oppenheimer, Michael; Yip, Stan
2015-01-01
The representative concentration pathway (RCP) simulations included in phase 5 of the Coupled Model Intercomparison Project (CMIP5) quantify the response of the climate system to different natural and anthropogenic forcing scenarios. These simulations differ because of 1) forcing, 2) the representation of the climate system in atmosphere-ocean general circulation models (AOGCMs), and 3) the presence of unforced (internal) variability. Global and local sea level rise projections derived from these simulations, and the emergence of distinct responses to the four RCPs depend on the relative magnitude of these sources of uncertainty at different lead times. Here, the uncertainty in CMIP5 projections of sea level is partitioned at global and local scales, using a 164-member ensemble of twenty-first-century simulations. Local projections at New York City (NYSL) are highlighted. The partition between model uncertainty, scenario uncertainty, and internal variability in global mean sea level (GMSL) is qualitatively consistent with that of surface air temperature, with model uncertainty dominant for most of the twenty-first century. Locally, model uncertainty is dominant through 2100, with maxima in the North Atlantic and the Arctic Ocean. The model spread is driven largely by 4 of the 16 AOGCMs in the ensemble; these models exhibit outlying behavior in all RCPs and in both GMSL and NYSL. The magnitude of internal variability varies widely by location and across models, leading to differences of several decades in the local emergence of RCPs. The AOGCM spread, and its sensitivity to model exclusion and/or weighting, has important implications for sea level assessments, especially if a local risk management approach is utilized.
Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models
USDA-ARS?s Scientific Manuscript database
Cumulative nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. This study used an agroecosystems simulation model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strydom, Gerhard; Bostelmann, F.
The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less
NASA Astrophysics Data System (ADS)
Baish, A. S.; Vivoni, E. R.; Payan, J. G.; Robles-Morua, A.; Basile, G. M.
2011-12-01
A distributed hydrologic model can help bring consensus among diverse stakeholders in regional flood planning by producing quantifiable sets of alternative futures. This value is acute in areas with high uncertainties in hydrologic conditions and sparse observations. In this study, we conduct an application of the Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator (tRIBS) in the Santa Catarina basin of Nuevo Leon, Mexico, where Hurricane Alex in July 2010 led to catastrophic flooding of the capital city of Monterrey. Distributed model simulations utilize best-available information on the regional topography, land cover, and soils obtained from Mexican government agencies or analysis of remotely-sensed imagery from MODIS and ASTER. Furthermore, we developed meteorological forcing for the flood event based on multiple data sources, including three local gauge networks, satellite-based estimates from TRMM and PERSIANN, and the North American Land Data Assimilation System (NLDAS). Remotely-sensed data allowed us to quantify rainfall distributions in the upland, rural portions of the Santa Catarina that are sparsely populated and ungauged. Rural areas had significant contributions to the flood event and as a result were considered by stakeholders for flood control measures, including new reservoirs and upland vegetation management. Participatory modeling workshops with the stakeholders revealed a disconnect between urban and rural populations in regard to understanding the hydrologic conditions of the flood event and the effectiveness of existing and potential flood control measures. Despite these challenges, the use of the distributed flood forecasts developed within this participatory framework facilitated building consensus among diverse stakeholders and exploring alternative futures in the basin.
Giant Landslides, Mega-Tsunamis, and Paleo-Sea Level in the Hawaiian Islands
NASA Astrophysics Data System (ADS)
Watts, P.; McMurtry, G. M.; Fryer, G. J.; Smith, J. R.; Imamura, F.
2001-12-01
We show considerable agreement between the ages of the two giant Alika landslides and dating of debris found tens to hundreds of meters above sea level in Hawaii. Despite the size of the landslides, controversy persists as to the ability to generate landslide tsunamis big enough to deposit the debris. We affirm that tsunami deposits are a sufficient explanation of the observed pattern of debris height. We also show that our tsunami simulations can be used to reduce the considerable uncertainty in subsidence history of the different Hawaiian islands, a current obstacle to interpreting the supposed deposits. Finally, we show that the onset of interglacials provides a probable explanation for the timing of these giant landslides over the last five million years. We predict that the greatest tsunami hazard facing the Hawaiian islands are giant landslides and that the current interglacial promotes the generation of mega-tsunamis from catastrophic volcano collapse. Hawaiian giant submarine landslide events have been recognized from detached submarine landslide blocks and fields of smaller debris by offshore surveys. Mega-tsunamis produced by giant landslides were first proposed for Hawaii and have since been implicated globally at other oceanic islands and along the continental margins. While not discounting the possibility of locally-generated tsunamis, some researchers have cast doubt upon the original hypothesis of giant waves impacting Lanai and other Hawaiian islands from flank failures of the nearby Mauna Loa Volcano on Hawaii island. Landslide tsunami simulations have advanced to the point where the tsunamigenic potential of the giant submarine landslides can be affirmed, while the subsidence history of different Hawaiian islands is still subject to debate.
Measurement of the $B^-$ lifetime using a simulation free approach for trigger bias correction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aaltonen, T.; /Helsinki Inst. of Phys.; Adelman, J.
2010-04-01
The collection of a large number of B hadron decays to hadronic final states at the CDF II detector is possible due to the presence of a trigger that selects events based on track impact parameters. However, the nature of the selection requirements of the trigger introduces a large bias in the observed proper decay time distribution. A lifetime measurement must correct for this bias and the conventional approach has been to use a Monte Carlo simulation. The leading sources of systematic uncertainty in the conventional approach are due to differences between the data and the Monte Carlo simulation. Inmore » this paper they present an analytic method for bias correction without using simulation, thereby removing any uncertainty between data and simulation. This method is presented in the form of a measurement of the lifetime of the B{sup -} using the mode B{sup -} {yields} D{sup 0}{pi}{sup -}. The B{sup -} lifetime is measured as {tau}{sub B{sup -}} = 1.663 {+-} 0.023 {+-} 0.015 ps, where the first uncertainty is statistical and the second systematic. This new method results in a smaller systematic uncertainty in comparison to methods that use simulation to correct for the trigger bias.« less
NASA Astrophysics Data System (ADS)
Román, Roberto; Bilbao, Julia; de Miguel, Argimiro; Pérez-Burgos, Ana
2014-05-01
The radiative transfer models can be used to obtain solar radiative quantities in the Earth surface as the erythemal ultraviolet (UVER) irradiance, which is the spectral irradiance weighted with the erythemal (sunburn) action spectrum, and the total shortwave irradiance (SW; 305-2,8000 nm). Aerosol and atmospheric properties are necessary as inputs in the model in order to calculate the UVER and SW irradiances under cloudless conditions, however the uncertainty in these inputs causes another uncertainty in the simulations. The objective of this work is to quantify the uncertainty in UVER and SW simulations generated by the aerosol optical depth (AOD) uncertainty. The data from different satellite retrievals were downloaded at nine Spanish places located in the Iberian Peninsula: Total ozone column from different databases, spectral surface albedo and water vapour column from MODIS instrument, AOD at 443 nm and Angström Exponent (between 443 nm and 670 nm) from MISR instrument onboard Terra satellite, single scattering albedo from OMI instrument onboard Aura satellite. The obtained AOD at 443 nm data from MISR were compared with AERONET measurements in six Spanish sites finding an uncertainty in the AOD from MISR of 0.074. In this work the radiative transfer model UVSPEC/Libradtran (1.7 version) was used to obtain the SW and UVER irradiance under cloudless conditions for each month and for different solar zenith angles (SZA) in the nine mentioned locations. The inputs used for these simulations were monthly climatology tables obtained with the available data in each location. Once obtained the UVER and SW simulations, they were repeated twice but changing the AOD monthly values by the same AOD plus/minus its uncertainty. The maximum difference between the irradiance run with AOD and the irradiance run with AOD plus/minus its uncertainty was calculated for each month, SZA, and location. This difference was considered as the uncertainty on the model caused by the AOD uncertainty. The uncertainty in the simulated global SW and UVER varies with the location, but the behaviour is similar: high uncertainty in specific months. The averages of the uncertainty at the nine locations were calculated. Uncertainty in the global SW is lower than 5% for SZA values lower than 70º, and the uncertainty in global UVER is between 2 and 6%. The uncertainty in the direct and diffuse components is higher than in the global case for both SW and UVER irradiances, but a balance between the changes with AOD in direct and diffuse components provide a lower uncertainty in global SW and UVER irradiance. References Bilbao, J., Román, R., de Miguel, A., Mateos, D.: Long-term solar erythemal UV irradiance data reconstruction in Spain using a semiempirical method, J. Geophys. Res., 116, D22211, 2011. Kylling, A., Stamnes, K., Tsay, S. C.: A reliable and efficient two-stream algorithm for spherical radiative transfer: Documentation of acciracy in realistic layered media, J. Atmos. Chem, 21, 115-150, 1995. Ricchiazzi, P., Yang, S., Gautier, C., Sowle, D.: SBDART: A research and Teaching software tool for plane-parallel radiative transfer in the Earth's atmosphere, Bulletin of the American Meteorological
GENOA-PFA: Progressive Fracture in Composites Simulated Computationally
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.
2000-01-01
GENOA-PFA is a commercial version of the Composite Durability Structural Analysis (CODSTRAN) computer program that simulates the progression of damage ultimately leading to fracture in polymer-matrix-composite (PMC) material structures under various loading and environmental conditions. GENOA-PFA offers several capabilities not available in other programs developed for this purpose, making it preferable for use in analyzing the durability and damage tolerance of complex PMC structures in which the fiber reinforcements occur in two- and three-dimensional weaves and braids. GENOA-PFA implements a progressive-fracture methodology based on the idea that a structure fails when flaws that may initially be small (even microscopic) grow and/or coalesce to a critical dimension where the structure no longer has an adequate safety margin to avoid catastrophic global fracture. Damage is considered to progress through five stages: (1) initiation, (2) growth, (3) accumulation (coalescence of propagating flaws), (4) stable propagation (up to the critical dimension), and (5) unstable or very rapid propagation (beyond the critical dimension) to catastrophic failure. The computational simulation of progressive failure involves formal procedures for identifying the five different stages of damage and for relating the amount of damage at each stage to the overall behavior of the deteriorating structure. In GENOA-PFA, mathematical modeling of the composite physical behavior involves an integration of simulations at multiple, hierarchical scales ranging from the macroscopic (lamina, laminate, and structure) to the microscopic (fiber, matrix, and fiber/matrix interface), as shown in the figure. The code includes algorithms to simulate the progression of damage from various source defects, including (1) through-the-thickness cracks and (2) voids with edge, pocket, internal, or mixed-mode delaminations.
Non-catastrophic and catastrophic fractures in racing Thoroughbreds at the Hong Kong Jockey Club.
Sun, T C; Riggs, C M; Cogger, N; Wright, J; Al-Alawneh, J I
2018-04-19
Reports of fractures in racehorses have predominantly focused on catastrophic injuries, and there is limited data identifying the location and incidence of fractures that did not result in a fatal outcome. To describe the nature and the incidence of non-catastrophic and catastrophic fractures in Thoroughbreds racing at the Hong Kong Jockey Club (HKJC) over seven racing seasons. Retrospective cohort study. Data of fractures sustained in horses while racing and of race characteristics were extracted from the HKJC Veterinary Management Information System (VMIS) and Racing Information System (RIS) respectively. The fracture event was determined from the first clinical entry for each specific injury. The incidence rates of non-catastrophic and catastrophic fractures were calculated per 1000 racing starts for racetrack, age, racing season, sex and trainer. 179 first fracture events occurred in 64,807 racing starts. The incidence rate of non-catastrophic fractures was 2.2 per 1000 racing starts and of catastrophic fractures was 0.6 per 1000 racing starts. Fractures of the proximal sesamoid bones represented 55% of all catastrophic fractures while the most common non-catastrophic fractures involved the carpus and the first phalanx. Significant associations were detected between the incidence of non-catastrophic fractures and sex, trainer and racing season. The first fracture event was used to calculate the incidence rate in this study and may have resulted in underestimation of the true incidence rate of fractures in this population. However, given the low number of recorded fracture events compared to the size of the study population, this underestimation is likely to be small. There were 3.6 times as many non-catastrophic fractures as catastrophic fractures in Thoroughbreds racing in Hong Kong between 2004 and 2011. Non-catastrophic fractures interfere with race training schedules and may predispose to catastrophic fracture. Future analytical studies on non-catastrophic racing fractures should be a priority for the racing industry. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
An individual-based model for population viability analysis of humpback chub in Grand Canyon
Pine, William Pine; Healy, Brian; Smith, Emily Omana; Trammell, Melissa; Speas, Dave; Valdez, Rich; Yard, Mike; Walters, Carl; Ahrens, Rob; Vanhaverbeke, Randy; Stone, Dennis; Wilson, Wade
2013-01-01
We developed an individual-based population viability analysis model (females only) for evaluating risk to populations from catastrophic events or conservation and research actions. This model tracks attributes (size, weight, viability, etc.) for individual fish through time and then compiles this information to assess the extinction risk of the population across large numbers of simulation trials. Using a case history for the Little Colorado River population of Humpback Chub Gila cypha in Grand Canyon, Arizona, we assessed extinction risk and resiliency to a catastrophic event for this population and then assessed a series of conservation actions related to removing specific numbers of Humpback Chub at different sizes for conservation purposes, such as translocating individuals to establish other spawning populations or hatchery refuge development. Our results suggested that the Little Colorado River population is generally resilient to a single catastrophic event and also to removals of larvae and juveniles for conservation purposes, including translocations to establish new populations. Our results also suggested that translocation success is dependent on similar survival rates in receiving and donor streams and low emigration rates from recipient streams. In addition, translocating either large numbers of larvae or small numbers of large juveniles has generally an equal likelihood of successful population establishment at similar extinction risk levels to the Little Colorado River donor population. Our model created a transparent platform to consider extinction risk to populations from catastrophe or conservation actions and should prove useful to managers assessing these risks for endangered species such as Humpback Chub.
Angela M. White; Elise F. Zipkin; Patricia N. Manley; Matthew D. Schlesinger
2013-01-01
Over a century of fire suppression activities have altered the structure and composition of mixed conifer forests throughout the western United States. In the absence of fire, fuels have accumulated in these forests causing concerns over the potential for catastrophic wildfires. Fuel reduction treatments are being used on federal and state lands to reduce the threat of...
Ensemble Simulation of the Atmospheric Radionuclides Discharged by the Fukushima Nuclear Accident
NASA Astrophysics Data System (ADS)
Sekiyama, Thomas; Kajino, Mizuo; Kunii, Masaru
2013-04-01
Enormous amounts of radionuclides were discharged into the atmosphere by a nuclear accident at the Fukushima Daiichi nuclear power plant (FDNPP) after the earthquake and tsunami on 11 March 2011. The radionuclides were dispersed from the power plant and deposited mainly over eastern Japan and the North Pacific Ocean. A lot of numerical simulations of the radionuclide dispersion and deposition had been attempted repeatedly since the nuclear accident. However, none of them were able to perfectly simulate the distribution of dose rates observed after the accident over eastern Japan. This was partly due to the error of the wind vectors and precipitations used in the numerical simulations; unfortunately, their deterministic simulations could not deal with the probability distribution of the simulation results and errors. Therefore, an ensemble simulation of the atmospheric radionuclides was performed using the ensemble Kalman filter (EnKF) data assimilation system coupled with the Japan Meteorological Agency (JMA) non-hydrostatic mesoscale model (NHM); this mesoscale model has been used operationally for daily weather forecasts by JMA. Meteorological observations were provided to the EnKF data assimilation system from the JMA operational-weather-forecast dataset. Through this ensemble data assimilation, twenty members of the meteorological analysis over eastern Japan from 11 to 31 March 2011 were successfully obtained. Using these meteorological ensemble analysis members, the radionuclide behavior in the atmosphere such as advection, convection, diffusion, dry deposition, and wet deposition was simulated. This ensemble simulation provided the multiple results of the radionuclide dispersion and distribution. Because a large ensemble deviation indicates the low accuracy of the numerical simulation, the probabilistic information is obtainable from the ensemble simulation results. For example, the uncertainty of precipitation triggered the uncertainty of wet deposition; the uncertainty of wet deposition triggered the uncertainty of atmospheric radionuclide amounts. Then the remained radionuclides were transported downwind; consequently the uncertainty signal of the radionuclide amounts was propagated downwind. The signal propagation was seen in the ensemble simulation by the tracking of the large deviation areas of radionuclide concentration and deposition. These statistics are able to provide information useful for the probabilistic prediction of radionuclides.
Assessment of SFR Wire Wrap Simulation Uncertainties
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David
Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advancedmore » Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results for the 3-D pipe, the single pin THORS mesh, and the 7-pin bundle mesh, respectively.« less
NASA Astrophysics Data System (ADS)
Denissenkov, Pavel; Perdikakis, Georgios; Herwig, Falk; Schatz, Hendrik; Ritter, Christian; Pignatari, Marco; Jones, Samuel; Nikas, Stylianos; Spyrou, Artemis
2018-05-01
The first-peak s-process elements Rb, Sr, Y and Zr in the post-AGB star Sakurai's object (V4334 Sagittarii) have been proposed to be the result of i-process nucleosynthesis in a post-AGB very-late thermal pulse event. We estimate the nuclear physics uncertainties in the i-process model predictions to determine whether the remaining discrepancies with observations are significant and point to potential issues with the underlying astrophysical model. We find that the dominant source in the nuclear physics uncertainties are predictions of neutron capture rates on unstable neutron rich nuclei, which can have uncertainties of more than a factor 20 in the band of the i-process. We use a Monte Carlo variation of 52 neutron capture rates and a 1D multi-zone post-processing model for the i-process in Sakurai's object to determine the cumulative effect of these uncertainties on the final elemental abundance predictions. We find that the nuclear physics uncertainties are large and comparable to observational errors. Within these uncertainties the model predictions are consistent with observations. A correlation analysis of the results of our MC simulations reveals that the strongest impact on the predicted abundances of Rb, Sr, Y and Zr is made by the uncertainties in the (n, γ) reaction rates of 85Br, 86Br, 87Kr, 88Kr, 89Kr, 89Rb, 89Sr, and 92Sr. This conclusion is supported by a series of multi-zone simulations in which we increased and decreased to their maximum and minimum limits one or two reaction rates per run. We also show that simple and fast one-zone simulations should not be used instead of more realistic multi-zone stellar simulations for nuclear sensitivity and uncertainty studies of convective–reactive processes. Our findings apply more generally to any i-process site with similar neutron exposure, such as rapidly accreting white dwarfs with near-solar metallicities.
Parametric uncertainties in global model simulations of black carbon column mass concentration
NASA Astrophysics Data System (ADS)
Pearce, Hana; Lee, Lindsay; Reddington, Carly; Carslaw, Ken; Mann, Graham
2016-04-01
Previous studies have deduced that the annual mean direct radiative forcing from black carbon (BC) aerosol may regionally be up to 5 W m-2 larger than expected due to underestimation of global atmospheric BC absorption in models. We have identified the magnitude and important sources of parametric uncertainty in simulations of BC column mass concentration from a global aerosol microphysics model (GLOMAP-Mode). A variance-based uncertainty analysis of 28 parameters has been performed, based on statistical emulators trained on model output from GLOMAP-Mode. This is the largest number of uncertain model parameters to be considered in a BC uncertainty analysis to date and covers primary aerosol emissions, microphysical processes and structural parameters related to the aerosol size distribution. We will present several recommendations for further research to improve the fidelity of simulated BC. In brief, we find that the standard deviation around the simulated mean annual BC column mass concentration varies globally between 2.5 x 10-9 g cm-2 in remote marine regions and 1.25 x 10-6 g cm-2 near emission sources due to parameter uncertainty Between 60 and 90% of the variance over source regions is due to uncertainty associated with primary BC emission fluxes, including biomass burning, fossil fuel and biofuel emissions. While the contributions to BC column uncertainty from microphysical processes, for example those related to dry and wet deposition, are increased over remote regions, we find that emissions still make an important contribution in these areas. It is likely, however, that the importance of structural model error, i.e. differences between models, is greater than parametric uncertainty. We have extended our analysis to emulate vertical BC profiles at several locations in the mid-Pacific Ocean and identify the parameters contributing to uncertainty in the vertical distribution of black carbon at these locations. We will present preliminary comparisons of emulated BC vertical profiles from the AeroCom multi-model ensemble and Hiaper Pole-to-Pole (HIPPO) observations.
Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...
2014-12-31
Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less
Comparison between changes in flood hazard and risk in Spain using historical information
NASA Astrophysics Data System (ADS)
Llasat, Maria-Carmen; Mediero, Luis; Garrote, Luis; Gilabert, Joan
2015-04-01
Recently, the COST Action ES0901 "European procedures for flood frequency estimation (FloodFreq)" had as objective "the comparison and evaluation of methods for flood frequency estimation under the various climatologic and geographic conditions found in Europe". It was highlighted the improvement of regional analyses on at-site estimates, in terms of the uncertainty of quantile estimates. In the case of Spain, a regional analysis was carried out at a national scale, which allows identifying the flow threshold corresponding to a given return period from the observed flow series recorded at a gauging station. In addition, Mediero et al. (2014) studied the possible influence of non-stationarity on flood series for the period 1942-2009. In parallel, Barnolas and Llasat (2007), among others, collected documentary information of catastrophic flood events in Spain for the last centuries. Traditionally, the first approach ("top-down") usually identifies a flood as catastrophic, when its exceeds the 500-year return period flood. However, the second one ("bottom-up approach") accounts for flood damages (Llasat et al, 2005). This study presents a comparison between both approaches, discussing the potential factors that can lead to discrepancies between them, as well as accounting for information about major changes experienced in the catchment that could lead to changes in flood hazard and risk.
Periodicity of extinction: A 1988 update
NASA Technical Reports Server (NTRS)
Sepkowski, J. John, Jr.
1988-01-01
The hypothesis that events of mass extinction recur periodically at approximately 26 my intervals is an empirical claim based on analysis of data from the fossil record. The hypothesis has become closely linked with catastrophism because several events in the periodic series are associated with evidence of extraterrestrial impacts, and terrestrial forcing mechanisms with long, periodic recurrences are not easily conceived. Astronomical mechanisms that have been hypothesized include undetected solar companions and solar oscillation about the galactic plane, which induce comet showers and result in impacts on Earth at regular intervals. Because these mechanisms are speculative, they have been the subject of considerable controversy, as has the hypothesis of periodicity of extinction. In response to criticisms and uncertainties, a data base was developed on times of extinction of marine animal genera. A time series is given and analyzed with 49 sample points for the per-genus extinction rate from the Late Permian to the Recent. An unexpected pattern in the data is the uniformity of magnitude of many of the periodic extinction events. Observations suggest that the sequence of extinction events might be the result of two sets of mechanisms: a periodic forcing that normally induces only moderate amounts of extinction, and independent incidents or catastrophes that, when coincident with the periodic forcing, amplify its signal and produce major-mass extinctions.
NASA Astrophysics Data System (ADS)
Croke, Jacky; Todd, Peter; Thompson, Chris; Watson, Fiona; Denham, Robert; Khanal, Giri
2013-02-01
Advances in remote sensing and digital terrain processing now allow for a sophisticated analysis of spatial and temporal changes in erosion and deposition. Digital elevation models (DEMs) can now be constructed and differenced to produce DEMs of Difference (DoD), which are used to assess net landscape change for morphological budgeting. To date this has been most effectively achieved in gravel-bed rivers over relatively small spatial scales. If the full potential of the technology is to be realised, additional studies are required at larger scales and across a wider range of geomorphic features. This study presents an assessment of the basin-scale spatial patterns of erosion, deposition, and net morphological change that resulted from a catastrophic flood event in the Lockyer Creek catchment of SE Queensland (SEQ) in January 2011. Multitemporal Light Detection and Ranging (LiDAR) DEMs were used to construct a DoD that was then combined with a one-dimensional flow hydraulic model HEC-RAS to delineate five major geomorphic landforms, including inner-channel area, within-channel benches, macrochannel banks, and floodplain. The LiDAR uncertainties were quantified and applied together with a probabilistic representation of uncertainty thresholded at a conservative 95% confidence interval. The elevation change distribution (ECD) for the 100-km2 study area indicates a magnitude of elevation change spanning almost 10 m but the mean elevation change of 0.04 m confirms that a large part of the landscape was characterised by relatively low magnitude changes over a large spatial area. Mean elevation changes varied by geomorphic feature and only two, the within-channel benches and macrochannel banks, were net erosional with an estimated combined loss of 1,815,149 m3 of sediment. The floodplain was the zone of major net deposition but mean elevation changes approached the defined critical limit of uncertainty. Areal and volumetric ECDs for this extreme event provide a representative expression of the balance between erosion and deposition, and importantly sediment redistribution, which is extremely difficult to quantify using more traditional channel planform or cross-sectional surveys. The ability of LiDAR to make a rapid and accurate assessment of key geomorphic processes over large spatial scales contributes to our understanding of key processes and, as demonstrated here, to the assessment of major geomorphological hazards such as extreme flood events.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
NASA Astrophysics Data System (ADS)
Devendran, A. A.; Lakshmanan, G.
2014-11-01
Data quality for GIS processing and analysis is becoming an increased concern due to the accelerated application of GIS technology for problem solving and decision making roles. Uncertainty in the geographic representation of the real world arises as these representations are incomplete. Identification of the sources of these uncertainties and the ways in which they operate in GIS based representations become crucial in any spatial data representation and geospatial analysis applied to any field of application. This paper reviews the articles on the various components of spatial data quality and various uncertainties inherent in them and special focus is paid to two fields of application such as Urban Simulation and Hydrological Modelling. Urban growth is a complicated process involving the spatio-temporal changes of all socio-economic and physical components at different scales. Cellular Automata (CA) model is one of the simulation models, which randomly selects potential cells for urbanisation and the transition rules evaluate the properties of the cell and its neighbour. Uncertainty arising from CA modelling is assessed mainly using sensitivity analysis including Monte Carlo simulation method. Likewise, the importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures. The Soil and Water Assessment Tool (SWAT) is a continuous time watershed model to evaluate various impacts of land use management and climate on hydrology and water quality. Hydrological model uncertainties using SWAT model are dealt primarily by Generalized Likelihood Uncertainty Estimation (GLUE) method.
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Moradkhani, H.; Marshall, L. A.; Sharma, A.; Geenens, G.
2016-12-01
Effective combination of model simulations and observations through Data Assimilation (DA) depends heavily on uncertainty characterisation. Many traditional methods for quantifying model uncertainty in DA require some level of subjectivity (by way of tuning parameters or by assuming Gaussian statistics). Furthermore, the focus is typically on only estimating the first and second moments. We propose a data-driven methodology to estimate the full distributional form of model uncertainty, i.e. the transition density p(xt|xt-1). All sources of uncertainty associated with the model simulations are considered collectively, without needing to devise stochastic perturbations for individual components (such as model input, parameter and structural uncertainty). A training period is used to derive the distribution of errors in observed variables conditioned on hidden states. Errors in hidden states are estimated from the conditional distribution of observed variables using non-linear optimization. The theory behind the framework and case study applications are discussed in detail. Results demonstrate improved predictions and more realistic uncertainty bounds compared to a standard perturbation approach.
Asymmetric Uncertainty Expression for High Gradient Aerodynamics
NASA Technical Reports Server (NTRS)
Pinier, Jeremy T
2012-01-01
When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.
NASA Astrophysics Data System (ADS)
Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut
2016-11-01
We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.
Uncertainties in radiation effect predictions for the natural radiation environments of space.
McNulty, P J; Stassinopoulos, E G
1994-10-01
Future manned missions beyond low earth orbit require accurate predictions of the risk to astronauts and to critical systems from exposure to ionizing radiation. For low-level exposures, the hazards are dominated by rare single-event phenomena where individual cosmic-ray particles or spallation reactions result in potentially catastrophic changes in critical components. Examples might be a biological lesion leading to cancer in an astronaut or a memory upset leading to an undesired rocket firing. The risks of such events appears to depend on the amount of energy deposited within critical sensitive volumes of biological cells and microelectronic components. The critical environmental information needed to estimate the risks posed by the natural space environments, including solar flares, is the number of times more than a threshold amount of energy for an event will be deposited in the critical microvolumes. These predictions are complicated by uncertainties in the natural environments, particularly the composition of flares, and by the effects of shielding. Microdosimetric data for large numbers of orbits are needed to improve the environmental models and to test the transport codes used to predict event rates.
Source processes for the probabilistic assessment of tsunami hazards
Geist, Eric L.; Lynett, Patrick J.
2014-01-01
The importance of tsunami hazard assessment has increased in recent years as a result of catastrophic consequences from events such as the 2004 Indian Ocean and 2011 Japan tsunamis. In particular, probabilistic tsunami hazard assessment (PTHA) methods have been emphasized to include all possible ways a tsunami could be generated. Owing to the scarcity of tsunami observations, a computational approach is used to define the hazard. This approach includes all relevant sources that may cause a tsunami to impact a site and all quantifiable uncertainty. Although only earthquakes were initially considered for PTHA, recent efforts have also attempted to include landslide tsunami sources. Including these sources into PTHA is considerably more difficult because of a general lack of information on relating landslide area and volume to mean return period. The large variety of failure types and rheologies associated with submarine landslides translates to considerable uncertainty in determining the efficiency of tsunami generation. Resolution of these and several other outstanding problems are described that will further advance PTHA methodologies leading to a more accurate understanding of tsunami hazard.
Uncertainties in radiation effect predictions for the natural radiation environments of space
NASA Technical Reports Server (NTRS)
Mcnulty, P. J.; Stassinopoulos, E. G.
1994-01-01
Future manned missions beyond low earth orbit require accurate predictions of the risk to astronauts and to critical systems from exposure to ionizing radiation. For low-level exposures, the hazards are dominated by rare single-event phenomena where individual cosmic-ray particles or spallation reactions result in potentially catastrophic changes in critical components. Examples might be a biological lesion leading to cancer in an astronaut or a memory upset leading to an undesired rocket firing. The risks of such events appears to depend on the amount of energy deposited within critical sensitive volumes of biological cells and microelectronic components. The critical environmental information needed to estimate the risks posed by the natural space environments, including solar flares, is the number of times more than a threshold amount of energy for an event will be deposited in the critical microvolumes. These predictions are complicated by uncertainties in the natural environments, particularly the composition of flares, and by the effects of shielding. Microdosimetric data for large numbers of orbits are needed to improve the environmental models and to test the transport codes used to predict event rates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gisler, Galen R.; Weaver, R. P.; Mader, Charles L.
Kick-em Jenny, in the Eastern Caribbean, is a submerged volcanic cone that has erupted a dozen or more times since its discovery in 1939. The most likely hazard posed by this volcano is to shipping in the immediate vicinity (through volcanic missiles or loss-of-buoyancy), but it is of interest to estimate upper limits on tsunamis that might be produced by a catastrophic explosive eruption. To this end, we have performed two-dimensional simulations of such an event in a geometry resembling that of Kick-em Jenny with our SAGE adaptive mesh Eulerian multifluid compressible hydrocode. We use realistic equations of state formore » air, water, and basalt, and follow the event from the initial explosive eruption, through the generation of a transient water cavity and the propagation of waves away from the site. We find that even for extremely catastrophic explosive eruptions, tsunamis from Kick-em Jenny are unlikely to pose significant danger to nearby islands. For comparison, we have also performed simulations of explosive eruptions at the much larger shield volcano Vailuluu in the Samoan chain, where the greater energy available can produce a more impressive wave. In general, however, we conclude that explosive eruptions do not couple well to water waves. The waves that are produced from such events are turbulent and highly dissipative, and don't propagate well. This is consistent with what we have found previously in simulations of asteroid-impact generated tsunamis. Non-explosive events, however, such as landslides or gas hydrate releases, do couple well to waves, and our simulations of tsunamis generated by subaerial and sub-aqueous landslides demonstrate this.« less
Heritability of Pain Catastrophizing and Associations with Experimental Pain Outcomes: A Twin Study
Trost, Zina; Strachan, Eric; Sullivan, Michael; Vervoort, Tine; Avery, Ally R.; Afari, Niloofar
2014-01-01
The current study employed a twin paradigm to examine the genetic and environmental contributions to pain catastrophizing as well as the observed association between pain catastrophizing and cold pressor task (CPT) outcomes. Male and female monozygotic (n=206) and dizygotic twins (n=194) from the University of Washington Twin Registry completed a measure of pain catastrophizing and performed a CPT challenge. As expected, pain catastrophizing emerged as a significant predictor of several CPT outcomes, including cold pressor immersion tolerance, pain tolerance, and delayed pain rating. The heritability estimate for pain catastrophizing was found to be 37% with the remaining 63% of variance attributable to unique environmental influence. Additionally, the observed associations between pain catastrophizing and CPT outcomes were not found attributable to shared genetics or environmental exposure, suggesting a direct relationship between catastrophizing and experimental pain outcomes. This study is the first to examine the heritability of pain catastrophizing and potential processes by which pain catastrophizing is related to experimental pain response. PMID:25599234
NASA Astrophysics Data System (ADS)
Rana, Verinder S.
This thesis concerns simulations of Inertial Confinement Fusion. Inertial confinement is carried out in a large scale facility at National Ignition Facility. The experiments have failed to reproduce design calculations, and so uncertainty quantification of calculations is an important asset. Uncertainties can be classified as aleatoric or epistemic. This thesis is concerned with aleatoric uncertainty quantification. Among the many uncertain aspects that affect the simulations, we have narrowed our study of possible uncertainties. The first source of uncertainty we present is the amount of pre-heating of the fuel done by hot electrons. The second source of uncertainty we consider is the effect of the algorithmic and physical transport diffusion and their effect on the hot spot thermodynamics. Physical transport mechanisms play an important role for the entire duration of the ICF capsule, so modeling them correctly becomes extremely vital. In addition, codes that simulate material mixing introduce numerical (algorithmically) generated transport across the material interfaces. This adds another layer of uncertainty in the solution through the artificially added diffusion. The third source of uncertainty we consider is physical model uncertainty. The fourth source of uncertainty we focus on a single localized surface perturbation (a divot) which creates a perturbation to the solution that can potentially enter the hot spot to diminish the thermonuclear environment. Jets of ablator material are hypothesized to enter the hot spot and cool the core, contributing to the observed lower reactions than predicted levels. A plasma transport package, Transport for Inertial Confinement Fusion (TICF) has been implemented into the Radiation Hydrodynamics code FLASH, from the University of Chicago. TICF has thermal, viscous and mass diffusion models that span the entire ICF implosion regime. We introduced a Quantum Molecular Dynamics calibrated thermal conduction model due to Hu for thermal transport. The numerical approximation uncertainties are introduced by the choice of a hydrodynamic solver for a particular flow. Solvers tend to be diffusive at material interfaces and the Front Tracking (FT) algorithm, which is an already available software code in the form of an API, helps to ameliorate such effects. The FT algorithm has also been implemented in FLASH and we use this to study the effect that divots can have on the hot spot properties.
NASA Astrophysics Data System (ADS)
Chen, X.; Huang, G.
2017-12-01
In recent years, distributed hydrological models have been widely used in storm water management, water resources protection and so on. Therefore, how to evaluate the uncertainty of the model reasonably and efficiently becomes a hot topic today. In this paper, the soil and water assessment tool (SWAT) model is constructed for the study area of China's Feilaixia watershed, and the uncertainty of the runoff simulation is analyzed by GLUE method deeply. Taking the initial parameter range of GLUE method as the research core, the influence of different initial parameter ranges on model uncertainty is studied. In this paper, two sets of parameter ranges are chosen as the object of study, the first one (range 1) is recommended by SWAT-CUP and the second one (range 2) is calibrated by SUFI-2. The results showed that under the same number of simulations (10,000 times), the overall uncertainty obtained by the range 2 is less than the range 1. Specifically, the "behavioral" parameter sets for the range 2 is 10000 and for the range 1 is 4448. In the calibration and the validation, the ratio of P-factor to R-factor for range 1 is 1.387 and 1.391, and for range 2 is 1.405 and 1.462 respectively. In addition, the simulation result of range 2 is better with the NS and R2 slightly higher than range 1. Therefore, it can be concluded that using the parameter range calibrated by SUFI-2 as the initial parameter range for the GLUE is a way to effectively capture and evaluate the simulation uncertainty.
Business Return in New Orleans: Decision Making Amid Post-Katrina Uncertainty
Lam, Nina S. N.; Pace, Kelley; Campanella, Richard; LeSage, James; Arenas, Helbert
2009-01-01
Background Empirical observations on how businesses respond after a major catastrophe are rare, especially for a catastrophe as great as Hurricane Katrina, which hit New Orleans, Louisiana on August 29, 2005. We analyzed repeated telephone surveys of New Orleans businesses conducted in December 2005, June 2006, and October 2007 to understand factors that influenced decisions to re-open amid post-disaster uncertainty. Methodology/Principal Findings Businesses in the group of professional, scientific, and technical services reopened the fastest in the near term, but differences in the rate of reopening for businesses stratified by type became indistinguishable in the longer term (around two years later). A reopening rate of 65% was found for all businesses by October 2007. Discriminant analysis showed significant differences in responses reflecting their attitudes about important factors between businesses that reopened and those that did not. Businesses that remained closed at the time of our third survey (two years after Katrina) ranked levee protection as the top concern immediately after Katrina, but damage to their premises and financing became major concerns in subsequent months reflected in the later surveys. For businesses that had opened (at the time of our third survey), infrastructure protection including levee, utility, and communications were the main concerns mentioned in surveys up to the third survey, when the issue of crime became their top concern. Conclusions/Significance These findings underscore the need to have public policy and emergency plans in place prior to the actual disaster, such as infrastructure protection, so that the policy can be applied in a timely manner before business decisions to return or close are made. Our survey results, which include responses from both open and closed businesses, overcome the “survivorship bias” problem and provide empirical observations that should be useful to improve micro-level spatial economic modeling of factors that influence business return decisions. PMID:19707547
Experimental and modeling uncertainties in the validation of lower hybrid current drive
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poli, F. M.; Bonoli, P. T.; Chilenski, M.
Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less
Experimental and modeling uncertainties in the validation of lower hybrid current drive
Poli, F. M.; Bonoli, P. T.; Chilenski, M.; ...
2016-07-28
Our work discusses sources of uncertainty in the validation of lower hybrid wave current drive simulations against experiments, by evolving self-consistently the magnetic equilibrium and the heating and current drive profiles, calculated with a combined toroidal ray tracing code and 3D Fokker–Planck solver. The simulations indicate a complex interplay of elements, where uncertainties in the input plasma parameters, in the models and in the transport solver combine and compensate each other, at times. It is concluded that ray-tracing calculations should include a realistic representation of the density and temperature in the region between the confined plasma and the wall, whichmore » is especially important in regimes where the LH waves are weakly damped and undergo multiple reflections from the plasma boundary. Uncertainties introduced in the processing of diagnostic data as well as uncertainties introduced by model approximations are assessed. We show that, by comparing the evolution of the plasma parameters in self-consistent simulations with available data, inconsistencies can be identified and limitations in the models or in the experimental data assessed.« less
Ensemble Bayesian forecasting system Part I: Theory and algorithms
NASA Astrophysics Data System (ADS)
Herr, Henry D.; Krzysztofowicz, Roman
2015-05-01
The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of predictand, possesses a Bayesian coherence property, constitutes a random sample of the predictand, and has an acceptable sampling error-which makes it suitable for rational decision making under uncertainty.
USDA-ARS?s Scientific Manuscript database
Agricultural system models have become important tools in studying water and nitrogen (N) dynamics, as well as crop growth, under different management practices. Complexity in input parameters often leads to significant uncertainty when simulating dynamic processes such as nitrate leaching or crop y...
New NREL Method Reduces Uncertainty in Photovoltaic Module Calibrations |
calibration traceability to certified test laboratories. This reliable calibration, in turn, determines the of a spire flash simulator, SOMS outdoor test bed, and LACSS continuous simulator. In NREL's Cell and % (k=2 coverage factor). This value is the lowest reported Pmax uncertainty of any accredited test
NASA Astrophysics Data System (ADS)
Constantine, P. G.; Emory, M.; Larsson, J.; Iaccarino, G.
2015-12-01
We present a computational analysis of the reactive flow in a hypersonic scramjet engine with focus on effects of uncertainties in the operating conditions. We employ a novel methodology based on active subspaces to characterize the effects of the input uncertainty on the scramjet performance. The active subspace identifies one-dimensional structure in the map from simulation inputs to quantity of interest that allows us to reparameterize the operating conditions; instead of seven physical parameters, we can use a single derived active variable. This dimension reduction enables otherwise infeasible uncertainty quantification, considering the simulation cost of roughly 9500 CPU-hours per run. For two values of the fuel injection rate, we use a total of 68 simulations to (i) identify the parameters that contribute the most to the variation in the output quantity of interest, (ii) estimate upper and lower bounds on the quantity of interest, (iii) classify sets of operating conditions as safe or unsafe corresponding to a threshold on the output quantity of interest, and (iv) estimate a cumulative distribution function for the quantity of interest.
Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS
Brown, C. S.; Zhang, Hongbin
2016-05-24
Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less
Farrance, Ian; Frenkel, Robert
2014-01-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand. PMID:24659835
Farrance, Ian; Frenkel, Robert
2014-02-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship and contribute to the combined standard uncertainty of the measurand.
Towards quantifying uncertainty in predictions of Amazon 'dieback'.
Huntingford, Chris; Fisher, Rosie A; Mercado, Lina; Booth, Ben B B; Sitch, Stephen; Harris, Phil P; Cox, Peter M; Jones, Chris D; Betts, Richard A; Malhi, Yadvinder; Harris, Glen R; Collins, Mat; Moorcroft, Paul
2008-05-27
Simulations with the Hadley Centre general circulation model (HadCM3), including carbon cycle model and forced by a 'business-as-usual' emissions scenario, predict a rapid loss of Amazonian rainforest from the middle of this century onwards. The robustness of this projection to both uncertainty in physical climate drivers and the formulation of the land surface scheme is investigated. We analyse how the modelled vegetation cover in Amazonia responds to (i) uncertainty in the parameters specified in the atmosphere component of HadCM3 and their associated influence on predicted surface climate. We then enhance the land surface description and (ii) implement a multilayer canopy light interception model and compare with the simple 'big-leaf' approach used in the original simulations. Finally, (iii) we investigate the effect of changing the method of simulating vegetation dynamics from an area-based model (TRIFFID) to a more complex size- and age-structured approximation of an individual-based model (ecosystem demography). We find that the loss of Amazonian rainforest is robust across the climate uncertainty explored by perturbed physics simulations covering a wide range of global climate sensitivity. The introduction of the refined light interception model leads to an increase in simulated gross plant carbon uptake for the present day, but, with altered respiration, the net effect is a decrease in net primary productivity. However, this does not significantly affect the carbon loss from vegetation and soil as a consequence of future simulated depletion in soil moisture; the Amazon forest is still lost. The introduction of the more sophisticated dynamic vegetation model reduces but does not halt the rate of forest dieback. The potential for human-induced climate change to trigger the loss of Amazon rainforest appears robust within the context of the uncertainties explored in this paper. Some further uncertainties should be explored, particularly with respect to the representation of rooting depth.
A polynomial chaos approach to the analysis of vehicle dynamics under uncertainty
NASA Astrophysics Data System (ADS)
Kewlani, Gaurav; Crawford, Justin; Iagnemma, Karl
2012-05-01
The ability of ground vehicles to quickly and accurately analyse their dynamic response to a given input is critical to their safety and efficient autonomous operation. In field conditions, significant uncertainty is associated with terrain and/or vehicle parameter estimates, and this uncertainty must be considered in the analysis of vehicle motion dynamics. Here, polynomial chaos approaches that explicitly consider parametric uncertainty during modelling of vehicle dynamics are presented. They are shown to be computationally more efficient than the standard Monte Carlo scheme, and experimental results compared with the simulation results performed on ANVEL (a vehicle simulator) indicate that the method can be utilised for efficient and accurate prediction of vehicle motion in realistic scenarios.
Probabilistic simulation of the human factor in structural reliability
NASA Technical Reports Server (NTRS)
Shah, Ashwin R.; Chamis, Christos C.
1991-01-01
Many structural failures have occasionally been attributed to human factors in engineering design, analyses maintenance, and fabrication processes. Every facet of the engineering process is heavily governed by human factors and the degree of uncertainty associated with them. Factors such as societal, physical, professional, psychological, and many others introduce uncertainties that significantly influence the reliability of human performance. Quantifying human factors and associated uncertainties in structural reliability require: (1) identification of the fundamental factors that influence human performance, and (2) models to describe the interaction of these factors. An approach is being developed to quantify the uncertainties associated with the human performance. This approach consists of a multi factor model in conjunction with direct Monte-Carlo simulation.
NASA Astrophysics Data System (ADS)
Swallow, B.; Rigby, M. L.; Rougier, J.; Manning, A.; Thomson, D.; Webster, H. N.; Lunt, M. F.; O'Doherty, S.
2016-12-01
In order to understand underlying processes governing environmental and physical phenomena, a complex mathematical model is usually required. However, there is an inherent uncertainty related to the parameterisation of unresolved processes in these simulators. Here, we focus on the specific problem of accounting for uncertainty in parameter values in an atmospheric chemical transport model. Systematic errors introduced by failing to account for these uncertainties have the potential to have a large effect on resulting estimates in unknown quantities of interest. One approach that is being increasingly used to address this issue is known as emulation, in which a large number of forward runs of the simulator are carried out, in order to approximate the response of the output to changes in parameters. However, due to the complexity of some models, it is often unfeasible to run large numbers of training runs that is usually required for full statistical emulators of the environmental processes. We therefore present a simplified model reduction method for approximating uncertainties in complex environmental simulators without the need for very large numbers of training runs. We illustrate the method through an application to the Met Office's atmospheric transport model NAME. We show how our parameter estimation framework can be incorporated into a hierarchical Bayesian inversion, and demonstrate the impact on estimates of UK methane emissions, using atmospheric mole fraction data. We conclude that accounting for uncertainties in the parameterisation of complex atmospheric models is vital if systematic errors are to be minimized and all relevant uncertainties accounted for. We also note that investigations of this nature can prove extremely useful in highlighting deficiencies in the simulator that might otherwise be missed.
Effects of input uncertainty on cross-scale crop modeling
NASA Astrophysics Data System (ADS)
Waha, Katharina; Huth, Neil; Carberry, Peter
2014-05-01
The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input data from very little to very detailed information, and compare the models' abilities to represent the spatial variability and temporal variability in crop yields. We display the uncertainty in crop yield simulations from different input data and crop models in Taylor diagrams which are a graphical summary of the similarity between simulations and observations (Taylor, 2001). The observed spatial variability can be represented well from both models (R=0.6-0.8) but APSIM predicts higher spatial variability than LPJmL due to its sensitivity to soil parameters. Simulations with the same crop model, climate and sowing dates have similar statistics and therefore similar skill to reproduce the observed spatial variability. Soil data is less important for the skill of a crop model to reproduce the observed spatial variability. However, the uncertainty in simulated spatial variability from the two crop models is larger than from input data settings and APSIM is more sensitive to input data then LPJmL. Even with a detailed, point-scale crop model and detailed input data it is difficult to capture the complexity and diversity in maize cropping systems.
NASA Astrophysics Data System (ADS)
Panthou, Gérémy; Vrac, Mathieu; Drobinski, Philippe; Bastin, Sophie; Somot, Samuel; Li, Laurent
2015-04-01
As regularly stated by numerous authors, the Mediterranean climate is considered as one major climate 'hot spot'. At least, three reasons may explain this statement. First, this region is known for being regularly affected by extreme hydro-meteorological events (heavy precipitation and flash-floods during the autumn season; droughts and heat waves during spring and summer). Second, the vulnerability of populations in regard of these extreme events is expected to increase during the XXIst century (at least due to the projected population growth in this region). At last, Global Circulation Models project that this regional climate will be highly sensitive to climate change. Moreover, global warming is expected to intensify the hydrological cycle and thus to increase the frequency of extreme hydro-meteorological events. In order to propose adaptation strategies, the robust estimation of the future evolution of the Mediterranean climate and the associated extreme hydro-meteorological events (in terms of intensity/frequency) is of great relevance. However, these projections are characterized by large uncertainties. Many components of the simulation chain can explain these large uncertainties : (i) uncertainties concerning the emission scenario; (ii) climate model simulations suffer of parametrization errors and uncertainties concerning the initial state of the climate; and (iii) the additional uncertainties given by the (dynamical or statistical) downscaling techniques and the impact model. Narrowing (as fine as possible) these uncertainties is a major challenge of the actual climate research. One way for that is to reduce the uncertainties associated with each component. In this study, we are interested in evaluating the potential improvement of : (i) coupled RCM simulations (with the Mediterranean Sea) in comparison with atmosphere only (stand-alone) RCM simulations and (ii) RCM simulations at a finer resolution in comparison with larger resolution. For that, three different RCMs (WRF, ALADIN, LMDZ4) were run, forced by ERA-Interim reanalyses, within the MED-CORDEX experiment. For each RCM, different versions (coupled/stand-alone, high/low resolution) were realized. A large set of scores was developed and applied in order to evaluate the performances of these different RCMs simulations. These scores were applied for three variables (daily precipitation amount, mean daily air temperature and the dry spell lengths). A particular attention was given to the RCM capability to reproduce the seasonal and spatial pattern of extreme statistics. Results show that the differences between coupled and stand-alone RCMs are localized very near the Mediterranean sea and that the model resolution has a slight impact on the scores obtained. Globally, the main differences between the RCM simulations come from the RCM used. Keywords: Mediterranean climate, extreme hydro-meteorological events, RCM simulations, evaluation of climate simulations
NASA Astrophysics Data System (ADS)
Cronkite-Ratcliff, C.; Phelps, G. A.; Boucher, A.
2011-12-01
In many geologic settings, the pathways of groundwater flow are controlled by geologic heterogeneities which have complex geometries. Models of these geologic heterogeneities, and consequently, their effects on the simulated pathways of groundwater flow, are characterized by uncertainty. Multiple-point geostatistics, which uses a training image to represent complex geometric descriptions of geologic heterogeneity, provides a stochastic approach to the analysis of geologic uncertainty. Incorporating multiple-point geostatistics into numerical models provides a way to extend this analysis to the effects of geologic uncertainty on the results of flow simulations. We present two case studies to demonstrate the application of multiple-point geostatistics to numerical flow simulation in complex geologic settings with both static and dynamic conditioning data. Both cases involve the development of a training image from a complex geometric description of the geologic environment. Geologic heterogeneity is modeled stochastically by generating multiple equally-probable realizations, all consistent with the training image. Numerical flow simulation for each stochastic realization provides the basis for analyzing the effects of geologic uncertainty on simulated hydraulic response. The first case study is a hypothetical geologic scenario developed using data from the alluvial deposits in Yucca Flat, Nevada. The SNESIM algorithm is used to stochastically model geologic heterogeneity conditioned to the mapped surface geology as well as vertical drill-hole data. Numerical simulation of groundwater flow and contaminant transport through geologic models produces a distribution of hydraulic responses and contaminant concentration results. From this distribution of results, the probability of exceeding a given contaminant concentration threshold can be used as an indicator of uncertainty about the location of the contaminant plume boundary. The second case study considers a characteristic lava-flow aquifer system in Pahute Mesa, Nevada. A 3D training image is developed by using object-based simulation of parametric shapes to represent the key morphologic features of rhyolite lava flows embedded within ash-flow tuffs. In addition to vertical drill-hole data, transient pressure head data from aquifer tests can be used to constrain the stochastic model outcomes. The use of both static and dynamic conditioning data allows the identification of potential geologic structures that control hydraulic response. These case studies demonstrate the flexibility of the multiple-point geostatistics approach for considering multiple types of data and for developing sophisticated models of geologic heterogeneities that can be incorporated into numerical flow simulations.
NASA Astrophysics Data System (ADS)
Sanderson, B. M.
2017-12-01
The CMIP ensembles represent the most comprehensive source of information available to decision-makers for climate adaptation, yet it is clear that there are fundamental limitations in our ability to treat the ensemble as an unbiased sample of possible future climate trajectories. There is considerable evidence that models are not independent, and increasing complexity and resolution combined with computational constraints prevent a thorough exploration of parametric uncertainty or internal variability. Although more data than ever is available for calibration, the optimization of each model is influenced by institutional priorities, historical precedent and available resources. The resulting ensemble thus represents a miscellany of climate simulators which defy traditional statistical interpretation. Models are in some cases interdependent, but are sufficiently complex that the degree of interdependency is conditional on the application. Configurations have been updated using available observations to some degree, but not in a consistent or easily identifiable fashion. This means that the ensemble cannot be viewed as a true posterior distribution updated by available data, but nor can observational data alone be used to assess individual model likelihood. We assess recent literature for combining projections from an imperfect ensemble of climate simulators. Beginning with our published methodology for addressing model interdependency and skill in the weighting scheme for the 4th US National Climate Assessment, we consider strategies for incorporating process-based constraints on future response, perturbed parameter experiments and multi-model output into an integrated framework. We focus on a number of guiding questions: Is the traditional framework of confidence in projections inferred from model agreement leading to biased or misleading conclusions? Can the benefits of upweighting skillful models be reconciled with the increased risk of truth lying outside the weighted ensemble distribution? If CMIP is an ensemble of partially informed best-guesses, can we infer anything about the parent distribution of all possible models of the climate system (and if not, are we implicitly under-representing the risk of a climate catastrophe outside of the envelope of CMIP simulations)?
Hao, Guozhu
2016-01-01
A water traffic system is a huge, nonlinear, complex system, and its stability is affected by various factors. Water traffic accidents can be considered to be a kind of mutation of a water traffic system caused by the coupling of multiple navigational environment factors. In this study, the catastrophe theory, principal component analysis (PCA), and multivariate statistics are integrated to establish a situation recognition model for a navigational environment with the aim of performing a quantitative analysis of the situation of this environment via the extraction and classification of its key influencing factors; in this model, the natural environment and traffic environment are considered to be two control variables. The Three Gorges Reservoir area of the Yangtze River is considered as an example, and six critical factors, i.e., the visibility, wind, current velocity, route intersection, channel dimension, and traffic flow, are classified into two principal components: the natural environment and traffic environment. These two components are assumed to have the greatest influence on the navigation risk. Then, the cusp catastrophe model is employed to identify the safety situation of the regional navigational environment in the Three Gorges Reservoir area. The simulation results indicate that the situation of the navigational environment of this area is gradually worsening from downstream to upstream. PMID:27391057
Jiang, Dan; Hao, Guozhu; Huang, Liwen; Zhang, Dan
2016-01-01
A water traffic system is a huge, nonlinear, complex system, and its stability is affected by various factors. Water traffic accidents can be considered to be a kind of mutation of a water traffic system caused by the coupling of multiple navigational environment factors. In this study, the catastrophe theory, principal component analysis (PCA), and multivariate statistics are integrated to establish a situation recognition model for a navigational environment with the aim of performing a quantitative analysis of the situation of this environment via the extraction and classification of its key influencing factors; in this model, the natural environment and traffic environment are considered to be two control variables. The Three Gorges Reservoir area of the Yangtze River is considered as an example, and six critical factors, i.e., the visibility, wind, current velocity, route intersection, channel dimension, and traffic flow, are classified into two principal components: the natural environment and traffic environment. These two components are assumed to have the greatest influence on the navigation risk. Then, the cusp catastrophe model is employed to identify the safety situation of the regional navigational environment in the Three Gorges Reservoir area. The simulation results indicate that the situation of the navigational environment of this area is gradually worsening from downstream to upstream.
Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen
2017-08-23
To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification.
Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen
2017-01-01
To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification. PMID:28832518
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.
Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations.more » Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes. - Highlights: • Proposed a physics–informed framework to quantify uncertainty in RANS simulations. • Framework incorporates physical prior knowledge and observation data. • Based on a rigorous Bayesian framework yet fully utilizes physical model. • Applicable for many complex physical systems beyond turbulent flows.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picard, Richard Roy; Bhat, Kabekode Ghanasham
2017-07-18
We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.
A Reliability Estimation in Modeling Watershed Runoff With Uncertainties
NASA Astrophysics Data System (ADS)
Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.
1990-10-01
The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.
Coronal Flux Rope Catastrophe Associated With Internal Energy Release
NASA Astrophysics Data System (ADS)
Zhuang, Bin; Hu, Youqiu; Wang, Yuming; Zhang, Quanhao; Liu, Rui; Gou, Tingyu; Shen, Chenglong
2018-04-01
Magnetic energy during the catastrophe was predominantly studied by the previous catastrophe works since it is believed to be the main energy supplier for the solar eruptions. However, the contribution of other types of energies during the catastrophe cannot be neglected. This paper studies the catastrophe of the coronal flux rope system in the solar wind background, with emphasis on the transformation of different types of energies during the catastrophe. The coronal flux rope is characterized by its axial and poloidal magnetic fluxes and total mass. It is shown that a catastrophe can be triggered by not only an increase but also a decrease of the axial magnetic flux. Moreover, the internal energy of the rope is found to be released during the catastrophe so as to provide energy for the upward eruption of the flux rope. As far as the magnetic energy is concerned, it provides only part of the energy release, or even increases during the catastrophe, so the internal energy may act as the dominant or even the unique energy supplier during the catastrophe.
Dakota Uncertainty Quantification Methods Applied to the CFD code Nek5000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delchini, Marc-Olivier; Popov, Emilian L.; Pointer, William David
This report presents the state of advancement of a Nuclear Energy Advanced Modeling and Simulation (NEAMS) project to characterize the uncertainty of the computational fluid dynamics (CFD) code Nek5000 using the Dakota package for flows encountered in the nuclear engineering industry. Nek5000 is a high-order spectral element CFD code developed at Argonne National Laboratory for high-resolution spectral-filtered large eddy simulations (LESs) and unsteady Reynolds-averaged Navier-Stokes (URANS) simulations.
Phipps, Eric T.; D'Elia, Marta; Edwards, Harold C.; ...
2017-04-18
In this study, quantifying simulation uncertainties is a critical component of rigorous predictive simulation. A key component of this is forward propagation of uncertainties in simulation input data to output quantities of interest. Typical approaches involve repeated sampling of the simulation over the uncertain input data, and can require numerous samples when accurately propagating uncertainties from large numbers of sources. Often simulation processes from sample to sample are similar and much of the data generated from each sample evaluation could be reused. We explore a new method for implementing sampling methods that simultaneously propagates groups of samples together in anmore » embedded fashion, which we call embedded ensemble propagation. We show how this approach takes advantage of properties of modern computer architectures to improve performance by enabling reuse between samples, reducing memory bandwidth requirements, improving memory access patterns, improving opportunities for fine-grained parallelization, and reducing communication costs. We describe a software technique for implementing embedded ensemble propagation based on the use of C++ templates and describe its integration with various scientific computing libraries within Trilinos. We demonstrate improved performance, portability and scalability for the approach applied to the simulation of partial differential equations on a variety of CPU, GPU, and accelerator architectures, including up to 131,072 cores on a Cray XK7 (Titan).« less
The Hungtsaiping landslide:A kinematic model based on morphology
NASA Astrophysics Data System (ADS)
Huang, W.-K.; Chu, H.-K.; Lo, C.-M.; Lin, M.-L.
2012-04-01
A large and deep-seated landslide at Hungtsaiping was triggered by the 7.3 magnitude 1999 Chi-Chi earthquake. Extensive site investigations of the landslide were conducted including field reconnaissance, geophysical exploration, borehole logs, and laboratory experiments. Thick colluvium was found around the landslide area and indicated the occurrence of a large ancient landslide. This study presents the catastrophic landslide event which occurred during the Chi-Chi earthquake. The mechanism of the 1999 landslide which cannot be revealed by the underground exploration data alone, is clarified. This research include investigations of the landslide kinematic process and the deposition geometry. A 3D discrete element method (program), PFC3D, was used to model the kinematic process that led to the landslide. The proposed procedure enables a rational and efficient way to simulate the landslide dynamic process. Key word: Hungtsaiping catastrophic landslide, kinematic process, deposition geometry, discrete element method
Dynamic impacts of a catastrophic production event: the foot-and-mouth disease case.
Cordier, Alexandre; Gohin, Jean; Krebs, Stephane; Rault, Arnaud
2013-03-01
In foot-and-mouth disease (FMD) free countries, the occurrence of an FMD outbreak is a rare event with potentially large economic losses. We explore the dynamic effects of an FMD outbreak on market variables and economic surplus taking into account the largely neglected issue of farm bankruptcy. Simulations are performed on a stylized agricultural economy, which is a net exporter before the outbreak. We find complex dynamic market effects when the farm credit market suffers from information imperfections leading to farm closure. Welfare effects are also dramatically altered. Domestic consumers may lose in the long run from an FMD outbreak because domestic supply contracts. On the other hand, farmers able to resist this event may ultimately gain. Our analysis also shows that these effects are not monotone, making any efficient policy response to this catastrophic event quite challenging. © 2012 Society for Risk Analysis.
Catastrophic depolymerization of microtubules driven by subunit shape change
Bollinger, Jonathan A.; Stevens, Mark J.
2018-01-17
We report that microtubules exhibit a dynamic instability between growth and catastrophic depolymerization. GTP-tubulin (αβ-dimer bound to GTP) self-assembles, but dephosphorylation of GTP- to GDP-tubulin within the tubule results in destabilization. While the mechanical basis for destabilization is not fully understood, one hypothesis is that dephosphorylation causes tubulin to change shape, frustrating bonds and generating stress. To test this idea, we perform molecular dynamics simulations of microtubules built from coarse-grained models of tubulin, incorporating a small compression of α-subunits associated with dephosphorylation in experiments. We find that this shape change induces depolymerization of otherwise stable systems via unpeeling “ram's horns”more » characteristic of microtubules. Depolymerization can be averted by caps with uncompressed α-subunits, i.e., GTP-rich end regions. Thus, the shape change is sufficient to yield microtubule behavior.« less
Simon, Steven L; Hoffman, F Owen; Hofer, Eduard
2015-01-01
Retrospective dose estimation, particularly dose reconstruction that supports epidemiological investigations of health risk, relies on various strategies that include models of physical processes and exposure conditions with detail ranging from simple to complex. Quantification of dose uncertainty is an essential component of assessments for health risk studies since, as is well understood, it is impossible to retrospectively determine the true dose for each person. To address uncertainty in dose estimation, numerical simulation tools have become commonplace and there is now an increased understanding about the needs and what is required for models used to estimate cohort doses (in the absence of direct measurement) to evaluate dose response. It now appears that for dose-response algorithms to derive the best, unbiased estimate of health risk, we need to understand the type, magnitude and interrelationships of the uncertainties of model assumptions, parameters and input data used in the associated dose estimation models. Heretofore, uncertainty analysis of dose estimates did not always properly distinguish between categories of errors, e.g., uncertainty that is specific to each subject (i.e., unshared error), and uncertainty of doses from a lack of understanding and knowledge about parameter values that are shared to varying degrees by numbers of subsets of the cohort. While mathematical propagation of errors by Monte Carlo simulation methods has been used for years to estimate the uncertainty of an individual subject's dose, it was almost always conducted without consideration of dependencies between subjects. In retrospect, these types of simple analyses are not suitable for studies with complex dose models, particularly when important input data are missing or otherwise not available. The dose estimation strategy presented here is a simulation method that corrects the previous deficiencies of analytical or simple Monte Carlo error propagation methods and is termed, due to its capability to maintain separation between shared and unshared errors, the two-dimensional Monte Carlo (2DMC) procedure. Simply put, the 2DMC method simulates alternative, possibly true, sets (or vectors) of doses for an entire cohort rather than a single set that emerges when each individual's dose is estimated independently from other subjects. Moreover, estimated doses within each simulated vector maintain proper inter-relationships such that the estimated doses for members of a cohort subgroup that share common lifestyle attributes and sources of uncertainty are properly correlated. The 2DMC procedure simulates inter-individual variability of possibly true doses within each dose vector and captures the influence of uncertainty in the values of dosimetric parameters across multiple realizations of possibly true vectors of cohort doses. The primary characteristic of the 2DMC approach, as well as its strength, are defined by the proper separation between uncertainties shared by members of the entire cohort or members of defined cohort subsets, and uncertainties that are individual-specific and therefore unshared.
AGR-1 Thermocouple Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeff Einerson
2012-05-01
This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less
The western arctic linkage experiment (WALE): overview and synthesis
A.D. McGuire; J. Walsh; J.S. Kimball; J.S. Clein; S.E. Euskirdhen; S. Drobot; U.C. Herzfeld; J. Maslanik; R.B. Lammers; M.A. Rawlins; C.J. Vorosmarty; T.S. Rupp; W. Wu; M. Calef
2008-01-01
The primary goal of the Western Arctic Linkage Experiment (WALE) was to better understand uncertainties of simulated hydrologic and ecosystem dynamics of the western Arctic in the context of 1) uncertainties in the data available to drive the models and 2) different approaches to simulating regional hydrology and ecosystem dynamics. Analyses of datasets on climate...
NASA Astrophysics Data System (ADS)
Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.
2012-01-01
Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate systems through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Land surface models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we analyzed the Harvard Forest phenology record to investigate and characterize the sources of uncertainty in phenological forecasts and the subsequent impacts on model forecasts of carbon and water cycling in the future. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species with 12 phenological models of different complexity to predict leaf bud-burst. The evaluation of different phenological models indicated support for spring warming models with photoperiod limitations and, though to a lesser extent, to chilling models based on the alternating model structure. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% CI: 2.4 day century-1 for scenario B1 and 4.5 day century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 day century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied somewhat among models (±7.7 day century-1 for A1fi, ±3.6 day century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 day °C-1 and 5.2 day °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated carbon and water fluxes using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of the seasonality of processes, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and evapotranspiration (ET) of 9.6% and 2.9% respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, uncertainties related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.
NASA Astrophysics Data System (ADS)
Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C. J.
2016-11-01
Despite their well-known limitations, Reynolds-Averaged Navier-Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has potential implications in many fields in which the governing equations are well understood but the model uncertainty comes from unresolved physical processes.
Origins of the rings of Uranus and Neptune. I - Statistics of satellite disruptions
NASA Technical Reports Server (NTRS)
Colwell, Joshua E.; Esposito, Larry W.
1992-01-01
The origin of the rings of Uranus and Neptune is considered by performing two types of stochastic simulations of the collisional history of small moons: Monte Carlo simulations in which only the largest surviving fragments from each disruption is followed, and a Markov chain approach which makes it possible to follow the size distribution from each disruption to arbitrarily small sizes. Results indicate that the population of small satellites around Uranus and Neptune have evolved through catastrophic fragmentation since the end of planet and satellite formation 3 to 4 billion years ago.
Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.
2017-12-01
Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data. Performance of the statistical model is illustrated through comparisons of generated realizations with the `true' numerical simulations. Finally, we demonstrate how these realizations can be used to determine statistically optimal locations for further interrogation of the subsurface.
Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas
NASA Astrophysics Data System (ADS)
Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.
In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.
Stenemo, Fredrik; Jarvis, Nicholas
2007-09-01
A simulation tool for site-specific vulnerability assessments of pesticide leaching to groundwater was developed, based on the pesticide fate and transport model MACRO, parameterized using pedotransfer functions and reasonable worst-case parameter values. The effects of uncertainty in the pedotransfer functions on simulation results were examined for 48 combinations of soils, pesticides and application timings, by sampling pedotransfer function regression errors and propagating them through the simulation model in a Monte Carlo analysis. An uncertainty factor, f(u), was derived, defined as the ratio between the concentration simulated with no errors, c(sim), and the 80th percentile concentration for the scenario. The pedotransfer function errors caused a large variation in simulation results, with f(u) ranging from 1.14 to 1440, with a median of 2.8. A non-linear relationship was found between f(u) and c(sim), which can be used to account for parameter uncertainty by correcting the simulated concentration, c(sim), to an estimated 80th percentile value. For fine-textured soils, the predictions were most sensitive to errors in the pedotransfer functions for two parameters regulating macropore flow (the saturated matrix hydraulic conductivity, K(b), and the effective diffusion pathlength, d) and two water retention function parameters (van Genuchten's N and alpha parameters). For coarse-textured soils, the model was also sensitive to errors in the exponent in the degradation water response function and the dispersivity, in addition to K(b), but showed little sensitivity to d. To reduce uncertainty in model predictions, improved pedotransfer functions for K(b), d, N and alpha would therefore be most useful. 2007 Society of Chemical Industry
NASA Astrophysics Data System (ADS)
Arnaud, Patrick; Cantet, Philippe; Odry, Jean
2017-11-01
Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with the use of a statistical law with two parameters (here generalised extreme value Type I distribution) and clearly lower than those associated with the use of a three-parameter law (here generalised extreme value Type II distribution). For extreme flood quantiles, the uncertainties are mostly due to the rainfall generator because of the progressive saturation of the hydrological model.
NASA Astrophysics Data System (ADS)
Lee, K. David; Colony, Mike
2011-06-01
Modeling and simulation has been established as a cost-effective means of supporting the development of requirements, exploring doctrinal alternatives, assessing system performance, and performing design trade-off analysis. The Army's constructive simulation for the evaluation of equipment effectiveness in small combat unit operations is currently limited to representation of situation awareness without inclusion of the many uncertainties associated with real world combat environments. The goal of this research is to provide an ability to model situation awareness and decision process uncertainties in order to improve evaluation of the impact of battlefield equipment on ground soldier and small combat unit decision processes. Our Army Probabilistic Inference and Decision Engine (Army-PRIDE) system provides this required uncertainty modeling through the application of two critical techniques that allow Bayesian network technology to be applied to real-time applications. (Object-Oriented Bayesian Network methodology and Object-Oriented Inference technique). In this research, we implement decision process and situation awareness models for a reference scenario using Army-PRIDE and demonstrate its ability to model a variety of uncertainty elements, including: confidence of source, information completeness, and information loss. We also demonstrate that Army-PRIDE improves the realism of the current constructive simulation's decision processes through Monte Carlo simulation.
Xue, Lianqing; Yang, Fan; Yang, Changbing; Wei, Guanghui; Li, Wenqian; He, Xinlin
2018-01-11
Understanding the mechanism of complicated hydrological processes is important for sustainable management of water resources in an arid area. This paper carried out the simulations of water movement for the Manas River Basin (MRB) using the improved semi-distributed Topographic hydrologic model (TOPMODEL) with a snowmelt model and topographic index algorithm. A new algorithm is proposed to calculate the curve of topographic index using internal tangent circle on a conical surface. Based on the traditional model, the improved indicator of temperature considered solar radiation is used to calculate the amount of snowmelt. The uncertainty of parameters for the TOPMODEL model was analyzed using the generalized likelihood uncertainty estimation (GLUE) method. The proposed model shows that the distribution of the topographic index is concentrated in high mountains, and the accuracy of runoff simulation has certain enhancement by considering radiation. Our results revealed that the performance of the improved TOPMODEL is acceptable and comparable to runoff simulation in the MRB. The uncertainty of the simulations resulted from the parameters and structures of model, climatic and anthropogenic factors. This study is expected to serve as a valuable complement for widely application of TOPMODEL and identify the mechanism of hydrological processes in arid area.
Sturgeon, John A.; Johnson, Kevin A.
2017-01-01
Pain catastrophizing, a pattern of negative cognitive-emotional responses to actual or anticipated pain, maintains chronic pain and undermines response to treatments. Currently, precisely how pain catastrophizing influences pain processing is not well understood. In experimental settings, pain catastrophizing has been associated with amplified pain processing. This study sought to clarify pain processing mechanisms via experimental induction of pain catastrophizing. Forty women with chronic low back pain were assigned in blocks to an experimental condition, either a psychologist-led 10-minute pain catastrophizing induction or a control (10-minute rest period). All participants underwent a baseline round of several quantitative sensory testing (QST) tasks, followed by the pain catastrophizing induction or the rest period, and then a second round of the same QST tasks. The catastrophizing induction appeared to increase state pain catastrophizing levels. Changes in QST pain were detected for two of the QST tasks administered, weighted pin pain and mechanical allodynia. Although there is a need to replicate our preliminary results with a larger sample, study findings suggest a potential relationship between induced pain catastrophizing and central sensitization of pain. Clarification of the mechanisms through which catastrophizing affects pain modulatory systems may yield useful clinical insights into the treatment of chronic pain. PMID:28348505
Taub, Chloe J; Sturgeon, John A; Johnson, Kevin A; Mackey, Sean C; Darnall, Beth D
2017-01-01
Pain catastrophizing, a pattern of negative cognitive-emotional responses to actual or anticipated pain, maintains chronic pain and undermines response to treatments. Currently, precisely how pain catastrophizing influences pain processing is not well understood. In experimental settings, pain catastrophizing has been associated with amplified pain processing. This study sought to clarify pain processing mechanisms via experimental induction of pain catastrophizing. Forty women with chronic low back pain were assigned in blocks to an experimental condition, either a psychologist-led 10-minute pain catastrophizing induction or a control (10-minute rest period). All participants underwent a baseline round of several quantitative sensory testing (QST) tasks, followed by the pain catastrophizing induction or the rest period, and then a second round of the same QST tasks. The catastrophizing induction appeared to increase state pain catastrophizing levels. Changes in QST pain were detected for two of the QST tasks administered, weighted pin pain and mechanical allodynia. Although there is a need to replicate our preliminary results with a larger sample, study findings suggest a potential relationship between induced pain catastrophizing and central sensitization of pain. Clarification of the mechanisms through which catastrophizing affects pain modulatory systems may yield useful clinical insights into the treatment of chronic pain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Brennan T
2015-01-01
Turbine discharges at low-head short converging intakes are difficult to measure accurately. The proximity of the measurement section to the intake entrance admits large uncertainties related to asymmetry of the velocity profile, swirl, and turbulence. Existing turbine performance codes [10, 24] do not address this special case and published literature is largely silent on rigorous evaluation of uncertainties associated with this measurement context. The American Society of Mechanical Engineers (ASME) Committee investigated the use of Acoustic transit time (ATT), Acoustic scintillation (AS), and Current meter (CM) in a short converging intake at the Kootenay Canal Generating Station in 2009. Basedmore » on their findings, a standardized uncertainty analysis (UA) framework for velocity-area method (specifically for CM measurements) is presented in this paper given the fact that CM is still the most fundamental and common type of measurement system. Typical sources of systematic and random errors associated with CM measurements are investigated, and the major sources of uncertainties associated with turbulence and velocity fluctuations, numerical velocity integration technique (bi-cubic spline), and the number and placement of current meters are being considered for an evaluation. Since the velocity measurements in a short converging intake are associated with complex nonlinear and time varying uncertainties (e.g., Reynolds stress in fluid dynamics), simply applying the law of propagation of uncertainty is known to overestimate the measurement variance while the Monte Carlo method does not. Therefore, a pseudo-Monte Carlo simulation method (random flow generation technique [8]) which was initially developed for the purpose of establishing upstream or initial conditions in the Large-Eddy Simulation (LES) and the Direct Numerical Simulation (DNS) is used to statistically determine uncertainties associated with turbulence and velocity fluctuations. This technique is then combined with a bi-cubic spline interpolation method which converts point velocities into a continuous velocity distribution over the measurement domain. Subsequently the number and placement of current meters are simulated to investigate the accuracy of the estimated flow rates using the numerical velocity-area integration method outlined in ISO 3354 [12]. The authors herein consider that statistics on generated flow rates processed with bi-cubic interpolation and sensor simulations are the combined uncertainties which already accounted for the effects of all those three uncertainty sources. A preliminary analysis based on the current meter data obtained through an upgrade acceptance test of a single unit located in a mainstem plant has been presented.« less
Uncertainty Quantification in Alchemical Free Energy Methods.
Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V
2018-06-12
Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.
Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion
NASA Astrophysics Data System (ADS)
Li, Z.; Ghaith, M.
2017-12-01
Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.
The catastrophic collapse of morale among hospital physicians in Japan
Yasunaga, Hideo
2008-01-01
The past few decades have witnessed bleak pictures of unhappy physicians worldwide. Japanese physicians working in hospitals are particularly distressed. Today, Japan’s healthcare system is near collapse because physicians are utterly demoralized. Their loss of morale is due to budget constraints, excessive demands, physician shortages, poor distribution, long working hours, hostile media, increasing lawsuits, and violence by patients. Severe cost-saving policies, inadequate distribution of healthcare resources, and the failure to communicate risks has damaged physicians’ morale and created conflicts between physicians and society. Physicians should communicate the uncertainty, limitations, and risks of modern medicine to all members of society. No resolution can be achieved unless trust exists between physicians, patients, the public, the media, bureaucrats, politicians and jurists. PMID:22312197
[Catastrophic health expenditures in Mexico: magnitude, distribution and determinants].
Sesma-Vázquez, Sergio; Pérez-Rico, Raymundo; Sosa-Manzano, Carlos Lino; Gómez-Dantés, Octavio
2005-01-01
To describe the magnitude, distribution, and determinants of catastrophic health expenditures in Mexico. The information source was the National Performance Assessment Survey and the methodology, the one developed by the World Health Organization for assessing fair financing. Households with catastrophic expenditures were defined as those with health expenditures over 30% of their ability to pay. Multivariate analysis by logistic and linear regression were used to identify the determinants of catastrophic expenditures. A total of 3.8% of the households incurred in catastrophic health expenditures. There were huge differences by state. The uninsured, poor, and rural households showed a higher impoverishment risk. Sixty percent of the catastrophic expenditures were attributable to outpatient care and medication. A 10% increase of insured households could result in a 9.6% decrease in catastrophic expenditures. Disability, adults 60 years of age and older, and pregnancy increased the probability of catastrophic expenditures. The insurance of older adults, pregnant women, and persons with disabilities could reduce catastrophic health expenditures in Mexico.
NASA Astrophysics Data System (ADS)
Haas, Edwin; Santabarbara, Ignacio; Kiese, Ralf; Butterbach-Bahl, Klaus
2017-04-01
Numerical simulation models are increasingly used to estimate greenhouse gas emissions at site to regional / national scale and are outlined as the most advanced methodology (Tier 3) in the framework of UNFCCC reporting. Process-based models incorporate the major processes of the carbon and nitrogen cycle of terrestrial ecosystems and are thus thought to be widely applicable at various conditions and spatial scales. Process based modelling requires high spatial resolution input data on soil properties, climate drivers and management information. The acceptance of model based inventory calculations depends on the assessment of the inventory's uncertainty (model, input data and parameter induced uncertainties). In this study we fully quantify the uncertainty in modelling soil N2O and NO emissions from arable, grassland and forest soils using the biogeochemical model LandscapeDNDC. We address model induced uncertainty (MU) by contrasting two different soil biogeochemistry modules within LandscapeDNDC. The parameter induced uncertainty (PU) was assessed by using joint parameter distributions for key parameters describing microbial C and N turnover processes as obtained by different Bayesian calibration studies for each model configuration. Input data induced uncertainty (DU) was addressed by Bayesian calibration of soil properties, climate drivers and agricultural management practices data. For the MU, DU and PU we performed several hundred simulations each to contribute to the individual uncertainty assessment. For the overall uncertainty quantification we assessed the model prediction probability, followed by sampled sets of input datasets and parameter distributions. Statistical analysis of the simulation results have been used to quantify the overall full uncertainty of the modelling approach. With this study we can contrast the variation in model results to the different sources of uncertainties for each ecosystem. Further we have been able to perform a fully uncertainty analysis for modelling N2O and NO emissions from arable, grassland and forest soils necessary for the comprehensibility of modelling results. We have applied the methodology to a regional inventory to assess the overall modelling uncertainty for a regional N2O and NO emissions inventory for the state of Saxony, Germany.
Climate change impacts on extreme events in the United States: an uncertainty analysis
Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...
NASA Astrophysics Data System (ADS)
Noh, Seong Jin; Lee, Seungsoo; An, Hyunuk; Kawaike, Kenji; Nakagawa, Hajime
2016-11-01
An urban flood is an integrated phenomenon that is affected by various uncertainty sources such as input forcing, model parameters, complex geometry, and exchanges of flow among different domains in surfaces and subsurfaces. Despite considerable advances in urban flood modeling techniques, limited knowledge is currently available with regard to the impact of dynamic interaction among different flow domains on urban floods. In this paper, an ensemble method for urban flood modeling is presented to consider the parameter uncertainty of interaction models among a manhole, a sewer pipe, and surface flow. Laboratory-scale experiments on urban flood and inundation are performed under various flow conditions to investigate the parameter uncertainty of interaction models. The results show that ensemble simulation using interaction models based on weir and orifice formulas reproduces experimental data with high accuracy and detects the identifiability of model parameters. Among interaction-related parameters, the parameters of the sewer-manhole interaction show lower uncertainty than those of the sewer-surface interaction. Experimental data obtained under unsteady-state conditions are more informative than those obtained under steady-state conditions to assess the parameter uncertainty of interaction models. Although the optimal parameters vary according to the flow conditions, the difference is marginal. Simulation results also confirm the capability of the interaction models and the potential of the ensemble-based approaches to facilitate urban flood simulation.
Ronald E. McRoberts
2005-01-01
Uncertainty in model-based predictions of individual tree diameter growth is attributed to three sources: measurement error for predictor variables, residual variability around model predictions, and uncertainty in model parameter estimates. Monte Carlo simulations are used to propagate the uncertainty from the three sources through a set of diameter growth models to...
NASA Astrophysics Data System (ADS)
Raj, Rahul; Hamm, Nicholas Alexander Samuel; van der Tol, Christiaan; Stein, Alfred
2016-03-01
Gross primary production (GPP) can be separated from flux tower measurements of net ecosystem exchange (NEE) of CO2. This is used increasingly to validate process-based simulators and remote-sensing-derived estimates of simulated GPP at various time steps. Proper validation includes the uncertainty associated with this separation. In this study, uncertainty assessment was done in a Bayesian framework. It was applied to data from the Speulderbos forest site, The Netherlands. We estimated the uncertainty in GPP at half-hourly time steps, using a non-rectangular hyperbola (NRH) model for its separation from the flux tower measurements. The NRH model provides a robust empirical relationship between radiation and GPP. It includes the degree of curvature of the light response curve, radiation and temperature. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. We defined the prior distribution of each NRH parameter and used Markov chain Monte Carlo (MCMC) simulation to estimate the uncertainty in the separated GPP from the posterior distribution at half-hourly time steps. This time series also allowed us to estimate the uncertainty at daily time steps. We compared the informative with the non-informative prior distributions of the NRH parameters and found that both choices produced similar posterior distributions of GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.
NASA Astrophysics Data System (ADS)
Zou, Zongxing; Tang, Huiming; Xiong, Chengren; Su, Aijun; Criss, Robert E.
2017-10-01
The Jiweishan rockslide of June 5, 2009 in China provides an important opportunity to elucidate the kinetic characteristics of high-speed, long-runout debris flows. A 2D discrete element model whose mechanical parameters were calibrated using basic field data was used to simulate the kinetic behavior of this catastrophic landslide. The model output shows that the Jiweishan debris flow lasted about 3 min, released a gravitational potential energy of about 6 × 10^13 J with collisions and friction dissipating approximately equal amounts of energy, and had a maximum fragment velocity of 60-70 m/s, almost twice the highest velocity of the overall slide mass (35 m/s). Notable simulated characteristics include the high velocity and energy of the slide material, the preservation of the original positional order of the slide blocks, the inverse vertical grading of blocks, and the downslope sorting of the slide deposits. Field observations that verify these features include uprooted trees in the frontal collision area of the air-blast wave, downslope reduction of average clast size, and undamaged plants atop huge blocks that prove their lack of downslope tumbling. The secondary acceleration effect and force chains derived from the numerical model help explain these deposit features and the long-distance transport. Our back-analyzed frictions of the motion path in the PFC model provide a reference for analyzing and predicting the motion of similar geological hazards.
NASA Astrophysics Data System (ADS)
Migliavacca, M.; Sonnentag, O.; Keenan, T. F.; Cescatti, A.; O'Keefe, J.; Richardson, A. D.
2012-06-01
Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate system through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Terrestrial biosphere models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we used the Harvard Forest phenology record to investigate and characterize sources of uncertainty in predicting phenology, and the subsequent impacts on model forecasts of carbon and water cycling. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species, with 12 leaf bud-burst models that varied in complexity. Akaike's Information Criterion indicated support for spring warming models with photoperiod limitations and, to a lesser extent, models that included chilling requirements. We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario). Parameter uncertainty was the smallest (average 95% Confidence Interval - CI: 2.4 days century-1 for scenario B1 and 4.5 days century-1 for A1fi), whereas driver uncertainty was the largest (up to 8.4 days century-1 in the simulated trends). The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied among models (±7.7 days century-1 for A1fi, ±3.6 days century-1 for B1). The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per degree of warming) varied between 2.2 days °C-1 and 5.2 days °C-1 depending on model structure. We quantified the impact of uncertainties in bud-burst forecasts on simulated photosynthetic CO2 uptake and evapotranspiration (ET) using a process-based terrestrial biosphere model. Uncertainty in phenology model structure led to uncertainty in the description of forest seasonality, which accumulated to uncertainty in annual model estimates of gross primary productivity (GPP) and ET of 9.6% and 2.9%, respectively. A sensitivity analysis shows that a variation of ±10 days in bud-burst dates led to a variation of ±5.0% for annual GPP and about ±2.0% for ET. For phenology models, differences among future climate scenarios (i.e. driver) represent the largest source of uncertainty, followed by uncertainties related to model structure, and finally, related to model parameterization. The uncertainties we have quantified will affect the description of the seasonality of ecosystem processes and in particular the simulation of carbon uptake by forest ecosystems, with a larger impact of uncertainties related to phenology model structure, followed by uncertainties related to phenological model parameterization.
Granato, Gregory E.; Jones, Susan C.
2015-01-01
Results of this study indicate the potential benefits of the multi-decade simulations that SELDM provides because these simulations quantify risks and uncertainties that affect decisions made with available data and statistics. Results of the SELDM simulations indicate that the WQABI criteria concentrations may be too stringent for evaluating the stormwater quality in receiving streams, highway runoff, and BMP discharges; especially with the substantial uncertainties inherent in selecting representative data.
Ciecior, Willy; Röhlig, Klaus-Jürgen; Kirchner, Gerald
2018-10-01
In the present paper, deterministic as well as first- and second-order probabilistic biosphere modeling approaches are compared. Furthermore, the sensitivity of the influence of the probability distribution function shape (empirical distribution functions and fitted lognormal probability functions) representing the aleatory uncertainty (also called variability) of a radioecological model parameter as well as the role of interacting parameters are studied. Differences in the shape of the output distributions for the biosphere dose conversion factor from first-order Monte Carlo uncertainty analysis using empirical and fitted lognormal distribution functions for input parameters suggest that a lognormal approximation is possibly not always an adequate representation of the aleatory uncertainty of a radioecological parameter. Concerning the comparison of the impact of aleatory and epistemic parameter uncertainty on the biosphere dose conversion factor, the latter here is described using uncertain moments (mean, variance) while the distribution itself represents the aleatory uncertainty of the parameter. From the results obtained, the solution space of second-order Monte Carlo simulation is much larger than that from first-order Monte Carlo simulation. Therefore, the influence of epistemic uncertainty of a radioecological parameter on the output result is much larger than that one caused by its aleatory uncertainty. Parameter interactions are only of significant influence in the upper percentiles of the distribution of results as well as only in the region of the upper percentiles of the model parameters. Copyright © 2018 Elsevier Ltd. All rights reserved.
Robust control of seismically excited cable stayed bridges with MR dampers
NASA Astrophysics Data System (ADS)
YeganehFallah, Arash; Khajeh Ahamd Attari, Nader
2017-03-01
In recent decades active and semi-active structural control are becoming attractive alternatives for enhancing performance of civil infrastructures subjected to seismic and winds loads. However, in order to have reliable active and semi-active control, there is a need to include information of uncertainties in design of the controller. In real world for civil structures, parameters such as loading places, stiffness, mass and damping are time variant and uncertain. These uncertainties in many cases model as parametric uncertainties. The motivation of this research is to design a robust controller for attenuating the vibrational responses of civil infrastructures, regarding their dynamical uncertainties. Uncertainties in structural dynamic’s parameters are modeled as affine uncertainties in state space modeling. These uncertainties are decoupled from the system through Linear Fractional Transformation (LFT) and are assumed to be unknown input to the system but norm bounded. The robust H ∞ controller is designed for the decoupled system to regulate the evaluation outputs and it is robust to effects of uncertainties, disturbance and sensors noise. The cable stayed bridge benchmark which is equipped with MR damper is considered for the numerical simulation. The simulated results show that the proposed robust controller can effectively mitigate undesired uncertainties effects on systems’ responds under seismic loading.
NASA Technical Reports Server (NTRS)
Pace, Dale K.
2000-01-01
A simulation conceptual model is a simulation developers way of translating modeling requirements (i. e., what is to be represented by the simulation or its modification) into a detailed design framework (i. e., how it is to be done), from which the software, hardware, networks (in the case of distributed simulation), and systems/equipment that will make up the simulation can be built or modified. A conceptual model is the collection of information which describes a simulation developers concept about the simulation and its pieces. That information consists of assumptions, algorithms, characteristics, relationships, and data. Taken together, these describe how the simulation developer understands what is to be represented by the simulation (entities, actions, tasks, processes, interactions, etc.) and how that representation will satisfy the requirements to which the simulation responds. Thus the conceptual model is the basis for judgment about simulation fidelity and validity for any condition that is not specifically tested. The more perspicuous and precise the conceptual model, the more likely it is that the simulation development will both fully satisfy requirements and allow demonstration that the requirements are satisfied (i. e., validation). Methods used in simulation conceptual model development have significant implications for simulation management and for assessment of simulation uncertainty. This paper suggests how to develop and document a simulation conceptual model so that the simulation fidelity and validity can be most effectively determined. These ideas for conceptual model development apply to all simulation varieties. The paper relates these ideas to uncertainty assessments as they relate to simulation fidelity and validity. The paper also explores implications for simulation management from conceptual model development methods, especially relative to reuse of simulation components.
The Multi-Step CADIS method for shutdown dose rate calculations and uncertainty propagation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, Ahmad M.; Peplow, Douglas E.; Grove, Robert E.
2015-12-01
Shutdown dose rate (SDDR) analysis requires (a) a neutron transport calculation to estimate neutron flux fields, (b) an activation calculation to compute radionuclide inventories and associated photon sources, and (c) a photon transport calculation to estimate final SDDR. In some applications, accurate full-scale Monte Carlo (MC) SDDR simulations are needed for very large systems with massive amounts of shielding materials. However, these simulations are impractical because calculation of space- and energy-dependent neutron fluxes throughout the structural materials is needed to estimate distribution of radioisotopes causing the SDDR. Biasing the neutron MC calculation using an importance function is not simple becausemore » it is difficult to explicitly express the response function, which depends on subsequent computational steps. Furthermore, the typical SDDR calculations do not consider how uncertainties in MC neutron calculation impact SDDR uncertainty, even though MC neutron calculation uncertainties usually dominate SDDR uncertainty.« less
Forensic Uncertainty Quantification of Explosive Dispersal of Particles
NASA Astrophysics Data System (ADS)
Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho
2017-06-01
In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.
Impact of uncertainties in free stream conditions on the aerodynamics of a rectangular cylinder
NASA Astrophysics Data System (ADS)
Mariotti, Alessandro; Shoeibi Omrani, Pejman; Witteveen, Jeroen; Salvetti, Maria Vittoria
2015-11-01
The BARC benchmark deals with the flow around a rectangular cylinder with chord-to-depth ratio equal to 5. This flow configuration is of practical interest for civil and industrial structures and it is characterized by massively separated flow and unsteadiness. In a recent review of BARC results, significant dispersion was observed both in experimental and numerical predictions of some flow quantities, which are extremely sensitive to various uncertainties, which may be present in experiments and simulations. Besides modeling and numerical errors, in simulations it is difficult to exactly reproduce the experimental conditions due to uncertainties in the set-up parameters, which sometimes cannot be exactly controlled or characterized. Probabilistic methods and URANS simulations are used to investigate the impact of the uncertainties in the following set-up parameters: the angle of incidence, the free stream longitudinal turbulence intensity and length scale. Stochastic collocation is employed to perform the probabilistic propagation of the uncertainty. The discretization and modeling errors are estimated by repeating the same analysis for different grids and turbulence models. The results obtained for different assumed PDF of the set-up parameters are also compared.
Environmental engineering calculations involving uncertainties; either in the model itself or in the data, are far beyond the capabilities of conventional analysis for any but the simplest of models. There exist a number of general-purpose computer simulation languages, using Mon...
Birnie, Kathryn A; Chorney, Jill; El-Hawary, Ron
2017-10-01
Child and parent pain catastrophizing are reported preoperative risk factors for children's acute and persistent postsurgical pain. This study examined dyadic relations between child and parent pain catastrophizing and child and parent ratings of child pain prior to (M = 4.01 days; "baseline") and following surgery (M = 6.5 weeks; "acute follow-up"), as well changes in pain catastrophizing during this time in 167 youth (86% female; Mage = 14.55 years) undergoing spinal fusion surgery and 1 parent (89% mothers). Actor-partner interdependence models assessed cross-sectional and longitudinal intra- and interpersonal effects. Cross-sectionally, child pain catastrophizing was positively associated with child pain at baseline and acute follow-up (actor effects: βbaseline = 0.288 and βfollow-up = 0.262; P < 0.01), and parents' ratings of child pain at baseline (partner effect: βbaseline = 0.212; P < 0.01). Parent pain catastrophizing was not cross-sectionally associated with ratings of child pain. Longitudinally, higher pain catastrophizing at baseline predicted higher pain catastrophizing at acute follow-up for children (actor effect: β = 0.337; P < 0.01) and parents (actor effect: β = 0.579; P < 0.01) with a significantly smaller effect for children (respondent × actor interaction: β = 0.121; P < 0.05). No longitudinal partner effects for catastrophizing were observed. Baseline child and parent pain catastrophizing did not predict child pain at acute follow-up. In conclusion, child, not parent, pain catastrophizing was associated with children's pre- and postsurgical pain, and showed significantly less stability over time. There is a need to better understand contributors to the stability or changeability of pain catastrophizing, the prospective relation of catastrophizing to pain, and contexts in which child vs parent pain catastrophizing is most influential for pediatric postsurgical pain.
NASA Astrophysics Data System (ADS)
Huang, Danqing; Yan, Peiwen; Zhu, Jian; Zhang, Yaocun; Kuang, Xueyuan; Cheng, Jing
2018-04-01
The uncertainty of global summer precipitation simulated by the 23 CMIP5 CGCMs and the possible impacts of model resolutions are investigated in this study. Large uncertainties exist over the tropical and subtropical regions, which can be mainly attributed to convective precipitation simulation. High-resolution models (HRMs) and low-resolution models (LRMs) are further investigated to demonstrate their different contributions to the uncertainties of the ensemble mean. It shows that the high-resolution model ensemble means (HMME) and low-resolution model ensemble mean (LMME) mitigate the biases between the MME and observation over most continents and oceans, respectively. The HMME simulates more precipitation than the LMME over most oceans, but less precipitation over some continents. The dominant precipitation category in the HRMs (LRMs) is the heavy precipitation (moderate precipitation) over the tropic regions. The combinations of convective and stratiform precipitation are also quite different: the HMME has much higher ratio of stratiform precipitation while the LMME has more convective precipitation. Finally, differences in precipitation between the HMME and LMME can be traced to their differences in the SST simulations via the local and remote air-sea interaction.
NASA Astrophysics Data System (ADS)
Riva, Fabio; Milanese, Lucio; Ricci, Paolo
2017-10-01
To reduce the computational cost of the uncertainty propagation analysis, which is used to study the impact of input parameter variations on the results of a simulation, a general and simple to apply methodology based on decomposing the solution to the model equations in terms of Chebyshev polynomials is discussed. This methodology, based on the work by Scheffel [Am. J. Comput. Math. 2, 173-193 (2012)], approximates the model equation solution with a semi-analytic expression that depends explicitly on time, spatial coordinates, and input parameters. By employing a weighted residual method, a set of nonlinear algebraic equations for the coefficients appearing in the Chebyshev decomposition is then obtained. The methodology is applied to a two-dimensional Braginskii model used to simulate plasma turbulence in basic plasma physics experiments and in the scrape-off layer of tokamaks, in order to study the impact on the simulation results of the input parameter that describes the parallel losses. The uncertainty that characterizes the time-averaged density gradient lengths, time-averaged densities, and fluctuation density level are evaluated. A reasonable estimate of the uncertainty of these distributions can be obtained with a single reduced-cost simulation.
NASA Technical Reports Server (NTRS)
Smart, Christian
1998-01-01
During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen out the many failure modes that did not contribute significantly to the catastrophic risk. The Hazard Analysis and FMEA for the SSME were also used to build ESDs that show the chain of events leading from the failure mode occurence to one of the following end states: catastrophic failure, engine shutdown, or siccessful operation( successful with respect to the failure mode under consideration).
Birnie, Kathryn A; Chambers, Christine T; Chorney, Jill; Fernandez, Conrad V; McGrath, Patrick J
2016-04-01
When explored separately, child and parent catastrophic thoughts about child pain show robust negative relations with child pain. The objective of this study was to conduct a dyadic analysis to elucidate intrapersonal and interpersonal influences of child and parent pain catastrophizing on aspects of pain communication, including observed behaviours and perceptions of child pain. A community sample of 171 dyads including children aged 8 to 12 years (89 girls) and parents (135 mothers) rated pain catastrophizing (trait and state versions) and child pain intensity and unpleasantness following a cold pressor task. Child pain tolerance was also assessed. Parent-child interactions during the cold pressor task were coded for parent attending, nonattending, and other talk, and child symptom complaints and other talk. Data were analyzed using the actor-partner interdependence model and hierarchical multiple regressions. Children reporting higher state pain catastrophizing had greater symptom complaints regardless of level of parent state pain catastrophizing. Children reporting low state pain catastrophizing had similar high levels of symptom complaints, but only when parents reported high state pain catastrophizing. Higher child and parent state and/or trait pain catastrophizing predicted their own ratings of higher child pain intensity and unpleasantness, with child state pain catastrophizing additionally predicting parent ratings. Higher pain tolerance was predicted by older child age and lower child state pain catastrophizing. These newly identified interpersonal effects highlight the relevance of the social context to children's pain expressions and parent perceptions of child pain. Both child and parent pain catastrophizing warrant consideration when managing child pain.
Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Templeton, Jeremy Alan; Blaylock, Myra L.; Domino, Stefan P.
2015-09-01
The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.
Investigation of Shapes and Spins of Reaccumulated Remnants from Asteroid Disruption Simulations
NASA Astrophysics Data System (ADS)
Michel, Patrick; Ballouz, R.; Richardson, D. C.; Schwartz, S. R.
2012-10-01
Evidence that asteroids larger than a few hundred meters diameter can be gravitational aggregates of smaller, cohesive pieces comes, for instance, from images returned by the Hayabusa spacecraft of asteroid 25143 Itokawa (Fujiwara et al., 2006, Science 312, 1330). These images show an irregular 500-meter-long body with a boulder-strewn surface, as might be expected from reaccumulation following catastrophic disruption of a larger parent asteroid (Michel et al., 2001, Science 294, 1696). However, numerical simulations of this process to date essentially focus on the size/mass and velocity distributions of reaccumulated fragments, matching asteroid families. Reaccumulation was simplified by merging the objects into growing spheres. However, understanding shapes, spins and surface properties of gravitational aggregates formed by reaccumulation is required to interpret information from ground-based observations and space missions. E.g., do boulders on Itokawa originate from reaccumulation of material ejected from a catastrophic impact or from other processes (such as the Brazil-nut effect)? How does reaccumulation affect the observed shapes? A model was developed (Richardson et al., 2009, Planet. Space Sci. 57, 183) to preserve shape and spin information of reaccumulated bodies in simulations of asteroid disruption, by allowing fragments to stick on contact (and optionally bounce or fragment further, depending on user-selectable parameters). Such treatments are computationally expensive, and we could only recently start to explore the parameter space. Preliminary results will be presented, showing that some observed surface and shape features may be explained by how fragments produced by a disruption reaccumulate. Simulations of rubble pile collisions without particle cohesion, and an investigation of the influence of initial target rotation on the outcome will also be shown. We acknowledge the National Science Foundation (AST1009579) and NASA (NNX08AM39G).
7 CFR 402.4 - Catastrophic Risk Protection Endorsement Provisions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 6 2014-01-01 2014-01-01 false Catastrophic Risk Protection Endorsement Provisions... INSURANCE CORPORATION, DEPARTMENT OF AGRICULTURE CATASTROPHIC RISK PROTECTION ENDORSEMENT § 402.4 Catastrophic Risk Protection Endorsement Provisions. Department of Agriculture Federal Crop Insurance...
7 CFR 402.4 - Catastrophic Risk Protection Endorsement Provisions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 6 2012-01-01 2012-01-01 false Catastrophic Risk Protection Endorsement Provisions... INSURANCE CORPORATION, DEPARTMENT OF AGRICULTURE CATASTROPHIC RISK PROTECTION ENDORSEMENT § 402.4 Catastrophic Risk Protection Endorsement Provisions. Department of Agriculture Federal Crop Insurance...
A 3D Hydrodynamic Model for Cytokinesis of Eukaryotic Cells
2014-08-01
goes wrong may lead to a catastrophe or failure, which may lead to an unwelcome outcome for instance cancer . Thus, a detailed understanding on... biofilm - drug interaction. Discrete and Continuous Dynamical Systems Series B, 15:417–456, March 2011. 13 [17] Brandon Lindley, Qi Wang, and Tianyu Zhang...Multicomponent hydrodynamic model for heterogeneous biofilms : Two-dimensional numerical simulations of growth and in- teraction with flows. Physical
Dettinger, Michael D.; Ingram, B. Lynn
2013-01-01
Scientists who created a simulated megastorm, called ARkStorm, that was patterned after the 1861 flood but was less severe, found that such a torrent could force more than a million people to evacuate and cause $400 billion in losses if it happened in California today. Forecasters are getting better at predicting the arrival of atmospheric rivers, which will improve warnings about flooding from the common storms and about the potential for catastrophe from a megastorm.
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.
2012-01-01
There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.
Blast Load Simulator Experiments for Computational Model Validation: Report 2
2017-02-01
repeatability. The uncertainty in the experimental pressures and impulses was evaluated by computing 95% confidence intervals on the results. DISCLAIMER: The...Experiment uncertainty The uncertainty in the experimental pressure and impulse was evaluated for the five replicate experiments for which, as closely as...comparisons were made among the replicated experiments to evaluate repeatability. The uncertainty in the experimental pressures and impulses was
Numerical simulations of catastrophic disruption: Recent results
NASA Technical Reports Server (NTRS)
Benz, W.; Asphaug, E.; Ryan, E. V.
1994-01-01
Numerical simulations have been used to study high velocity two-body impacts. In this paper, a two-dimensional Largrangian finite difference hydro-code and a three-dimensional smooth particle hydro-code (SPH) are described and initial results reported. These codes can be, and have been, used to make specific predictions about particular objects in our solar system. But more significantly, they allow us to explore a broad range of collisional events. Certain parameters (size, time) can be studied only over a very restricted range within the laboratory; other parameters (initial spin, low gravity, exotic structure or composition) are difficult to study at all experimentally. The outcomes of numerical simulations lead to a more general and accurate understanding of impacts in their many forms.
NASA Technical Reports Server (NTRS)
Karakoylu, E.; Franz, B.
2016-01-01
First attempt at quantifying uncertainties in ocean remote sensing reflectance satellite measurements. Based on 1000 iterations of Monte Carlo. Data source is a SeaWiFS 4-day composite, 2003. The uncertainty is for remote sensing reflectance (Rrs) at 443 nm.
Advanced Simulation of Coupled Earthquake and Tsunami Events
NASA Astrophysics Data System (ADS)
Behrens, Joern
2013-04-01
Tsunami-Earthquakes represent natural catastrophes threatening lives and well-being of societies in a solitary and unexpected extreme event as tragically demonstrated in Sumatra (2004), Samoa (2009), Chile (2010), or Japan (2011). Both phenomena are consequences of the complex system of interactions of tectonic stress, fracture mechanics, rock friction, rupture dynamics, fault geometry, ocean bathymetry, and coastline geometry. The ASCETE project forms an interdisciplinary research consortium that couples the most advanced simulation technologies for earthquake rupture dynamics and tsunami propagation to understand the fundamental conditions of tsunami generation. We report on the latest research results in physics-based dynamic rupture and tsunami wave propagation simulation, using unstructured and adaptive meshes with continuous and discontinuous Galerkin discretization approaches. Coupling both simulation tools - the physics-based dynamic rupture simulation and the hydrodynamic tsunami wave propagation - will give us the possibility to conduct highly realistic studies of the interaction of rupture dynamics and tsunami impact characteristics.
Factoring uncertainty into restoration modeling of in-situ leach uranium mines
Johnson, Raymond H.; Friedel, Michael J.
2009-01-01
Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Heng; Ye, Ming; Walker, Anthony P.
Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less
Iranian Household Financial Protection against Catastrophic Health Care Expenditures
Moghadam, M Nekoei; Banshi, M; Javar, M Akbari; Amiresmaili, M; Ganjavi, S
2012-01-01
Background: Protecting households against financial risks is one of objectives of any health system. In this regard, Iran’s fourth five year developmental plan act in its 90th article, articulated decreasing household’s exposure to catastrophic health expenditure to one percent. Hence, this study aimed to measure percentage of Iranian households exposed to catastrophic health expenditures and to explore its determinants. Methods: The present descriptive-analytical study was carried out retrospectively. Households whose financial contributions to the health system exceeded 40% of disposable income were considered as exposed to catastrophic healthcare expenditures. Influential factors on catastrophic healthcare expenditures were examined by logistic regression and chi-square test. Results: Of 39,088 households, 80 were excluded due to absence of food expenditures. 2.8% of households were exposed to catastrophic health expenditures. Influential factors on catastrophic healthcare were utilizing ambulatory, hospital, and drug addiction cessation services as well as consuming pharmaceuticals. Socioeconomics characteristics such as health insurance coverage, household size, and economic status were other determinants of exposure to catastrophic healthcare expenditures. Conclusion: Iranian health system has not achieved the objective of reducing catastrophic healthcare expenditure to one percent. Inefficient health insurance coverage, different fee schedules practiced by private and public providers, failure of referral system are considered as probable barriers toward decreasing households’ exposure to catastrophic healthcare expenditures. PMID:23193508
Pain Catastrophizing and Its Relationship with Health Outcomes: Does Pain Intensity Matter?
García-Palacios, Azucena; Botella, Cristina; Ribera-Canudas, Maria Victoria
2017-01-01
Pain catastrophizing is known to contribute to physical and mental functioning, even when controlling for the effect of pain intensity. However, research has yet to explore whether the strength of the relationship between pain catastrophizing and pain-related outcomes varies across pain intensity levels (i.e., moderation). If this was the case, it would have important implications for existing models of pain and current interventions. The present investigation explored whether pain intensity moderates the relationship between pain catastrophizing and pain-related outcomes. Participants were 254 patients (62% women) with heterogeneous chronic pain. Patients completed a measure of pain intensity, pain interference, pain catastrophizing, and physical and mental health. Pain intensity moderated the relationship between pain catastrophizing and pain interference and between pain catastrophizing and physical health status. Specifically, the strength of the correlation between pain catastrophizing and these outcomes decreased considerably as pain intensity increased. In contrast, pain intensity did not moderate the relationship between pain catastrophizing and mental health. Study findings provide a new insight into the role of pain intensity (i.e., moderator) in the relationship between pain catastrophizing and various pain-related outcomes, which might help develop existent models of pain. Clinical implications are discussed in the context of personalized therapy. PMID:28348506
NASA Astrophysics Data System (ADS)
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
NASA Astrophysics Data System (ADS)
Smith, L. A.
2001-05-01
Many sources of uncertainty come into play when modelling geophysical systems by simulation. These include uncertainty in the initial condition, uncertainty in model parameter values (and the parameterisations themselves) and error in the model class from which the model(s) was selected. In recent decades, climate simulations have focused resources on reducing the last of these by including more and more details into the model. One can question when this ``kitchen sink'' approach should be complimented with realistic estimates of the impact from other uncertainties noted above. Indeed while the impact of model error can never be fully quantified, as all simulation experiments are interpreted a the rosy scenario which assumes a priori that nothing crucial is missing, the impact of other uncertainties can be quantified at only the cost of computational power; as illustrated, for example, in ensemble climate modelling experiments like Casino-21. This talk illustrates the interplay uncertainties in the context of a trivial nonlinear system and an ensemble of models. The simple systems considered in this small scale experiment, Keno-21, are meant to illustrate issues of experimental design; they are not intended to provide true climate simulations. The use of simulation models with huge numbers of parameters given limited data is usually justified by an appeal to the Laws of Physics: the number of free degrees-of-freedom are many fewer than the number of variables; both variables, parameterisations, and parameter values are constrained by ``the physics" and the resulting simulation yields a realistic reproduction of the entire planet's climate system to within reasonable bounds. But what bounds? exactly? In a single model run under transient forcing scenario, there are good statistical grounds for considering only large space and time averages; most of these reasons vanish if an ensemble of runs are made. Ensemble runs can quantify the (in)ability of a model to provide insight on regional changes: if a model cannot capture regional variations in the data on which the model was constructed (that is, in-sample) claims that out-of-sample predictions of those same regional averages should be used in policy making are vacuous. While motivated by climate modelling and illustrated on a trivial nonlinear system, these issues have implications across the range of geophysical modelling. These include implications for appropriate resource allocation, on the making of science policy, and on the public understanding of science and the role of uncertainty in decision making.
Climate data induced uncertainty in model-based estimations of terrestrial primary productivity
NASA Astrophysics Data System (ADS)
Wu, Zhendong; Ahlström, Anders; Smith, Benjamin; Ardö, Jonas; Eklundh, Lars; Fensholt, Rasmus; Lehsten, Veiko
2017-06-01
Model-based estimations of historical fluxes and pools of the terrestrial biosphere differ substantially. These differences arise not only from differences between models but also from differences in the environmental and climatic data used as input to the models. Here we investigate the role of uncertainties in historical climate data by performing simulations of terrestrial gross primary productivity (GPP) using a process-based dynamic vegetation model (LPJ-GUESS) forced by six different climate datasets. We find that the climate induced uncertainty, defined as the range among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 11 Pg C yr-1 globally (9% of mean GPP). We also assessed a hypothetical maximum climate data induced uncertainty by combining climate variables from different datasets, which resulted in significantly larger uncertainties of 41 Pg C yr-1 globally or 32% of mean GPP. The uncertainty is partitioned into components associated to the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (climate data range) and the apparent sensitivity of the modeled GPP to the driver (apparent model sensitivity). We find that LPJ-GUESS overestimates GPP compared to empirically based GPP data product in all land cover classes except for tropical forests. Tropical forests emerge as a disproportionate source of uncertainty in GPP estimation both in the simulations and empirical data products. The tropical forest uncertainty is most strongly associated with shortwave radiation and precipitation forcing, of which climate data range contributes higher to overall uncertainty than apparent model sensitivity to forcing. Globally, precipitation dominates the climate induced uncertainty over nearly half of the vegetated land area, which is mainly due to climate data range and less so due to the apparent model sensitivity. Overall, climate data ranges are found to contribute more to the climate induced uncertainty than apparent model sensitivity to forcing. Our study highlights the need to better constrain tropical climate, and demonstrates that uncertainty caused by climatic forcing data must be considered when comparing and evaluating carbon cycle model results and empirical datasets.
NASA Astrophysics Data System (ADS)
Islam, Siraj Ul; Déry, Stephen J.
2017-03-01
This study evaluates predictive uncertainties in the snow hydrology of the Fraser River Basin (FRB) of British Columbia (BC), Canada, using the Variable Infiltration Capacity (VIC) model forced with several high-resolution gridded climate datasets. These datasets include the Canadian Precipitation Analysis and the thin-plate smoothing splines (ANUSPLIN), North American Regional Reanalysis (NARR), University of Washington (UW) and Pacific Climate Impacts Consortium (PCIC) gridded products. Uncertainties are evaluated at different stages of the VIC implementation, starting with the driving datasets, optimization of model parameters, and model calibration during cool and warm phases of the Pacific Decadal Oscillation (PDO). The inter-comparison of the forcing datasets (precipitation and air temperature) and their VIC simulations (snow water equivalent - SWE - and runoff) reveals widespread differences over the FRB, especially in mountainous regions. The ANUSPLIN precipitation shows a considerable dry bias in the Rocky Mountains, whereas the NARR winter air temperature is 2 °C warmer than the other datasets over most of the FRB. In the VIC simulations, the elevation-dependent changes in the maximum SWE (maxSWE) are more prominent at higher elevations of the Rocky Mountains, where the PCIC-VIC simulation accumulates too much SWE and ANUSPLIN-VIC yields an underestimation. Additionally, at each elevation range, the day of maxSWE varies from 10 to 20 days between the VIC simulations. The snow melting season begins early in the NARR-VIC simulation, whereas the PCIC-VIC simulation delays the melting, indicating seasonal uncertainty in SWE simulations. When compared with the observed runoff for the Fraser River main stem at Hope, BC, the ANUSPLIN-VIC simulation shows considerable underestimation of runoff throughout the water year owing to reduced precipitation in the ANUSPLIN forcing dataset. The NARR-VIC simulation yields more winter and spring runoff and earlier decline of flows in summer due to a nearly 15-day earlier onset of the FRB springtime snowmelt. Analysis of the parametric uncertainty in the VIC calibration process shows that the choice of the initial parameter range plays a crucial role in defining the model hydrological response for the FRB. Furthermore, the VIC calibration process is biased toward cool and warm phases of the PDO and the choice of proper calibration and validation time periods is important for the experimental setup. Overall the VIC hydrological response is prominently influenced by the uncertainties involved in the forcing datasets rather than those in its parameter optimization and experimental setups.
NASA Astrophysics Data System (ADS)
Henneberg, Olga; Ament, Felix; Grützun, Verena
2018-05-01
Soil moisture amount and distribution control evapotranspiration and thus impact the occurrence of convective precipitation. Many recent model studies demonstrate that changes in initial soil moisture content result in modified convective precipitation. However, to quantify the resulting precipitation changes, the chaotic behavior of the atmospheric system needs to be considered. Slight changes in the simulation setup, such as the chosen model domain, also result in modifications to the simulated precipitation field. This causes an uncertainty due to stochastic variability, which can be large compared to effects caused by soil moisture variations. By shifting the model domain, we estimate the uncertainty of the model results. Our novel uncertainty estimate includes 10 simulations with shifted model boundaries and is compared to the effects on precipitation caused by variations in soil moisture amount and local distribution. With this approach, the influence of soil moisture amount and distribution on convective precipitation is quantified. Deviations in simulated precipitation can only be attributed to soil moisture impacts if the systematic effects of soil moisture modifications are larger than the inherent simulation uncertainty at the convection-resolving scale. We performed seven experiments with modified soil moisture amount or distribution to address the effect of soil moisture on precipitation. Each of the experiments consists of 10 ensemble members using the deep convection-resolving COSMO model with a grid spacing of 2.8 km. Only in experiments with very strong modification in soil moisture do precipitation changes exceed the model spread in amplitude, location or structure. These changes are caused by a 50 % soil moisture increase in either the whole or part of the model domain or by drying the whole model domain. Increasing or decreasing soil moisture both predominantly results in reduced precipitation rates. Replacing the soil moisture with realistic fields from different days has an insignificant influence on precipitation. The findings of this study underline the need for uncertainty estimates in soil moisture studies based on convection-resolving models.
Chasing Perfection: Should We Reduce Model Uncertainty in Carbon Cycle-Climate Feedbacks
NASA Astrophysics Data System (ADS)
Bonan, G. B.; Lombardozzi, D.; Wieder, W. R.; Lindsay, K. T.; Thomas, R. Q.
2015-12-01
Earth system model simulations of the terrestrial carbon (C) cycle show large multi-model spread in the carbon-concentration and carbon-climate feedback parameters. Large differences among models are also seen in their simulation of global vegetation and soil C stocks and other aspects of the C cycle, prompting concern about model uncertainty and our ability to faithfully represent fundamental aspects of the terrestrial C cycle in Earth system models. Benchmarking analyses that compare model simulations with common datasets have been proposed as a means to assess model fidelity with observations, and various model-data fusion techniques have been used to reduce model biases. While such efforts will reduce multi-model spread, they may not help reduce uncertainty (and increase confidence) in projections of the C cycle over the twenty-first century. Many ecological and biogeochemical processes represented in Earth system models are poorly understood at both the site scale and across large regions, where biotic and edaphic heterogeneity are important. Our experience with the Community Land Model (CLM) suggests that large uncertainty in the terrestrial C cycle and its feedback with climate change is an inherent property of biological systems. The challenge of representing life in Earth system models, with the rich diversity of lifeforms and complexity of biological systems, may necessitate a multitude of modeling approaches to capture the range of possible outcomes. Such models should encompass a range of plausible model structures. We distinguish between model parameter uncertainty and model structural uncertainty. Focusing on improved parameter estimates may, in fact, limit progress in assessing model structural uncertainty associated with realistically representing biological processes. Moreover, higher confidence may be achieved through better process representation, but this does not necessarily reduce uncertainty.
Degeling, Koen; IJzerman, Maarten J; Koopman, Miriam; Koffijberg, Hendrik
2017-12-15
Parametric distributions based on individual patient data can be used to represent both stochastic and parameter uncertainty. Although general guidance is available on how parameter uncertainty should be accounted for in probabilistic sensitivity analysis, there is no comprehensive guidance on reflecting parameter uncertainty in the (correlated) parameters of distributions used to represent stochastic uncertainty in patient-level models. This study aims to provide this guidance by proposing appropriate methods and illustrating the impact of this uncertainty on modeling outcomes. Two approaches, 1) using non-parametric bootstrapping and 2) using multivariate Normal distributions, were applied in a simulation and case study. The approaches were compared based on point-estimates and distributions of time-to-event and health economic outcomes. To assess sample size impact on the uncertainty in these outcomes, sample size was varied in the simulation study and subgroup analyses were performed for the case-study. Accounting for parameter uncertainty in distributions that reflect stochastic uncertainty substantially increased the uncertainty surrounding health economic outcomes, illustrated by larger confidence ellipses surrounding the cost-effectiveness point-estimates and different cost-effectiveness acceptability curves. Although both approaches performed similar for larger sample sizes (i.e. n = 500), the second approach was more sensitive to extreme values for small sample sizes (i.e. n = 25), yielding infeasible modeling outcomes. Modelers should be aware that parameter uncertainty in distributions used to describe stochastic uncertainty needs to be reflected in probabilistic sensitivity analysis, as it could substantially impact the total amount of uncertainty surrounding health economic outcomes. If feasible, the bootstrap approach is recommended to account for this uncertainty.
Chatrchyan, S.
2015-07-10
In our Letter, there was a component of the statistical uncertainty from the simulated PbPb Monte Carlo samples. This uncertainty was not propagated to all of the results. Figures 3 and 4 have been updated to reflect this source of uncertainty. In this case, the statistical uncertainties remain smaller than the systematic uncertainties in all cases such that the conclusions of the Letter are unaltered.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, Donald D.; Gowardhan, Akshay; Cameron-Smith, Philip
2015-08-08
Here, a computational Bayesian inverse technique is used to quantify the effects of meteorological inflow uncertainty on tracer transport and source estimation in a complex urban environment. We estimate a probability distribution of meteorological inflow by comparing wind observations to Monte Carlo simulations from the Aeolus model. Aeolus is a computational fluid dynamics model that simulates atmospheric and tracer flow around buildings and structures at meter-scale resolution. Uncertainty in the inflow is propagated through forward and backward Lagrangian dispersion calculations to determine the impact on tracer transport and the ability to estimate the release location of an unknown source. Ourmore » uncertainty methods are compared against measurements from an intensive observation period during the Joint Urban 2003 tracer release experiment conducted in Oklahoma City.« less
Health-Related Financial Catastrophe, Inequality and Chronic Illness in Bangladesh
Rahman, Md. Mizanur; Gilmour, Stuart; Saito, Eiko; Sultana, Papia; Shibuya, Kenji
2013-01-01
Background Bangladesh has a high proportion of households incurring catastrophic health expenditure, and very limited risk sharing mechanisms. Identifying determinants of out-of-pocket (OOP) payments and catastrophic health expenditure may reveal opportunities to reduce costs and protect households from financial risk. Objective This study investigates the determinants of high healthcare expenditure and healthcare- related financial catastrophe. Methods A cross-sectional household survey was conducted in Rajshahi city, Bangladesh, in 2011. Catastrophic health expenditure was estimated separately based on capacity to pay and proportion of non-food expenditure. Determinants of OOP payments and financial catastrophe were estimated using double hurdle and Poisson regression models respectively. Results On average households spent 11% of their total budgets on health, half the residents spent 7% of the monthly per capita consumption expenditure for one illness, and nearly 9% of households faced financial catastrophe. The poorest households spent less on health but had a four times higher risk of catastrophe than the richest households. The risk of financial catastrophe and the level of OOP payments were higher for users of inpatient, outpatient public and private facilities respectively compared to using self-medication or traditional healers. Other determinants of OOP payments and catastrophic expenses were economic status, presence of chronic illness in the household, and illness among children and adults. Conclusion Households that received inpatient or outpatient private care experienced the highest burden of health expenditure. The poorest members of the community also face large, often catastrophic expenses. Chronic illness management is crucial to reducing the total burden of disease in a household and its associated increased risk of level of OOP payments and catastrophic expenses. Households can only be protected from these situations by reducing the health system's dependency on OOP payments and providing more financial risk protection. PMID:23451102
Geomorphic legacy of medieval Himalayan earthquakes in the Pokhara Valley
NASA Astrophysics Data System (ADS)
Schwanghart, Wolfgang; Bernhardt, Anne; Stolle, Amelie; Hoelzmann, Philipp; Adhikari, Basanta R.; Andermann, Christoff; Tofelde, Stefanie; Merchel, Silke; Rugel, Georg; Fort, Monique; Korup, Oliver
2016-04-01
The Himalayas and their foreland belong to the world's most earthquake-prone regions. With millions of people at risk from severe ground shaking and associated damages, reliable data on the spatial and temporal occurrence of past major earthquakes is urgently needed to inform seismic risk analysis. Beyond the instrumental record such information has been largely based on historical accounts and trench studies. Written records provide evidence for damages and fatalities, yet are difficult to interpret when derived from the far-field. Trench studies, in turn, offer information on rupture histories, lengths and displacements along faults but involve high chronological uncertainties and fail to record earthquakes that do not rupture the surface. Thus, additional and independent information is required for developing reliable earthquake histories. Here, we present exceptionally well-dated evidence of catastrophic valley infill in the Pokhara Valley, Nepal. Bayesian calibration of radiocarbon dates from peat beds, plant macrofossils, and humic silts in fine-grained tributary sediments yields a robust age distribution that matches the timing of nearby M>8 earthquakes in ~1100, 1255, and 1344 AD. The upstream dip of tributary valley fills and X-ray fluorescence spectrometry of their provenance rule out local sediment sources. Instead, geomorphic and sedimentary evidence is consistent with catastrophic fluvial aggradation and debris flows that had plugged several tributaries with tens of meters of calcareous sediment from the Annapurna Massif >60 km away. The landscape-changing consequences of past large Himalayan earthquakes have so far been elusive. Catastrophic aggradation in the wake of two historically documented medieval earthquakes and one inferred from trench studies underscores that Himalayan valley fills should be considered as potential archives of past earthquakes. Such valley fills are pervasive in the Lesser Himalaya though high erosion rates reduce preservation potential. Further studies may wish to seek such remnants of prehistoric earthquakes using extensive sedimentological work as well as numerical age control.
NASA Astrophysics Data System (ADS)
Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.
2017-12-01
The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.
NASA Astrophysics Data System (ADS)
Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.
2016-12-01
This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for improving the model physics parameterizations.
Evaluation of uncertainties in the CRCM-simulated North American climate
NASA Astrophysics Data System (ADS)
de Elía, Ramón; Caya, Daniel; Côté, Hélène; Frigon, Anne; Biner, Sébastien; Giguère, Michel; Paquin, Dominique; Harvey, Richard; Plummer, David
2008-02-01
This work is a first step in the analysis of uncertainty sources in the RCM-simulated climate over North America. Three main sets of sensitivity studies were carried out: the first estimates the magnitude of internal variability, which is needed to evaluate the significance of changes in the simulated climate induced by any model modification. The second is devoted to the role of CRCM configuration as a source of uncertainty, in particular the sensitivity to nesting technique, domain size, and driving reanalysis. The third study aims to assess the relative importance of the previously estimated sensitivities by performing two additional sensitivity experiments: one, in which the reanalysis driving data is replaced by data generated by the second generation Coupled Global Climate Model (CGCM2), and another, in which a different CRCM version is used. Results show that the internal variability, triggered by differences in initial conditions, is much smaller than the sensitivity to any other source. Results also show that levels of uncertainty originating from liberty of choices in the definition of configuration parameters are comparable among themselves and are smaller than those due to the choice of CGCM or CRCM version used. These results suggest that uncertainty originated by the CRCM configuration latitude (freedom of choice among domain sizes, nesting techniques and reanalysis dataset), although important, does not seem to be a major obstacle to climate downscaling. Finally, with the aim of evaluating the combined effect of the different uncertainties, the ensemble spread is estimated for a subset of the analysed simulations. Results show that downscaled surface temperature is in general more uncertain in the northern regions, while precipitation is more uncertain in the central and eastern US.
Predicting catastrophes of non-autonomous networks with visibility graphs and horizontal visibility
NASA Astrophysics Data System (ADS)
Zhang, Haicheng; Xu, Daolin; Wu, Yousheng
2018-05-01
Prediction of potential catastrophes in engineering systems is a challenging problem. We first attempt to construct a complex network to predict catastrophes of a multi-modular floating system in advance of their occurrences. Response time series of the system can be mapped into an virtual network by using visibility graph or horizontal visibility algorithm. The topology characteristics of the networks can be used to forecast catastrophes of the system. Numerical results show that there is an obvious corresponding relationship between the variation of topology characteristics and the onset of catastrophes. A Catastrophe Index (CI) is proposed as a numerical indicator to measure a qualitative change from a stable state to a catastrophic state. The two approaches, the visibility graph and horizontal visibility algorithms, are compared by using the index in the reliability analysis with different data lengths and sampling frequencies. The technique of virtual network method is potentially extendable to catastrophe predictions of other engineering systems.
Quantifying uncertainty and computational complexity for pore-scale simulations
NASA Astrophysics Data System (ADS)
Chen, C.; Yuan, Z.; Wang, P.; Yang, X.; Zhenyan, L.
2016-12-01
Pore-scale simulation is an essential tool to understand the complex physical process in many environmental problems, from multi-phase flow in the subsurface to fuel cells. However, in practice, factors such as sample heterogeneity, data sparsity and in general, our insufficient knowledge of the underlying process, render many simulation parameters and hence the prediction results uncertain. Meanwhile, most pore-scale simulations (in particular, direct numerical simulation) incur high computational cost due to finely-resolved spatio-temporal scales, which further limits our data/samples collection. To address those challenges, we propose a novel framework based on the general polynomial chaos (gPC) and build a surrogate model representing the essential features of the underlying system. To be specific, we apply the novel framework to analyze the uncertainties of the system behavior based on a series of pore-scale numerical experiments, such as flow and reactive transport in 2D heterogeneous porous media and 3D packed beds. Comparing with recent pore-scale uncertainty quantification studies using Monte Carlo techniques, our new framework requires fewer number of realizations and hence considerably reduce the overall computational cost, while maintaining the desired accuracy.
NASA Astrophysics Data System (ADS)
Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish
2018-06-01
Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.
Exploring the implication of climate process uncertainties within the Earth System Framework
NASA Astrophysics Data System (ADS)
Booth, B.; Lambert, F. H.; McNeal, D.; Harris, G.; Sexton, D.; Boulton, C.; Murphy, J.
2011-12-01
Uncertainties in the magnitude of future climate change have been a focus of a great deal of research. Much of the work with General Circulation Models has focused on the atmospheric response to changes in atmospheric composition, while other processes remain outside these frameworks. Here we introduce an ensemble of new simulations, based on an Earth System configuration of HadCM3C, designed to explored uncertainties in both physical (atmospheric, oceanic and aerosol physics) and carbon cycle processes, using perturbed parameter approaches previously used to explore atmospheric uncertainty. Framed in the context of the climate response to future changes in emissions, the resultant future projections represent significantly broader uncertainty than existing concentration driven GCM assessments. The systematic nature of the ensemble design enables interactions between components to be explored. For example, we show how metrics of physical processes (such as climate sensitivity) are also influenced carbon cycle parameters. The suggestion from this work is that carbon cycle processes represent a comparable contribution to uncertainty in future climate projections as contributions from atmospheric feedbacks more conventionally explored. The broad range of climate responses explored within these ensembles, rather than representing a reason for inaction, provide information on lower likelihood but high impact changes. For example while the majority of these simulations suggest that future Amazon forest extent is resilient to the projected climate changes, a small number simulate dramatic forest dieback. This ensemble represents a framework to examine these risks, breaking them down into physical processes (such as ocean temperature drivers of rainfall change) and vegetation processes (where uncertainties point towards requirements for new observational constraints).
The visualization of spatial uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srivastava, R.M.
1994-12-31
Geostatistical conditions simulation is gaining acceptance as a numerical modeling tool in the petroleum industry. Unfortunately, many of the new users of conditional simulation work with only one outcome or ``realization`` and ignore the many other outcomes that could be produced by their conditional simulation tools. 3-D visualization tools allow them to create very realistic images of this single outcome as reality. There are many methods currently available for presenting the uncertainty information from a family of possible outcomes; most of these, however, use static displays and many present uncertainty in a format that is not intuitive. This paper exploresmore » the visualization of uncertainty through dynamic displays that exploit the intuitive link between uncertainty and change by presenting the use with a constantly evolving model. The key technical challenge to such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such animation is possible and to show that such dynamic displays can be an effective tool in risk analysis for the petroleum industry.« less
NASA Astrophysics Data System (ADS)
Monier, E.; Scott, J. R.; Sokolov, A. P.; Forest, C. E.; Schlosser, C. A.
2013-12-01
This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) incorporates a human activity model, it is possible to analyze uncertainties in emissions resulting from both uncertainties in the underlying socio-economic characteristics of the economic model and in the choice of climate-related policies. Another major feature is the flexibility to vary key climate parameters controlling the climate system response to changes in greenhouse gases and aerosols concentrations, e.g., climate sensitivity, ocean heat uptake rate, and strength of the aerosol forcing. The IGSM-CAM is not only able to realistically simulate the present-day mean climate and the observed trends at the global and continental scale, but it also simulates ENSO variability with realistic time scales, seasonality and patterns of SST anomalies, albeit with stronger magnitudes than observed. The IGSM-CAM shares the same general strengths and limitations as the Coupled Model Intercomparison Project Phase 3 (CMIP3) models in simulating present-day annual mean surface temperature and precipitation. Over land, the IGSM-CAM shows similar biases to the NCAR Community Climate System Model (CCSM) version 3, which shares the same atmospheric model. This study also presents 21st century simulations based on two emissions scenarios (unconstrained scenario and stabilization scenario at 660 ppm CO2-equivalent) similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios, and three sets of climate parameters. Results of the simulations with the chosen climate parameters provide a good approximation for the median, and the 5th and 95th percentiles of the probability distribution of 21st century changes in global mean surface air temperature from previous work with the IGSM. Because the IGSM-CAM framework only considers one particular climate model, it cannot be used to assess the structural modeling uncertainty arising from differences in the parameterization suites of climate models. However, comparison of the IGSM-CAM projections with simulations of 31 CMIP5 models under the RCP4.5 and RCP8.5 scenarios show that the range of warming at the continental scale shows very good agreement between the two ensemble simulations, except over Antarctica, where the IGSM-CAM overestimates the warming. This demonstrates that by sampling the climate system response, the IGSM-CAM, even though it relies on one single climate model, can essentially reproduce the range of future continental warming simulated by more than 30 different models. Precipitation changes projected in the IGSM-CAM simulations and the CMIP5 multi-model ensemble both display a large uncertainty at the continental scale. The two ensemble simulations show good agreement over Asia and Europe. However, the ranges of precipitation changes do not overlap - but display similar size - over Africa and South America, two continents where models generally show little agreement in the sign of precipitation changes and where CCSM3 tends to be an outlier. Overall, the IGSM-CAM provides an efficient and consistent framework to explore the large uncertainty in future projections of global and regional climate change associated with uncertainty in the climate response and projected emissions.
Ensembles modeling approach to study Climate Change impacts on Wheat
NASA Astrophysics Data System (ADS)
Ahmed, Mukhtar; Claudio, Stöckle O.; Nelson, Roger; Higgins, Stewart
2017-04-01
Simulations of crop yield under climate variability are subject to uncertainties, and quantification of such uncertainties is essential for effective use of projected results in adaptation and mitigation strategies. In this study we evaluated the uncertainties related to crop-climate models using five crop growth simulation models (CropSyst, APSIM, DSSAT, STICS and EPIC) and 14 general circulation models (GCMs) for 2 representative concentration pathways (RCP) of atmospheric CO2 (4.5 and 8.5 W m-2) in the Pacific Northwest (PNW), USA. The aim was to assess how different process-based crop models could be used accurately for estimation of winter wheat growth, development and yield. Firstly, all models were calibrated for high rainfall, medium rainfall, low rainfall and irrigated sites in the PNW using 1979-2010 as the baseline period. Response variables were related to farm management and soil properties, and included crop phenology, leaf area index (LAI), biomass and grain yield of winter wheat. All five models were run from 2000 to 2100 using the 14 GCMs and 2 RCPs to evaluate the effect of future climate (rainfall, temperature and CO2) on winter wheat phenology, LAI, biomass, grain yield and harvest index. Simulated time to flowering and maturity was reduced in all models except EPIC with some level of uncertainty. All models generally predicted an increase in biomass and grain yield under elevated CO2 but this effect was more prominent under rainfed conditions than irrigation. However, there was uncertainty in the simulation of crop phenology, biomass and grain yield under 14 GCMs during three prediction periods (2030, 2050 and 2070). We concluded that to improve accuracy and consistency in simulating wheat growth dynamics and yield under a changing climate, a multimodel ensemble approach should be used.
Complexities, Catastrophes and Cities: Emergency Dynamics in Varying Scenarios and Urban Topologies
NASA Astrophysics Data System (ADS)
Narzisi, Giuseppe; Mysore, Venkatesh; Byeon, Jeewoong; Mishra, Bud
Complex Systems are often characterized by agents capable of interacting with each other dynamically, often in non-linear and non-intuitive ways. Trying to characterize their dynamics often results in partial differential equations that are difficult, if not impossible, to solve. A large city or a city-state is an example of such an evolving and self-organizing complex environment that efficiently adapts to different and numerous incremental changes to its social, cultural and technological infrastructure [1]. One powerful technique for analyzing such complex systems is Agent-Based Modeling (ABM) [9], which has seen an increasing number of applications in social science, economics and also biology. The agent-based paradigm facilitates easier transfer of domain specific knowledge into a model. ABM provides a natural way to describe systems in which the overall dynamics can be described as the result of the behavior of populations of autonomous components: agents, with a fixed set of rules based on local information and possible central control. As part of the NYU Center for Catastrophe Preparedness and Response (CCPR1), we have been exploring how ABM can serve as a powerful simulation technique for analyzing large-scale urban disasters. The central problem in Disaster Management is that it is not immediately apparent whether the current emergency plans are robust against such sudden, rare and punctuated catastrophic events.
Ellinas, Christos; Allan, Neil; Durugbo, Christopher; Johansson, Anders
2015-01-01
Current societal requirements necessitate the effective delivery of complex projects that can do more while using less. Yet, recent large-scale project failures suggest that our ability to successfully deliver them is still at its infancy. Such failures can be seen to arise through various failure mechanisms; this work focuses on one such mechanism. Specifically, it examines the likelihood of a project sustaining a large-scale catastrophe, as triggered by single task failure and delivered via a cascading process. To do so, an analytical model was developed and tested on an empirical dataset by the means of numerical simulation. This paper makes three main contributions. First, it provides a methodology to identify the tasks most capable of impacting a project. In doing so, it is noted that a significant number of tasks induce no cascades, while a handful are capable of triggering surprisingly large ones. Secondly, it illustrates that crude task characteristics cannot aid in identifying them, highlighting the complexity of the underlying process and the utility of this approach. Thirdly, it draws parallels with systems encountered within the natural sciences by noting the emergence of self-organised criticality, commonly found within natural systems. These findings strengthen the need to account for structural intricacies of a project's underlying task precedence structure as they can provide the conditions upon which large-scale catastrophes materialise.
Estimating Uncertainty in Annual Forest Inventory Estimates
Ronald E. McRoberts; Veronica C. Lessard
1999-01-01
The precision of annual forest inventory estimates may be negatively affected by uncertainty from a variety of sources including: (1) sampling error; (2) procedures for updating plots not measured in the current year; and (3) measurement errors. The impact of these sources of uncertainty on final inventory estimates is investigated using Monte Carlo simulation...
NASA Astrophysics Data System (ADS)
Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.
2017-12-01
Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.
Reducing model uncertainty effects in flexible manipulators through the addition of passive damping
NASA Technical Reports Server (NTRS)
Alberts, T. E.
1987-01-01
An important issue in the control of practical systems is the effect of model uncertainty on closed loop performance. This is of particular concern when flexible structures are to be controlled, due to the fact that states associated with higher frequency vibration modes are truncated in order to make the control problem tractable. Digital simulations of a single-link manipulator system are employed to demonstrate that passive damping added to the flexible member reduces adverse effects associated with model uncertainty. A controller was designed based on a model including only one flexible mode. This controller was applied to larger order systems to evaluate the effects of modal truncation. Simulations using a Linear Quadratic Regulator (LQR) design assuming full state feedback illustrate the effect of control spillover. Simulations of a system using output feedback illustrate the destabilizing effect of observation spillover. The simulations reveal that the system with passive damping is less susceptible to these effects than the untreated case.
The uncertainty of crop yield projections is reduced by improved temperature response functions.
Wang, Enli; Martre, Pierre; Zhao, Zhigan; Ewert, Frank; Maiorano, Andrea; Rötter, Reimund P; Kimball, Bruce A; Ottman, Michael J; Wall, Gerard W; White, Jeffrey W; Reynolds, Matthew P; Alderman, Phillip D; Aggarwal, Pramod K; Anothai, Jakarat; Basso, Bruno; Biernath, Christian; Cammarano, Davide; Challinor, Andrew J; De Sanctis, Giacomo; Doltra, Jordi; Fereres, Elias; Garcia-Vila, Margarita; Gayler, Sebastian; Hoogenboom, Gerrit; Hunt, Leslie A; Izaurralde, Roberto C; Jabloun, Mohamed; Jones, Curtis D; Kersebaum, Kurt C; Koehler, Ann-Kristin; Liu, Leilei; Müller, Christoph; Naresh Kumar, Soora; Nendel, Claas; O'Leary, Garry; Olesen, Jørgen E; Palosuo, Taru; Priesack, Eckart; Eyshi Rezaei, Ehsan; Ripoche, Dominique; Ruane, Alex C; Semenov, Mikhail A; Shcherbak, Iurii; Stöckle, Claudio; Stratonovitch, Pierre; Streck, Thilo; Supit, Iwan; Tao, Fulu; Thorburn, Peter; Waha, Katharina; Wallach, Daniel; Wang, Zhimin; Wolf, Joost; Zhu, Yan; Asseng, Senthold
2017-07-17
Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on crop yield, but have large uncertainty in yield simulations. Here, we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for >50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 °C to 33 °C. We derived a set of new temperature response functions that when substituted in four wheat models reduced the error in grain yield simulations across seven global sites with different temperature regimes by 19% to 50% (42% average). We anticipate the improved temperature responses to be a key step to improve modelling of crops under rising temperature and climate change, leading to higher skill of crop yield projections.
The Uncertainty of Crop Yield Projections Is Reduced by Improved Temperature Response Functions
NASA Technical Reports Server (NTRS)
Wang, Enli; Martre, Pierre; Zhao, Zhigan; Ewert, Frank; Maiorano, Andrea; Rotter, Reimund P.; Kimball, Bruce A.; Ottman, Michael J.; White, Jeffrey W.; Reynolds, Matthew P.;
2017-01-01
Increasing the accuracy of crop productivity estimates is a key element in planning adaptation strategies to ensure global food security under climate change. Process-based crop models are effective means to project climate impact on crop yield, but have large uncertainty in yield simulations. Here, we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for is greater than 50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 C to 33 C. We derived a set of new temperature response functions that when substituted in four wheat models reduced the error in grain yield simulations across seven global sites with different temperature regimes by 19% to 50% (42% average). We anticipate the improved temperature responses to be a key step to improve modelling of crops under rising temperature and climate change, leading to higher skill of crop yield projections.
NASA Astrophysics Data System (ADS)
Hamidi, Mohammadreza; Shahanaghi, Kamran; Jabbarzadeh, Armin; Jahani, Ehsan; Pousti, Zahra
2017-12-01
In every production plant, it is necessary to have an estimation of production level. Sometimes there are many parameters affective in this estimation. In this paper, it tried to find an appropriate estimation of production level for an industrial factory called Barez in an uncertain environment. We have considered a part of production line, which has different production time for different kind of products, which means both environmental and system uncertainty. To solve the problem we have simulated the line and because of the uncertainty in the times, fuzzy simulation is considered. Required fuzzy numbers are estimated by the use of bootstrap technique. The results are used in production planning process by factory experts and have had satisfying consequences. Opinions of these experts about the efficiency of using this methodology, has been attached.
NASA Astrophysics Data System (ADS)
Tompkins, A. M.; Thomson, M. C.
2017-12-01
Simulations of the impact of climate variations on a vector-bornedisease such as malaria are subject to a number of sources ofuncertainty. These include the model structure and parameter settingsin addition to errors in the climate data and the neglect of theirspatial heterogeneity, especially over complex terrain. We use aconstrained genetic algorithm to confront these two sources ofuncertainty for malaria transmission in the highlands of Kenya. Thetechnique calibrates the parameter settings of a process-based,mathematical model of malaria transmission to vary within theirassessed level of uncertainty and also allows the calibration of thedriving climate data. The simulations show that in highland settingsclose to the threshold for sustained transmission, the uncertainty inclimate is more important to address than the malaria modeluncertainty. Applications of the coupled climate-malaria modelling system are briefly presented.
Radiometric properties of the NS001 Thematic Mapper Simulator aircraft multispectral scanner
NASA Technical Reports Server (NTRS)
Markham, Brian L.; Ahmad, Suraiya P.
1990-01-01
Laboratory tests of the NS001 TM are described emphasizing absolute calibration to determine the radiometry of the simulator's reflective channels. In-flight calibration of the data is accomplished with the NS001 internal integrating-sphere source because instabilities in the source can limit the absolute calibration. The data from 1987-89 indicate uncertainties of up to 25 percent with an apparent average uncertainty of about 15 percent. Also identified are dark current drift and sensitivity changes along the scan line, random noise, and nonlinearity which contribute errors of 1-2 percent. Uncertainties similar to hysteresis are also noted especially in the 2.08-2.35-micron range which can reduce sensitivity and cause errors. The NS001 TM Simulator demonstrates a polarization sensitivity that can generate errors of up to about 10 percent depending on the wavelength.
Drift induced by repeated hydropeaking waves in controlled conditions
NASA Astrophysics Data System (ADS)
Maiolini, Bruno; Bruno, M. Cristina; Biffi, Sofia; Cashman, Matthew J.
2014-05-01
Repeated hydropeaking events characterize most alpine rivers downstream of power plants fed by high elevation reservoirs. The effects of hydropeaking on the benthic communities are well known, and usually each hydropeaking wave causes an increase in tractive force and changes in temperature and water quality. Simulations of hydropeaking in artificial system can help to disentangle the direct effects of the modified flow regime from impacts associated with other associated physio-chemical changes, and with the effects of river regulation and land-use changes that often accompany water resource development. In September 2013 we conducted a set of controlled simulations in five steel flumes fed by an Alpine stream (Fersina stream, Adige River catchment, Trentino, Italy), where benthic invertebrates can freely colonize the flumes. One flume was used as control with no change in flow, in the other four flumes we simulated an hydropeaking wave lasting six hours, and repeated for five consecutive days. Flow was increased by twice baseflow in two flumes, and three times in the other two. We collected benthic samples before the beginning (morning of day 1) and after the end (afternoon of day 5) of the set of simulations to evaluate changes in the benthic communities due to induced drift migration. During each simulation, we collected drifting organisms at short time intervals to assess the responses to: 1) the initial discharge increase, 2) the persistence of high flows for several hours; 3) the decrease of discharge to the baseflow; 4) the change in drift with each successive day. Preliminary results indicate typical strong increases of catastrophic drift on the onset of each simulated hydropeaking, drift responses proportional to the absolute discharge increase, a decrease in the drift responses over successive days. Different taxa responded with different patterns: taxa which resist tractive force increased in drift only during the periods of baseflow that follow the habitat stress (behavioral drift) (e.g., Simuliidae, behavioral drift); other taxa which can not resist the increase in tractive force, drifted from the beginning of the simulation (e.g., Chironomidae, catastrophic drift).
Should catastrophic risks be included in a regulated competitive health insurance market?
van de Ven, W P; Schut, F T
1994-11-01
In 1988 the Dutch government launched a proposal for a national health insurance based on regulated competition. The mandatory benefits package should be offered by competing insurers and should cover both non-catastrophic risks (like hospital care, physician services and drugs) and catastrophic risks (like several forms of expensive long-term care). However, there are two arguments to exclude some of the catastrophic risks from the competitive insurance market, at least during the implementation process of the reforms. Firstly, the prospects for a workable system of risk-adjusted payments to the insurers that should take away the incentives for cream skimming are, at least during the next 5 years, more favorable for the non-catastrophic risks than for the catastrophic risks. Secondly, even if a workable system of risk-adjusted payments can be developed, the problem of quality skimping may be relevant for some of the catastrophic risks, but not for non-catastrophic risks. By 'quality skimping' we mean the reduction of the quality of care to a level which is below the minimum level that is acceptable to society. After 5 years of health care reforms in the Netherlands new insights have resulted in a growing support to confine the implementation of the reforms to the non-catastrophic risks. In drawing (and redrawing) the exact boundaries between different regulatory regimes for catastrophic and non-catastrophic risks, the expected benefits of a cost-effective substitution of care have to be weighted against the potential harm caused by cream skimming and quality skimping.
Personality and Temperament Correlates of Pain Catastrophizing in Young Adolescents
ERIC Educational Resources Information Center
Muris, Peter; Meesters, Cor; van den Hout, Anja; Wessels, Sylvia; Franken, Ingmar; Rassin, Eric
2007-01-01
Pain catastrophizing is generally viewed as an important cognitive factor underlying chronic pain. The present study examined personality and temperament correlates of pain catastrophizing in a sample of young adolescents (N = 132). Participants completed the Pain Catastrophizing Scale for Children, as well as scales for measuring sensitivity of…
NASA Astrophysics Data System (ADS)
Rivera, Diego; Rivas, Yessica; Godoy, Alex
2015-02-01
Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.
NASA Astrophysics Data System (ADS)
Bermejo-Moreno, Ivan; Campo, Laura; Larsson, Johan; Emory, Mike; Bodart, Julien; Palacios, Francisco; Iaccarino, Gianluca; Eaton, John
2013-11-01
We study the interaction between an oblique shock wave and the turbulent boundary layers inside a nearly-square duct by combining wall-modeled LES, 2D and 3D RANS simulations, targeting the experiment of Campo, Helmer & Eaton, 2012 (nominal conditions: M = 2 . 05 , Reθ = 6 , 500). A primary objective is to quantify the effect of aleatory and epistemic uncertainties on the STBLI. Aleatory uncertainties considered include the inflow conditions (Mach number of the incoming air stream and thickness of the boundary layers) and perturbations of the duct geometry upstream of the interaction. The epistemic uncertainty under consideration focuses on the RANS turbulence model form by injecting perturbations in the Reynolds stress anisotropy in regions of the flow where the model assumptions (in particular, the Boussinesq eddy-viscosity hypothesis) may be invalid. These perturbations are then propagated through the flow solver into the solution. The uncertainty quantification (UQ) analysis is done through 2D and 3D RANS simulations, assessing the importance of the three-dimensional effects imposed by the nearly-square duct geometry. Wall-modeled LES are used to verify elements of the UQ methodology and to explore the flow features and physics of the STBLI for multiple shock strengths. Financial support from the United States Department of Energy under the PSAAP program is gratefully acknowledged.
NASA Technical Reports Server (NTRS)
Fifrey, Priscilla
2010-01-01
Today Modeling and Simulation-- ---as an important practice or industry or area of expertise ----- is at a complex crossroad - a sort of cyber-highway--where these complexities meet-technical, economic, environmental, geopolitical and cultural. They may converge or collide. Let's not kid ourselves. It is all too much for anyone person or organization Malcolm Gladwell said it. "We have constructed a world in which the potential for high tech catastrophe is embedded in the fabric of everyday life." We are surrounded by problems that scream at us from our television, Internet and social networks along with billboards and protest signs. We face not just high tech catastrophes but, also, landslides, earthquakes, tornados, floods and hurricanes and large-scale criminality. Evil, war, famine and pestilence have not gone away. It is all too much to think about. My friend, George Peabody, who taught me everything I know about power said that addressing such issues requires that we constantly build our network, information resources and the credibility and visibility of our work. That is how we will build the power of simulation so it can change the world --even maybe, save it. We need all the help we can get and give one another because our human early warning systems appear to be out of kilter. We seem to have trouble imagining how small failings can continue to lead to catastrophic disaster. Think about O-rings and blowout preventers. One is reminded of the old nursery rhyme, "For want of a nail, a shoe was lost! for want of a shoe the horse was lost! for want of a rider the battle was lost and so the kingdom fell." Although the investigation will take more time for real answers, it is worrisome that a rig worker reported to the BBC that-- weeks before the explosion of Deep Ocean Horizon. -he identified a leak in the oil rig's safety equipment -the Control Pod of the blowout preventer which has giant shears designed to cut and seal off the well's main pipe. With both electronics and hydraulics, these are effectively the brains of the blowout preventer. No one fixed it, he alleges, they just shut it down and relied on the other control pod -an act deemed unacceptable by petroleum expert, Tad Patzek, at the University of Texas. The US Congress has identified numerous other problems with the blowout preventer, including design problems and unexpected modifications.
Addressing uncertainty in atomistic machine learning.
Peterson, Andrew A; Christensen, Rune; Khorshidi, Alireza
2017-05-10
Machine-learning regression has been demonstrated to precisely emulate the potential energy and forces that are output from more expensive electronic-structure calculations. However, to predict new regions of the potential energy surface, an assessment must be made of the credibility of the predictions. In this perspective, we address the types of errors that might arise in atomistic machine learning, the unique aspects of atomistic simulations that make machine-learning challenging, and highlight how uncertainty analysis can be used to assess the validity of machine-learning predictions. We suggest this will allow researchers to more fully use machine learning for the routine acceleration of large, high-accuracy, or extended-time simulations. In our demonstrations, we use a bootstrap ensemble of neural network-based calculators, and show that the width of the ensemble can provide an estimate of the uncertainty when the width is comparable to that in the training data. Intriguingly, we also show that the uncertainty can be localized to specific atoms in the simulation, which may offer hints for the generation of training data to strategically improve the machine-learned representation.
NASA Astrophysics Data System (ADS)
Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.
2014-12-01
This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.
Probabilistic evaluation of uncertainties and risks in aerospace components
NASA Technical Reports Server (NTRS)
Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.
1992-01-01
A methodology is presented for the computational simulation of primitive variable uncertainties, and attention is given to the simulation of specific aerospace components. Specific examples treated encompass a probabilistic material behavior model, as well as static, dynamic, and fatigue/damage analyses of a turbine blade in a mistuned bladed rotor in the SSME turbopumps. An account is given of the use of the NESSES probabilistic FEM analysis CFD code.
Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry
NASA Technical Reports Server (NTRS)
Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak
2011-01-01
This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.
Petrich, Nicholas T.; Spak, Scott N.; Carmichael, Gregory R.; Hu, Dingfei; Martinez, Andres; Hornbuckle, Keri C.
2013-01-01
Passive air samplers (PAS) including polyurethane foam (PUF) are widely deployed as an inexpensive and practical way to sample semi-volatile pollutants. However, concentration estimates from PAS rely on constant empirical mass transfer rates, which add unquantified uncertainties to concentrations. Here we present a method for modeling hourly sampling rates for semi-volatile compounds from hourly meteorology using first-principle chemistry, physics, and fluid dynamics, calibrated from depuration experiments. This approach quantifies and explains observed effects of meteorology on variability in compound-specific sampling rates and analyte concentrations; simulates nonlinear PUF uptake; and recovers synthetic hourly concentrations at a reference temperature. Sampling rates are evaluated for polychlorinated biphenyl congeners at a network of Harner model samplers in Chicago, Illinois during 2008, finding simulated average sampling rates within analytical uncertainty of those determined from loss of depuration compounds, and confirming quasi-linear uptake. Results indicate hourly, daily and interannual variability in sampling rates, sensitivity to temporal resolution in meteorology, and predictable volatility-based relationships between congeners. We quantify importance of each simulated process to sampling rates and mass transfer and assess uncertainty contributed by advection, molecular diffusion, volatilization, and flow regime within the PAS, finding PAS chamber temperature contributes the greatest variability to total process uncertainty (7.3%). PMID:23837599
NASA Astrophysics Data System (ADS)
Bhuiyan, M. A. E.; Nikolopoulos, E. I.; Anagnostou, E. N.
2017-12-01
Quantifying the uncertainty of global precipitation datasets is beneficial when using these precipitation products in hydrological applications, because precipitation uncertainty propagation through hydrologic modeling can significantly affect the accuracy of the simulated hydrologic variables. In this research the Iberian Peninsula has been used as the study area with a study period spanning eleven years (2000-2010). This study evaluates the performance of multiple hydrologic models forced with combined global rainfall estimates derived based on a Quantile Regression Forests (QRF) technique. In QRF technique three satellite precipitation products (CMORPH, PERSIANN, and 3B42 (V7)); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset are being utilized in this study. A high-resolution, ground-based observations driven precipitation dataset (named SAFRAN) available at 5 km/1 h resolution is used as reference. Through the QRF blending framework the stochastic error model produces error-adjusted ensemble precipitation realizations, which are used to force four global hydrological models (JULES (Joint UK Land Environment Simulator), WaterGAP3 (Water-Global Assessment and Prognosis), ORCHIDEE (Organizing Carbon and Hydrology in Dynamic Ecosystems) and SURFEX (Stands for Surface Externalisée) ) to simulate three hydrologic variables (surface runoff, subsurface runoff and evapotranspiration). The models are forced with the reference precipitation to generate reference-based hydrologic simulations. This study presents a comparative analysis of multiple hydrologic model simulations for different hydrologic variables and the impact of the blending algorithm on the simulated hydrologic variables. Results show how precipitation uncertainty propagates through the different hydrologic model structures to manifest in reduction of error in hydrologic variables.
Carter, Tony
2010-01-01
Volatile business conditions have led to drastic corporate downsizing, meaning organizations are expected to do more with less. Managers must be more knowledgeable and possess a more eclectic myriad of business skills, many of which have not even been seen until recently. Many internal and external changes have occurred to organizations that have dictated the need to do business differently. Changes such as technological advances; globalization; catastrophic business crises; a more frantic competitive climate; and more demanding, sophisticated customers are examples of some of the shifts in the external business environment. Internal changes to organizations have been in the form of reengineering, accompanied by structural realignments and downsizing; greater emphasis on quality levels in product and service output; faster communication channels; and a more educated, skilled employee base with higher expectations from management.
NASA Astrophysics Data System (ADS)
Gelati, Emiliano; Decharme, Bertrand; Calvet, Jean-Christophe; Minvielle, Marie; Polcher, Jan; Fairbairn, David; Weedon, Graham P.
2018-04-01
Physically consistent descriptions of land surface hydrology are crucial for planning human activities that involve freshwater resources, especially in light of the expected climate change scenarios. We assess how atmospheric forcing data uncertainties affect land surface model (LSM) simulations by means of an extensive evaluation exercise using a number of state-of-the-art remote sensing and station-based datasets. For this purpose, we use the CO2-responsive ISBA-A-gs LSM coupled with the CNRM version of the Total Runoff Integrated Pathways (CTRIP) river routing model. We perform multi-forcing simulations over the Euro-Mediterranean area (25-75.5° N, 11.5° W-62.5° E, at 0.5° resolution) from 1979 to 2012. The model is forced using four atmospheric datasets. Three of them are based on the ERA-Interim reanalysis (ERA-I). The fourth dataset is independent from ERA-Interim: PGF, developed at Princeton University. The hydrological impacts of atmospheric forcing uncertainties are assessed by comparing simulated surface soil moisture (SSM), leaf area index (LAI) and river discharge against observation-based datasets: SSM from the European Space Agency's Water Cycle Multi-mission Observation Strategy and Climate Change Initiative projects (ESA-CCI), LAI of the Global Inventory Modeling and Mapping Studies (GIMMS), and Global Runoff Data Centre (GRDC) river discharge. The atmospheric forcing data are also compared to reference datasets. Precipitation is the most uncertain forcing variable across datasets, while the most consistent are air temperature and SW and LW radiation. At the monthly timescale, SSM and LAI simulations are relatively insensitive to forcing uncertainties. Some discrepancies with ESA-CCI appear to be forcing-independent and may be due to different assumptions underlying the LSM and the remote sensing retrieval algorithm. All simulations overestimate average summer and early-autumn LAI. Forcing uncertainty impacts on simulated river discharge are larger on mean values and standard deviations than on correlations with GRDC data. Anomaly correlation coefficients are not inferior to those computed from raw monthly discharge time series, indicating that the model reproduces inter-annual variability fairly well. However, simulated river discharge time series generally feature larger variability compared to measurements. They also tend to overestimate winter-spring high flows and underestimate summer-autumn low flows. Considering that several differences emerge between simulations and reference data, which may not be completely explained by forcing uncertainty, we suggest several research directions. These range from further investigating the discrepancies between LSMs and remote sensing retrievals to developing new model components to represent physical and anthropogenic processes.
Extended Importance Sampling for Reliability Analysis under Evidence Theory
NASA Astrophysics Data System (ADS)
Yuan, X. K.; Chen, B.; Zhang, B. Q.
2018-05-01
In early engineering practice, the lack of data and information makes uncertainty difficult to deal with. However, evidence theory has been proposed to handle uncertainty with limited information as an alternative way to traditional probability theory. In this contribution, a simulation-based approach, called ‘Extended importance sampling’, is proposed based on evidence theory to handle problems with epistemic uncertainty. The proposed approach stems from the traditional importance sampling for reliability analysis under probability theory, and is developed to handle the problem with epistemic uncertainty. It first introduces a nominal instrumental probability density function (PDF) for every epistemic uncertainty variable, and thus an ‘equivalent’ reliability problem under probability theory is obtained. Then the samples of these variables are generated in a way of importance sampling. Based on these samples, the plausibility and belief (upper and lower bounds of probability) can be estimated. It is more efficient than direct Monte Carlo simulation. Numerical and engineering examples are given to illustrate the efficiency and feasible of the proposed approach.
Bei, Naifang; Li, Guohui; Meng, Zhiyong; Weng, Yonghui; Zavala, Miguel; Molina, L T
2014-11-15
The purpose of this study is to investigate the impact of using an ensemble Kalman filter (EnKF) on air quality simulations in the California-Mexico border region on two days (May 30 and June 04, 2010) during Cal-Mex 2010. The uncertainties in ozone (O3) and aerosol simulations in the border area due to the meteorological initial uncertainties were examined through ensemble simulations. The ensemble spread of surface O3 averaged over the coastal region was less than 10ppb. The spreads in the nitrate and ammonium aerosols are substantial on both days, mostly caused by the large uncertainties in the surface temperature and humidity simulations. In general, the forecast initialized with the EnKF analysis (EnKF) improved the simulation of meteorological fields to some degree in the border region compared to the reference forecast initialized with NCEP analysis data (FCST) and the simulation with observation nudging (FDDA), which in turn leading to reasonable air quality simulations. The simulated surface O3 distributions by EnKF were consistently better than FCST and FDDA on both days. EnKF usually produced more reasonable simulations of nitrate and ammonium aerosols compared to the observations, but still have difficulties in improving the simulations of organic and sulfate aerosols. However, discrepancies between the EnKF simulations and the measurements were still considerably large, particularly for sulfate and organic aerosols, indicating that there are still ample rooms for improvement in the present data assimilation and/or the modeling systems. Copyright © 2014 Elsevier B.V. All rights reserved.
Multiple Sclerosis and Catastrophic Health Expenditure in Iran.
Juyani, Yaser; Hamedi, Dorsa; Hosseini Jebeli, Seyede Sedighe; Qasham, Maryam
2016-09-01
There are many disabling medical conditions which can result in catastrophic health expenditure. Multiple Sclerosis is one of the most costly medical conditions through the world which encounter families to the catastrophic health expenditures. This study aims to investigate on what extent Multiple sclerosis patients face catastrophic costs. This study was carried out in Ahvaz, Iran (2014). The study population included households that at least one of their members suffers from MS. To analyze data, Logit regression model was employed by using the default software STATA12. 3.37% of families were encountered with catastrophic costs. Important variables including brand of drug, housing, income and health insurance were significantly correlated with catastrophic expenditure. This study suggests that although a small proportion of MS patients met the catastrophic health expenditure, mechanisms that pool risk and cost (e.g. health insurance) are required to protect them and improve financial and access equity in health care.
Pricing the property claim service (PCS) catastrophe insurance options using gamma distribution
NASA Astrophysics Data System (ADS)
Noviyanti, Lienda; Soleh, Achmad Zanbar; Setyanto, Gatot R.
2017-03-01
The catastrophic events like earthquakes, hurricanes or flooding are characteristics for some areas, a properly calculated annual premium would be closely as high as the loss insured. From an actuarial perspective, such events constitute the risk that are not insurable. On the other hand people living in such areas need protection. In order to securitize the catastrophe risk, futures or options based on a loss index could be considered. Chicago Board of Trade launched a new class of catastrophe insurance options based on new indices provided by Property Claim Services (PCS). The PCS-option is based on the Property Claim Service Index (PCS-Index). The index are used to determine and payout in writing index-based insurance derivatives. The objective of this paper is to price PCS Catastrophe Insurance Option based on PCS Catastrophe index. Gamma Distribution is used to estimate PCS Catastrophe index distribution.
On "black swans" and "perfect storms": risk analysis and management when statistics are not enough.
Paté-Cornell, Elisabeth
2012-11-01
Two images, "black swans" and "perfect storms," have struck the public's imagination and are used--at times indiscriminately--to describe the unthinkable or the extremely unlikely. These metaphors have been used as excuses to wait for an accident to happen before taking risk management measures, both in industry and government. These two images represent two distinct types of uncertainties (epistemic and aleatory). Existing statistics are often insufficient to support risk management because the sample may be too small and the system may have changed. Rationality as defined by the von Neumann axioms leads to a combination of both types of uncertainties into a single probability measure--Bayesian probability--and accounts only for risk aversion. Yet, the decisionmaker may also want to be ambiguity averse. This article presents an engineering risk analysis perspective on the problem, using all available information in support of proactive risk management decisions and considering both types of uncertainty. These measures involve monitoring of signals, precursors, and near-misses, as well as reinforcement of the system and a thoughtful response strategy. It also involves careful examination of organizational factors such as the incentive system, which shape human performance and affect the risk of errors. In all cases, including rare events, risk quantification does not allow "prediction" of accidents and catastrophes. Instead, it is meant to support effective risk management rather than simply reacting to the latest events and headlines. © 2012 Society for Risk Analysis.
Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González
2016-01-01
Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering. PMID:27872840
Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González
2016-01-01
Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sung, Yixing; Adams, Brian M.; Witkowski, Walter R.
2011-04-01
The CASL Level 2 Milestone VUQ.Y1.03, 'Enable statistical sensitivity and UQ demonstrations for VERA,' was successfully completed in March 2011. The VUQ focus area led this effort, in close partnership with AMA, and with support from VRI. DAKOTA was coupled to VIPRE-W thermal-hydraulics simulations representing reactors of interest to address crud-related challenge problems in order to understand the sensitivity and uncertainty in simulation outputs with respect to uncertain operating and model form parameters. This report summarizes work coupling the software tools, characterizing uncertainties, selecting sensitivity and uncertainty quantification algorithms, and analyzing the results of iterative studies. These demonstration studies focusedmore » on sensitivity and uncertainty of mass evaporation rate calculated by VIPRE-W, a key predictor for crud-induced power shift (CIPS).« less
NASA Astrophysics Data System (ADS)
Pantazidou, Marina; Liu, Ke
2008-02-01
This paper focuses on parameters describing the distribution of dense nonaqueous phase liquid (DNAPL) contaminants and investigates the variability of these parameters that results from soil heterogeneity. In addition, it quantifies the uncertainty reduction that can be achieved with increased density of soil sampling. Numerical simulations of DNAPL releases were performed using stochastic realizations of hydraulic conductivity fields generated with the same geostatistical parameters and conditioning data at two sampling densities, thus generating two simulation ensembles of low and high density (three-fold increase) of soil sampling. The results showed that DNAPL plumes in aquifers identical in a statistical sense exhibit qualitatively different patterns, ranging from compact to finger-like. The corresponding quantitative differences were expressed by defining several alternative measures that describe the DNAPL plume and computing these measures for each simulation of the two ensembles. The uncertainty in the plume features under study was affected to different degrees by the variability of the soil, with coefficients of variation ranging from about 20% to 90%, for the low-density sampling. Meanwhile, the increased soil sampling frequency resulted in reductions of uncertainty varying from 7% to 69%, for low- and high-uncertainty variables, respectively. In view of the varying uncertainty in the characteristics of a DNAPL plume, remedial designs that require estimates of the less uncertain features of the plume may be preferred over others that need a more detailed characterization of the source zone architecture.
Relationship of Catastrophizing to Fatigue Among Women Receiving Treatment for Breast Cancer
ERIC Educational Resources Information Center
Jacobsen, Paul B.; Andrykowski, Michael A.; Thors, Christina L.
2004-01-01
This study examined the relationship of catastrophizing to fatigue in 80 women receiving chemotherapy (CT) or radiotherapy (RT) for treatment of early stage breast cancer. Findings revealed expected relationships between catastrophizing and fatigue among women receiving RT but not CT. Among RT patients, those high in catastrophizing reported…
Positive Traits Linked to Less Pain through Lower Pain Catastrophizing
Hood, Anna; Pulvers, Kim; Carrillo, Janet; Merchant, Gina; Thomas, Marie
2011-01-01
The present study examined the association between positive traits, pain catastrophizing, and pain perceptions. We hypothesized that pain catastrophizing would mediate the relationship between positive traits and pain. First, participants (n = 114) completed the Trait Hope Scale, the Life Orientation Test- Revised, and the Pain Catastrophizing Scale. Participants then completed the experimental pain stimulus, a cold pressor task, by submerging their hand in a circulating water bath (0º Celsius) for as long as tolerable. Immediately following the task, participants completed the Short-Form McGill Pain Questionnaire (MPQ-SF). Pearson correlation found associations between hope and pain catastrophizing (r = −.41, p < .01) and MPQ-SF scores (r = −.20, p < .05). Optimism was significantly associated with pain catastrophizing (r = −.44, p < .01) and MPQ-SF scores (r = −.19, p < .05). Bootstrapping, a non-parametric resampling procedure, tested for mediation and supported our hypothesis that pain catastrophizing mediated the relationship between positive traits and MPQ-SF pain report. To our knowledge, this investigation is the first to establish that the protective link between positive traits and experimental pain operates through lower pain catastrophizing. PMID:22199416
Walter, Donald A.; LeBlanc, Denis R.
2008-01-01
Historical weapons testing and disposal activities at Camp Edwards, which is located on the Massachusetts Military Reservation, western Cape Cod, have resulted in the release of contaminants into an underlying sand and gravel aquifer that is the sole source of potable water to surrounding communities. Ground-water models have been used at the site to simulate advective transport in the aquifer in support of field investigations. Reasonable models developed by different groups and calibrated by trial and error often yield different predictions of advective transport, and the predictions lack quantitative measures of uncertainty. A recently (2004) developed regional model of western Cape Cod, modified to include the sensitivity and parameter-estimation capabilities of MODFLOW-2000, was used in this report to evaluate the utility of inverse (statistical) methods to (1) improve model calibration and (2) assess model-prediction uncertainty. Simulated heads and flows were most sensitive to recharge and to the horizontal hydraulic conductivity of the Buzzards Bay and Sandwich Moraines and the Buzzards Bay and northern parts of the Mashpee outwash plains. Conversely, simulated heads and flows were much less sensitive to vertical hydraulic conductivity. Parameter estimation (inverse calibration) improved the match to observed heads and flows; the absolute mean residual for heads improved by 0.32 feet and the absolute mean residual for streamflows improved by about 0.2 cubic feet per second. Advective-transport predictions in Camp Edwards generally were most sensitive to the parameters with the highest precision (lowest coefficients of variation), indicating that the numerical model is adequate for evaluating prediction uncertainties in and around Camp Edwards. The incorporation of an advective-transport observation, representing the leading edge of a contaminant plume that had been difficult to match by using trial-and-error calibration, improved the match between an observed and simulated plume path; however, a modified representation of local geology was needed to simultaneously maintain a reasonable calibration to heads and flows and to the plume path. Advective-transport uncertainties were expressed as about 68-, 95-, and 99-percent confidence intervals on three dimensional simulated particle positions. The confidence intervals can be graphically represented as ellipses around individual particle positions in the X-Y (geographic) plane and in the X-Z or Y-Z (vertical) planes. The merging of individual ellipses allows uncertainties on forward particle tracks to be displayed in map or cross-sectional view as a cone of uncertainty around a simulated particle path; uncertainties on reverse particle-track endpoints - representing simulated recharge locations - can be geographically displayed as areas at the water table around the discrete particle endpoints. This information gives decisionmakers insight into the level of confidence they can have in particle-tracking results and can assist them in the efficient use of available field resources.
The Communal Coping Model of Pain Catastrophizing in Daily Life: A Within-Couples Daily Diary Study
Burns, John W.; Gerhart, James I.; Post, Kristina M.; Smith, David A.; Porter, Laura S.; Schuster, Erik; Buvanendran, Asokumar; Fras, Anne Marie; Keefe, Francis J.
2015-01-01
The Communal Coping Model (CCM) characterizes pain catastrophizing as a coping tactic whereby pain expression elicits assistance and empathic responses from others. Married couples (N = 105 couples; one spouse with chronic low back pain) completed electronic daily diary assessments 5 times/day for 14 days. On these diaries, patients reported pain catastrophizing, pain, function, and perceived spouse support, criticism and hostility. Non-patient spouses reported on their support, criticism, and hostility directed toward patients, as well as their observations of patient pain and pain behaviors. Hierarchical linear modeling tested concurrent and lagged (3 hours later) relationships. Principal findings included: a) within-person increases in pain catastrophizing were positively associated with spouse reports of patient pain behavior in concurrent and lagged analyses; b) within-person increases in pain catastrophizing were positively associated with patient perceptions of spouse support, criticism, and hostility in concurrent analyses; c) within-person increases in pain catastrophizing were negatively associated with spouse reports of criticism and hostility in lagged analyses. Spouses reported patient behaviors that were tied to elevated pain catastrophizing, and spouses changed their behavior during and following elevated pain catastrophizing episodes. Pain catastrophizing may affect the interpersonal environment of patients and spouses in ways consistent with the CCM. PMID:26320945
Kim, Younhee; Yang, Bongmin
2011-05-01
The compositions of health expenditures by households in South Korea with and without catastrophic health expenditures were compared. Also, relationships between catastrophic health expenditures and household incomes, and between such health expenditures and expenditure patterns were explored. Data from the 2006 South Korean Household Income & Expenditure Survey, a representative survey of 90,696 households were analyzed. We used a double-hurdle model to assess each income source and expenditure category. The independent variable was the presence of catastrophic health expenditure. After adjusting for household characteristics, the results showed that earned, business, and property incomes were significantly lower, but transfer and loan incomes were significantly higher in households with catastrophic health expenditures than in those without such health expenditures. All consumption categories, other than health expenditure, were significantly lower in households with catastrophic health expenditures than in those without catastrophic health expenditures. This suggests that households with catastrophic health expenditures faced challenges in offset by the potentially excessive health expenditure and may have been obliged to reduce consumption of other items. The expansion of insurance coverage and lowering of out-of-pocket rates in the South Korean Health Insurance benefits could be a necessary first step in protecting households from the occurrence of health related economic catastrophes. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Challinor, A. J.
2010-12-01
Recent progress in assessing the impacts of climate variability and change on crops using multiple regional-scale simulations of crop and climate (i.e. ensembles) is presented. Simulations for India and China used perturbed responses to elevated carbon dioxide constrained using observations from FACE studies and controlled environments. Simulations with crop parameter sets representing existing and potential future adapted varieties were also carried out. The results for India are compared to sensitivity tests on two other crop models. For China, a parallel approach used socio-economic data to account for autonomous farmer adaptation. Results for the USA analysed cardinal temperatures under a range of local warming scenarios for 2711 varieties of spring wheat. The results are as follows: 1. Quantifying and reducing uncertainty. The relative contribution of uncertainty in crop and climate simulation to the total uncertainty in projected yield changes is examined. The observational constraints from FACE and controlled environment studies are shown to be the likely critical factor in maintaining relatively low crop parameter uncertainty. Without these constraints, crop simulation uncertainty in a doubled CO2 environment would likely be greater than uncertainty in simulating climate. However, consensus across crop models in India varied across different biophysical processes. 2. The response of yield to changes in local mean temperature was examined and compared to that found in the literature. No consistent response to temperature change was found across studies. 3. Implications for adaptation. China. The simulations of spring wheat in China show the relative importance of tolerance to water and heat stress in avoiding future crop failures. The greatest potential for reducing the number of harvests less than one standard deviation below the baseline mean yield value comes from alleviating water stress; the greatest potential for reducing harvests less than two standard deviations below the mean comes from alleviation of heat stress. The socio-economic analysis suggests that adaptation is also possible through measures such as greater investment. India. The simulations of groundnut in India identified regions where heat stress will play an increasing role in limiting crop yields, and other regions where crops with greater thermal time requirement will be needed. The simulations were used, together with an observed dataset and a simple analysis of crop cardinal temperatures and thermal time, to estimate the potential for adaptation using existing cultivars. USA. Analysis of spring wheat in the USA showed that at +2oC of local warming, 87% of the 2711 varieties examined, and all of the five most common varieties, could be used to maintain the crop duration of the current climate (i.e. successful adaptation to mean warming). At +4o this fell to 54% of all varieties, and two of the top five. 4. Future research. The results, and the limitations of the study, suggest directions for research to link climate and crop models, socio-economic analyses and crop variety trial data in order to prioritise adaptation options such as capacity building, plant breeding and biotechnology.
NASA Astrophysics Data System (ADS)
McSharry, Patrick; Mitchell, Andrew; Anderson, Rebecca
2010-05-01
Decision-makers in both public and private organisations depend on accurate data and scientific understanding to adequately address climate change and the impact of extreme events. The financial impacts of catastrophes on populations and infrastructure can be offset through effective risk transfer mechanisms, structured to reflect the specific perils and levels of exposure to be covered. Optimal strategies depend on the likely socio-econonomic impact, the institutional framework, the overall objectives of the covers placed and the level of both the frequency and severity of loss potential expected. The diversity of approaches across different countries has been documented by the Spanish "Consorcio de Compensación de Seguros". We discuss why international public/private partnerships are necessary for addressing the risk of natural catastrophes. International initiatives such as the Global Earthquake Model (GEM) and the World Forum of Catastrophe Programmes (WFCP) can provide effective guidelines for constructing natural catastrophe schemes. The World Bank has been instrumental in the creation of many of the existing schemes such as the Turkish Catastrophe Insurance Pool, the Caribbean Catastrophe Risk Insurance Facility and the Mongolian Index-Based Livestock Insurance Program. We review existing schemes and report on best practice in relation to providing protection against natural catastrophe perils. The suitability of catastrophe modelling approaches to support schemes across the world are discussed and we identify opportunities to improve risk assessment for such schemes through transparent frameworks for quantifying, pricing, sharing and financing catastrophe risk on a local and global basis.
Linguistic Indicators of Pain Catastrophizing in Patients With Chronic Musculoskeletal Pain.
Junghaenel, Doerte U; Schneider, Stefan; Broderick, Joan E
2017-05-01
The present study examined markers of pain catastrophizing in the word use of patients with chronic pain. Patients (N = 71) completed the Pain Catastrophizing Scale and wrote about their life with pain. Quantitative word count analysis examined whether the essays contained linguistic indicators of catastrophizing. Bivariate correlations showed that catastrophizing was associated with greater use of first person singular pronouns, such as "I" (r = .27, P ≤ .05) and pronouns referencing other people (r = .28, P ≤ .05). Catastrophizing was further significantly associated with greater use of sadness (r = .35, P ≤ .01) and anger (r = .30, P ≤ .05) words. No significant relationships with positive emotion and cognitive process words were evident. Controlling for patients' engagement in the writing task, gender, age, pain intensity, and neuroticism in multiple regression, the linguistic categories together uniquely explained 13.6% of the variance in catastrophizing (P ≤ .001). First person singular pronouns (β = .24, P ≤ .05) and words relating to sadness (β = .25, P ≤ .05) were significant, and pronouns referencing other people (β = .19, P ≤ .10) were trending. The results suggest that pain catastrophizing is associated with a "linguistic fingerprint" that can be discerned from patients' natural word use. Quantitative word count analysis examined whether pain catastrophizing is reflected in patients' written essays about living with pain. Catastrophizing was associated with more first person singular pronouns, more pronouns referencing other people, and more expressions of sadness and anger. The results can help understand how catastrophizing translates into communicative behaviors. Copyright © 2017 American Pain Society. Published by Elsevier Inc. All rights reserved.
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
Ronald E. McRoberts; Veronica C. Lessard
2001-01-01
Uncertainty in diameter growth predictions is attributed to three general sources: measurement error or sampling variability in predictor variables, parameter covariances, and residual or unexplained variation around model expectations. Using measurement error and sampling variability distributions obtained from the literature and Monte Carlo simulation methods, the...
Development of vulnerability curves to typhoon hazards based on insurance policy and claim dataset
NASA Astrophysics Data System (ADS)
Mo, Wanmei; Fang, Weihua; li, Xinze; Wu, Peng; Tong, Xingwei
2016-04-01
Vulnerability refers to the characteristics and circumstances of an exposure that make it vulnerable to the effects of some certain hazards. It can be divided into physical vulnerability, social vulnerability, economic vulnerabilities and environmental vulnerability. Physical vulnerability indicates the potential physical damage of exposure caused by natural hazards. Vulnerability curves, quantifying the loss ratio against hazard intensity with a horizontal axis for the intensity and a vertical axis for the Mean Damage Ratio (MDR), is essential to the vulnerability assessment and quantitative evaluation of disasters. Fragility refers to the probability of diverse damage states under different hazard intensity, revealing a kind of characteristic of the exposure. Fragility curves are often used to quantify the probability of a given set of exposure at or exceeding a certain damage state. The development of quantitative fragility and vulnerability curves is the basis of catastrophe modeling. Generally, methods for quantitative fragility and vulnerability assessment can be categorized into empirical, analytical and expert opinion or judgment-based ones. Empirical method is one of the most popular methods and it relies heavily on the availability and quality of historical hazard and loss dataset, which has always been a great challenge. Analytical method is usually based on the engineering experiments and it is time-consuming and lacks built-in validation, so its credibility is also sometimes criticized widely. Expert opinion or judgment-based method is quite effective in the absence of data but the results could be too subjective so that the uncertainty is likely to be underestimated. In this study, we will present the fragility and vulnerability curves developed with empirical method based on simulated historical typhoon wind, rainfall and induced flood, and insurance policy and claim datasets of more than 100 historical typhoon events. Firstly, an insurance exposure classification system is built according to structure type, occupation type and insurance coverage. Then MDR estimation method based on considering insurance policy structure and claim information is proposed and validated. Following that, fragility and vulnerability curves of the major exposure types for construction, homeowner insurance and enterprise property insurance are fitted with empirical function based on the historical dataset. The results of this study can not only help understand catastrophe risk and mange insured disaster risks, but can also be applied in other disaster risk reduction efforts.
Alsamhi, Saeed Hamood; Ansari, Mohd Samar; Ma, Ou; Almalki, Faris; Gupta, Sachin Kumar
2018-05-23
The actions taken at the initial times of a disaster are critical. Catastrophe occurs because of terrorist acts or natural hazards which have the potential to disrupt the infrastructure of wireless communication networks. Therefore, essential emergency functions such as search, rescue, and recovery operations during a catastrophic event will be disabled. We propose tethered balloon technology to provide efficient emergency communication services and reduce casualty mortality and morbidity for disaster recovery. The tethered balloon is an actively developed research area and a simple solution to support the performance, facilities, and services of emergency medical communication. The most critical requirement for rescue and relief teams is having a higher quality of communication services which enables them to save people's lives. Using our proposed technology, it has been reported that the performance of rescue and relief teams significantly improved. OPNET Modeler 14.5 is used for a network simulated with the help of ad hoc tools (Disaster Med Public Health Preparedness. 2018;page 1 of 8).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sig Drellack, Lance Prothro
2007-12-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less
NASA Astrophysics Data System (ADS)
Doroszkiewicz, Joanna; Romanowicz, Renata
2016-04-01
Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the project CHIHE (Climate Change Impact on Hydrological Extremes), carried out in the Institute of Geophysics Polish Academy of Sciences, funded by Norway Grants (contract No. Pol-Nor/196243/80/2013). The hydro-meteorological observations were provided by the Institute of Meteorology and Water Management (IMGW), Poland.
NASA Astrophysics Data System (ADS)
Lin, Tsungpo
Performance engineers face the major challenge in modeling and simulation for the after-market power system due to system degradation and measurement errors. Currently, the majority in power generation industries utilizes the deterministic data matching method to calibrate the model and cascade system degradation, which causes significant calibration uncertainty and also the risk of providing performance guarantees. In this research work, a maximum-likelihood based simultaneous data reconciliation and model calibration (SDRMC) is used for power system modeling and simulation. By replacing the current deterministic data matching with SDRMC one can reduce the calibration uncertainty and mitigate the error propagation to the performance simulation. A modeling and simulation environment for a complex power system with certain degradation has been developed. In this environment multiple data sets are imported when carrying out simultaneous data reconciliation and model calibration. Calibration uncertainties are estimated through error analyses and populated to performance simulation by using principle of error propagation. System degradation is then quantified by performance comparison between the calibrated model and its expected new & clean status. To mitigate smearing effects caused by gross errors, gross error detection (GED) is carried out in two stages. The first stage is a screening stage, in which serious gross errors are eliminated in advance. The GED techniques used in the screening stage are based on multivariate data analysis (MDA), including multivariate data visualization and principal component analysis (PCA). Subtle gross errors are treated at the second stage, in which the serial bias compensation or robust M-estimator is engaged. To achieve a better efficiency in the combined scheme of the least squares based data reconciliation and the GED technique based on hypotheses testing, the Levenberg-Marquardt (LM) algorithm is utilized as the optimizer. To reduce the computation time and stabilize the problem solving for a complex power system such as a combined cycle power plant, meta-modeling using the response surface equation (RSE) and system/process decomposition are incorporated with the simultaneous scheme of SDRMC. The goal of this research work is to reduce the calibration uncertainties and, thus, the risks of providing performance guarantees arisen from uncertainties in performance simulation.
NASA Astrophysics Data System (ADS)
Ricciuto, D. M.; Mei, R.; Mao, J.; Hoffman, F. M.; Kumar, J.
2015-12-01
Uncertainties in land parameters could have important impacts on simulated water and energy fluxes and land surface states, which will consequently affect atmospheric and biogeochemical processes. Therefore, quantification of such parameter uncertainties using a land surface model is the first step towards better understanding of predictive uncertainty in Earth system models. In this study, we applied a random-sampling, high-dimensional model representation (RS-HDMR) method to analyze the sensitivity of simulated photosynthesis, surface energy fluxes and surface hydrological components to selected land parameters in version 4.5 of the Community Land Model (CLM4.5). Because of the large computational expense of conducting ensembles of global gridded model simulations, we used the results of a previous cluster analysis to select one thousand representative land grid cells for simulation. Plant functional type (PFT)-specific uniform prior ranges for land parameters were determined using expert opinion and literature survey, and samples were generated with a quasi-Monte Carlo approach-Sobol sequence. Preliminary analysis of 1024 simulations suggested that four PFT-dependent parameters (including slope of the conductance-photosynthesis relationship, specific leaf area at canopy top, leaf C:N ratio and fraction of leaf N in RuBisco) are the dominant sensitive parameters for photosynthesis, surface energy and water fluxes across most PFTs, but with varying importance rankings. On the other hand, for surface ans sub-surface runoff, PFT-independent parameters, such as the depth-dependent decay factors for runoff, play more important roles than the previous four PFT-dependent parameters. Further analysis by conditioning the results on different seasons and years are being conducted to provide guidance on how climate variability and change might affect such sensitivity. This is the first step toward coupled simulations including biogeochemical processes, atmospheric processes or both to determine the full range of sensitivity of Earth system modeling to land-surface parameters. This can facilitate sampling strategies in measurement campaigns targeted at reduction of climate modeling uncertainties and can also provide guidance on land parameter calibration for simulation optimization.
NASA Astrophysics Data System (ADS)
Edwards, T.
2015-12-01
Modelling Antarctic marine ice sheet instability (MISI) - the potential for sustained grounding line retreat along downsloping bedrock - is very challenging because high resolution at the grounding line is required for reliable simulation. Assessing modelling uncertainties is even more difficult, because such models are very computationally expensive, restricting the number of simulations that can be performed. Quantifying uncertainty in future Antarctic instability has therefore so far been limited. There are several ways to tackle this problem, including: Simulating a small domain, to reduce expense and allow the use of ensemble methods; Parameterising response of the grounding line to the onset of MISI, for the same reasons; Emulating the simulator with a statistical model, to explore the impacts of uncertainties more thoroughly; Substituting physical models with expert-elicited statistical distributions. Methods 2-4 require rigorous testing against observations and high resolution models to have confidence in their results. We use all four to examine the dependence of MISI in the Amundsen Sea Embayment (ASE) on uncertain model inputs, including bedrock topography, ice viscosity, basal friction, model structure (sliding law and treatment of grounding line migration) and MISI triggers (including basal melting and risk of ice shelf collapse). We compare simulations from a 3000 member ensemble with GRISLI (methods 2, 4) with a 284 member ensemble from BISICLES (method 1) and also use emulation (method 3). Results from the two ensembles show similarities, despite very different model structures and ensemble designs. Basal friction and topography have a large effect on the extent of grounding line retreat, and the sliding law strongly modifies sea level contributions through changes in the rate and extent of grounding line retreat and the rate of ice thinning. Over 50 years, MISI in the ASE gives up to 1.1 mm/year (95% quantile) SLE in GRISLI (calibrated with ASE mass losses in a Bayesian framework), and up to 1.2 mm/year SLE (95% quantile) in the 270 completed BISICLES simulations (no calibration). We will show preliminary results emulating the models, calibrating with observations, and comparing them to assess structural uncertainty. We use these to improve MISI projections for the whole continent.
Market Dynamics and Optimal Timber Salvage After a Natural Catastrophe
Jeffrey P. Prestemon; Thomas P. Holmes
2004-01-01
Forest-based natural catastrophes are regular features of timber production in the United States, especially from hurricanes, fires, and insect and disease outbreaks. These catastrophes affect timber prices and result in economic transfers. We develop a model of timber market dynamics after such a catastrophe that shows how timber salvage affects the welfare of...
NASA Astrophysics Data System (ADS)
Karmalkar, A.
2017-12-01
Ensembles of dynamically downscaled climate change simulations are routinely used to capture uncertainty in projections at regional scales. I assess the reliability of two such ensembles for North America - NARCCAP and NA-CORDEX - by investigating the impact of model selection on representing uncertainty in regional projections, and the ability of the regional climate models (RCMs) to provide reliable information. These aspects - discussed for the six regions used in the US National Climate Assessment - provide an important perspective on the interpretation of downscaled results. I show that selecting general circulation models for downscaling based on their equilibrium climate sensitivities is a reasonable choice, but the six models chosen for NA-CORDEX do a poor job at representing uncertainty in winter temperature and precipitation projections in many parts of the eastern US, which lead to overconfident projections. The RCM performance is highly variable across models, regions, and seasons and the ability of the RCMs to provide improved seasonal mean performance relative to their parent GCMs seems limited in both RCM ensembles. Additionally, the ability of the RCMs to simulate historical climates is not strongly related to their ability to simulate climate change across the ensemble. This finding suggests limited use of models' historical performance to constrain their projections. Given these challenges in dynamical downscaling, the RCM results should not be used in isolation. Information on how well the RCM ensembles represent known uncertainties in regional climate change projections discussed here needs to be communicated clearly to inform maagement decisions.
NASA Astrophysics Data System (ADS)
Troldborg, Mads; Nowak, Wolfgang; Lange, Ida V.; Santos, Marta C.; Binning, Philip J.; Bjerg, Poul L.
2012-09-01
Mass discharge estimates are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Such estimates are, however, rather uncertain as they integrate uncertain spatial distributions of both concentration and groundwater flow. Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty, and (3) uncertain source zone and transport parameters. The method generates conditional realizations of the spatial flow and concentration distribution. An analytical macrodispersive transport solution is employed to simulate the mean concentration distribution, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. The method has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is demonstrated on a field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the cosimulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.
Mishra, Harshit; Karmakar, Subhankar; Kumar, Rakesh; Singh, Jitendra
2017-07-01
Landfilling is a cost-effective method, which makes it a widely used practice around the world, especially in developing countries. However, because of the improper management of landfills, high leachate leakage can have adverse impacts on soils, plants, groundwater, aquatic organisms, and, subsequently, human health. A comprehensive survey of the literature finds that the probabilistic quantification of uncertainty based on estimations of the human health risks due to landfill leachate contamination has rarely been reported. Hence, in the present study, the uncertainty about the human health risks from municipal solid waste landfill leachate contamination to children and adults was quantified to investigate its long-term risks by using a Monte Carlo simulation framework for selected heavy metals. The Turbhe sanitary landfill of Navi Mumbai, India, which was commissioned in the recent past, was selected to understand the fate and transport of heavy metals in leachate. A large residential area is located near the site, which makes the risk assessment problem both crucial and challenging. In this article, an integral approach in the form of a framework has been proposed to quantify the uncertainty that is intrinsic to human health risk estimation. A set of nonparametric cubic splines was fitted to identify the nonlinear seasonal trend in leachate quality parameters. LandSim 2.5, a landfill simulator, was used to simulate the landfill activities for various time slices, and further uncertainty in noncarcinogenic human health risk was estimated using a Monte Carlo simulation followed by univariate and multivariate sensitivity analyses. © 2016 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Mishra, H.; Karmakar, S.; Kumar, R.
2016-12-01
Risk assessment will not remain simple when it involves multiple uncertain variables. Uncertainties in risk assessment majorly results from (1) the lack of knowledge of input variable (mostly random), and (2) data obtained from expert judgment or subjective interpretation of available information (non-random). An integrated probabilistic-fuzzy health risk approach has been proposed for simultaneous treatment of random and non-random uncertainties associated with input parameters of health risk model. The LandSim 2.5, a landfill simulator, has been used to simulate the Turbhe landfill (Navi Mumbai, India) activities for various time horizons. Further the LandSim simulated six heavy metals concentration in ground water have been used in the health risk model. The water intake, exposure duration, exposure frequency, bioavailability and average time are treated as fuzzy variables, while the heavy metals concentration and body weight are considered as probabilistic variables. Identical alpha-cut and reliability level are considered for fuzzy and probabilistic variables respectively and further, uncertainty in non-carcinogenic human health risk is estimated using ten thousand Monte-Carlo simulations (MCS). This is the first effort in which all the health risk variables have been considered as non-deterministic for the estimation of uncertainty in risk output. The non-exceedance probability of Hazard Index (HI), summation of hazard quotients, of heavy metals of Co, Cu, Mn, Ni, Zn and Fe for male and female population have been quantified and found to be high (HI>1) for all the considered time horizon, which evidently shows possibility of adverse health effects on the population residing near Turbhe landfill.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siranosian, Antranik Antonio; Schembri, Philip Edward; Miller, Nathan Andrew
The Benchmark Extensible Tractable Testbed Engineering Resource (BETTER) is proposed as a family of modular test bodies that are intended to support engineering capability development by helping to identify weaknesses and needs. Weapon systems, subassemblies, and components are often complex and difficult to test and analyze, resulting in low confidence and high uncertainties in experimental and simulated results. The complexities make it difficult to distinguish between inherent uncertainties and errors due to insufficient capabilities. BETTER test bodies will first use simplified geometries and materials such that testing, data collection, modeling and simulation can be accomplished with high confidence and lowmore » uncertainty. Modifications and combinations of simple and well-characterized BETTER test bodies can then be used to increase complexity in order to reproduce relevant mechanics and identify weaknesses. BETTER can provide both immediate and long-term improvements in testing and simulation capabilities. This document presents the motivation, concept, benefits and examples for BETTER.« less
A Cascade Approach to Uncertainty Estimation for the Hydrological Simulation of Droughts
NASA Astrophysics Data System (ADS)
Smith, Katie; Tanguy, Maliko; Parry, Simon; Prudhomme, Christel
2016-04-01
Uncertainty poses a significant challenge in environmental research and the characterisation and quantification of uncertainty has become a research priority over the past decade. Studies of extreme events are particularly affected by issues of uncertainty. This study focusses on the sources of uncertainty in the modelling of streamflow droughts in the United Kingdom. Droughts are a poorly understood natural hazard with no universally accepted definition. Meteorological, hydrological and agricultural droughts have different meanings and vary both spatially and temporally, yet each is inextricably linked. The work presented here is part of two extensive interdisciplinary projects investigating drought reconstruction and drought forecasting capabilities in the UK. Lumped catchment models are applied to simulate streamflow drought, and uncertainties from 5 different sources are investigated: climate input data, potential evapotranspiration (PET) method, hydrological model, within model structure, and model parameterisation. Latin Hypercube sampling is applied to develop large parameter ensembles for each model structure which are run using parallel computing on a high performance computer cluster. Parameterisations are assessed using a multi-objective evaluation criteria which includes both general and drought performance metrics. The effect of different climate input data and PET methods on model output is then considered using the accepted model parameterisations. The uncertainty from each of the sources creates a cascade, and when presented as such the relative importance of each aspect of uncertainty can be determined.
NASA Astrophysics Data System (ADS)
Li, L.; Xu, C.-Y.; Engeland, K.
2012-04-01
With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD
Noel, Melanie; Rabbitts, Jennifer A; Tai, Gabrielle G; Palermo, Tonya M
2015-05-01
Children's memories for pain play a powerful role in their pain experiences. Parents' memories may also influence children's pain experiences, by influencing parent-child interactions about pain and children's cognitions and behaviors. Pain catastrophizing of children and parents has been implicated as a factor underlying memory biases; however, this has not been empirically examined. The current longitudinal study is the first to examine the role of pain catastrophizing of children and parents in the development of their pain memories after surgery. Participants were 49 youth (32 girls) aged 10 to 18 years undergoing major surgery and their parents. One week before surgery, children and parents completed measures of pain catastrophizing. Two weeks after surgery (the acute recovery period), children and parents completed measures of child pain intensity and affect. Two to 4 months after surgery, children's and parents' memories of child pain intensity and affect were elicited. Hierarchical linear regression models revealed that over and above covariates, parent catastrophizing about their child's pain (magnification, rumination) accounted for a significant portion of variance in children's affective and parents' sensory pain memories. Although parent catastrophizing had a direct effect on pain memories, mediation analyses revealed that child catastrophizing (helplessness) indirectly influenced children's and parents' pain memories through the child's postoperative pain experience. Findings highlight that aspects of catastrophic thinking about child pain before surgery are linked to distressing pain memories several months later. Although both child and parent catastrophizing influence pain memory development, parent catastrophizing is most influential to both children's and parents' evolving cognitions about child pain.
Whitney, Colette A; Dorfman, Caroline S; Shelby, Rebecca A; Keefe, Francis J; Gandhi, Vicky; Somers, Tamara J
2018-04-20
First-degree relatives of women with breast cancer may experience increased worry or perceived risk when faced with reminders of their own cancer risk. Worry and risk reminders may include physical symptoms (e.g., persistent breast pain) and caregiving experiences. Women who engage in pain catastrophizing may be particularly likely to experience increased distress when risk reminders are present. We examined the degree to which persistent breast pain and experience as a cancer caregiver were related to cancer worry and perceived risk in first-degree relatives of women with breast cancer (N = 85) and how catastrophic thoughts about breast pain could impact these relationships. There was a significant interaction between persistent breast pain and pain catastrophizing in predicting cancer worry (p = .03); among women who engaged in pain catastrophizing, cancer worry remained high even in the absence of breast pain. Pain catastrophizing also moderated the relationships between caregiving involvement and cancer worry (p = .003) and perceived risk (p = .03). As the degree of caregiving responsibility increased, cancer worry and perceived risk increased for women who engaged in pain catastrophizing; levels of cancer worry and perceived risk remained low and stable for women who did not engage in pain catastrophizing regardless of caregiving experience. The results suggest that first-degree relatives of breast cancer survivors who engage in pain catastrophizing may experience greater cancer worry and perceived risk and may benefit from interventions aimed at reducing catastrophic thoughts about pain.
Ku-band antenna acquisition and tracking performance study, volume 4
NASA Technical Reports Server (NTRS)
Huang, T. C.; Lindsey, W. C.
1977-01-01
The results pertaining to the tradeoff analysis and performance of the Ku-band shuttle antenna pointing and signal acquisition system are presented. The square, hexagonal and spiral antenna trajectories were investigated assuming the TDRS postulated uncertainty region and a flexible statistical model for the location of the TDRS within the uncertainty volume. The scanning trajectories, shuttle/TDRS signal parameters and dynamics, and three signal acquisition algorithms were integrated into a hardware simulation. The hardware simulation is quite flexible in that it allows for the evaluation of signal acquisition performance for an arbitrary (programmable) antenna pattern, a large range of C/N sub O's, various TDRS/shuttle a priori uncertainty distributions, and three distinct signal search algorithms.
Revisiting drought impact on tree mortality and carbon fluxes in ORCHIDEE-CAN DGVM
NASA Astrophysics Data System (ADS)
Joetzjer, E.; Bartlett, M. K.; Sack, L.; Poulter, B.; Ciais, P.
2016-12-01
In the past decade, two extreme droughts in the Amazon rainforest led to a perturbation of carbon cycle dynamics and forest structure, partly through an increase in tree mortality. While there is a relatively strong consensus in CMIP5 projections for an increase in both frequency and intensity of droughts across the Amazon, the potential for forest die-off constitutes a large uncertainty in projections of climate impacts on terrestrial ecosystems and carbon cycle feedbacks. Two long-term through fall exclusion experiments (TFE) provided novel observations of Amazonian ecosystem responses under drought. These experiments also provided a great opportunity to evaluate and improve models' behavior under drought. While current DGVMs use a wide array of algorithms to represent drought effect on ecosystem, most are associated with large uncertainty for representing drought-induced mortality, and require updating to include current information of physiological processes. During very strong droughts, the leaves desiccate and stems may undergo catastrophic embolism. However, even before that point, stomata close, to minimize excessive water loss and risk of hydraulic failure, which reduces carbon assimilation. Here, we describe a new parameterization of the stomatal conductance and mortality processes induced by drought using the ORCHIDEE-CAN dynamic vegetation model and test it using the two TFE results. We implemented a direct climate effect on mortality through catastrophic stem embolism using a new hydraulic architecture to represent the hydraulic potential gradient from the soil to the leaves based on vulnerability curves, and tree capacitance. In addition, growth primary productivity and transpiration are down-regulated by the hydraulic architecture in case of drought through stomatal conductance, which depends on the hydraulic potential of the leaf. We also explored the role of non structural carbohydrates (NSC) on hydraulic failure and mortality following the idea that stored NSC serves a critical osmotic function. Our results suggest that models have the capacity to represent drought induced individual mortality from a mechanistic perspective allowing a better understanding of the drought impacts on carbon cycle and forest structure in the tropics.
NASA Astrophysics Data System (ADS)
Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.
2012-04-01
The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station) was also investigated.
The Key Role of Pain Catastrophizing in the Disability of Patients with Acute Back Pain.
Ramírez-Maestre, C; Esteve, R; Ruiz-Párraga, G; Gómez-Pérez, L; López-Martínez, A E
2017-04-01
This study investigated the role of anxiety sensitivity, resilience, pain catastrophizing, depression, pain fear-avoidance beliefs, and pain intensity in patients with acute back pain-related disability. Two hundred and thirty-two patients with acute back pain completed questionnaires on anxiety sensitivity, resilience, pain catastrophizing, fear-avoidance beliefs, depression, pain intensity, and disability. A structural equation modelling analysis revealed that anxiety sensitivity was associated with pain catastrophizing, and resilience was associated with lower levels of depression. Pain catastrophizing was positively associated with fear-avoidance beliefs and pain intensity. Depression was associated with fear-avoidance beliefs, but was not associated with pain intensity. Finally, catastrophizing, fear-avoidance beliefs, and pain intensity were positively and significantly associated with acute back pain-related disability. Although fear-avoidance beliefs and pain intensity were associated with disability, the results showed that pain catastrophizing was a central variable in the pain experience and had significant direct associations with disability when pain was acute. Anxiety sensitivity appeared to be an important antecedent of catastrophizing, whereas the influence of resilience on the acute back pain experience was limited to its relationship with depression.
Nonstationary envelope process and first excursion probability
NASA Technical Reports Server (NTRS)
Yang, J.
1972-01-01
A definition of the envelope of nonstationary random processes is proposed. The establishment of the envelope definition makes it possible to simulate the nonstationary random envelope directly. Envelope statistics, such as the density function, joint density function, moment function, and level crossing rate, which are relevent to analyses of catastrophic failure, fatigue, and crack propagation in structures, are derived. Applications of the envelope statistics to the prediction of structural reliability under random loadings are discussed in detail.
A proposed criterion for aircraft flight in turbulence
NASA Technical Reports Server (NTRS)
Porter, R. F.; Robinson, A. C.
1971-01-01
A proposed criterion for aircraft flight in turbulent conditions is presented. Subjects discussed are: (1) the problem of flight safety in turbulence, (2) new criterion for turbulence flight where existing ones seem adequate, and (3) computational problems associated with new criterion. Primary emphasis is placed on catastrophic occurrences in subsonic cruise with the aircraft under automatic control. A Monte Carlo simulation is used in the formulation and evaluation of probabilities of survival of an encounter with turbulence.
DEMOGRAPHIC UNCERTAINTY IN ECOLOGICAL RISK ASSESSMENTS. (R825347)
We built a Ricker's model incorporating demographic stochasticity to simulate the effects of demographic uncertainty on responses of gray-tailed vole (Microtus canicaudus) populations to pesticide applications. We constructed models with mark-recapture data collected from populat...
Rainfall runoff modelling of the Upper Ganga and Brahmaputra basins using PERSiST.
Futter, M N; Whitehead, P G; Sarkar, S; Rodda, H; Crossman, J
2015-06-01
There are ongoing discussions about the appropriate level of complexity and sources of uncertainty in rainfall runoff models. Simulations for operational hydrology, flood forecasting or nutrient transport all warrant different levels of complexity in the modelling approach. More complex model structures are appropriate for simulations of land-cover dependent nutrient transport while more parsimonious model structures may be adequate for runoff simulation. The appropriate level of complexity is also dependent on data availability. Here, we use PERSiST; a simple, semi-distributed dynamic rainfall-runoff modelling toolkit to simulate flows in the Upper Ganges and Brahmaputra rivers. We present two sets of simulations driven by single time series of daily precipitation and temperature using simple (A) and complex (B) model structures based on uniform and hydrochemically relevant land covers respectively. Models were compared based on ensembles of Bayesian Information Criterion (BIC) statistics. Equifinality was observed for parameters but not for model structures. Model performance was better for the more complex (B) structural representations than for parsimonious model structures. The results show that structural uncertainty is more important than parameter uncertainty. The ensembles of BIC statistics suggested that neither structural representation was preferable in a statistical sense. Simulations presented here confirm that relatively simple models with limited data requirements can be used to credibly simulate flows and water balance components needed for nutrient flux modelling in large, data-poor basins.
NASA Astrophysics Data System (ADS)
Debusschere, Bert J.; Najm, Habib N.; Matta, Alain; Knio, Omar M.; Ghanem, Roger G.; Le Maître, Olivier P.
2003-08-01
This paper presents a model for two-dimensional electrochemical microchannel flow including the propagation of uncertainty from model parameters to the simulation results. For a detailed representation of electroosmotic and pressure-driven microchannel flow, the model considers the coupled momentum, species transport, and electrostatic field equations, including variable zeta potential. The chemistry model accounts for pH-dependent protein labeling reactions as well as detailed buffer electrochemistry in a mixed finite-rate/equilibrium formulation. Uncertainty from the model parameters and boundary conditions is propagated to the model predictions using a pseudo-spectral stochastic formulation with polynomial chaos (PC) representations for parameters and field quantities. Using a Galerkin approach, the governing equations are reformulated into equations for the coefficients in the PC expansion. The implementation of the physical model with the stochastic uncertainty propagation is applied to protein-labeling in a homogeneous buffer, as well as in two-dimensional electrochemical microchannel flow. The results for the two-dimensional channel show strong distortion of sample profiles due to ion movement and consequent buffer disturbances. The uncertainty in these results is dominated by the uncertainty in the applied voltage across the channel.
NASA Technical Reports Server (NTRS)
Sotiropoulou, Rafaella-Eleni P.; Nenes, Athanasios; Adams, Peter J.; Seinfeld, John H.
2007-01-01
In situ observations of aerosol and cloud condensation nuclei (CCN) and the GISS GCM Model II' with an online aerosol simulation and explicit aerosol-cloud interactions are used to quantify the uncertainty in radiative forcing and autoconversion rate from application of Kohler theory. Simulations suggest that application of Koehler theory introduces a 10-20% uncertainty in global average indirect forcing and 2-11% uncertainty in autoconversion. Regionally, the uncertainty in indirect forcing ranges between 10-20%, and 5-50% for autoconversion. These results are insensitive to the range of updraft velocity and water vapor uptake coefficient considered. This study suggests that Koehler theory (as implemented in climate models) is not a significant source of uncertainty for aerosol indirect forcing but can be substantial for assessments of aerosol effects on the hydrological cycle in climatically sensitive regions of the globe. This implies that improvements in the representation of GCM subgrid processes and aerosol size distribution will mostly benefit indirect forcing assessments. Predictions of autoconversion, by nature, will be subject to considerable uncertainty; its reduction may require explicit representation of size-resolved aerosol composition and mixing state.
NASA Astrophysics Data System (ADS)
Qi, W.; Zhang, C.; Fu, G.; Sweetapple, C.; Zhou, H.
2016-02-01
The applicability of six fine-resolution precipitation products, including precipitation radar, infrared, microwave and gauge-based products, using different precipitation computation recipes, is evaluated using statistical and hydrological methods in northeastern China. In addition, a framework quantifying uncertainty contributions of precipitation products, hydrological models, and their interactions to uncertainties in ensemble discharges is proposed. The investigated precipitation products are Tropical Rainfall Measuring Mission (TRMM) products (TRMM3B42 and TRMM3B42RT), Global Land Data Assimilation System (GLDAS)/Noah, Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and a Global Satellite Mapping of Precipitation (GSMAP-MVK+) product. Two hydrological models of different complexities, i.e. a water and energy budget-based distributed hydrological model and a physically based semi-distributed hydrological model, are employed to investigate the influence of hydrological models on simulated discharges. Results show APHRODITE has high accuracy at a monthly scale compared with other products, and GSMAP-MVK+ shows huge advantage and is better than TRMM3B42 in relative bias (RB), Nash-Sutcliffe coefficient of efficiency (NSE), root mean square error (RMSE), correlation coefficient (CC), false alarm ratio, and critical success index. These findings could be very useful for validation, refinement, and future development of satellite-based products (e.g. NASA Global Precipitation Measurement). Although large uncertainty exists in heavy precipitation, hydrological models contribute most of the uncertainty in extreme discharges. Interactions between precipitation products and hydrological models can have the similar magnitude of contribution to discharge uncertainty as the hydrological models. A better precipitation product does not guarantee a better discharge simulation because of interactions. It is also found that a good discharge simulation depends on a good coalition of a hydrological model and a precipitation product, suggesting that, although the satellite-based precipitation products are not as accurate as the gauge-based products, they could have better performance in discharge simulations when appropriately combined with hydrological models. This information is revealed for the first time and very beneficial for precipitation product applications.
How predictable is the timing of a summer ice-free Arctic?
NASA Astrophysics Data System (ADS)
Jahn, Alexandra; Kay, Jennifer E.; Holland, Marika M.; Hall, David M.
2016-09-01
Climate model simulations give a large range of over 100 years for predictions of when the Arctic could first become ice free in the summer, and many studies have attempted to narrow this uncertainty range. However, given the chaotic nature of the climate system, what amount of spread in the prediction of an ice-free summer Arctic is inevitable? Based on results from large ensemble simulations with the Community Earth System Model, we show that internal variability alone leads to a prediction uncertainty of about two decades, while scenario uncertainty between the strong (Representative Concentration Pathway (RCP) 8.5) and medium (RCP4.5) forcing scenarios adds at least another 5 years. Common metrics of the past and present mean sea ice state (such as ice extent, volume, and thickness) as well as global mean temperatures do not allow a reduction of the prediction uncertainty from internal variability.
Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa
NASA Astrophysics Data System (ADS)
Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu
2013-04-01
Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.
Quantifying chemical uncertainties in simulations of the ISM
NASA Astrophysics Data System (ADS)
Glover, Simon
2018-06-01
The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2014-01-01
Simulation codes often utilize finite-dimensional approximation resulting in numerical error. Some examples include, numerical methods utilizing grids and finite-dimensional basis functions, particle methods using a finite number of particles. These same simulation codes also often contain sources of uncertainty, for example, uncertain parameters and fields associated with the imposition of initial and boundary data,uncertain physical model parameters such as chemical reaction rates, mixture model parameters, material property parameters, etc.
NASA Astrophysics Data System (ADS)
Li, Y.; Kinzelbach, W.; Zhou, J.; Cheng, G. D.; Li, X.
2012-05-01
The hydrologic model HYDRUS-1-D and the crop growth model WOFOST are coupled to efficiently manage water resources in agriculture and improve the prediction of crop production. The results of the coupled model are validated by experimental studies of irrigated-maize done in the middle reaches of northwest China's Heihe River, a semi-arid to arid region. Good agreement is achieved between the simulated evapotranspiration, soil moisture and crop production and their respective field measurements made under current maize irrigation and fertilization. Based on the calibrated model, the scenario analysis reveals that the most optimal amount of irrigation is 500-600 mm in this region. However, for regions without detailed observation, the results of the numerical simulation can be unreliable for irrigation decision making owing to the shortage of calibrated model boundary conditions and parameters. So, we develop a method of combining model ensemble simulations and uncertainty/sensitivity analysis to speculate the probability of crop production. In our studies, the uncertainty analysis is used to reveal the risk of facing a loss of crop production as irrigation decreases. The global sensitivity analysis is used to test the coupled model and further quantitatively analyse the impact of the uncertainty of coupled model parameters and environmental scenarios on crop production. This method can be used for estimation in regions with no or reduced data availability.
Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic
Guillas, S.; Georgiopoulou, A.; Dias, F.
2017-01-01
Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained. PMID:28484339
Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic.
Salmanidou, D M; Guillas, S; Georgiopoulou, A; Dias, F
2017-04-01
Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained.
Uncertainty analyses of CO2 plume expansion subsequent to wellbore CO2 leakage into aquifers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hou, Zhangshuan; Bacon, Diana H.; Engel, David W.
2014-08-01
In this study, we apply an uncertainty quantification (UQ) framework to CO2 sequestration problems. In one scenario, we look at the risk of wellbore leakage of CO2 into a shallow unconfined aquifer in an urban area; in another scenario, we study the effects of reservoir heterogeneity on CO2 migration. We combine various sampling approaches (quasi-Monte Carlo, probabilistic collocation, and adaptive sampling) in order to reduce the number of forward calculations while trying to fully explore the input parameter space and quantify the input uncertainty. The CO2 migration is simulated using the PNNL-developed simulator STOMP-CO2e (the water-salt-CO2 module). For computationally demandingmore » simulations with 3D heterogeneity fields, we combined the framework with a scalable version module, eSTOMP, as the forward modeling simulator. We built response curves and response surfaces of model outputs with respect to input parameters, to look at the individual and combined effects, and identify and rank the significance of the input parameters.« less
Paracousti-UQ: A Stochastic 3-D Acoustic Wave Propagation Algorithm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preston, Leiph
Acoustic full waveform algorithms, such as Paracousti, provide deterministic solutions in complex, 3-D variable environments. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected sound levels within an environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. Performing Monte Carlo (MC) simulations is one method of assessing this uncertainty, but it can quickly become computationally intractable for realistic problems. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a fractionmore » of the computational cost of MC. Paracousti-UQ solves the SPDE system of 3-D acoustic wave propagation equations and provides estimates of the uncertainty of the output simulated wave field (e.g., amplitudes, waveforms) based on estimated probability distributions of the input medium and source parameters. This report describes the derivation of the stochastic partial differential equations, their implementation, and comparison of Paracousti-UQ results with MC simulations using simple models.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael
The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variantsmore » of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. exible run-time con gurable work ow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.« less
NASA Astrophysics Data System (ADS)
Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang
2010-05-01
CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces
Coupled Electro-Thermal Simulations of Single Event Burnout in Power Diodes
NASA Astrophysics Data System (ADS)
Albadri, A. M.; Schrimpf, R. D.; Walker, D. G.; Mahajan, S. V.
2005-12-01
Power diodes may undergo destructive failures when they are struck by high-energy particles during the off state (high reverse-bias voltage). This paper describes the failure mechanism using a coupled electro-thermal model. The specific case of a 3500-V diode is considered and it is shown that the temperatures reached when high voltages are applied are sufficient to cause damage to the constituent materials of the diode. The voltages at which failure occurs (e.g., 2700 V for a 17-MeV carbon ion) are consistent with previously reported data. The simulation results indicate that the catastrophic failures result from local heating caused by avalanche multiplication of ion-generated carriers.
NASA Astrophysics Data System (ADS)
Jianjun, X.; Bingjie, Y.; Rongji, W.
2018-03-01
The purpose of this paper was to improve catastrophe insurance level. Firstly, earthquake predictions were carried out using mathematical analysis method. Secondly, the foreign catastrophe insurances’ policies and models were compared. Thirdly, the suggestions on catastrophe insurances to China were discussed. The further study should be paid more attention on the earthquake prediction by introducing big data.
NASA Astrophysics Data System (ADS)
Xiong, Wei; Skalský, Rastislav; Porter, Cheryl H.; Balkovič, Juraj; Jones, James W.; Yang, Di
2016-09-01
Understanding the interactions between agricultural production and climate is necessary for sound decision-making in climate policy. Gridded and high-resolution crop simulation has emerged as a useful tool for building this understanding. Large uncertainty exists in this utilization, obstructing its capacity as a tool to devise adaptation strategies. Increasing focus has been given to sources of uncertainties for climate scenarios, input-data, and model, but uncertainties due to model parameter or calibration are still unknown. Here, we use publicly available geographical data sets as input to the Environmental Policy Integrated Climate model (EPIC) for simulating global-gridded maize yield. Impacts of climate change are assessed up to the year 2099 under a climate scenario generated by HadEM2-ES under RCP 8.5. We apply five strategies by shifting one specific parameter in each simulation to calibrate the model and understand the effects of calibration. Regionalizing crop phenology or harvest index appears effective to calibrate the model for the globe, but using various values of phenology generates pronounced difference in estimated climate impact. However, projected impacts of climate change on global maize production are consistently negative regardless of the parameter being adjusted. Different values of model parameter result in a modest uncertainty at global level, with difference of the global yield change less than 30% by the 2080s. The uncertainty subjects to decrease if applying model calibration or input data quality control. Calibration has a larger effect at local scales, implying the possible types and locations for adaptation.
Johnson, Aaron W; Duda, Kevin R; Sheridan, Thomas B; Oman, Charles M
2017-03-01
This article describes a closed-loop, integrated human-vehicle model designed to help understand the underlying cognitive processes that influenced changes in subject visual attention, mental workload, and situation awareness across control mode transitions in a simulated human-in-the-loop lunar landing experiment. Control mode transitions from autopilot to manual flight may cause total attentional demands to exceed operator capacity. Attentional resources must be reallocated and reprioritized, which can increase the average uncertainty in the operator's estimates of low-priority system states. We define this increase in uncertainty as a reduction in situation awareness. We present a model built upon the optimal control model for state estimation, the crossover model for manual control, and the SEEV (salience, effort, expectancy, value) model for visual attention. We modify the SEEV attention executive to direct visual attention based, in part, on the uncertainty in the operator's estimates of system states. The model was validated using the simulated lunar landing experimental data, demonstrating an average difference in the percentage of attention ≤3.6% for all simulator instruments. The model's predictions of mental workload and situation awareness, measured by task performance and system state uncertainty, also mimicked the experimental data. Our model supports the hypothesis that visual attention is influenced by the uncertainty in system state estimates. Conceptualizing situation awareness around the metric of system state uncertainty is a valuable way for system designers to understand and predict how reallocations in the operator's visual attention during control mode transitions can produce reallocations in situation awareness of certain states.
NASA Astrophysics Data System (ADS)
Lu, D.; Ricciuto, D. M.; Evans, K. J.
2017-12-01
Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.
2011-01-01
Background Albeit exercise is currently advocated as one of the most effective management strategies for fibromyalgia syndrome (FMS); the implementation of exercise as a FMS treatment in reality is significantly hampered by patients' poor compliance. The inference that pain catastrophizing is a key predictor of poor compliance in FMS patients, justifies considering the alteration of pain catastrophizing in improving compliance towards exercises in FMS patients. The aim of this study is to provide proof-of-concept for the development and testing of a novel virtual reality exposure therapy (VRET) program as treatment for exercise-related pain catastrophizing in FMS patients. Methods Two interlinked experimental studies will be conducted. Study 1 aims to objectively ascertain if neurophysiological changes occur in the functional brain areas associated with pain catastrophizing, when catastrophizing FMS subjects are exposed to visuals of exercise activities. Study 2 aims to ascertain the preliminary efficacy and feasibility of exposure to visuals of exercise activities as a treatment for exercise-related pain catastrophizing in FMS subjects. Twenty subjects will be selected from a group of FMS patients attending the Tygerberg Hospital in Cape Town, South Africa and randomly allocated to either the VRET (intervention) group or waiting list (control) group. Baseline neurophysiological activity for subjects will be collected in study 1 using functional magnetic resonance imaging (fMRI). In study 2, clinical improvement in pain catastrophizing will be measured using fMRI (objective) and the pain catastrophizing scale (subjective). Discussion The premise is if exposing FMS patients to visuals of various exercise activities trigger the functional brain areas associated with pain catastrophizing; then as a treatment, repeated exposure to visuals of the exercise activities using a VRET program could possibly decrease exercise-related pain catastrophizing in FMS patients. Proof-of-concept will either be established or negated. The results of this project are envisaged to revolutionize FMS and pain catastrophizing research and in the future, assist health professionals and FMS patients in reducing despondency regarding FMS management. Trial registration PACTR201011000264179 PMID:21529375
Uncertainty of fast biological radiation dose assessment for emergency response scenarios.
Ainsbury, Elizabeth A; Higueras, Manuel; Puig, Pedro; Einbeck, Jochen; Samaga, Daniel; Barquinero, Joan Francesc; Barrios, Lleonard; Brzozowska, Beata; Fattibene, Paola; Gregoire, Eric; Jaworska, Alicja; Lloyd, David; Oestreicher, Ursula; Romm, Horst; Rothkamm, Kai; Roy, Laurence; Sommer, Sylwester; Terzoudi, Georgia; Thierens, Hubert; Trompier, Francois; Vral, Anne; Woda, Clemens
2017-01-01
Reliable dose estimation is an important factor in appropriate dosimetric triage categorization of exposed individuals to support radiation emergency response. Following work done under the EU FP7 MULTIBIODOSE and RENEB projects, formal methods for defining uncertainties on biological dose estimates are compared using simulated and real data from recent exercises. The results demonstrate that a Bayesian method of uncertainty assessment is the most appropriate, even in the absence of detailed prior information. The relative accuracy and relevance of techniques for calculating uncertainty and combining assay results to produce single dose and uncertainty estimates is further discussed. Finally, it is demonstrated that whatever uncertainty estimation method is employed, ignoring the uncertainty on fast dose assessments can have an important impact on rapid biodosimetric categorization.
Tsunami hazard assessments with consideration of uncertain earthquakes characteristics
NASA Astrophysics Data System (ADS)
Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.
2017-12-01
The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Heng, E-mail: hengli@mdanderson.org; Zhu, X. Ronald; Zhang, Xiaodong
Purpose: To develop and validate a novel delivery strategy for reducing the respiratory motion–induced dose uncertainty of spot-scanning proton therapy. Methods and Materials: The spot delivery sequence was optimized to reduce dose uncertainty. The effectiveness of the delivery sequence optimization was evaluated using measurements and patient simulation. One hundred ninety-one 2-dimensional measurements using different delivery sequences of a single-layer uniform pattern were obtained with a detector array on a 1-dimensional moving platform. Intensity modulated proton therapy plans were generated for 10 lung cancer patients, and dose uncertainties for different delivery sequences were evaluated by simulation. Results: Without delivery sequence optimization,more » the maximum absolute dose error can be up to 97.2% in a single measurement, whereas the optimized delivery sequence results in a maximum absolute dose error of ≤11.8%. In patient simulation, the optimized delivery sequence reduces the mean of fractional maximum absolute dose error compared with the regular delivery sequence by 3.3% to 10.6% (32.5-68.0% relative reduction) for different patients. Conclusions: Optimizing the delivery sequence can reduce dose uncertainty due to respiratory motion in spot-scanning proton therapy, assuming the 4-dimensional CT is a true representation of the patients' breathing patterns.« less
NASA Astrophysics Data System (ADS)
Jie, M.; Zhang, J.; Guo, B. B.
2017-12-01
As a typical distributed hydrological model, the SWAT model also has a challenge in calibrating parameters and analysis their uncertainty. This paper chooses the Chaohe River Basin China as the study area, through the establishment of the SWAT model, loading the DEM data of the Chaohe river basin, the watershed is automatically divided into several sub-basins. Analyzing the land use, soil and slope which are on the basis of the sub-basins and calculating the hydrological response unit (HRU) of the study area, after running SWAT model, the runoff simulation values in the watershed are obtained. On this basis, using weather data, known daily runoff of three hydrological stations, combined with the SWAT-CUP automatic program and the manual adjustment method are used to analyze the multi-site calibration of the model parameters. Furthermore, the GLUE algorithm is used to analyze the parameters uncertainty of the SWAT model. Through the sensitivity analysis, calibration and uncertainty study of SWAT, the results indicate that the parameterization of the hydrological characteristics of the Chaohe river is successful and feasible which can be used to simulate the Chaohe river basin.
NASA Astrophysics Data System (ADS)
Chigira, M.; Matsushi, Y.; Tsou, C.
2013-12-01
Our experience of catastrophic landslides induced by rainstorms and earthquakes in recent years suggests that many of them are preceded by deep-seated gravitational slope deformation. Deep-seated gravitational slope deformation continues slowly and continually and some of them transform into catastrophic failures, which cause devastating damage in wide areas. Some other types, however, do not change into catastrophic failure. Deep-seated gravitational slope deformation that preceded catastrophic failures induced by typhoon Talas 2011 Japan, had been surveyed with airborne laser scanner beforehand, of which high-resolution DEMs gave us an important clue to identify which type of topographic features of gravitational slope deformation is susceptible to catastrophic failure. We found that 26 of 39 deep-seated catastrophic landslides had small scarps along the heads of future landslides. These scarps were caused by gravitational slope deformation that preceded the catastrophic failure. Although the scarps may have been enlarged by degradation, their sizes relative to the whole slopes suggest that minimal slope deformation had occurred in the period immediately before the catastrophic failure. The scarp ratio, defined as the ratio of length of a scarp to that of the whole slope both measured along the slope line, ranged from 1% to 23%. 38% of the landslides with small scarps had scarp ratios less than 4%, and a half less than 8%. This fact suggests that the gravitational slope deformation preceded catastrophic failure was relatively small and may suggest that those slopes were under critical conditions just before catastrophic failure. The above scarp ratios may be characteristic to accretional complex with undulating, anastomosing thrust faults, which were major sliding surfaces of the typhoon-induced landslides. Eleven of the remaining 13 landslides occurred in landslide scars of previous landslides or occurred as an extension of landslide scars at the lower parts of gravitationally deformed slopes. Remaining one landslide had been preceded by a linear depression at its top, and the topographic precursors of the remaining one landslide could not been specified.
NASA Astrophysics Data System (ADS)
Raj, R.; Hamm, N. A. S.; van der Tol, C.; Stein, A.
2015-08-01
Gross primary production (GPP), separated from flux tower measurements of net ecosystem exchange (NEE) of CO2, is used increasingly to validate process-based simulators and remote sensing-derived estimates of simulated GPP at various time steps. Proper validation should include the uncertainty associated with this separation at different time steps. This can be achieved by using a Bayesian framework. In this study, we estimated the uncertainty in GPP at half hourly time steps. We used a non-rectangular hyperbola (NRH) model to separate GPP from flux tower measurements of NEE at the Speulderbos forest site, The Netherlands. The NRH model included the variables that influence GPP, in particular radiation, and temperature. In addition, the NRH model provided a robust empirical relationship between radiation and GPP by including the degree of curvature of the light response curve. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. Adopting a Bayesian approach, we defined the prior distribution of each NRH parameter. Markov chain Monte Carlo (MCMC) simulation was used to update the prior distribution of each NRH parameter. This allowed us to estimate the uncertainty in the separated GPP at half-hourly time steps. This yielded the posterior distribution of GPP at each half hour and allowed the quantification of uncertainty. The time series of posterior distributions thus obtained allowed us to estimate the uncertainty at daily time steps. We compared the informative with non-informative prior distributions of the NRH parameters. The results showed that both choices of prior produced similar posterior distributions GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.
An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon
NASA Technical Reports Server (NTRS)
Rutherford, Brian
2000-01-01
The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.
Performance of Trajectory Models with Wind Uncertainty
NASA Technical Reports Server (NTRS)
Lee, Alan G.; Weygandt, Stephen S.; Schwartz, Barry; Murphy, James R.
2009-01-01
Typical aircraft trajectory predictors use wind forecasts but do not account for the forecast uncertainty. A method for generating estimates of wind prediction uncertainty is described and its effect on aircraft trajectory prediction uncertainty is investigated. The procedure for estimating the wind prediction uncertainty relies uses a time-lagged ensemble of weather model forecasts from the hourly updated Rapid Update Cycle (RUC) weather prediction system. Forecast uncertainty is estimated using measures of the spread amongst various RUC time-lagged ensemble forecasts. This proof of concept study illustrates the estimated uncertainty and the actual wind errors, and documents the validity of the assumed ensemble-forecast accuracy relationship. Aircraft trajectory predictions are made using RUC winds with provision for the estimated uncertainty. Results for a set of simulated flights indicate this simple approach effectively translates the wind uncertainty estimate into an aircraft trajectory uncertainty. A key strength of the method is the ability to relate uncertainty to specific weather phenomena (contained in the various ensemble members) allowing identification of regional variations in uncertainty.
Reversing cooling flows with AGN jets: shock waves, rarefaction waves and trailing outflows
NASA Astrophysics Data System (ADS)
Guo, Fulai; Duan, Xiaodong; Yuan, Ye-Fei
2018-01-01
The cooling flow problem is one of the central problems in galaxy clusters, and active galactic nucleus (AGN) feedback is considered to play a key role in offsetting cooling. However, how AGN jets heat and suppress cooling flows remains highly debated. Using an idealized simulation of a cool-core cluster, we study the development of central cooling catastrophe and how a subsequent powerful AGN jet event averts cooling flows, with a focus on complex gasdynamical processes involved. We find that the jet drives a bow shock, which reverses cooling inflows and overheats inner cool-core regions. The shocked gas moves outward in a rarefaction wave, which rarefies the dense core and adiabatically transports a significant fraction of heated energy to outer regions. As the rarefaction wave propagates away, inflows resume in the cluster core, but a trailing outflow is uplifted by the AGN bubble, preventing gas accumulation and catastrophic cooling in central regions. Inflows and trailing outflows constitute meridional circulations in the cluster core. At later times, trailing outflows fall back to the cluster centre, triggering central cooling catastrophe and potentially a new generation of AGN feedback. We thus envisage a picture of cool cluster cores going through cycles of cooling-induced contraction and AGN-induced expansion. This picture naturally predicts an anti-correlation between the gas fraction (or X-ray luminosity) of cool cores and the central gas entropy, which may be tested by X-ray observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Y. Z., E-mail: yzzhangmail@sohu.com
2015-02-10
Using a 2.5-dimensional MHD simulation, we investigate the role played by the inner coronal null point in the formation and evolution of solar quiescent prominences. The flux rope is characterized by its magnetic fluxes, the toroidal magnetic flux Φ {sub p} and the poloidal flux Φ{sub ψ}. It is found that for a given Φ {sub p}, the catastrophe does not occur in the flux rope system until Φ{sub ψ} increases to a critical point. Moreover, the magnetic flux of the null point is the maximum value of the magnetic flux in the quadrupole background magnetic field, and represented bymore » ψ {sub N}. The results show that the bigger ψ {sub N} usually corresponds to the smaller catastrophic point, the lower magnetic energy of the flux rope system, and the lesser magnetic energy inside the flux rope. Our results confirm that catastrophic disruption of the prominence occurs more easily when there is a bigger ψ {sub N}. However, ψ {sub N} has little influence on the maximum speed of the coronal mass ejections (CMEs) with an erupted prominence. Thus we argue that a topological configuration with the inner coronal null point is a necessary structure for the formation and evolution of solar quiescent prominences. In conclusion, it is easier for the prominences to form and to erupt as a core part of the CMEs in the magnetic structure with a greater ψ {sub N}.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henderson, Calen B., E-mail: henderson@astronomy.ohio-state.edu
2015-02-10
I investigate the possibility of constraining the flux of the lens (i.e., host star) for the types of planetary systems the Korean Microlensing Telescope Network is predicted to find. I examine the potential to obtain lens flux measurements by (1) imaging the lens once it is spatially resolved from the source, (2) measuring the elongation of the point-spread function of the microlensing target (lens+source) when the lens and source are still unresolved, and (3) taking prompt follow-up photometry. In each case I simulate the observing programs for a representative example of current ground-based adaptive optics (AO) facilities (specifically NACO onmore » the Very Large Telescope), future ground-based AO facilities (GMTIFS on the Giant Magellan Telescope, GMT), and future space telescopes (NIRCAM on the James Webb Space Telescope, JWST). Given the predicted distribution of relative lens-source proper motions, I find that the lens flux could be measured to a precision of σ{sub H{sub ℓ}}≤0.1 for ≳60% of planet detections ≥5 yr after each microlensing event for a simulated observing program using GMT, which images resolved lenses. NIRCAM on JWST would be able to carry out equivalently high-precision measurements for ∼28% of events Δt = 10 yr after each event by imaging resolved lenses. I also explore the effects various blend components would have on the mass derived from prompt follow-up photometry, including companions to the lens, companions to the source, and unassociated interloping stars. I find that undetected blend stars would cause catastrophic failures (i.e., >50% fractional uncertainty in the inferred lens mass) for ≲ (16 · f {sub bin})% of planet detections, where f {sub bin} is the binary fraction, with the majority of these failures occurring for host stars with mass ≲0.3 M {sub ☉}.« less
Does catastrophic thinking enhance oesophageal pain sensitivity? An experimental investigation.
Martel, M O; Olesen, A E; Jørgensen, D; Nielsen, L M; Brock, C; Edwards, R R; Drewes, A M
2016-09-01
Gastro-oesophageal reflux disease (GORD) is a major health problem that is frequently accompanied by debilitating oesophageal pain symptoms. The first objective of the study was to examine the association between catastrophizing and oesophageal pain sensitivity. The second objective was to examine whether catastrophizing was associated with the magnitude of acid-induced oesophageal sensitization. Twenty-five healthy volunteers (median age: 24.0 years; range: 22-31) were recruited and were asked to complete the Pain Catastrophizing Scale (PCS). During two subsequent study visits, mechanical, thermal, and electrical pain sensitivity in the oesophagus was assessed before and after inducing oesophageal sensitization using a 30-min intraluminal oesophageal acid perfusion procedure. Analyses were conducted based on data averaged across the two study visits. At baseline, catastrophizing was significantly associated with mechanical (r = -0.42, p < 0.05) and electrical (r = -0.60, p < 0.01) pain thresholds. After acid perfusion, catastrophizing was also significantly associated with mechanical (r = -0.58, p < 0.01) and electrical (r = -0.50, p < 0.05) pain thresholds. Catastrophizing was not significantly associated with thermal pain thresholds. Subsequent analyses revealed that catastrophizing was not significantly associated with the magnitude of acid-induced oesophageal sensitization. Taken together, findings from the present study suggest that catastrophic thinking exerts an influence on oesophageal pain sensitivity, but not necessarily on the magnitude of acid-induced oesophageal sensitization. WHAT DOES THIS STUDY ADD?: Catastrophizing is associated with heightened pain sensitivity in the oesophagus. This was substantiated by assessing responses to noxious stimulation of the oesophagus using an experimental paradigm mimicking features and symptoms experienced by patients with gastro-oesophageal reflux disease (GORD). © 2016 European Pain Federation - EFIC®
Hospitalization and catastrophic medical payment: evidence from hospitals located in Tehran.
Ghiasvand, Hesam; Sha'baninejad, Hossein; Arab, Mohammad; Rashidian, Arash
2014-07-01
Hospitalized patients constitute the main fraction of users in any health system. Financial burden of reimbursement for received services and cares by these users is sometimes unbearable and may lead to catastrophic medical payments. So, designing and implementing effective health prepayments schemes appear to be an effective governmental intervention to reduce catastrophic medical payments and protect households against it. We aimed to calculate the proportion of hospitalized patients exposed to catastrophic medical payments, its determinant factors and its distribution. We conducted a cross sectional study with 400 samples in five hospitals affiliated with Tehran University of Medical Sciences (TUMS). A self-administered questionnaire was distributed among respondents. Data were analyzed by logistic regression and χ(2) statistics. Also, we drew the Lorenz curve and calculated the Gini coefficient in order to present the distribution of catastrophic medical payments burden on different income levels. About 15.05% of patients were exposed to catastrophic medical payments. Also, we found that the educational level of the patient's family head, the sex of the patient's family head, hospitalization day numbers, having made any out of hospital payments linked with the same admission and households annual income levels; were linked with a higher likelihood of exposure to catastrophic medical payments. Also, the Gini coefficient is about 0.8 for catastrophic medical payments distribution. There is a high level of catastrophic medical payments in hospitalized patients. The weakness of economic status of households and the not well designed prepayments schemes on the other hand may lead to this. This paper illustrated a clear picture for catastrophic medical payments at hospital level and suggests applicable notes to Iranian health policymakers and planners.
Lazaridou, Asimina; Kim, Jieun; Cahalan, Christine M; Loggia, Marco L; Franceschelli, Olivia; Berna, Chantal; Schur, Peter; Napadow, Vitaly; Edwards, Robert R
2017-03-01
Fibromyalgia (FM) is a chronic, common pain disorder characterized by hyperalgesia. A key mechanism by which cognitive-behavioral therapy (CBT) fosters improvement in pain outcomes is via reductions in hyperalgesia and pain-related catastrophizing, a dysfunctional set of cognitive-emotional processes. However, the neural underpinnings of these CBT effects are unclear. Our aim was to assess CBT's effects on the brain circuitry underlying hyperalgesia in FM patients, and to explore the role of treatment-associated reduction in catastrophizing as a contributor to normalization of pain-relevant brain circuitry and clinical improvement. In total, 16 high-catastrophizing FM patients were enrolled in the study and randomized to 4 weeks of individual treatment with either CBT or a Fibromyalgia Education (control) condition. Resting state functional magnetic resonance imaging scans evaluated functional connectivity between key pain-processing brain regions at baseline and posttreatment. Clinical outcomes were assessed at baseline, posttreatment, and 6-month follow-up. Catastrophizing correlated with increased resting state functional connectivity between S1 and anterior insula. The CBT group showed larger reductions (compared with the education group) in catastrophizing at posttreatment (P<0.05), and CBT produced significant reductions in both pain and catastrophizing at the 6-month follow-up (P<0.05). Patients in the CBT group also showed reduced resting state connectivity between S1 and anterior/medial insula at posttreatment; these reductions in resting state connectivity were associated with concurrent treatment-related reductions in catastrophizing. The results add to the growing support for the clinically important associations between S1-insula connectivity, clinical pain, and catastrophizing, and suggest that CBT may, in part via reductions in catastrophizing, help to normalize pain-related brain responses in FM.
Probabilistic Simulation of Stress Concentration in Composite Laminates
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Murthy, P. L. N.; Liaw, D. G.
1994-01-01
A computational methodology is described to probabilistically simulate the stress concentration factors (SCF's) in composite laminates. This new approach consists of coupling probabilistic composite mechanics with probabilistic finite element structural analysis. The composite mechanics is used to probabilistically describe all the uncertainties inherent in composite material properties, whereas the finite element is used to probabilistically describe the uncertainties associated with methods to experimentally evaluate SCF's, such as loads, geometry, and supports. The effectiveness of the methodology is demonstrated by using is to simulate the SCF's in three different composite laminates. Simulated results match experimental data for probability density and for cumulative distribution functions. The sensitivity factors indicate that the SCF's are influenced by local stiffness variables, by load eccentricities, and by initial stress fields.
Hirsh, Adam T; George, Steven Z; Bialosky, Joel E; Robinson, Michael E
2008-09-01
Pain-related fear and catastrophizing are important variables of consideration in an individual's pain experience. Methodological limitations of previous studies limit strong conclusions regarding these relationships. In this follow-up study, we examined the relationships between fear of pain, pain catastrophizing, and experimental pain perception. One hundred healthy volunteers completed the Fear of Pain Questionnaire (FPQ-III), Pain Catastrophizing Scale (PCS), and Coping Strategies Questionnaire-Catastrophizing scale (CSQ-CAT) before undergoing the cold pressor test (CPT). The CSQ-CAT and PCS were completed again after the CPT, with participants instructed to complete these measures based on their experience during the procedure. Measures of pain threshold, tolerance, and intensity were collected and served as dependent variables in separate regression models. Sex, pain catastrophizing, and pain-related fear were included as predictor variables. Results of regression analyses indicated that after controlling for sex, pain-related fear was a consistently stronger predictor of pain in comparison to catastrophizing. These results were consistent when separate measures (CSQ-CAT vs PCS) and time points (pretask vs "in vivo") of catastrophizing were used. These findings largely corroborate those from our previous study and are suggestive of the absolute and relative importance of pain-related fear in the experimental pain experience. Although pain-related fear has received less attention in the experimental literature than pain catastrophizing, results of the current study are consistent with clinical reports highlighting this variable as an important aspect of the experience of pain.
Vervoort, T; Goubert, L; Vandenbossche, H; Van Aken, S; Matthys, D; Crombez, G
2011-12-01
The contribution of the child's and parents' catastrophizing about pain was explored in explaining procedural pain and fear in children. Procedural fear and pain were investigated in 44 children with Type I diabetes undergoing a finger prick. The relationships between parents' catastrophizing and parents' own fear and estimates of their child's pain were also investigated. The children and their mothers completed questionnaires prior to a routine consultation with the diabetes physician. Children completed a situation-specific measure of the Pain Catastrophizing Scale for Children (PCS-C) and provided ratings of their experienced pain and fear on a 0-10 numerical rating scale (NRS). Parents completed a situation-specific measure of the Pain Catastrophizing Scale For Parents (PCS-P) d provided estimates of their child's pain and their own experienced fear on a 0-10 NRS. Analyses indicated that higher catastrophizing by children was associated with more fear and pain during the finger prick. Scores for parents' catastrophzing about their children's pain were positively related to parents' scores for their own fear, estimates of their children's pain, and child-reported fear, but not the amount of pain reported by the child. The findings attest to the importance of assessing for and targeting child and parents' catastrophizing about pain. Addressing catastrophizing and related fears and concerns of both parents and children may be necessary to assure appropriate self-management. Further investigation of the mechanisms relating catastrophizing to deleterious outcomes is warranted.
Riddle, Daniel L.; Keefe, Francis J.; Nay, William T.; McKee, Daphne; Attarian, David E.; Jensen, Mark P.
2011-01-01
Objectives To (1) describe a behavioral intervention designed for patients with elevated pain catastrophizing who are scheduled for knee arthroplasty, and (2) use a quasi-experimental design to evaluate the potential efficacy of the intervention on pain severity, catastrophizing cognitions, and disability. Design Quasi-experimental non-equivalent control group design with a 2 month follow-up. Setting Two university-based Orthopedic Surgery departments. Participants Adults scheduled for knee replacement surgery who reported elevated levels of pain catastrophizing. Patients were recruited from two clinics and were assessed prior to surgery and 2 months following surgery. Intervention A group of 18 patients received a psychologist directed pain coping skills training intervention comprising 8 sessions and the other group, a historical cohort of 45 patients, received usual care. Main Outcome Measures WOMAC Pain and Disability scores as well as scores on the Pain Catastrophizing Scale. Results Two months following surgery, the patients who received pain coping skills training reported significantly greater reductions in pain severity and catastrophizing, and greater improvements in function as compared to the usual care cohort. Conclusion Pain catastrophizing is known to increase risk of poor outcome following knee arthroplasty. The findings provide preliminary evidence that the treatment may be highly efficacious for reducing pain, catastrophizing, and disability, in patients reporting elevated catastrophizing prior to knee arthroplasty. A randomized clinical trial is warranted to confirm these effects. PMID:21530943
Li, Ye; Wu, Qunhong; Xu, Ling; Legge, David; Hao, Yanhua; Gao, Lijun; Ning, Ning; Wan, Gang
2012-09-01
To assess the degree to which the Chinese people are protected from catastrophic household expenditure and impoverishment from medical expenses and to explore the health system and structural factors influencing the first of these outcomes. Data were derived from the Fourth National Health Service Survey. An analysis of catastrophic health expenditure and impoverishment from medical expenses was undertaken with a sample of 55 556 households of different characteristics and located in rural and urban settings in different parts of the country. Logistic regression was used to identify the determinants of catastrophic health expenditure. The rate of catastrophic health expenditure was 13.0%; that of impoverishment was 7.5%. Rates of catastrophic health expenditure were higher among households having members who were hospitalized, elderly, or chronically ill, as well as in households in rural or poorer regions. A combination of adverse factors increased the risk of catastrophic health expenditure. Families enrolled in the urban employee or resident insurance schemes had lower rates of catastrophic health expenditure than those enrolled in the new rural corporative scheme. The need for and use of health care, demographics, type of benefit package and type of provider payment method were the determinants of catastrophic health expenditure. Although China has greatly expanded health insurance coverage, financial protection remains insufficient. Policy-makers should focus on designing improved insurance plans by expanding the benefit package, redesigning cost sharing arrangements and provider payment methods and developing more effective expenditure control strategies.
The role of pain catastrophizing in experimental pain perception.
Kristiansen, Frederik L; Olesen, Anne E; Brock, Christina; Gazerani, Parisa; Petrini, Laura; Mogil, Jeffrey S; Drewes, Asbjørn M
2014-03-01
Pain is a subjective experience influenced by multiple factors, and tremendous variety within individuals is present. To evaluate emotional state of pain, catastrophizing score can be used. This study investigated pain catastrophizing ratings in association with experimental pain perception. Experimental pain was induced using thermal heat and cold stimulation of skin, mechanical stimulation of muscle and bone, and thermal, mechanical, and electrical stimulation of the gastrointestinal tract in healthy participants (N = 41). Prior to experimental sessions, a pain catastrophizing questionnaire was filled out by each participant. Based on the median catastophizing score, participants were divided into two groups: noncatastrophizers and low-catastrophizers. No significant difference was found between low-catastrophizers and noncatastrophizers in thermal heat stimulation of skin, mechanical stimulation of muscle and bone, and rectal electrical stimulation (All P > 0.05). Low-catastrophizers were more sensitive to visceral thermal stimulation (4.7%, P = 0.02) and visceral mechanical stimulation (29.7%, P = 0.03). For participants that completed the 120 seconds ice water stimulation, noncatastrophizers reported 13.8% less pain than low-catastrophizers (P = 0.02). A positive correlation between PCS score and pain perception on cold pressor test was found (r = 0.4, P = 0.02). By extrapolating data, further analysis of the total group was performed and no differences (both P > 0.05) were observed. Even small increments in pain catastrophizing score can influence pain perception to deep and tonic stimulations. Catatrophizing may partly explain the variability found in experimental pain studies. © 2013 World Institute of Pain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Ye
The critical component of a risk assessment study in evaluating GCS is an analysis of uncertainty in CO2 modeling. In such analyses, direct numerical simulation of CO2 flow and leakage requires many time-consuming model runs. Alternatively, analytical methods have been developed which allow fast and efficient estimation of CO2 storage and leakage, although restrictive assumptions on formation rock and fluid properties are employed. In this study, an intermediate approach is proposed based on the Design of Experiment and Response Surface methodology, which consists of using a limited number of numerical simulations to estimate a prediction outcome as a combination ofmore » the most influential uncertain site properties. The methodology can be implemented within a Monte Carlo framework to efficiently assess parameter and prediction uncertainty while honoring the accuracy of numerical simulations. The choice of the uncertain properties is flexible and can include geologic parameters that influence reservoir heterogeneity, engineering parameters that influence gas trapping and migration, and reactive parameters that influence the extent of fluid/rock reactions. The method was tested and verified on modeling long-term CO2 flow, non-isothermal heat transport, and CO2 dissolution storage by coupling two-phase flow with explicit miscibility calculation using an accurate equation of state that gives rise to convective mixing of formation brine variably saturated with CO2. All simulations were performed using three-dimensional high-resolution models including a target deep saline aquifer, overlying caprock, and a shallow aquifer. To evaluate the uncertainty in representing reservoir permeability, sediment hierarchy of a heterogeneous digital stratigraphy was mapped to create multiple irregularly shape stratigraphic models of decreasing geologic resolutions: heterogeneous (reference), lithofacies, depositional environment, and a (homogeneous) geologic formation. To ensure model equivalency, all the stratigraphic models were successfully upscaled from the reference heterogeneous model for bulk flow and transport predictions (Zhang & Zhang, 2015). GCS simulation was then simulated with all models, yielding insights into the level of parameterization complexity that is needed for the accurate simulation of reservoir pore pressure, CO2 storage, leakage, footprint, and dissolution over both short (i.e., injection) and longer (monitoring) time scales. Important uncertainty parameters that impact these key performance metrics were identified for the stratigraphic models as well as for the heterogeneous model, leading to the development of reduced/simplified models at lower characterization cost that can be used for the reservoir uncertainty analysis. All the CO2 modeling was conducted using PFLOTRAN – a massively parallel, multiphase, multi-component, and reactive transport simulator developed by a multi-laboratory DOE/SciDAC (Scientific Discovery through Advanced Computing) project (Zhang et al., 2017, in review). Within the uncertainty analysis framework, increasing reservoir depth were investigated to explore its effect on the uncertainty outcomes and the potential for developing gravity-stable injection with increased storage security (Dai et al., 20126; Dai et al., 2017, in review). Finally, to accurately model CO2 fluid-rock reactions and resulting long-term storage as secondary carbonate minerals, a modified kinetic rate law for general mineral dissolution and precipitation was proposed and verified that is invariant to a scale transformation of the mineral formula weight. This new formulation will lead to more accurate assessment of mineral storage over geologic time scales (Lichtner, 2016).« less
Dorratoltaj, Nargesalsadat; Marathe, Achla; Swarup, Samarth; Eubank, Stephen G.
2017-01-01
The study objective is to estimate the epidemiological and economic impact of vaccine interventions during influenza pandemics in Chicago, and assist in vaccine intervention priorities. Scenarios of delay in vaccine introduction with limited vaccine efficacy and limited supplies are not unlikely in future influenza pandemics, as in the 2009 H1N1 influenza pandemic. We simulated influenza pandemics in Chicago using agent-based transmission dynamic modeling. Population was distributed among high-risk and non-high risk among 0–19, 20–64 and 65+ years subpopulations. Different attack rate scenarios for catastrophic (30.15%), strong (21.96%), and moderate (11.73%) influenza pandemics were compared against vaccine intervention scenarios, at 40% coverage, 40% efficacy, and unit cost of $28.62. Sensitivity analysis for vaccine compliance, vaccine efficacy and vaccine start date was also conducted. Vaccine prioritization criteria include risk of death, total deaths, net benefits, and return on investment. The risk of death is the highest among the high-risk 65+ years subpopulation in the catastrophic influenza pandemic, and highest among the high-risk 0–19 years subpopulation in the strong and moderate influenza pandemics. The proportion of total deaths and net benefits are the highest among the high-risk 20–64 years subpopulation in the catastrophic, strong and moderate influenza pandemics. The return on investment is the highest in the high-risk 0–19 years subpopulation in the catastrophic, strong and moderate influenza pandemics. Based on risk of death and return on investment, high-risk groups of the three age group subpopulations can be prioritized for vaccination, and the vaccine interventions are cost saving for all age and risk groups. The attack rates among the children are higher than among the adults and seniors in the catastrophic, strong, and moderate influenza pandemic scenarios, due to their larger social contact network and homophilous interactions in school. Based on return on investment and higher attack rates among children, we recommend prioritizing children (0–19 years) and seniors (65+ years) after high-risk groups for influenza vaccination during times of limited vaccine supplies. Based on risk of death, we recommend prioritizing seniors (65+ years) after high-risk groups for influenza vaccination during times of limited vaccine supplies. PMID:28570660
Model uncertainties of the 2002 update of California seismic hazard maps
Cao, T.; Petersen, M.D.; Frankel, A.D.
2005-01-01
In this article we present and explore the source and ground-motion model uncertainty and parametric sensitivity for the 2002 update of the California probabilistic seismic hazard maps. Our approach is to implement a Monte Carlo simulation that allows for independent sampling from fault to fault in each simulation. The source-distance dependent characteristics of the uncertainty maps of seismic hazard are explained by the fundamental uncertainty patterns from four basic test cases, in which the uncertainties from one-fault and two-fault systems are studied in detail. The California coefficient of variation (COV, ratio of the standard deviation to the mean) map for peak ground acceleration (10% of exceedance in 50 years) shows lower values (0.1-0.15) along the San Andreas fault system and other class A faults than along class B faults (0.2-0.3). High COV values (0.4-0.6) are found around the Garlock, Anacapa-Dume, and Palos Verdes faults in southern California and around the Maacama fault and Cascadia subduction zone in northern California.
Frey, H Christopher; Zhao, Yuchao
2004-11-15
Probabilistic emission inventories were developed for urban air toxic emissions of benzene, formaldehyde, chromium, and arsenic for the example of Houston. Variability and uncertainty in emission factors were quantified for 71-97% of total emissions, depending upon the pollutant and data availability. Parametric distributions for interunit variability were fit using maximum likelihood estimation (MLE), and uncertainty in mean emission factors was estimated using parametric bootstrap simulation. For data sets containing one or more nondetected values, empirical bootstrap simulation was used to randomly sample detection limits for nondetected values and observations for sample values, and parametric distributions for variability were fit using MLE estimators for censored data. The goodness-of-fit for censored data was evaluated by comparison of cumulative distributions of bootstrap confidence intervals and empirical data. The emission inventory 95% uncertainty ranges are as small as -25% to +42% for chromium to as large as -75% to +224% for arsenic with correlated surrogates. Uncertainty was dominated by only a few source categories. Recommendations are made for future improvements to the analysis.
Damage assessment of composite plate structures with material and measurement uncertainty
NASA Astrophysics Data System (ADS)
Chandrashekhar, M.; Ganguli, Ranjan
2016-06-01
Composite materials are very useful in structural engineering particularly in weight sensitive applications. Two different test models of the same structure made from composite materials can display very different dynamic behavior due to large uncertainties associated with composite material properties. Also, composite structures can suffer from pre-existing imperfections like delaminations, voids or cracks during fabrication. In this paper, we show that modeling and material uncertainties in composite structures can cause considerable problem in damage assessment. A recently developed C0 shear deformable locking free refined composite plate element is employed in the numerical simulations to alleviate modeling uncertainty. A qualitative estimate of the impact of modeling uncertainty on the damage detection problem is made. A robust Fuzzy Logic System (FLS) with sliding window defuzzifier is used for delamination damage detection in composite plate type structures. The FLS is designed using variations in modal frequencies due to randomness in material properties. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a composite plate finite element model. It is demonstrated that the FLS shows excellent robustness in delamination detection at very high levels of randomness in input data.
Eigenspace perturbations for uncertainty estimation of single-point turbulence closures
NASA Astrophysics Data System (ADS)
Iaccarino, Gianluca; Mishra, Aashwin Ananda; Ghili, Saman
2017-02-01
Reynolds-averaged Navier-Stokes (RANS) models represent the workhorse for predicting turbulent flows in complex industrial applications. However, RANS closures introduce a significant degree of epistemic uncertainty in predictions due to the potential lack of validity of the assumptions utilized in model formulation. Estimating this uncertainty is a fundamental requirement for building confidence in such predictions. We outline a methodology to estimate this structural uncertainty, incorporating perturbations to the eigenvalues and the eigenvectors of the modeled Reynolds stress tensor. The mathematical foundations of this framework are derived and explicated. Thence, this framework is applied to a set of separated turbulent flows, while compared to numerical and experimental data and contrasted against the predictions of the eigenvalue-only perturbation methodology. It is exhibited that for separated flows, this framework is able to yield significant enhancement over the established eigenvalue perturbation methodology in explaining the discrepancy against experimental observations and high-fidelity simulations. Furthermore, uncertainty bounds of potential engineering utility can be estimated by performing five specific RANS simulations, reducing the computational expenditure on such an exercise.
Uncertainty in Ecohydrological Modeling in an Arid Region Determined with Bayesian Methods
Yang, Junjun; He, Zhibin; Du, Jun; Chen, Longfei; Zhu, Xi
2016-01-01
In arid regions, water resources are a key forcing factor in ecosystem circulation, and soil moisture is the critical link that constrains plant and animal life on the soil surface and underground. Simulation of soil moisture in arid ecosystems is inherently difficult due to high variability. We assessed the applicability of the process-oriented CoupModel for forecasting of soil water relations in arid regions. We used vertical soil moisture profiling for model calibration. We determined that model-structural uncertainty constituted the largest error; the model did not capture the extremes of low soil moisture in the desert-oasis ecotone (DOE), particularly below 40 cm soil depth. Our results showed that total uncertainty in soil moisture prediction was improved when input and output data, parameter value array, and structure errors were characterized explicitly. Bayesian analysis was applied with prior information to reduce uncertainty. The need to provide independent descriptions of uncertainty analysis (UA) in the input and output data was demonstrated. Application of soil moisture simulation in arid regions will be useful for dune-stabilization and revegetation efforts in the DOE. PMID:26963523
Evaluation of regional climate simulations for air quality modelling purposes
NASA Astrophysics Data System (ADS)
Menut, Laurent; Tripathi, Om P.; Colette, Augustin; Vautard, Robert; Flaounas, Emmanouil; Bessagnet, Bertrand
2013-05-01
In order to evaluate the future potential benefits of emission regulation on regional air quality, while taking into account the effects of climate change, off-line air quality projection simulations are driven using weather forcing taken from regional climate models. These regional models are themselves driven by simulations carried out using global climate models (GCM) and economical scenarios. Uncertainties and biases in climate models introduce an additional "climate modeling" source of uncertainty that is to be added to all other types of uncertainties in air quality modeling for policy evaluation. In this article we evaluate the changes in air quality-related weather variables induced by replacing reanalyses-forced by GCM-forced regional climate simulations. As an example we use GCM simulations carried out in the framework of the ERA-interim programme and of the CMIP5 project using the Institut Pierre-Simon Laplace climate model (IPSLcm), driving regional simulations performed in the framework of the EURO-CORDEX programme. In summer, we found compensating deficiencies acting on photochemistry: an overestimation by GCM-driven weather due to a positive bias in short-wave radiation, a negative bias in wind speed, too many stagnant episodes, and a negative temperature bias. In winter, air quality is mostly driven by dispersion, and we could not identify significant differences in either wind or planetary boundary layer height statistics between GCM-driven and reanalyses-driven regional simulations. However, precipitation appears largely overestimated in GCM-driven simulations, which could significantly affect the simulation of aerosol concentrations. The identification of these biases will help interpreting results of future air quality simulations using these data. Despite these, we conclude that the identified differences should not lead to major difficulties in using GCM-driven regional climate simulations for air quality projections.
Reconstruction of droughts in India using multiple land-surface models (1951-2015)
NASA Astrophysics Data System (ADS)
Mishra, Vimal; Shah, Reepal; Azhar, Syed; Shah, Harsh; Modi, Parth; Kumar, Rohini
2018-04-01
India has witnessed some of the most severe historical droughts in the current decade, and severity, frequency, and areal extent of droughts have been increasing. As a large part of the population of India is dependent on agriculture, soil moisture drought affecting agricultural activities (crop yields) has significant impacts on socio-economic conditions. Due to limited observations, soil moisture is generally simulated using land-surface hydrological models (LSMs); however, these LSM outputs have uncertainty due to many factors, including errors in forcing data and model parameterization. Here we reconstruct agricultural drought events over India during the period of 1951-2015 based on simulated soil moisture from three LSMs, the Variable Infiltration Capacity (VIC), the Noah, and the Community Land Model (CLM). Based on simulations from the three LSMs, we find that major drought events occurred in 1987, 2002, and 2015 during the monsoon season (June through September). During the Rabi season (November through February), major soil moisture droughts occurred in 1966, 1973, 2001, and 2003. Soil moisture droughts estimated from the three LSMs are comparable in terms of their spatial coverage; however, differences are found in drought severity. Moreover, we find a higher uncertainty in simulated drought characteristics over a large part of India during the major crop-growing season (Rabi season, November to February: NDJF) compared to those of the monsoon season (June to September: JJAS). Furthermore, uncertainty in drought estimates is higher for severe and localized droughts. Higher uncertainty in the soil moisture droughts is largely due to the difference in model parameterizations (especially soil depth), resulting in different persistence of soil moisture simulated by the three LSMs. Our study highlights the importance of accounting for the LSMs' uncertainty and consideration of the multi-model ensemble system for the real-time monitoring and prediction of drought over India.
NASA Astrophysics Data System (ADS)
He, Xin; Koch, Julian; Sonnenborg, Torben O.; Jørgensen, Flemming; Schamper, Cyril; Christian Refsgaard, Jens
2014-04-01
Geological heterogeneity is a very important factor to consider when developing geological models for hydrological purposes. Using statistically based stochastic geological simulations, the spatial heterogeneity in such models can be accounted for. However, various types of uncertainties are associated with both the geostatistical method and the observation data. In the present study, TProGS is used as the geostatistical modeling tool to simulate structural heterogeneity for glacial deposits in a head water catchment in Denmark. The focus is on how the observation data uncertainty can be incorporated in the stochastic simulation process. The study uses two types of observation data: borehole data and airborne geophysical data. It is commonly acknowledged that the density of the borehole data is usually too sparse to characterize the horizontal heterogeneity. The use of geophysical data gives an unprecedented opportunity to obtain high-resolution information and thus to identify geostatistical properties more accurately especially in the horizontal direction. However, since such data are not a direct measurement of the lithology, larger uncertainty of point estimates can be expected as compared to the use of borehole data. We have proposed a histogram probability matching method in order to link the information on resistivity to hydrofacies, while considering the data uncertainty at the same time. Transition probabilities and Markov Chain models are established using the transformed geophysical data. It is shown that such transformation is in fact practical; however, the cutoff value for dividing the resistivity data into facies is difficult to determine. The simulated geological realizations indicate significant differences of spatial structure depending on the type of conditioning data selected. It is to our knowledge the first time that grid-to-grid airborne geophysical data including the data uncertainty are used in conditional geostatistical simulations in TProGS. Therefore, it provides valuable insights regarding the advantages and challenges of using such comprehensive data.
The impact of lake and reservoir parameterization on global streamflow simulation.
Zajac, Zuzanna; Revilla-Romero, Beatriz; Salamon, Peter; Burek, Peter; Hirpa, Feyera A; Beck, Hylke
2017-05-01
Lakes and reservoirs affect the timing and magnitude of streamflow, and are therefore essential hydrological model components, especially in the context of global flood forecasting. However, the parameterization of lake and reservoir routines on a global scale is subject to considerable uncertainty due to lack of information on lake hydrographic characteristics and reservoir operating rules. In this study we estimated the effect of lakes and reservoirs on global daily streamflow simulations of a spatially-distributed LISFLOOD hydrological model. We applied state-of-the-art global sensitivity and uncertainty analyses for selected catchments to examine the effect of uncertain lake and reservoir parameterization on model performance. Streamflow observations from 390 catchments around the globe and multiple performance measures were used to assess model performance. Results indicate a considerable geographical variability in the lake and reservoir effects on the streamflow simulation. Nash-Sutcliffe Efficiency (NSE) and Kling-Gupta Efficiency (KGE) metrics improved for 65% and 38% of catchments respectively, with median skill score values of 0.16 and 0.2 while scores deteriorated for 28% and 52% of the catchments, with median values -0.09 and -0.16, respectively. The effect of reservoirs on extreme high flows was substantial and widespread in the global domain, while the effect of lakes was spatially limited to a few catchments. As indicated by global sensitivity analysis, parameter uncertainty substantially affected uncertainty of model performance. Reservoir parameters often contributed to this uncertainty, although the effect varied widely among catchments. The effect of reservoir parameters on model performance diminished with distance downstream of reservoirs in favor of other parameters, notably groundwater-related parameters and channel Manning's roughness coefficient. This study underscores the importance of accounting for lakes and, especially, reservoirs and using appropriate parameterization in large-scale hydrological simulations.
Application of Probabilistic Analysis to Aircraft Impact Dynamics
NASA Technical Reports Server (NTRS)
Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.
2003-01-01
Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.
Risk and panic in late modernity: implications of the converging sites of social anxiety.
Hier, Sean P
2003-03-01
Comparing moral panic with the potential catastrophes of the risk society, Sheldon Ungar contends that new sites of social anxiety emerging around nuclear, medical, environmental and chemical threats have thrown into relief many of the questions motivating moral panic research agendas. He argues that shifting sites of social anxiety necessitate a rethinking of theoretical, methodological and conceptual issues related to processes of social control, claims making and general perceptions of public safety. This paper charts an alternative trajectory, asserting that analytic priority rests not with an understanding of the implications of changing but converging sites of social anxiety. Concentrating on the converging sites of social anxiety in late modernity, the analysis forecasts a proliferation of moral panics as an exaggerated symptom of the heightened sense of uncertainty purported to accompany the ascendency of the risk society.
Orbital debris environment for spacecraft in low earth orbit
NASA Technical Reports Server (NTRS)
Kessler, Donald J.
1990-01-01
Modeling and measurement results used in formulating an environment model that can be used for the engineering design of spacecraft are reviewed. Earth-based and space-based sensors are analyzed and it is noted that the effects of satellite breakups can be modeled to predict a uncatalogued population, if the nature of the breakup is understood. It is observed that the telescopic data indicate that the current model is too low for sizes slightly larger than 10 cm, and may be too low for sizes between 2 cm and 10 cm, while there is an uncertainty in the current development, especially for sizes smaller than 10 cm, and at altitudes different from 500 km. Projections for the catastrophic collision rate for different growth conditions are made, emphasizing that the rate of growth of fragments will be twice the rate of intact objects.
Technical, economic and legal aspects of wind energy utilization
NASA Astrophysics Data System (ADS)
Obermair, G. M.; Jarass, L.
Potentially problematical areas of the implementation of wind turbines for electricity production in West Germany are identified and briefly discussed. Variations in wind generator output due to source variability may cause power regulation difficulties in the grid and also raise uncertainties in utility capacity planning for new construction. Catastrophic machine component failures, such as a thrown blade, are hazardous to life and property, while lulls in the resource can cause power regulation capabilities only when grid penetration has reached significant levels. Economically, the lack of actual data from large scale wind projects is cited as a barrier to accurate cost comparisons of wind-derived power relative to other generating sources, although breakeven costs for wind power have been found to be $2000/kW installed capacity, i.e., a marginal cost of $0.10/kW.
Trask, Newell J.
1994-01-01
Concern with the threat posed by terrestrial asteroid and comet impacts has heightened as the catastrophic consequences of such events have become better appreciated. Although the probabilities of such impacts are very small, a reasonable question for debate is whether such phenomena should be taken into account in deciding policy for the management of spent fuel and high-level radioactive waste. The rate at which asteroid or comet impacts would affect areas of surface storage of radioactive waste is about the same as the estimated rate at which volcanic activity would affect the Yucca Mountain area. The Underground Retrievable Storage (URS) concept could satisfactorily reduce the risk from cosmic impact with its associated uncertainties in addition to providing other benefits described by previous authors.
Conner, William H.; Krauss, Ken W.; Baldwin, Andrew H.; Hutchinson, Stephen
2014-01-01
Tidal wetlands are some of the most dynamic areas of the Earth and are found at the interface between the land and sea. Salinity, regular tidal flooding, and infrequent catastrophic flooding due to storm events result in complex interactions among biotic and abiotic factors. The complexity of these interactions, along with the uncertainty of where one draws the line between tidal and nontidal, makes characterizing tidal wetlands a difficult task. The three primary types of tidal wetlands are tidal marshes, mangroves, and freshwater forested wetlands. Tidal marshes are dominated by herbaceous plants and are generally found at middle to high latitudes of both hemispheres. Mangrove forests dominate tropical coastlines around the world while tidal freshwater forests are global in distribution. All three wetland types are highly productive ecosystems, supporting abundant and diverse faunal communities. Unfortunately, these wetlands are subject to alteration and loss from both natural and anthropogenic causes.
NASA Astrophysics Data System (ADS)
Li, K. Betty; Goovaerts, Pierre; Abriola, Linda M.
2007-06-01
Contaminant mass discharge across a control plane downstream of a dense nonaqueous phase liquid (DNAPL) source zone has great potential to serve as a metric for the assessment of the effectiveness of source zone treatment technologies and for the development of risk-based source-plume remediation strategies. However, too often the uncertainty of mass discharge estimated in the field is not accounted for in the analysis. In this paper, a geostatistical approach is proposed to estimate mass discharge and to quantify its associated uncertainty using multilevel transect measurements of contaminant concentration (C) and hydraulic conductivity (K). The approach adapts the p-field simulation algorithm to propagate and upscale the uncertainty of mass discharge from the local uncertainty models of C and K. Application of this methodology to numerically simulated transects shows that, with a regular sampling pattern, geostatistics can provide an accurate model of uncertainty for the transects that are associated with low levels of source mass removal (i.e., transects that have a large percentage of contaminated area). For high levels of mass removal (i.e., transects with a few hot spots and large areas of near-zero concentration), a total sampling area equivalent to 6˜7% of the transect is required to achieve accurate uncertainty modeling. A comparison of the results for different measurement supports indicates that samples taken with longer screen lengths may lead to less accurate models of mass discharge uncertainty. The quantification of mass discharge uncertainty, in the form of a probability distribution, will facilitate risk assessment associated with various remediation strategies.
RESTSIM: A Simulation Model That Highlights Decision Making under Conditions of Uncertainty.
ERIC Educational Resources Information Center
Zinkhan, George M.; Taylor, James R.
1983-01-01
Describes RESTSIM, an interactive computer simulation program for graduate and upper-level undergraduate management, marketing, and retailing courses, which introduces naive users to simulation as a decision support technique, and provides a vehicle for studying various statistical procedures for evaluating simulation output. (MBR)
Guidance for Catastrophic Emergency Situations Involving Asbestos
This document addresses the types of asbestos issues that may arise during catastrophic events and how EPA has addressed such issues. It replaces the Guidelines for Catastrophic Emergency Situations Involving Asbestos which was issued in 1992.
NASA Astrophysics Data System (ADS)
Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare
In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.
NASA Astrophysics Data System (ADS)
Cockx, K.; Van de Voorde, T.; Canters, F.; Poelmans, L.; Uljee, I.; Engelen, G.; de Jong, K.; Karssenberg, D.; van der Kwast, J.
2013-05-01
Building urban growth models typically involves a process of historic calibration based on historic time series of land-use maps, usually obtained from satellite imagery. Both the remote sensing data analysis to infer land use and the subsequent modelling of land-use change are subject to uncertainties, which may have an impact on the accuracy of future land-use predictions. Our research aims to quantify and reduce these uncertainties by means of a particle filter data assimilation approach that incorporates uncertainty in land-use mapping and land-use model parameter assessment into the calibration process. This paper focuses on part of this work, more in particular the modelling of uncertainties associated with the impervious surface cover estimation and urban land-use classification adopted in the land-use mapping approach. Both stages are submitted to a Monte Carlo simulation to assess their relative contribution to and their combined impact on the uncertainty in the derived land-use maps. The approach was applied on the central part of the Flanders region (Belgium), using a time-series of Landsat/SPOT-HRV data covering the years 1987, 1996, 2005 and 2012. Although the most likely land-use map obtained from the simulation is very similar to the original classification, it is shown that the errors related to the impervious surface sub-pixel fraction estimation have a strong impact on the land-use map's uncertainty. Hence, incorporating uncertainty in the land-use change model calibration through particle filter data assimilation is proposed to address the uncertainty observed in the derived land-use maps and to reduce uncertainty in future land-use predictions.
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Wieseman, Carol D.
2012-01-01
Orthogonal harmonic multisine excitations were utilized in a wind tunnel test and in simulation of the SemiSpan Supersonic Transport model to assess aeroservoelastic characteristics. Fundamental issues associated with analyzing sinusoidal signals were examined, including spectral leakage, excitation truncation, and uncertainties on frequency response functions and mean-square coherence. Simulation allowed for evaluation of these issues relative to a truth model, while wind tunnel data introduced real-world implementation issues.
NASA Technical Reports Server (NTRS)
Bienert, Nancy; Mercer, Joey; Homola, Jeffrey; Morey, Susan; Prevot, Thomas
2014-01-01
This paper presents a case study of how factors such as wind prediction errors and metering delays can influence controller performance and workload in Human-In-The-Loop simulations. Retired air traffic controllers worked two arrival sectors adjacent to the terminal area. The main tasks were to provide safe air traffic operations and deliver the aircraft to the metering fix within +/- 25 seconds of the scheduled arrival time with the help of provided decision support tools. Analyses explore the potential impact of metering delays and system uncertainties on controller workload and performance. The results suggest that trajectory prediction uncertainties impact safety performance, while metering fix accuracy and workload appear subject to the scenario difficulty.
System Design under Uncertainty: Evolutionary Optimization of the Gravity Probe-B Spacecraft
NASA Technical Reports Server (NTRS)
Pullen, Samuel P.; Parkinson, Bradford W.
1994-01-01
This paper discusses the application of evolutionary random-search algorithms (Simulated Annealing and Genetic Algorithms) to the problem of spacecraft design under performance uncertainty. Traditionally, spacecraft performance uncertainty has been measured by reliability. Published algorithms for reliability optimization are seldom used in practice because they oversimplify reality. The algorithm developed here uses random-search optimization to allow us to model the problem more realistically. Monte Carlo simulations are used to evaluate the objective function for each trial design solution. These methods have been applied to the Gravity Probe-B (GP-B) spacecraft being developed at Stanford University for launch in 1999, Results of the algorithm developed here for GP-13 are shown, and their implications for design optimization by evolutionary algorithms are discussed.
Diagnosing Model Errors in Simulations of Solar Radiation on Inclined Surfaces: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Yu; Sengupta, Manajit
2016-06-01
Transposition models have been widely used in the solar energy industry to simulate solar radiation on inclined PV panels. Following numerous studies comparing the performance of transposition models, this paper aims to understand the quantitative uncertainty in the state-of-the-art transposition models and the sources leading to the uncertainty. Our results suggest that an isotropic transposition model developed by Badescu substantially underestimates diffuse plane-of-array (POA) irradiances when diffuse radiation is perfectly isotropic. In the empirical transposition models, the selection of empirical coefficients and land surface albedo can both result in uncertainty in the output. This study can be used as amore » guide for future development of physics-based transposition models.« less
Diagnosing Model Errors in Simulation of Solar Radiation on Inclined Surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Yu; Sengupta, Manajit
2016-11-21
Transposition models have been widely used in the solar energy industry to simulate solar radiation on inclined PV panels. Following numerous studies comparing the performance of transposition models, this paper aims to understand the quantitative uncertainty in the state-of-the-art transposition models and the sources leading to the uncertainty. Our results show significant differences between two highly used isotropic transposition models with one substantially underestimating the diffuse plane-of-array (POA) irradiances when diffuse radiation is perfectly isotropic. In the empirical transposition models, the selection of empirical coefficients and land surface albedo can both result in uncertainty in the output. This study canmore » be used as a guide for future development of physics-based transposition models.« less
The use of perturbed physics ensembles and emulation in palaeoclimate reconstruction (Invited)
NASA Astrophysics Data System (ADS)
Edwards, T. L.; Rougier, J.; Collins, M.
2010-12-01
Climate is a coherent process, with correlations and dependencies across space, time, and climate variables. However, reconstructions of palaeoclimate traditionally consider individual pieces of information independently, rather than making use of this covariance structure. Such reconstructions are at risk of being unphysical or at least implausible. Climate simulators such as General Circulation Models (GCMs), on the other hand, contain climate system theory in the form of dynamical equations describing physical processes, but are imperfect and computationally expensive. These two datasets - pointwise palaeoclimate reconstructions and climate simulator evaluations - contain complementary information, and a statistical synthesis can produce a palaeoclimate reconstruction that combines them while not ignoring their limitations. We use an ensemble of simulators with perturbed parameterisations, to capture the uncertainty about the simulator variant, and our method also accounts for structural uncertainty. The resulting reconstruction contains a full expression of climate uncertainty, not just pointwise but also jointly over locations. Such joint information is crucial in determining spatially extensive features such as isotherms, or the location of the tree-line. A second outcome of the statistical analysis is a refined distribution for the simulator parameters. In this way, information from palaeoclimate observations can be used directly in quantifying uncertainty in future climate projections. The main challenge is the expense of running a large scale climate simulator: each evaluation of an atmosphere-ocean GCM takes several months of computing time. The solution is to interpret the ensemble of evaluations within an 'emulator', which is a statistical model of the simulator. This technique has been used fruitfully in the statistical field of Computer Models for two decades, and has recently been applied in estimating uncertainty in future climate predictions in the UKCP09 (http://ukclimateprojections.defra.gov.uk). But only in the last couple of years has it developed to the point where it can be applied to large-scale spatial fields. We construct an emulator for the mid-Holocene (6000 calendar years BP) temperature anomaly over North America, at the resolution of our simulator (2.5° latitude by 3.75° longitude). This allows us to explore the behaviour of simulator variants that we could not afford to evaluate directly. We introduce the technique of 'co-emulation' of two versions of the climate simulator: the coupled atmosphere-ocean model HadCM3, and an equivalent with a simplified ocean, HadSM3. Running two different versions of a simulator is a powerful tool for increasing the information yield from a fixed budget of computer time, but the results must be combined statistically to account for the reduced fidelity of the quicker version. Emulators provide the appropriate framework.
NASA Technical Reports Server (NTRS)
Gupta, Pramod; Guenther, Kurt; Hodgkinson, John; Jacklin, Stephen; Richard, Michael; Schumann, Johann; Soares, Fola
2005-01-01
Modern exploration missions require modern control systems-control systems that can handle catastrophic changes in the system's behavior, compensate for slow deterioration in sustained operations, and support fast system ID. Adaptive controllers, based upon Neural Networks have these capabilities, but they can only be used safely if proper verification & validation (V&V) can be done. In this paper we present our V & V approach and simulation result within NASA's Intelligent Flight Control Systems (IFCS).
NASA Astrophysics Data System (ADS)
Tierz, Pablo; Sandri, Laura; Ramona Stefanescu, Elena; Patra, Abani; Marzocchi, Warner; Costa, Antonio; Sulpizio, Roberto
2014-05-01
Explosive volcanoes and, especially, Pyroclastic Density Currents (PDCs) pose an enormous threat to populations living in the surroundings of volcanic areas. Difficulties in the modeling of PDCs are related to (i) very complex and stochastic physical processes, intrinsic to their occurrence, and (ii) to a lack of knowledge about how these processes actually form and evolve. This means that there are deep uncertainties (namely, of aleatory nature due to point (i) above, and of epistemic nature due to point (ii) above) associated to the study and forecast of PDCs. Consequently, the assessment of their hazard is better described in terms of probabilistic approaches rather than by deterministic ones. What is actually done to assess probabilistic hazard from PDCs is to couple deterministic simulators with statistical techniques that can, eventually, supply probabilities and inform about the uncertainties involved. In this work, some examples of both PDC numerical simulators (Energy Cone and TITAN2D) and uncertainty quantification techniques (Monte Carlo sampling -MC-, Polynomial Chaos Quadrature -PCQ- and Bayesian Linear Emulation -BLE-) are presented, and their advantages, limitations and future potential are underlined. The key point in choosing a specific method leans on the balance between its related computational cost, the physical reliability of the simulator and the pursued target of the hazard analysis (type of PDCs considered, time-scale selected for the analysis, particular guidelines received from decision-making agencies, etc.). Although current numerical and statistical techniques have brought important advances in probabilistic volcanic hazard assessment from PDCs, some of them may be further applicable to more sophisticated simulators. In addition, forthcoming improvements could be focused on three main multidisciplinary directions: 1) Validate the simulators frequently used (through comparison with PDC deposits and other simulators), 2) Decrease simulator runtimes (whether by increasing the knowledge about the physical processes or by doing more efficient programming, parallelization, ...) and 3) Improve uncertainty quantification techniques.
NASA Astrophysics Data System (ADS)
Dolan, B.; Rutledge, S. A.; Barnum, J. I.; Matsui, T.; Tao, W. K.; Iguchi, T.
2017-12-01
POLarimetric Radar Retrieval and Instrument Simulator (POLARRIS) is a framework that has been developed to simulate radar observations from cloud resolving model (CRM) output and subject model data and observations to the same retrievals, analysis and visualization. This framework not only enables validation of bulk microphysical model simulated properties, but also offers an opportunity to study the uncertainties associated with retrievals such as hydrometeor classification (HID). For the CSU HID, membership beta functions (MBFs) are built using a set of simulations with realistic microphysical assumptions about axis ratio, density, canting angles, size distributions for each of ten hydrometeor species. These assumptions are tested using POLARRIS to understand their influence on the resulting simulated polarimetric data and final HID classification. Several of these parameters (density, size distributions) are set by the model microphysics, and therefore the specific assumptions of axis ratio and canting angle are carefully studied. Through these sensitivity studies, we hope to be able to provide uncertainties in retrieved polarimetric variables and HID as applied to CRM output. HID retrievals assign a classification to each point by determining the highest score, thereby identifying the dominant hydrometeor type within a volume. However, in nature, there is rarely just one a single hydrometeor type at a particular point. Models allow for mixing ratios of different hydrometeors within a grid point. We use the mixing ratios from CRM output in concert with the HID scores and classifications to understand how the HID algorithm can provide information about mixtures within a volume, as well as calculate a confidence in the classifications. We leverage the POLARRIS framework to additionally probe radar wavelength differences toward the possibility of a multi-wavelength HID which could utilize the strengths of different wavelengths to improve HID classifications. With these uncertainties and algorithm improvements, cases of convection are studied in a continental (Oklahoma) and maritime (Darwin, Australia) regime. Observations from C-band polarimetric data in both locations are compared to CRM simulations from NU-WRF using the POLARRIS framework.
INCORPORATING CATASTROPHES INTO INTEGRATED ASSESSMENT: SCIENCE, IMPACTS, AND ADAPTATION
Incorporating potential catastrophic consequences into integrated assessment models of climate change has been a top priority of policymakers and modelers alike. We review the current state of scientific understanding regarding three frequently mentioned geophysical catastrophes,...
Knaul, Felicia Marie; Wong, Rebeca; Arreola-Ornelas, Héctor; Méndez, Oscar
2011-01-01
Compare patterns of catastrophic health expenditures in 12 countries in Latin America and the Caribbean. Prevalence of catastrophic expenses was estimated uniformly at the household level using household surveys. Two types of prevalence indicators were used based on out-of-pocket health expense: a) relative to an international poverty line, and b) relative to the household's ability to pay net of their food basket. Ratios of catastrophic expenditures were estimated across subgroups defined by economic and social variables. The percent of households with catastrophic health expenditures ranged from 1 to 25% in the twelve countries. In general, rural residence, lowest quintile of income, presence of older adults, and lack of health insurance in the household are associated with higher propensity of catastrophic health expenditures. However, there is vast heterogeneity by country. Cross national studies may serve to examine how health systems contribute to the social protection of Latin American households.
Households encountering with catastrophic health expenditures in Ferdows, Iran.
Ghoddoosinejad, Javad; Jannati, Ali; Gholipour, Kamal; Baghban Baghestan, Elham
2014-08-01
Out-of-pocket payments are the main sources of healthcare financing in most developing countries. Healthcare services can impose a massive cost burden on households, especially in developing countries. The purpose of this study was to calculate households encountered with catastrophic healthcare expenditures in Ferdows, Iran. The sample included 100 households representing 20% of all households in Ferdows, Iran. The data were collected using self-administered questionnaire. The ability to pay of households was calculated, and then if costs of household health were at least 40% of their ability to pay, it was considered as catastrophic expenditures. Rate of households encountered to catastrophic health expenditures was estimated to be 24%, of which dentistry services had the highest part in catastrophic health expenditures. Low ability to pay of households should be supported against these expenditures. More equitable health system would solve the problem, although more financial aid should be provided for households encountered to catastrophic costs.
Zhang, Z. Fred; White, Signe K.; Bonneville, Alain; ...
2014-12-31
Numerical simulations have been used for estimating CO2 injectivity, CO2 plume extent, pressure distribution, and Area of Review (AoR), and for the design of CO2 injection operations and monitoring network for the FutureGen project. The simulation results are affected by uncertainties associated with numerous input parameters, the conceptual model, initial and boundary conditions, and factors related to injection operations. Furthermore, the uncertainties in the simulation results also vary in space and time. The key need is to identify those uncertainties that critically impact the simulation results and quantify their impacts. We introduce an approach to determine the local sensitivity coefficientmore » (LSC), defined as the response of the output in percent, to rank the importance of model inputs on outputs. The uncertainty of an input with higher sensitivity has larger impacts on the output. The LSC is scalable by the error of an input parameter. The composite sensitivity of an output to a subset of inputs can be calculated by summing the individual LSC values. We propose a local sensitivity coefficient method and applied it to the FutureGen 2.0 Site in Morgan County, Illinois, USA, to investigate the sensitivity of input parameters and initial conditions. The conceptual model for the site consists of 31 layers, each of which has a unique set of input parameters. The sensitivity of 11 parameters for each layer and 7 inputs as initial conditions is then investigated. For CO2 injectivity and plume size, about half of the uncertainty is due to only 4 or 5 of the 348 inputs and 3/4 of the uncertainty is due to about 15 of the inputs. The initial conditions and the properties of the injection layer and its neighbour layers contribute to most of the sensitivity. Overall, the simulation outputs are very sensitive to only a small fraction of the inputs. However, the parameters that are important for controlling CO2 injectivity are not the same as those controlling the plume size. The three most sensitive inputs for injectivity were the horizontal permeability of Mt Simon 11 (the injection layer), the initial fracture-pressure gradient, and the residual aqueous saturation of Mt Simon 11, while those for the plume area were the initial salt concentration, the initial pressure, and the initial fracture-pressure gradient. The advantages of requiring only a single set of simulation results, scalability to the proper parameter errors, and easy calculation of the composite sensitivities make this approach very cost-effective for estimating AoR uncertainty and guiding cost-effective site characterization, injection well design, and monitoring network design for CO2 storage projects.« less
'spup' - an R package for uncertainty propagation in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2016-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2017-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
Jamestown II: Building a New World.
ERIC Educational Resources Information Center
Sanchez, Tony
This simulation uses a science fiction setting to capture the unparalled adventure, danger, and uncertainty of the colonization period in United States history. The simulation can be done in small groups or individually, and value judgments affect the outcome of the simulation. The premise of the simulation is that due to overpopulation,…
Cunningham, Natoshia R; Lynch-Jordan, Anne; Barnett, Kimberly; Peugh, James; Sil, Soumitri; Goldschneider, Kenneth; Kashikar-Zuck, Susmita
2014-12-01
Functional abdominal pain (FAP) in youth is associated with substantial impairment in functioning, and prior research has shown that overprotective parent responses can heighten impairment. Little is known about how a range of parental behaviors (overprotection, minimizing, and/or encouragement) in response to their child's pain interact with child coping characteristics (eg, catastrophizing) to influence functioning in youth with FAP. In this study, it was hypothesized that the relation between parenting factors and child disability would be mediated by children's levels of maladaptive coping (ie, pain catastrophizing). Seventy-five patients with FAP presenting to a pediatric pain clinic and their caregivers participated in the study. Youth completed measures of pain intensity (Numeric Rating Scale), pain catastrophizing (Pain Catastrophizing Scale), and disability (Functional Disability Inventory). Caregivers completed measures of parent pain catastrophizing (Pain Catastrophizing Scale), and parent responses to child pain behaviors (Adult Responses to Child Symptoms: Protection, Minimizing, and Encouragement/Monitoring subscales). Increased functional disability was significantly related to higher child pain intensity, increased child and parent pain catastrophizing, and higher levels of encouragement/monitoring and protection. Parent minimization was not related to disability. Child pain catastrophizing fully mediated the relation between parent encouragement/monitoring and disability and partially mediated the relation between parent protectiveness and disability. The impact of parenting behaviors in response to FAP on child disability is determined, in part, by the child's coping style. Findings highlight a more nuanced understanding of the parent-child interaction in determining pain-related disability levels, which should be taken into consideration in assessing and treating youth with FAP.
Song, Eun Cheol; Shin, Young Jeon
2010-09-01
The low benefit coverage rate of South Korea's health security system has been continually pointed out. A low benefit coverage rate inevitably causes catastrophic health expenditure, which can be the cause of the transition to poverty and the persistence of poverty. This study was conducted to ascertain the effect of catastrophic health expenditure on the transition to poverty and the persistence of poverty in South Korea. To determine the degree of social mobility, this study was conducted among the 6311 households that participated in the South Korea Welfare Panel Study in both 2006 and 2008. The effect of catastrophic health expenditure on the transition to poverty and the persistence of poverty in South Korea was assessed via multiple logistic regression analysis. The poverty rate in South Korea was 21.6% in 2006 and 20.0% in 2008. 25.1 - 7.3% of the households are facing catastrophic health expenditure. Catastrophic health expenditure was found to affect the transition to poverty even after adjusting for the characteristics of the household and the head of the household, at the threshold of 28% or above. 25.1% of the households in this study were found to be currently facing catastrophic health expenditure, and it was determined that catastrophic health expenditure is a cause of transition to poverty. This result shows that South Korea's health security system is not an effective social safety net. As such, to prevent catastrophic health expenditure and transition to poverty, the benefit coverage of South Korea's health security system needs to the strengthened.
Van Denburg, Alyssa N; Shelby, Rebecca A; Caldwell, David S; O'Sullivan, Madeline L; Keefe, Francis J
2018-04-06
Pain catastrophizing (ie, the tendency to focus on and magnify pain sensations and feel helpless in the face of pain) is one of the most important and consistent psychological predictors of the pain experience. The present study examined, in 60 patients with osteoarthritis pain who were married or partnered: 1) the degree to which ambivalence over emotional expression and negative network orientation were associated with pain catastrophizing, and 2) whether self-efficacy for pain communication moderated these relations. Hierarchical multiple linear regression analyses revealed a significant main effect for the association between ambivalence over emotional expression and pain catastrophizing; as ambivalence over emotional expression increased, the degree of pain catastrophizing increased. In addition, the interaction between ambivalence over emotional expression and self-efficacy for pain communication was significant, such that as self-efficacy for pain communication increased, the association between ambivalence over emotional expression and pain catastrophizing became weaker. Negative network orientation was not significantly associated with pain catastrophizing. Findings suggest that higher levels of self-efficacy for pain communication may help weaken the effects of ambivalence over emotional expression on pain catastrophizing. In light of these results, patients may benefit from interventions that target pain communication processes and emotion regulation. This article examines interpersonal processes involved in pain catastrophizing. This study has the potential to lead to better understanding of maladaptive pain coping strategies and possibly better prevention and treatment strategies. Copyright © 2018 The American Pain Society. Published by Elsevier Inc. All rights reserved.
Schlomann, Brandon H
2018-06-06
A central problem in population ecology is understanding the consequences of stochastic fluctuations. Analytically tractable models with Gaussian driving noise have led to important, general insights, but they fail to capture rare, catastrophic events, which are increasingly observed at scales ranging from global fisheries to intestinal microbiota. Due to mathematical challenges, growth processes with random catastrophes are less well characterized and it remains unclear how their consequences differ from those of Gaussian processes. In the face of a changing climate and predicted increases in ecological catastrophes, as well as increased interest in harnessing microbes for therapeutics, these processes have never been more relevant. To better understand them, I revisit here a differential equation model of logistic growth coupled to density-independent catastrophes that arrive as a Poisson process, and derive new analytic results that reveal its statistical structure. First, I derive exact expressions for the model's stationary moments, revealing a single effective catastrophe parameter that largely controls low order statistics. Then, I use weak convergence theorems to construct its Gaussian analog in a limit of frequent, small catastrophes, keeping the stationary population mean constant for normalization. Numerically computing statistics along this limit shows how they transform as the dynamics shifts from catastrophes to diffusions, enabling quantitative comparisons. For example, the mean time to extinction increases monotonically by orders of magnitude, demonstrating significantly higher extinction risk under catastrophes than under diffusions. Together, these results provide insight into a wide range of stochastic dynamical systems important for ecology and conservation. Copyright © 2018 Elsevier Ltd. All rights reserved.