Sample records for ensemble downscaling mred

  1. Multi-RCM ensemble downscaling of global seasonal forecasts (MRED)

    NASA Astrophysics Data System (ADS)

    Arritt, R. W.

    2008-12-01

    The Multi-RCM Ensemble Downscaling (MRED) project was recently initiated to address the question, Can regional climate models provide additional useful information from global seasonal forecasts? MRED will use a suite of regional climate models to downscale seasonal forecasts produced by the new National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) seasonal forecast system and the NASA GEOS5 system. The initial focus will be on wintertime forecasts in order to evaluate topographic forcing, snowmelt, and the potential usefulness of higher resolution, especially for near-surface fields influenced by high resolution orography. Each regional model will cover the conterminous US (CONUS) at approximately 32 km resolution, and will perform an ensemble of 15 runs for each year 1982-2003 for the forecast period 1 December - 30 April. MRED will compare individual regional and global forecasts as well as ensemble mean precipitation and temperature forecasts, which are currently being used to drive macroscale land surface models (LSMs), as well as wind, humidity, radiation, turbulent heat fluxes, which are important for more advanced coupled macro-scale hydrologic models. Metrics of ensemble spread will also be evaluated. Extensive analysis will be performed to link improvements in downscaled forecast skill to regional forcings and physical mechanisms. Our overarching goal is to determine what additional skill can be provided by a community ensemble of high resolution regional models, which we believe will eventually define a strategy for more skillful and useful regional seasonal climate forecasts.

  2. Multi-RCM ensemble downscaling of global seasonal forecasts (MRED)

    NASA Astrophysics Data System (ADS)

    Arritt, R.

    2009-04-01

    Regional climate models (RCMs) have long been used to downscale global climate simulations. In contrast the ability of RCMs to downscale seasonal climate forecasts has received little attention. The Multi-RCM Ensemble Downscaling (MRED) project was recently initiated to address the question, Does dynamical downscaling using RCMs provide additional useful information for seasonal forecasts made by global models? MRED is using a suite of RCMs to downscale seasonal forecasts produced by the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) seasonal forecast system and the NASA GEOS5 system. The initial focus is on wintertime forecasts in order to evaluate topographic forcing, snowmelt, and the usefulness of higher resolution for near-surface fields influenced by high resolution orography. Each RCM covers the conterminous U.S. at approximately 32 km resolution, comparable to the scale of the North American Regional Reanalysis (NARR) which will be used to evaluate the models. The forecast ensemble for each RCM is comprised of 15 members over a period of 22+ years (from 1982 to 2003+) for the forecast period 1 December - 30 April. Each RCM will create a 15-member lagged ensemble by starting on different dates in the preceding November. This results in a 120-member ensemble for each projection (8 RCMs by 15 members per RCM). The RCMs will be continually updated at their lateral boundaries using 6-hourly output from CFS or GEOS5. Hydrometeorological output will be produced in a standard netCDF-based format for a common analysis grid, which simplifies both model intercomparison and the generation of ensembles. MRED will compare individual RCM and global forecasts as well as ensemble mean precipitation and temperature forecasts, which are currently being used to drive macroscale land surface models (LSMs). Metrics of ensemble spread will also be evaluated. Extensive process-oriented analysis will be performed to link improvements in

  3. Ensemble Downscaling of Winter Seasonal Forecasts: The MRED Project

    NASA Astrophysics Data System (ADS)

    Arritt, R. W.; Mred Team

    2010-12-01

    The Multi-Regional climate model Ensemble Downscaling (MRED) project is a multi-institutional project that is producing large ensembles of downscaled winter seasonal forecasts from coupled atmosphere-ocean seasonal prediction models. Eight regional climate models each are downscaling 15-member ensembles from the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) and the new NASA seasonal forecast system based on the GEOS5 atmospheric model coupled with the MOM4 ocean model. This produces 240-member ensembles, i.e., 8 regional models x 15 global ensemble members x 2 global models, for each winter season (December-April) of 1982-2003. Results to date show that combined global-regional downscaled forecasts have greatest skill for seasonal precipitation anomalies during strong El Niño events such as 1982-83 and 1997-98. Ensemble means of area-averaged seasonal precipitation for the regional models generally track the corresponding results for the global model, though there is considerable inter-model variability amongst the regional models. For seasons and regions where area mean precipitation is accurately simulated the regional models bring added value by extracting greater spatial detail from the global forecasts, mainly due to better resolution of terrain in the regional models. Our results also emphasize that an ensemble approach is essential to realizing the added value from the combined global-regional modeling system.

  4. Ensemble downscaling in coupled solar wind-magnetosphere modeling for space weather forecasting.

    PubMed

    Owens, M J; Horbury, T S; Wicks, R T; McGregor, S L; Savani, N P; Xiong, M

    2014-06-01

    Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind "noise," which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical "downscaling" of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme. Solar wind models must be downscaled in order to drive magnetospheric models Ensemble downscaling is more effective than deterministic downscaling The magnetosphere responds nonlinearly to small-scale solar wind fluctuations.

  5. Ensemble downscaling in coupled solar wind-magnetosphere modeling for space weather forecasting

    PubMed Central

    Owens, M J; Horbury, T S; Wicks, R T; McGregor, S L; Savani, N P; Xiong, M

    2014-01-01

    Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme. Key Points Solar wind models must be downscaled in order to drive magnetospheric models Ensemble downscaling is more effective than deterministic downscaling The magnetosphere responds nonlinearly to small-scale solar wind fluctuations PMID:26213518

  6. Hydro-meteorological evaluation of downscaled global ensemble rainfall forecasts

    NASA Astrophysics Data System (ADS)

    Gaborit, Étienne; Anctil, François; Fortin, Vincent; Pelletier, Geneviève

    2013-04-01

    Ensemble rainfall forecasts are of high interest for decision making, as they provide an explicit and dynamic assessment of the uncertainty in the forecast (Ruiz et al. 2009). However, for hydrological forecasting, their low resolution currently limits their use to large watersheds (Maraun et al. 2010). In order to bridge this gap, various implementations of the statistic-stochastic multi-fractal downscaling technique presented by Perica and Foufoula-Georgiou (1996) were compared, bringing Environment Canada's global ensemble rainfall forecasts from a 100 by 70-km resolution down to 6 by 4-km, while increasing each pixel's rainfall variance and preserving its original mean. For comparison purposes, simpler methods were also implemented such as the bi-linear interpolation, which disaggregates global forecasts without modifying their variance. The downscaled meteorological products were evaluated using different scores and diagrams, from both a meteorological and a hydrological view points. The meteorological evaluation was conducted comparing the forecasted rainfall depths against nine days of observed values taken from Québec City rain gauge database. These 9 days present strong precipitation events occurring during the summer of 2009. For the hydrologic evaluation, the hydrological models SWMM5 and (a modified version of) GR4J were implemented on a small 6 km2 urban catchment located in the Québec City region. Ensemble hydrologic forecasts with a time step of 3 hours were then performed over a 3-months period of the summer of 2010 using the original and downscaled ensemble rainfall forecasts. The most important conclusions of this work are that the overall quality of the forecasts was preserved during the disaggregation procedure and that the disaggregated products using this variance-enhancing method were of similar quality than bi-linear interpolation products. However, variance and dispersion of the different members were, of course, much improved for the

  7. Physical Processes and Applications of the Monte Carlo Radiative Energy Deposition (MRED) Code

    NASA Astrophysics Data System (ADS)

    Reed, Robert A.; Weller, Robert A.; Mendenhall, Marcus H.; Fleetwood, Daniel M.; Warren, Kevin M.; Sierawski, Brian D.; King, Michael P.; Schrimpf, Ronald D.; Auden, Elizabeth C.

    2015-08-01

    MRED is a Python-language scriptable computer application that simulates radiation transport. It is the computational engine for the on-line tool CRÈME-MC. MRED is based on c++ code from Geant4 with additional Fortran components to simulate electron transport and nuclear reactions with high precision. We provide a detailed description of the structure of MRED and the implementation of the simulation of physical processes used to simulate radiation effects in electronic devices and circuits. Extensive discussion and references are provided that illustrate the validation of models used to implement specific simulations of relevant physical processes. Several applications of MRED are summarized that demonstrate its ability to predict and describe basic physical phenomena associated with irradiation of electronic circuits and devices. These include effects from single particle radiation (including both direct ionization and indirect ionization effects), dose enhancement effects, and displacement damage effects. MRED simulations have also helped to identify new single event upset mechanisms not previously observed by experiment, but since confirmed, including upsets due to muons and energetic electrons.

  8. Downscaling RCP8.5 daily temperatures and precipitation in Ontario using localized ensemble optimal interpolation (EnOI) and bias correction

    NASA Astrophysics Data System (ADS)

    Deng, Ziwang; Liu, Jinliang; Qiu, Xin; Zhou, Xiaolan; Zhu, Huaiping

    2017-10-01

    A novel method for daily temperature and precipitation downscaling is proposed in this study which combines the Ensemble Optimal Interpolation (EnOI) and bias correction techniques. For downscaling temperature, the day to day seasonal cycle of high resolution temperature of the NCEP climate forecast system reanalysis (CFSR) is used as background state. An enlarged ensemble of daily temperature anomaly relative to this seasonal cycle and information from global climate models (GCMs) are used to construct a gain matrix for each calendar day. Consequently, the relationship between large and local-scale processes represented by the gain matrix will change accordingly. The gain matrix contains information of realistic spatial correlation of temperature between different CFSR grid points, between CFSR grid points and GCM grid points, and between different GCM grid points. Therefore, this downscaling method keeps spatial consistency and reflects the interaction between local geographic and atmospheric conditions. Maximum and minimum temperatures are downscaled using the same method. For precipitation, because of the non-Gaussianity issue, a logarithmic transformation is used to daily total precipitation prior to conducting downscaling. Cross validation and independent data validation are used to evaluate this algorithm. Finally, data from a 29-member ensemble of phase 5 of the Coupled Model Intercomparison Project (CMIP5) GCMs are downscaled to CFSR grid points in Ontario for the period from 1981 to 2100. The results show that this method is capable of generating high resolution details without changing large scale characteristics. It results in much lower absolute errors in local scale details at most grid points than simple spatial downscaling methods. Biases in the downscaled data inherited from GCMs are corrected with a linear method for temperatures and distribution mapping for precipitation. The downscaled ensemble projects significant warming with amplitudes of 3

  9. Assessment of a stochastic downscaling methodology in generating an ensemble of hourly future climate time series

    NASA Astrophysics Data System (ADS)

    Fatichi, S.; Ivanov, V. Y.; Caporali, E.

    2013-04-01

    This study extends a stochastic downscaling methodology to generation of an ensemble of hourly time series of meteorological variables that express possible future climate conditions at a point-scale. The stochastic downscaling uses general circulation model (GCM) realizations and an hourly weather generator, the Advanced WEather GENerator (AWE-GEN). Marginal distributions of factors of change are computed for several climate statistics using a Bayesian methodology that can weight GCM realizations based on the model relative performance with respect to a historical climate and a degree of disagreement in projecting future conditions. A Monte Carlo technique is used to sample the factors of change from their respective marginal distributions. As a comparison with traditional approaches, factors of change are also estimated by averaging GCM realizations. With either approach, the derived factors of change are applied to the climate statistics inferred from historical observations to re-evaluate parameters of the weather generator. The re-parameterized generator yields hourly time series of meteorological variables that can be considered to be representative of future climate conditions. In this study, the time series are generated in an ensemble mode to fully reflect the uncertainty of GCM projections, climate stochasticity, as well as uncertainties of the downscaling procedure. Applications of the methodology in reproducing future climate conditions for the periods of 2000-2009, 2046-2065 and 2081-2100, using the period of 1962-1992 as the historical baseline are discussed for the location of Firenze (Italy). The inferences of the methodology for the period of 2000-2009 are tested against observations to assess reliability of the stochastic downscaling procedure in reproducing statistics of meteorological variables at different time scales.

  10. Model Independence in Downscaled Climate Projections: a Case Study in the Southeast United States

    NASA Astrophysics Data System (ADS)

    Gray, G. M. E.; Boyles, R.

    2016-12-01

    Downscaled climate projections are used to deduce how the climate will change in future decades at local and regional scales. It is important to use multiple models to characterize part of the future uncertainty given the impact on adaptation decision making. This is traditionally employed through an equally-weighted ensemble of multiple GCMs downscaled using one technique. Newer practices include several downscaling techniques in an effort to increase the ensemble's representation of future uncertainty. However, this practice may be adding statistically dependent models to the ensemble. Previous research has shown a dependence problem in the GCM ensemble in multiple generations, but has not been shown in the downscaled ensemble. In this case study, seven downscaled climate projections on the daily time scale are considered: CLAREnCE10, SERAP, BCCA (CMIP5 and CMIP3 versions), Hostetler, CCR, and MACA-LIVNEH. These data represent 83 ensemble members, 44 GCMs, and two generations of GCMs. Baseline periods are compared against the University of Idaho's METDATA gridded observation dataset. Hierarchical agglomerative clustering is applied to the correlated errors to determine dependent clusters. Redundant GCMs across different downscaling techniques show the most dependence, while smaller dependence signals are detected within downscaling datasets and across generations of GCMs. These results indicate that using additional downscaled projections to increase the ensemble size must be done with care to avoid redundant GCMs and the process of downscaling may increase the dependence of those downscaled GCMs. Climate model generation does not appear dissimilar enough to be treated as two separate statistical populations for ensemble building at the local and regional scales.

  11. Simulation of SEU Cross-sections using MRED under Conditions of Limited Device Information

    NASA Technical Reports Server (NTRS)

    Lauenstein, J. M.; Reed, R. A.; Weller, R. A.; Mendenhall, M. H.; Warren, K. M.; Pellish, J. A.; Schrimpf, R. D.; Sierawski, B. D.; Massengill, L. W.; Dodd, P. E.; hide

    2007-01-01

    This viewgraph presentation reviews the simulation of Single Event Upset (SEU) cross sections using the membrane electrode assembly (MEA) resistance and electrode diffusion (MRED) tool using "Best guess" assumptions about the process and geometry, and direct ionization, low-energy beam test results. This work will also simulate SEU cross-sections including angular and high energy responses and compare the simulated results with beam test data for the validation of the model. Using MRED, we produced a reasonably accurate upset response model of a low-critical charge SRAM without detailed information about the circuit, device geometry, or fabrication process

  12. Assessment of hi-resolution multi-ensemble statistical downscaling regional climate scenarios over Japan

    NASA Astrophysics Data System (ADS)

    Dairaku, K.

    2017-12-01

    The Asia-Pacific regions are increasingly threatened by large scale natural disasters. Growing concerns that loss and damages of natural disasters are projected to further exacerbate by climate change and socio-economic change. Climate information and services for risk assessments are of great concern. Fundamental regional climate information is indispensable for understanding changing climate and making decisions on when and how to act. To meet with the needs of stakeholders such as National/local governments, spatio-temporal comprehensive and consistent information is necessary and useful for decision making. Multi-model ensemble regional climate scenarios with 1km horizontal grid-spacing over Japan are developed by using CMIP5 37 GCMs (RCP8.5) and a statistical downscaling (Bias Corrected Spatial Disaggregation (BCSD)) to investigate uncertainty of projected change associated with structural differences of the GCMs for the periods of historical climate (1950-2005) and near future climate (2026-2050). Statistical downscaling regional climate scenarios show good performance for annual and seasonal averages for precipitation and temperature. The regional climate scenarios show systematic underestimate of extreme events such as hot days of over 35 Celsius and annual maximum daily precipitation because of the interpolation processes in the BCSD method. Each model projected different responses in near future climate because of structural differences. The most of CMIP5 37 models show qualitatively consistent increase of average and extreme temperature and precipitation. The added values of statistical/dynamical downscaling methods are also investigated for locally forced nonlinear phenomena, extreme events.

  13. Using Analog Ensemble to generate spatially downscaled probabilistic wind power forecasts

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.; Shahriari, M.; Cervone, G.

    2017-12-01

    We use the Analog Ensemble (AnEn) method to generate probabilistic 80-m wind power forecasts. We use data from the NCEP GFS ( 28 km resolution) and NCEP NAM (12 km resolution). We use forecasts data from NAM and GFS, and analysis data from NAM which enables us to: 1) use a lower-resolution model to create higher-resolution forecasts, and 2) use a higher-resolution model to create higher-resolution forecasts. The former essentially increases computing speed and the latter increases forecast accuracy. An aggregated model of the former can be compared against the latter to measure the accuracy of the AnEn spatial downscaling. The AnEn works by taking a deterministic future forecast and comparing it with past forecasts. The model searches for the best matching estimates within the past forecasts and selects the predictand value corresponding to these past forecasts as the ensemble prediction for the future forecast. Our study is based on predicting wind speed and air density at more than 13,000 grid points in the continental US. We run the AnEn model twice: 1) estimating 80-m wind speed by using predictor variables such as temperature, pressure, geopotential height, U-component and V-component of wind, 2) estimating air density by using predictors such as temperature, pressure, and relative humidity. We use the air density values to correct the standard wind power curves for different values of air density. The standard deviation of the ensemble members (i.e. ensemble spread) will be used as the degree of difficulty to predict wind power at different locations. The value of the correlation coefficient between the ensemble spread and the forecast error determines the appropriateness of this measure. This measure is prominent for wind farm developers as building wind farms in regions with higher predictability will reduce the real-time risks of operating in the electricity markets.

  14. Characterizing sources of uncertainty from global climate models and downscaling techniques

    USGS Publications Warehouse

    Wootten, Adrienne; Terando, Adam; Reich, Brian J.; Boyles, Ryan; Semazzi, Fred

    2017-01-01

    In recent years climate model experiments have been increasingly oriented towards providing information that can support local and regional adaptation to the expected impacts of anthropogenic climate change. This shift has magnified the importance of downscaling as a means to translate coarse-scale global climate model (GCM) output to a finer scale that more closely matches the scale of interest. Applying this technique, however, introduces a new source of uncertainty into any resulting climate model ensemble. Here we present a method, based on a previously established variance decomposition method, to partition and quantify the uncertainty in climate model ensembles that is attributable to downscaling. We apply the method to the Southeast U.S. using five downscaled datasets that represent both statistical and dynamical downscaling techniques. The combined ensemble is highly fragmented, in that only a small portion of the complete set of downscaled GCMs and emission scenarios are typically available. The results indicate that the uncertainty attributable to downscaling approaches ~20% for large areas of the Southeast U.S. for precipitation and ~30% for extreme heat days (> 35°C) in the Appalachian Mountains. However, attributable quantities are significantly lower for time periods when the full ensemble is considered but only a sub-sample of all models are available, suggesting that overconfidence could be a serious problem in studies that employ a single set of downscaled GCMs. We conclude with recommendations to advance the design of climate model experiments so that the uncertainty that accrues when downscaling is employed is more fully and systematically considered.

  15. Positioning cell wall synthetic complexes by the bacterial morphogenetic proteins MreB and MreD.

    PubMed

    White, Courtney L; Kitich, Aleksandar; Gober, James W

    2010-05-01

    In Caulobacter crescentus, intact cables of the actin homologue, MreB, are required for the proper spatial positioning of MurG which catalyses the final step in peptidoglycan precursor synthesis. Similarly, in the periplasm, MreC controls the spatial orientation of the penicillin binding proteins and a lytic transglycosylase. We have now found that MreB cables are required for the organization of several other cytosolic murein biosynthetic enzymes such as MraY, MurB, MurC, MurE and MurF. We also show these proteins adopt a subcellular pattern of localization comparable to MurG, suggesting the existence of cytoskeletal-dependent interactions. Through extensive two-hybrid analyses, we have now generated a comprehensive interaction map of components of the bacterial morphogenetic complex. In the cytosol, this complex contains both murein biosynthetic enzymes and morphogenetic proteins, including RodA, RodZ and MreD. We show that the integral membrane protein, MreD, is essential for lateral peptidoglycan synthesis, interacts with the precursor synthesizing enzymes MurG and MraY, and additionally, determines MreB localization. Our results suggest that the interdependent localization of MreB and MreD functions to spatially organize a complex of peptidoglycan precursor synthesis proteins, which is required for propagation of a uniform cell shape and catalytically efficient peptidoglycan synthesis.

  16. Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics

    NASA Astrophysics Data System (ADS)

    Lazarus, S. M.; Holman, B. P.; Splitt, M. E.

    2017-12-01

    A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.

  17. Downscaling SMAP Radiometer Soil Moisture over the CONUS using Soil-Climate Information and Ensemble Learning

    NASA Astrophysics Data System (ADS)

    Abbaszadeh, P.; Moradkhani, H.

    2017-12-01

    Soil moisture contributes significantly towards the improvement of weather and climate forecast and understanding terrestrial ecosystem processes. It is known as a key hydrologic variable in the agricultural drought monitoring, flood modeling and irrigation management. While satellite retrievals can provide an unprecedented information on soil moisture at global-scale, the products are generally at coarse spatial resolutions (25-50 km2). This often hampers their use in regional or local studies, which normally require a finer resolution of the data set. This work presents a new framework based on an ensemble learning method while using soil-climate information derived from remote-sensing and ground-based observations to downscale the level 3 daily composite version (L3_SM_P) of SMAP radiometer soil moisture over the Continental U.S. (CONUS) at 1 km spatial resolution. In the proposed method, a suite of remotely sensed and in situ data sets in addition to soil texture information and topography data among others were used. The downscaled product was validated against in situ soil moisture measurements collected from a limited number of core validation sites and several hundred sparse soil moisture networks throughout the CONUS. The obtained results indicated a great potential of the proposed methodology to derive the fine resolution soil moisture information applicable for fine resolution hydrologic modeling, data assimilation and other regional studies.

  18. Applications of Machine Learning to Downscaling and Verification

    NASA Astrophysics Data System (ADS)

    Prudden, R.

    2017-12-01

    Downscaling, sometimes known as super-resolution, means converting model data into a more detailed local forecast. It is a problem which could be highly amenable to machine learning approaches, provided that sufficient historical forecast data and observations are available. It is also closely linked to the subject of verification, since improving a forecast requires a way to measure that improvement. This talk will describe some early work towards downscaling Met Office ensemble forecasts, and discuss how the output may be usefully evaluated.

  19. Inter-comparison of precipitable water among reanalyses and its effect on downscaling in the tropics

    NASA Astrophysics Data System (ADS)

    Takahashi, H. G.; Fujita, M.; Hara, M.

    2012-12-01

    This paper compared precipitable water (PW) among four major reanalyses. In addition, we also investigated the effect of the boundary conditions on downscaling in the tropics, using a regional climate model. The spatial pattern of PW in the reanalyses agreed closely with observations. However, the absolute amounts of PW in some reanalyses were very small compared to observations. The discrepancies of the 12-year mean PW in July over the Southeast Asian monsoon region exceeded the inter-annual standard deviation of the PW. There was also a discrepancy in tropical PWs throughout the year, an indication that the problem is not regional, but global. The downscaling experiments were conducted, which were forced by the different four reanalyses. The atmospheric circulation, including monsoon westerlies and various disturbances, was very small among the reanalyses. However, simulated precipitation was only 60 % of observed precipitation, although the dry bias in the boundary conditions was only 6 %. This result indicates that dry bias has large effects on precipitation in downscaling over the tropics. This suggests that a simulated regional climate downscaled from ensemble-mean boundary conditions is quite different from an ensemble-mean regional climate averaged over the several regional ones downscaled from boundary conditions of the ensemble members in the tropics. Downscaled models can provide realistic simulations of regional tropical climates only if the boundary conditions include realistic absolute amounts of PW. Use of boundary conditions that include realistic absolute amounts of PW in downscaling in the tropics is imperative at the present time. This work was partly supported by the Global Environment Research Fund (RFa-1101) of the Ministry of the Environment, Japan.

  20. New statistical downscaling for Canada

    NASA Astrophysics Data System (ADS)

    Murdock, T. Q.; Cannon, A. J.; Sobie, S.

    2013-12-01

    This poster will document the production of a set of statistically downscaled future climate projections for Canada based on the latest available RCM and GCM simulations - the North American Regional Climate Change Assessment Program (NARCCAP; Mearns et al. 2007) and the Coupled Model Intercomparison Project Phase 5 (CMIP5). The main stages of the project included (1) downscaling method evaluation, (2) scenarios selection, (3) production of statistically downscaled results, and (4) applications of results. We build upon a previous downscaling evaluation project (Bürger et al. 2012, Bürger et al. 2013) in which a quantile-based method (Bias Correction/Spatial Disaggregation - BCSD; Werner 2011) provided high skill compared with four other methods representing the majority of types of downscaling used in Canada. Additional quantile-based methods (Bias-Correction/Constructed Analogues; Maurer et al. 2010 and Bias-Correction/Climate Imprint ; Hunter and Meentemeyer 2005) were evaluated. A subset of 12 CMIP5 simulations was chosen based on an objective set of selection criteria. This included hemispheric skill assessment based on the CLIMDEX indices (Sillmann et al. 2013), historical criteria used previously at the Pacific Climate Impacts Consortium (Werner 2011), and refinement based on a modified clustering algorithm (Houle et al. 2012; Katsavounidis et al. 1994). Statistical downscaling was carried out on the NARCCAP ensemble and a subset of the CMIP5 ensemble. We produced downscaled scenarios over Canada at a daily time resolution and 300 arc second (~10 km) spatial resolution from historical runs for 1951-2005 and from RCP 2.6, 4.5, and 8.5 projections for 2006-2100. The ANUSPLIN gridded daily dataset (McKenney et al. 2011) was used as a target. It has national coverage, spans the historical period of interest 1951-2005, and has daily time resolution. It uses interpolation of station data based on thin-plate splines. This type of method has been shown to have

  1. High-Resolution Dynamical Downscaling Ensemble Projections of Future Extreme Temperature Distributions for the United States

    NASA Astrophysics Data System (ADS)

    Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.; Kotamarthi, V. Rao

    2017-12-01

    The aim of this study is to examine projections of extreme temperatures over the continental United States (CONUS) for the 21st century using an ensemble of high spatial resolution dynamically downscaled model simulations with different boundary conditions. The downscaling uses the Weather Research and Forecast model at a spatial resolution of 12 km along with outputs from three different Coupled Model Intercomparison Project Phase 5 global climate models that provide boundary conditions under two different future greenhouse gas (GHG) concentration trajectories. The results from two decadal-length time slices (2045-2054 and 2085-2094) are compared with a historical decade (1995-2004). Probability density functions of daily maximum/minimum temperatures are analyzed over seven climatologically cohesive regions of the CONUS. The impacts of different boundary conditions as well as future GHG concentrations on extreme events such as heat waves and days with temperature higher than 95°F are also investigated. The results show that the intensity of extreme warm temperature in future summer is significantly increased, while the frequency of extreme cold temperature in future winter decreases. The distribution of summer daily maximum temperature experiences a significant warm-side shift and increased variability, while the distribution of winter daily minimum temperature is projected to have a less significant warm-side shift with decreased variability. Using "business-as-usual" scenario, 5-day heat waves are projected to occur at least 5-10 times per year in most CONUS and ≥95°F days will increase by 1-2 months by the end of the century.

  2. High-Resolution Dynamical Downscaling Ensemble Projections of Future Extreme Temperature Distributions for the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.

    The aim of this study is to examine projections of extreme temperatures over the continental United States (CONUS) for the 21st century using an ensemble of high spatial resolution dynamically downscaled model simulations with different boundary conditions. The downscaling uses the Weather Research and Forecast model at a spatial resolution of 12 km along with outputs from three different Coupled Model Intercomparison Project Phase 5 global climate models that provide boundary con- ditions under two different future greenhouse gas (GHG) concentration trajectories. The results from two decadal-length time slices (2045–2054 and 2085–2094) are compared with a historical decade (1995–2004). Probabilitymore » density functions of daily maximum/minimum temperatures are analyzed over seven climatologically cohesive regions of the CONUS. The impacts of different boundary conditions as well as future GHG concentrations on extreme events such as heat waves and days with temperature higher than 95°F are also investigated. The results show that the intensity of extreme warm temperature in future summer is significantly increased, while the frequency of extreme cold temperature in future winter decreases. The distribution of summer daily maximum temperature experiences a significant warm-side shift and increased variability, while the distribution of winter daily minimum temperature is projected to have a less significant warm-side shift with decreased variability. Finally, using "business-as-usual" scenario, 5-day heat waves are projected to occur at least 5–10 times per year in most CONUS and ≥ 95°F days will increase by 1–2 months by the end of the century.« less

  3. High-Resolution Dynamical Downscaling Ensemble Projections of Future Extreme Temperature Distributions for the United States

    DOE PAGES

    Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.; ...

    2017-11-20

    The aim of this study is to examine projections of extreme temperatures over the continental United States (CONUS) for the 21st century using an ensemble of high spatial resolution dynamically downscaled model simulations with different boundary conditions. The downscaling uses the Weather Research and Forecast model at a spatial resolution of 12 km along with outputs from three different Coupled Model Intercomparison Project Phase 5 global climate models that provide boundary con- ditions under two different future greenhouse gas (GHG) concentration trajectories. The results from two decadal-length time slices (2045–2054 and 2085–2094) are compared with a historical decade (1995–2004). Probabilitymore » density functions of daily maximum/minimum temperatures are analyzed over seven climatologically cohesive regions of the CONUS. The impacts of different boundary conditions as well as future GHG concentrations on extreme events such as heat waves and days with temperature higher than 95°F are also investigated. The results show that the intensity of extreme warm temperature in future summer is significantly increased, while the frequency of extreme cold temperature in future winter decreases. The distribution of summer daily maximum temperature experiences a significant warm-side shift and increased variability, while the distribution of winter daily minimum temperature is projected to have a less significant warm-side shift with decreased variability. Finally, using "business-as-usual" scenario, 5-day heat waves are projected to occur at least 5–10 times per year in most CONUS and ≥ 95°F days will increase by 1–2 months by the end of the century.« less

  4. Quantifying the Value of Downscaled Climate Model Information for Adaptation Decisions: When is Downscaling a Smart Decision?

    NASA Astrophysics Data System (ADS)

    Terando, A. J.; Wootten, A.; Eaton, M. J.; Runge, M. C.; Littell, J. S.; Bryan, A. M.; Carter, S. L.

    2015-12-01

    Two types of decisions face society with respect to anthropogenic climate change: (1) whether to enact a global greenhouse gas abatement policy, and (2) how to adapt to the local consequences of current and future climatic changes. The practice of downscaling global climate models (GCMs) is often used to address (2) because GCMs do not resolve key features that will mediate global climate change at the local scale. In response, the development of downscaling techniques and models has accelerated to aid decision makers seeking adaptation guidance. However, quantifiable estimates of the value of information are difficult to obtain, particularly in decision contexts characterized by deep uncertainty and low system-controllability. Here we demonstrate a method to quantify the additional value that decision makers could expect if research investments are directed towards developing new downscaled climate projections. As a proof of concept we focus on a real-world management problem: whether to undertake assisted migration for an endangered tropical avian species. We also take advantage of recently published multivariate methods that account for three vexing issues in climate impacts modeling: maximizing climate model quality information, accounting for model dependence in ensembles of opportunity, and deriving probabilistic projections. We expand on these global methods by including regional (Caribbean Basin) and local (Puerto Rico) domains. In the local domain, we test whether a high resolution (2km) dynamically downscaled GCM reduces the multivariate error estimate compared to the original coarse-scale GCM. Initial tests show little difference between the downscaled and original GCM multivariate error. When propagated through to a species population model, the Value of Information analysis indicates that the expected utility that would accrue to the manager (and species) if this downscaling were completed may not justify the cost compared to alternative actions.

  5. Consistency and Main Differences Between European Regional Climate Downscaling Intercomparison Results; From PRUDENCE and ENSEMBLES to CORDEX

    NASA Astrophysics Data System (ADS)

    Christensen, J. H.; Larsen, M. A. D.; Christensen, O. B.; Drews, M.

    2017-12-01

    For more than 20 years, coordinated efforts to apply regional climate models to downscale GCM simulations for Europe have been pursued by an ever increasing group of scientists. This endeavor showed its first results during EU framework supported projects such as RACCS and MERCURE. Here, the foundation for today's advanced worldwide CORDEX approach was laid out by a core of six research teams, who conducted some of the first coordinated RCM simulations with the aim to assess regional climate change for Europe. However, it was realized at this stage that model bias in GCMs as well as RCMs made this task very challenging. As an immediate outcome, the idea was conceived to make an even more coordinated effort by constructing a well-defined and structured set of common simulations; this lead to the PRUDENCE project (2001-2004). Additional coordinated efforts involving ever increasing numbers of GCMs and RCMs followed in ENSEMBLES (2004-2009) and the ongoing Euro-CORDEX (officially commenced 2011) efforts. Along with the overall coordination, simulations have increased their standard resolution from 50km (PRUDENCE) to about 12km (Euro-CORDEX) and from time slice simulations (PRUDENCE) to transient experiments (ENSEMBLES and CORDEX); from one driving model and emission scenario (PRUDENCE) to several (Euro-CORDEX). So far, this wealth of simulations have been used to assess the potential impacts of future climate change in Europe providing a baseline change as defined by a multi-model mean change with associated uncertainties calculated from model spread in the ensemble. But how has the overall picture of state-of-the-art regional climate change projections changed over this period of almost two decades? Here we compare across scenarios, model resolutions and model vintage the results from PRUDENCE, ENSEMBLES and Euro-CORDEX. By appropriate scaling we identify robust findings about the projected future of European climate expressed by temperature and precipitation changes

  6. Optimising predictor domains for spatially coherent precipitation downscaling

    NASA Astrophysics Data System (ADS)

    Radanovics, S.; Vidal, J.-P.; Sauquet, E.; Ben Daoud, A.; Bontron, G.

    2013-10-01

    Statistical downscaling is widely used to overcome the scale gap between predictors from numerical weather prediction models or global circulation models and predictands like local precipitation, required for example for medium-term operational forecasts or climate change impact studies. The predictors are considered over a given spatial domain which is rarely optimised with respect to the target predictand location. In this study, an extended version of the growing rectangular domain algorithm is proposed to provide an ensemble of near-optimum predictor domains for a statistical downscaling method. This algorithm is applied to find five-member ensembles of near-optimum geopotential predictor domains for an analogue downscaling method for 608 individual target zones covering France. Results first show that very similar downscaling performances based on the continuous ranked probability score (CRPS) can be achieved by different predictor domains for any specific target zone, demonstrating the need for considering alternative domains in this context of high equifinality. A second result is the large diversity of optimised predictor domains over the country that questions the commonly made hypothesis of a common predictor domain for large areas. The domain centres are mainly distributed following the geographical location of the target location, but there are apparent differences between the windward and the lee side of mountain ridges. Moreover, domains for target zones located in southeastern France are centred more east and south than the ones for target locations on the same longitude. The size of the optimised domains tends to be larger in the southeastern part of the country, while domains with a very small meridional extent can be found in an east-west band around 47° N. Sensitivity experiments finally show that results are rather insensitive to the starting point of the optimisation algorithm except for zones located in the transition area north of this east

  7. Understanding the effects of electromagnetic field emissions from Marine Renewable Energy Devices (MREDs) on the commercially important edible crab, Cancer pagurus (L.).

    PubMed

    Scott, Kevin; Harsanyi, Petra; Lyndon, Alastair R

    2018-06-01

    The effects of simulated electromagnetic fields (EMF), emitted from sub-sea power cables, on the commercially important decapod, edible crab (Cancer pagurus), were assessed. Stress related parameters were measured (l-Lactate, d-Glucose, Haemocyanin and respiration rate) along with behavioural and response parameters (antennular flicking, activity level, attraction/avoidance, shelter preference and time spent resting/roaming) during 24-h periods. Exposure to EMF had no effect on Haemocyanin concentrations, respiration rate, activity level or antennular flicking rate. EMF exposure significantly disrupted haemolymph l-Lactate and d-Glucose natural circadian rhythms. Crabs showed a clear attraction to EMF exposed shelter (69%) compared to control shelter (9%) and significantly reduced their time spent roaming by 21%. Consequently, EMF emitted from Marine Renewable Energy Devices (MREDs) will likely affect edible crabs both behaviourally and physiologically, suggesting that the impact of EMF on crustaceans must be considered when planning MREDs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Assessment of regional downscaling simulations for long term mean, excess and deficit Indian Summer Monsoons

    NASA Astrophysics Data System (ADS)

    Varikoden, Hamza; Mujumdar, M.; Revadekar, J. V.; Sooraj, K. P.; Ramarao, M. V. S.; Sanjay, J.; Krishnan, R.

    2018-03-01

    This study undertakes a comprehensive assessment of dynamical downscaling of summer monsoon (June-September; JJAS) rainfall over heterogeneous regions namely the Western Ghats (WG), Central India (CI) and North-Eastern Region (NER) for long term mean, excess and deficit episodes for the historical period from 1951 to 2005. This downscaling assessment is based on six Coordinated Regional Climate Downscaling Experiments (CORDEX) for South Asia (SAS) region, their five driving Global Climate Models (GCM) simulations along with observations from India Meteorological Department (IMD) and Asian Precipitation Highly Resolved Observational Integrated Towards Evaluation for Water Resources (APHRODITE). The analysis reveals an overall reduction of dry bias in rainfall across the regions of Indian sub-continent in most of the downscaled CORDEX-SAS models and in their ensemble mean as compared to that of driving GCMs. The interannual variabilities during historical period are reasonably captured by the ensemble means of CORDEX-SAS simulations with an underestimation of 0.43%, 38% and 52% for the WG, CI and NER, respectively. Upon careful examination of the CORDEX-SAS models and their driving GCMs revealed considerable improvement in the regionally downscaled rainfall. The value addition of dynamical downscaling is apparent over the WG in Regional Climate Model (RCM) simulations with an improvement of more than 30% for the long term mean, excess and deficit episodes from their driving GCMs. In the case of NER, the improvement in the downscaled rainfall product is more than 10% for all the episodes. However, the value addition in the CORDEX-SAS simulations for CI region, dominantly influenced by synoptic scale processes, is not clear. Nevertheless, the reduction of dry bias in the complex topographical regions is remarkable. The relative performance of dynamical downscaling of rainfall over complex topography in response to local forcing and orographic lifting depict the value

  9. Ensemble experiments using a nested LETKF system to reproduce intense vortices associated with tornadoes of 6 May 2012 in Japan

    NASA Astrophysics Data System (ADS)

    Seko, Hiromu; Kunii, Masaru; Yokota, Sho; Tsuyuki, Tadashi; Miyoshi, Takemasa

    2015-12-01

    Experiments simulating intense vortices associated with tornadoes that occurred on 6 May 2012 on the Kanto Plain, Japan, were performed with a nested local ensemble transform Kalman filter (LETKF) system. Intense vortices were reproduced by downscale experiments with a 12-member ensemble in which the initial conditions were obtained from the nested LETKF system analyses. The downscale experiments successfully generated intense vortices in three regions similar to the observed vortices, whereas only one tornado was reproduced by a deterministic forecast. The intense vorticity of the strongest tornado, which was observed in the southernmost region, was successfully reproduced by 10 of the 12 ensemble members. An examination of the results of the ensemble downscale experiments showed that the duration of intense vorticities tended to be longer when the vertical shear of the horizontal wind was larger and the lower airflow was more humid. Overall, the study results show that ensemble forecasts have the following merits: (1) probabilistic forecasts of the outbreak of intense vortices associated with tornadoes are possible; (2) the miss rate of outbreaks should decrease; and (3) environmental factors favoring outbreaks can be obtained by comparing the multiple possible scenarios of the ensemble forecasts.

  10. Downscaling Satellite Data for Predicting Catchment-scale Root Zone Soil Moisture with Ground-based Sensors and an Ensemble Kalman Filter

    NASA Astrophysics Data System (ADS)

    Lin, H.; Baldwin, D. C.; Smithwick, E. A. H.

    2015-12-01

    Predicting root zone (0-100 cm) soil moisture (RZSM) content at a catchment-scale is essential for drought and flood predictions, irrigation planning, weather forecasting, and many other applications. Satellites, such as the NASA Soil Moisture Active Passive (SMAP), can estimate near-surface (0-5 cm) soil moisture content globally at coarse spatial resolutions. We develop a hierarchical Ensemble Kalman Filter (EnKF) data assimilation modeling system to downscale satellite-based near-surface soil moisture and to estimate RZSM content across the Shale Hills Critical Zone Observatory at a 1-m resolution in combination with ground-based soil moisture sensor data. In this example, a simple infiltration model within the EnKF-model has been parameterized for 6 soil-terrain units to forecast daily RZSM content in the catchment from 2009 - 2012 based on AMSRE. LiDAR-derived terrain variables define intra-unit RZSM variability using a novel covariance localization technique. This method also allows the mapping of uncertainty with our RZSM estimates for each time-step. A catchment-wide satellite-to-surface downscaling parameter, which nudges the satellite measurement closer to in situ near-surface data, is also calculated for each time-step. We find significant differences in predicted root zone moisture storage for different terrain units across the experimental time-period. Root mean square error from a cross-validation analysis of RZSM predictions using an independent dataset of catchment-wide in situ Time-Domain Reflectometry (TDR) measurements ranges from 0.060-0.096 cm3 cm-3, and the RZSM predictions are significantly (p < 0.05) correlated with TDR measurements [r = 0.47-0.68]. The predictive skill of this data assimilation system is similar to the Penn State Integrated Hydrologic Modeling (PIHM) system. Uncertainty estimates are significantly (p < 0.05) correlated to cross validation error during wet and dry conditions, but more so in dry summer seasons. Developing an

  11. Evaluations of high-resolution dynamically downscaled ensembles over the contiguous United States

    NASA Astrophysics Data System (ADS)

    Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.; Kotamarthi, V. Rao

    2018-02-01

    This study uses Weather Research and Forecast (WRF) model to evaluate the performance of six dynamical downscaled decadal historical simulations with 12-km resolution for a large domain (7200 × 6180 km) that covers most of North America. The initial and boundary conditions are from three global climate models (GCMs) and one reanalysis data. The GCMs employed in this study are the Geophysical Fluid Dynamics Laboratory Earth System Model with Generalized Ocean Layer Dynamics component, Community Climate System Model, version 4, and the Hadley Centre Global Environment Model, version 2-Earth System. The reanalysis data is from the National Centers for Environmental Prediction-US. Department of Energy Reanalysis II. We analyze the effects of bias correcting, the lateral boundary conditions and the effects of spectral nudging. We evaluate the model performance for seven surface variables and four upper atmospheric variables based on their climatology and extremes for seven subregions across the United States. The results indicate that the simulation's performance depends on both location and the features/variable being tested. We find that the use of bias correction and/or nudging is beneficial in many situations, but employing these when running the RCM is not always an improvement when compared to the reference data. The use of an ensemble mean and median leads to a better performance in measuring the climatology, while it is significantly biased for the extremes, showing much larger differences than individual GCM driven model simulations from the reference data. This study provides a comprehensive evaluation of these historical model runs in order to make informed decisions when making future projections.

  12. An application of hybrid downscaling model to forecast summer precipitation at stations in China

    NASA Astrophysics Data System (ADS)

    Liu, Ying; Fan, Ke

    2014-06-01

    A pattern prediction hybrid downscaling method was applied to predict summer (June-July-August) precipitation at China 160 stations. The predicted precipitation from the downscaling scheme is available one month before. Four predictors were chosen to establish the hybrid downscaling scheme. The 500-hPa geopotential height (GH5) and 850-hPa specific humidity (q85) were from the skillful predicted output of three DEMETER (Development of a European Multi-model Ensemble System for Seasonal to Interannual Prediction) general circulation models (GCMs). The 700-hPa geopotential height (GH7) and sea level pressure (SLP) were from reanalysis datasets. The hybrid downscaling scheme (HD-4P) has better prediction skill than a conventional statistical downscaling model (SD-2P) which contains two predictors derived from the output of GCMs, although two downscaling schemes were performed to improve the seasonal prediction of summer rainfall in comparison with the original output of the DEMETER GCMs. In particular, HD-4P downscaling predictions showed lower root mean square errors than those based on the SD-2P model. Furthermore, the HD-4P downscaling model reproduced the China summer precipitation anomaly centers more accurately than the scenario of the SD-2P model in 1998. A hybrid downscaling prediction should be effective to improve the prediction skill of summer rainfall at stations in China.

  13. Downscaled projections of Caribbean coral bleaching that can inform conservation planning.

    PubMed

    van Hooidonk, Ruben; Maynard, Jeffrey Allen; Liu, Yanyun; Lee, Sang-Ki

    2015-09-01

    Projections of climate change impacts on coral reefs produced at the coarse resolution (~1°) of Global Climate Models (GCMs) have informed debate but have not helped target local management actions. Here, projections of the onset of annual coral bleaching conditions in the Caribbean under Representative Concentration Pathway (RCP) 8.5 are produced using an ensemble of 33 Coupled Model Intercomparison Project phase-5 models and via dynamical and statistical downscaling. A high-resolution (~11 km) regional ocean model (MOM4.1) is used for the dynamical downscaling. For statistical downscaling, sea surface temperature (SST) means and annual cycles in all the GCMs are replaced with observed data from the ~4-km NOAA Pathfinder SST dataset. Spatial patterns in all three projections are broadly similar; the average year for the onset of annual severe bleaching is 2040-2043 for all projections. However, downscaled projections show many locations where the onset of annual severe bleaching (ASB) varies 10 or more years within a single GCM grid cell. Managers in locations where this applies (e.g., Florida, Turks and Caicos, Puerto Rico, and the Dominican Republic, among others) can identify locations that represent relative albeit temporary refugia. Both downscaled projections are different for the Bahamas compared to the GCM projections. The dynamically downscaled projections suggest an earlier onset of ASB linked to projected changes in regional currents, a feature not resolved in GCMs. This result demonstrates the value of dynamical downscaling for this application and means statistically downscaled projections have to be interpreted with caution. However, aside from west of Andros Island, the projections for the two types of downscaling are mostly aligned; projected onset of ASB is within ±10 years for 72% of the reef locations. © 2015 The Authors. Global Change Biology Published by John Wiley & Sons Ltd.

  14. Reanalysis of the 1893 heat wave in France through offline data assimilation in a downscaled ensemble meteorological reconstruction

    NASA Astrophysics Data System (ADS)

    Devers, Alexandre; Vidal, Jean-Philippe; Lauvernet, Claire; Graff, Benjamin

    2017-04-01

    The knowledge of historical French weather has recently been improved through the development of the SCOPE (Spatially COherent Probabilistic Extended) Climate reconstruction, a probabilistic high-resolution daily reconstruction of precipitation and temperature covering the period 1871-2012 and based on the statistical downscaling of the Twentieth Century Reanalysis (Caillouet et al., 2016). However, historical surface observations - even though rather scarce and sparse - do exist from at least the beginning of the period considered, and this information does not currently feed SCOPE Climate reconstructions. The goal of this study is therefore to assimilate these historical observations into SCOPE Climate reconstructions in order to build a 150-year meteorological reanalysis over France. This study considers "offline" data assimilation methods - Kalman filtering methods like the Ensemble Square Root Filter - that have successfully been used in recent paleoclimate studies, i.e. at much larger temporal and spatial scales (see e.g. Bhend et al., 2012). These methods are here applied for reconstructing the 8-24 August 1893 heat wave in France, using all available daily temperature observations from that period. Temperatures reached that summer were indeed compared at the time to those of Senegal (Garnier, 2012). Results show a spatially coherent view of the heat wave at the national scale as well as a reduced uncertainty compared to initial meteorological reconstructions, thus demonstrating the added value of data assimilation. In order to assess the performance of assimilation methods in a more recent context, these methods are also used to reconstruct the well-known 3-14 August 2003 heat wave by using (1) all available stations, and (2) the same station density as in August 1893, the rest of the observations being saved for validation. This analysis allows comparing two heat waves having occurred 100 years apart in France with different associated uncertainties, in

  15. Simulation of an ensemble of future climate time series with an hourly weather generator

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.; Kim, J.

    2010-12-01

    There is evidence that climate change is occurring in many regions of the world. The necessity of climate change predictions at the local scale and fine temporal resolution is thus warranted for hydrological, ecological, geomorphological, and agricultural applications that can provide thematic insights into the corresponding impacts. Numerous downscaling techniques have been proposed to bridge the gap between the spatial scales adopted in General Circulation Models (GCM) and regional analyses. Nevertheless, the time and spatial resolutions obtained as well as the type of meteorological variables may not be sufficient for detailed studies of climate change effects at the local scales. In this context, this study presents a stochastic downscaling technique that makes use of an hourly weather generator to simulate time series of predicted future climate. Using a Bayesian approach, the downscaling procedure derives distributions of factors of change for several climate statistics from a multi-model ensemble of GCMs. Factors of change are sampled from their distributions using a Monte Carlo technique to entirely account for the probabilistic information obtained with the Bayesian multi-model ensemble. Factors of change are subsequently applied to the statistics derived from observations to re-evaluate the parameters of the weather generator. The weather generator can reproduce a wide set of climate variables and statistics over a range of temporal scales, from extremes, to the low-frequency inter-annual variability. The final result of such a procedure is the generation of an ensemble of hourly time series of meteorological variables that can be considered as representative of future climate, as inferred from GCMs. The generated ensemble of scenarios also accounts for the uncertainty derived from multiple GCMs used in downscaling. Applications of the procedure in reproducing present and future climates are presented for different locations world-wide: Tucson (AZ

  16. Multisite rainfall downscaling and disaggregation in a tropical urban area

    NASA Astrophysics Data System (ADS)

    Lu, Y.; Qin, X. S.

    2014-02-01

    A systematic downscaling-disaggregation study was conducted over Singapore Island, with an aim to generate high spatial and temporal resolution rainfall data under future climate-change conditions. The study consisted of two major components. The first part was to perform an inter-comparison of various alternatives of downscaling and disaggregation methods based on observed data. This included (i) single-site generalized linear model (GLM) plus K-nearest neighbor (KNN) (S-G-K) vs. multisite GLM (M-G) for spatial downscaling, (ii) HYETOS vs. KNN for single-site disaggregation, and (iii) KNN vs. MuDRain (Multivariate Rainfall Disaggregation tool) for multisite disaggregation. The results revealed that, for multisite downscaling, M-G performs better than S-G-K in covering the observed data with a lower RMSE value; for single-site disaggregation, KNN could better keep the basic statistics (i.e. standard deviation, lag-1 autocorrelation and probability of wet hour) than HYETOS; for multisite disaggregation, MuDRain outperformed KNN in fitting interstation correlations. In the second part of the study, an integrated downscaling-disaggregation framework based on M-G, KNN, and MuDRain was used to generate hourly rainfall at multiple sites. The results indicated that the downscaled and disaggregated rainfall data based on multiple ensembles from HadCM3 for the period from 1980 to 2010 could well cover the observed mean rainfall amount and extreme data, and also reasonably keep the spatial correlations both at daily and hourly timescales. The framework was also used to project future rainfall conditions under HadCM3 SRES A2 and B2 scenarios. It was indicated that the annual rainfall amount could reduce up to 5% at the end of this century, but the rainfall of wet season and extreme hourly rainfall could notably increase.

  17. Downscaling global precipitation for local applications - a case for the Rhine basin

    NASA Astrophysics Data System (ADS)

    Sperna Weiland, Frederiek; van Verseveld, Willem; Schellekens, Jaap

    2017-04-01

    Within the EU FP7 project eartH2Observe a global Water Resources Re-analysis (WRR) is being developed. This re-analysis consists of meteorological and hydrological water balance variables with global coverage, spanning the period 1979-2014 at 0.25 degrees resolution (Schellekens et al., 2016). The dataset can be of special interest in regions with limited in-situ data availability, yet for local scale analysis particularly in mountainous regions, a resolution of 0.25 degrees may be too coarse and downscaling the data to a higher resolution may be required. A downscaling toolbox has been made that includes spatial downscaling of precipitation based on the global WorldClim dataset that is available at 1 km resolution as a monthly climatology (Hijmans et al., 2005). The input of the down-scaling tool are either the global eartH2Observe WRR1 and WRR2 datasets based on the WFDEI correction methodology (Weedon et al., 2014) or the global Multi-Source Weighted-Ensemble Precipitation (MSWEP) dataset (Beck et al., 2016). Here we present a validation of the datasets over the Rhine catchment by means of a distributed hydrological model (wflow, Schellekens et al., 2014) using a number of precipitation scenarios. (1) We start by running the model using the local reference dataset derived by spatial interpolation of gauge observations. Furthermore we use (2) the MSWEP dataset at the native 0.25-degree resolution followed by (3) MSWEP downscaled with the WorldClim dataset and final (4) MSWEP downscaled with the local reference dataset. The validation will be based on comparison of the modeled river discharges as well as rainfall statistics. We expect that down-scaling the MSWEP dataset with the WorldClim data to higher resolution will increase its performance. To test the performance of the down-scaling routine we have added a run with MSWEP data down-scaled with the local dataset and compare this with the run based on the local dataset itself. - Beck, H. E. et al., 2016. MSWEP

  18. Added value of dynamical downscaling of winter seasonal forecasts over North America

    NASA Astrophysics Data System (ADS)

    Tefera Diro, Gulilat; Sushama, Laxmi

    2017-04-01

    Skillful seasonal forecasts have enormous potential benefits for socio-economic sectors that are sensitive to weather and climate conditions, as the early warning routines could reduce the vulnerability of such sectors. In this study, individual ensemble members of the ECMWF global ensemble seasonal forecasts are dynamically downscaled to produce ensemble of regional seasonal forecasts over North America using the fifth generation Canadian Regional Climate Model (CRCM5). CRCM5 forecasts are initialized on November 1st of each year and are integrated for four months for the 1991-2001 period at 0.22 degree resolution to produce a one-month lead-time forecast. The initial conditions for atmospheric variables are obtained from ERA-Interim reanalysis, whereas the initial conditions for land surface are obtained from a separate ERA-interim driven CRCM5 simulation with spectral nudging applied to the interior domain. The global and regional ensemble forecasts were then verified to investigate the skill and economic benefits of dynamical downscaling. Results indicate that both the global and regional climate models produce skillful precipitation forecast over the southern Great Plains and eastern coasts of the U.S and skillful temperature forecasts over the northern U.S. and most of Canada. In comparison to ECMWF forecasts, CRCM5 forecasts improved the temperature forecast skill over most part of the domain, but the improvements for precipitation is limited to regions with complex topography, where it improves the frequency of intense daily precipitation. CRCM5 forecast also yields a better economic value compared to ECMWF precipitation forecasts, for users whose cost to loss ratio is smaller than 0.5.

  19. Development of hi-resolution regional climate scenarios in Japan by statistical downscaling

    NASA Astrophysics Data System (ADS)

    Dairaku, K.

    2016-12-01

    Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. To meet with the needs of stakeholders such as local governments, a Japan national project, Social Implementation Program on Climate Change Adaptation Technology (SI-CAT), launched in December 2015. It develops reliable technologies for near-term climate change predictions. Multi-model ensemble regional climate scenarios with 1km horizontal grid-spacing over Japan are developed by using CMIP5 GCMs and a statistical downscaling method to support various municipal adaptation measures appropriate for possible regional climate changes. A statistical downscaling method, Bias Correction Spatial Disaggregation (BCSD), is employed to develop regional climate scenarios based on CMIP5 RCP8.5 five GCMs (MIROC5, MRI-CGCM3, GFDL-CM3, CSIRO-Mk3-6-0, HadGEM2-ES) for the periods of historical climate (1970-2005) and near future climate (2020-2055). Downscaled variables are monthly/daily precipitation and temperature. File format is NetCDF4 (conforming to CF1.6, HDF5 compression). Developed regional climate scenarios will be expanded to meet with needs of stakeholders and interface applications to access and download the data are under developing. Statistical downscaling method is not necessary to well represent locally forced nonlinear phenomena, extreme events such as heavy rain, heavy snow, etc. To complement the statistical method, dynamical downscaling approach is also combined and applied to some specific regions which have needs of stakeholders. The added values of statistical/dynamical downscaling methods compared with parent GCMs are investigated.

  20. Evaluations of high-resolution dynamically downscaled ensembles over the contiguous United States Climate Dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.

    This study uses Weather Research and Forecast (WRF) model to evaluate the performance of six dynamical downscaled decadal historical simulations with 12-km resolution for a large domain (7200 x 6180 km) that covers most of North America. The initial and boundary conditions are from three global climate models (GCMs) and one reanalysis data. The GCMs employed in this study are the Geophysical Fluid Dynamics Laboratory Earth System Model with Generalized Ocean Layer Dynamics component, Community Climate System Model, version 4, and the Hadley Centre Global Environment Model, version 2-Earth System. The reanalysis data is from the National Centers for Environmentalmore » Prediction-US. Department of Energy Reanalysis II. We analyze the effects of bias correcting, the lateral boundary conditions and the effects of spectral nudging. We evaluate the model performance for seven surface variables and four upper atmospheric variables based on their climatology and extremes for seven subregions across the United States. The results indicate that the simulation’s performance depends on both location and the features/variable being tested. We find that the use of bias correction and/or nudging is beneficial in many situations, but employing these when running the RCM is not always an improvement when compared to the reference data. The use of an ensemble mean and median leads to a better performance in measuring the climatology, while it is significantly biased for the extremes, showing much larger differences than individual GCM driven model simulations from the reference data. This study provides a comprehensive evaluation of these historical model runs in order to make informed decisions when making future projections.« less

  1. Risk assessments of regional climate change over Europe: generation of probabilistic ensemble and uncertainty assessment for EURO-CODEX

    NASA Astrophysics Data System (ADS)

    Yuan, J.; Kopp, R. E.

    2017-12-01

    Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO

  2. Revealing Risks in Adaptation Planning: expanding Uncertainty Treatment and dealing with Large Projection Ensembles during Planning Scenario development

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Wood, A.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2015-12-01

    Adaptation planning assessments often rely on single methods for climate projection downscaling and hydrologic analysis, do not reveal uncertainties from associated method choices, and thus likely produce overly confident decision-support information. Recent work by the authors has highlighted this issue by identifying strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic impacts. This work has shown that many of the methodological choices made can alter the magnitude, and even the sign of the climate change signal. Such results motivate consideration of both sources of method uncertainty within an impacts assessment. Consequently, the authors have pursued development of improved downscaling techniques spanning a range of method classes (quasi-dynamical and circulation-based statistical methods) and developed approaches to better account for hydrologic analysis uncertainty (multi-model; regional parameter estimation under forcing uncertainty). This presentation summarizes progress in the development of these methods, as well as implications of pursuing these developments. First, having access to these methods creates an opportunity to better reveal impacts uncertainty through multi-method ensembles, expanding on present-practice ensembles which are often based only on emissions scenarios and GCM choices. Second, such expansion of uncertainty treatment combined with an ever-expanding wealth of global climate projection information creates a challenge of how to use such a large ensemble for local adaptation planning. To address this challenge, the authors are evaluating methods for ensemble selection (considering the principles of fidelity, diversity and sensitivity) that is compatible with present-practice approaches for abstracting change scenarios from any "ensemble of opportunity". Early examples from this development will also be presented.

  3. Accounting for Global Climate Model Projection Uncertainty in Modern Statistical Downscaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johannesson, G

    2010-03-17

    Future climate change has emerged as a national and a global security threat. To carry out the needed adaptation and mitigation steps, a quantification of the expected level of climate change is needed, both at the global and the regional scale; in the end, the impact of climate change is felt at the local/regional level. An important part of such climate change assessment is uncertainty quantification. Decision and policy makers are not only interested in 'best guesses' of expected climate change, but rather probabilistic quantification (e.g., Rougier, 2007). For example, consider the following question: What is the probability that themore » average summer temperature will increase by at least 4 C in region R if global CO{sub 2} emission increases by P% from current levels by time T? It is a simple question, but one that remains very difficult to answer. It is answering these kind of questions that is the focus of this effort. The uncertainty associated with future climate change can be attributed to three major factors: (1) Uncertainty about future emission of green house gasses (GHG). (2) Given a future GHG emission scenario, what is its impact on the global climate? (3) Given a particular evolution of the global climate, what does it mean for a particular location/region? In what follows, we assume a particular GHG emission scenario has been selected. Given the GHG emission scenario, the current batch of the state-of-the-art global climate models (GCMs) is used to simulate future climate under this scenario, yielding an ensemble of future climate projections (which reflect, to some degree our uncertainty of being able to simulate future climate give a particular GHG scenario). Due to the coarse-resolution nature of the GCM projections, they need to be spatially downscaled for regional impact assessments. To downscale a given GCM projection, two methods have emerged: dynamical downscaling and statistical (empirical) downscaling (SDS). Dynamic downscaling

  4. The role of ensemble post-processing for modeling the ensemble tail

    NASA Astrophysics Data System (ADS)

    Van De Vyver, Hans; Van Schaeybroeck, Bert; Vannitsem, Stéphane

    2016-04-01

    . Soc. 134: 2051-2066.Buizza and Leutbecher, 2015: The forecast skill horizon, Q. J. R. Meteorol. Soc. 141: 3366-3382.Ferro, 2007: A probability model for verifying deterministic forecasts of extreme events. Weather and Forecasting 22 (5), 1089-1100.Friederichs, 2010: Statistical downscaling of extreme precipitation events using extreme value theory. Extremes 13, 109-132.Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.

  5. Analyzing the future climate change of Upper Blue Nile River basin using statistical downscaling techniques

    NASA Astrophysics Data System (ADS)

    Fenta Mekonnen, Dagnenet; Disse, Markus

    2018-04-01

    Climate change is becoming one of the most threatening issues for the world today in terms of its global context and its response to environmental and socioeconomic drivers. However, large uncertainties between different general circulation models (GCMs) and coarse spatial resolutions make it difficult to use the outputs of GCMs directly, especially for sustainable water management at regional scale, which introduces the need for downscaling techniques using a multimodel approach. This study aims (i) to evaluate the comparative performance of two widely used statistical downscaling techniques, namely the Long Ashton Research Station Weather Generator (LARS-WG) and the Statistical Downscaling Model (SDSM), and (ii) to downscale future climate scenarios of precipitation, maximum temperature (Tmax) and minimum temperature (Tmin) of the Upper Blue Nile River basin at finer spatial and temporal scales to suit further hydrological impact studies. The calibration and validation result illustrates that both downscaling techniques (LARS-WG and SDSM) have shown comparable and good ability to simulate the current local climate variables. Further quantitative and qualitative comparative performance evaluation was done by equally weighted and varying weights of statistical indexes for precipitation only. The evaluation result showed that SDSM using the canESM2 CMIP5 GCM was able to reproduce more accurate long-term mean monthly precipitation but LARS-WG performed best in capturing the extreme events and distribution of daily precipitation in the whole data range. Six selected multimodel CMIP3 GCMs, namely HadCM3, GFDL-CM2.1, ECHAM5-OM, CCSM3, MRI-CGCM2.3.2 and CSIRO-MK3 GCMs, were used for downscaling climate scenarios by the LARS-WG model. The result from the ensemble mean of the six GCM showed an increasing trend for precipitation, Tmax and Tmin. The relative change in precipitation ranged from 1.0 to 14.4 % while the change for mean annual Tmax may increase from 0.4 to 4.3

  6. The effects of climate downscaling technique and observational data set on modeled ecological responses.

    PubMed

    Pourmokhtarian, Afshin; Driscoll, Charles T; Campbell, John L; Hayhoe, Katharine; Stoner, Anne M K

    2016-07-01

    Assessments of future climate change impacts on ecosystems typically rely on multiple climate model projections, but often utilize only one downscaling approach trained on one set of observations. Here, we explore the extent to which modeled biogeochemical responses to changing climate are affected by the selection of the climate downscaling method and training observations used at the montane landscape of the Hubbard Brook Experimental Forest, New Hampshire, USA. We evaluated three downscaling methods: the delta method (or the change factor method), monthly quantile mapping (Bias Correction-Spatial Disaggregation, or BCSD), and daily quantile regression (Asynchronous Regional Regression Model, or ARRM). Additionally, we trained outputs from four atmosphere-ocean general circulation models (AOGCMs) (CCSM3, HadCM3, PCM, and GFDL-CM2.1) driven by higher (A1fi) and lower (B1) future emissions scenarios on two sets of observations (1/8º resolution grid vs. individual weather station) to generate the high-resolution climate input for the forest biogeochemical model PnET-BGC (eight ensembles of six runs).The choice of downscaling approach and spatial resolution of the observations used to train the downscaling model impacted modeled soil moisture and streamflow, which in turn affected forest growth, net N mineralization, net soil nitrification, and stream chemistry. All three downscaling methods were highly sensitive to the observations used, resulting in projections that were significantly different between station-based and grid-based observations. The choice of downscaling method also slightly affected the results, however not as much as the choice of observations. Using spatially smoothed gridded observations and/or methods that do not resolve sub-monthly shifts in the distribution of temperature and/or precipitation can produce biased results in model applications run at greater temporal and/or spatial resolutions. These results underscore the importance of

  7. AI-based (ANN and SVM) statistical downscaling methods for precipitation estimation under climate change scenarios

    NASA Astrophysics Data System (ADS)

    Mehrvand, Masoud; Baghanam, Aida Hosseini; Razzaghzadeh, Zahra; Nourani, Vahid

    2017-04-01

    Since statistical downscaling methods are the most largely used models to study hydrologic impact studies under climate change scenarios, nonlinear regression models known as Artificial Intelligence (AI)-based models such as Artificial Neural Network (ANN) and Support Vector Machine (SVM) have been used to spatially downscale the precipitation outputs of Global Climate Models (GCMs). The study has been carried out using GCM and station data over GCM grid points located around the Peace-Tampa Bay watershed weather stations. Before downscaling with AI-based model, correlation coefficient values have been computed between a few selected large-scale predictor variables and local scale predictands to select the most effective predictors. The selected predictors are then assessed considering grid location for the site in question. In order to increase AI-based downscaling model accuracy pre-processing has been developed on precipitation time series. In this way, the precipitation data derived from various GCM data analyzed thoroughly to find the highest value of correlation coefficient between GCM-based historical data and station precipitation data. Both GCM and station precipitation time series have been assessed by comparing mean and variances over specific intervals. Results indicated that there is similar trend between GCM and station precipitation data; however station data has non-stationary time series while GCM data does not. Finally AI-based downscaling model have been applied to several GCMs with selected predictors by targeting local precipitation time series as predictand. The consequences of recent step have been used to produce multiple ensembles of downscaled AI-based models.

  8. Methodology for Air Quality Forecast Downscaling from Regional- to Street-Scale

    NASA Astrophysics Data System (ADS)

    Baklanov, Alexander; Nuterman, Roman; Mahura, Alexander; Amstrup, Bjarne; Hansen Saas, Bent; Havskov Sørensen, Jens; Lorenzen, Thomas; Weismann, Jakob

    2010-05-01

    The most serious air pollution events occur in cities where there is a combination of high population density and air pollution, e.g. from vehicles. The pollutants can lead to serious human health problems, including asthma, irritation of the lungs, bronchitis, pneumonia, decreased resistance to respiratory infections, and premature death. In particular air pollution is associated with increase in cardiovascular disease and lung cancer. In 2000 WHO estimated that between 2.5 % and 11 % of total annual deaths are caused by exposure to air pollution. However, European-scale air quality models are not suited for local forecasts, as their grid-cell is typically of the order of 5 to 10km and they generally lack detailed representation of urban effects. Two suites are used in the framework of the EC FP7 project MACC (Monitoring of Atmosphere Composition and Climate) to demonstrate how downscaling from the European MACC ensemble to local-scale air quality forecast will be carried out: one will illustrate capabilities for the city of Copenhagen (Denmark); the second will focus on the city of Bucharest (Romania). This work is devoted to the first suite, where methodological aspects of downscaling from regional (European/ Denmark) to urban scale (Copenhagen), and from the urban down to street scale. The first results of downscaling according to the proposed methodology are presented. The potential for downscaling of European air quality forecasts by operating urban and street-level forecast models is evaluated. This will bring a strong support for continuous improvement of the regional forecast modelling systems for air quality in Europe, and underline clear perspectives for the future regional air quality core and downstream services for end-users. At the end of the MACC project, requirements on "how-to-do" downscaling of European air-quality forecasts to the city and street levels with different approaches will be formulated.

  9. High resolution statistical downscaling of the EUROSIP seasonal prediction. Application for southeastern Romania

    NASA Astrophysics Data System (ADS)

    Busuioc, Aristita; Dumitrescu, Alexandru; Dumitrache, Rodica; Iriza, Amalia

    2017-04-01

    Seasonal climate forecasts in Europe are currently issued at the European Centre for Medium-Range Weather Forecasts (ECMWF) in the form of multi-model ensemble predictions available within the "EUROSIP" system. Different statistical techniques to calibrate, downscale and combine the EUROSIP direct model output are used to optimize the quality of the final probabilistic forecasts. In this study, a statistical downscaling model (SDM) based on canonical correlation analysis (CCA) is used to downscale the EUROSIP seasonal forecast at a spatial resolution of 1km x 1km over the Movila farm placed in southeastern Romania. This application is achieved in the framework of the H2020 MOSES project (http://www.moses-project.eu). The combination between monthly standardized values of three climate variables (maximum/minimum temperatures-Tmax/Tmin, total precipitation-Prec) is used as predictand while combinations of various large-scale predictors are tested in terms of their availability as outputs in the seasonal EUROSIP probabilistic forecasting (sea level pressure, temperature at 850 hPa and geopotential height at 500 hPa). The predictors are taken from the ECMWF system considering 15 members of the ensemble, for which the hindcasts since 1991 until present are available. The model was calibrated over the period 1991-2014 and predictions for summers 2015 and 2016 were achieved. The calibration was made for the ensemble average as well as for each ensemble member. The model was developed for each lead time: one month anticipation for June, two months anticipation for July and three months anticipation for August. The main conclusions from these preliminary results are: best predictions (in terms of the anomaly sign) for Tmax (July-2 months anticipation, August-3 months anticipation) for both years (2015, 2016); for Tmin - good predictions only for August (3 months anticipation ) for both years; for precipitation, good predictions for July (2 months anticipation) in 2015 and

  10. Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, J. Brent; Bosilovich, Michael; Lyon, Bradfield

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period.

  11. Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa

    NASA Technical Reports Server (NTRS)

    Roberts, J. Brent; Robertson, Franklin R.; Bosilovich, Michael; Lyon, Bradfield; Funk, Chris

    2013-01-01

    The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period

  12. Regional Climate Models Downscaling in the Alpine Area with Multimodel SuperEnsemble

    NASA Astrophysics Data System (ADS)

    Cane, D.; Barbarino, S.; Renier, L.; Ronchi, C.

    2012-04-01

    The climatic scenarios show a strong signal of warming in the Alpine area already for the mid XXI century. The climate simulation, however, even when obtained with Regional Climate Models (RCMs), are affected by strong errors where compared with observations in the control period, due to their difficulties in representing the complex orography of the Alps and limitations in their physical parametrization. In this work we use a selection of RCMs runs from the ENSEMBLES project, carefully chosen in order to maximise the variety of leading Global Climate Models and of the RCMs themselves, calculated on the SRES scenario A1B. The reference observation for the Greater Alpine Area are extracted from the European dataset E-OBS produced by the project ENSEMBLES with an available resolution of 25 km. For the study area of Piemonte daily temperature and precipitation observations (1957-present) were carefully gridded on a 14-km grid over Piemonte Region with an Optimal Interpolation technique. We applied the Multimodel SuperEnsemble technique to temperature fields, reducing the high biases of RCMs temperature field compared to observations in the control period. We propose also the first application to RCMs of a brand new probabilistic Multimodel SuperEnsemble Dressing technique to estimate precipitation fields, already applied successfully to weather forecast models, with careful description of precipitation Probability Density Functions conditioned to the model outputs. This technique reduces the strong precipitation overestimation by RCMs over the alpine chain and reproduces the monthly behaviour of observed precipitation in the control period far better than the direct model outputs.

  13. Projection of spatial and temporal changes of rainfall in Sarawak of Borneo Island using statistical downscaling of CMIP5 models

    NASA Astrophysics Data System (ADS)

    Sa'adi, Zulfaqar; Shahid, Shamsuddin; Chung, Eun-Sung; Ismail, Tarmizi bin

    2017-11-01

    This study assesses the possible changes in rainfall patterns of Sarawak in Borneo Island due to climate change through statistical downscaling of General Circulation Models (GCM) projections. Available in-situ observed rainfall data were used to downscale the future rainfall from ensembles of 20 GCMs of Coupled Model Intercomparison Project phase 5 (CMIP5) for four Representative Concentration Pathways (RCP) scenarios, namely, RCP2.6, RCP4.5, RCP6.0 and RCP8.5. Model Output Statistics (MOS) based downscaling models were developed using two data mining approaches known as Random Forest (RF) and Support Vector Machine (SVM). The SVM was found to downscale all GCMs with normalized mean square error (NMSE) of 48.2-75.2 and skill score (SS) of 0.94-0.98 during validation. The results show that the future projection of the annual rainfalls is increasing and decreasing on the region-based and catchment-based basis due to the influence of the monsoon season affecting the coast of Sarawak. The ensemble mean of GCMs projections reveals the increased and decreased mean of annual precipitations at 33 stations with the rate of 0.1% to 19.6% and one station with the rate of - 7.9% to - 3.1%, respectively under all RCP scenarios. The remaining 15 stations showed inconsistency neither increasing nor decreasing at the rate of - 5.6% to 5.2%, but mainly showing a trend of decreasing rainfall during the first period (2010-2039) followed by increasing rainfall for the period of 2070-2099.

  14. An intercomparison of a large ensemble of statistical downscaling methods for Europe: Overall results from the VALUE perfect predictor cross-validation experiment

    NASA Astrophysics Data System (ADS)

    Gutiérrez, Jose Manuel; Maraun, Douglas; Widmann, Martin; Huth, Radan; Hertig, Elke; Benestad, Rasmus; Roessler, Ole; Wibig, Joanna; Wilcke, Renate; Kotlarski, Sven

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. This framework is based on a user-focused validation tree, guiding the selection of relevant validation indices and performance measures for different aspects of the validation (marginal, temporal, spatial, multi-variable). Moreover, several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur (assessment of intrinsic performance, effect of errors inherited from the global models, effect of non-stationarity, etc.). The list of downscaling experiments includes 1) cross-validation with perfect predictors, 2) GCM predictors -aligned with EURO-CORDEX experiment- and 3) pseudo reality predictors (see Maraun et al. 2015, Earth's Future, 3, doi:10.1002/2014EF000259, for more details). The results of these experiments are gathered, validated and publicly distributed through the VALUE validation portal, allowing for a comprehensive community-open downscaling intercomparison study. In this contribution we describe the overall results from Experiment 1), consisting of a European wide 5-fold cross-validation (with consecutive 6-year periods from 1979 to 2008) using predictors from ERA-Interim to downscale precipitation and temperatures (minimum and maximum) over a set of 86 ECA&D stations representative of the main geographical and climatic regions in Europe. As a result of the open call for contribution to this experiment (closed in Dec. 2015), over 40 methods representative of the main approaches (MOS and Perfect Prognosis, PP) and techniques (linear scaling, quantile mapping, analogs, weather typing, linear and generalized regression, weather generators, etc.) were submitted, including information both data

  15. High resolution probabilistic precipitation forecast over Spain combining the statistical downscaling tool PROMETEO and the AEMET short range EPS system (AEMET/SREPS)

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Santos, C.; Garcia-Moya, J. A.; Gutierrez, J. M.; Orfila, B.

    2009-04-01

    The Short-Range Ensemble Prediction System (SREPS) is a multi-LAM (UM, HIRLAM, MM5, LM and HRM) multi analysis/boundary conditions (ECMWF, UKMetOffice, DWD and GFS) run twice a day by AEMET (72 hours lead time) over a European domain, with a total of 5 (LAMs) x 4 (GCMs) = 20 members. One of the main goals of this project is analyzing the impact of models and boundary conditions in the short-range high-resolution forecasted precipitation. A previous validation of this method has been done considering a set of climate networks in Spain, France and Germany, by interpolating the prediction to the gauge locations (SREPS, 2008). In this work we compare these results with those obtained by using a statistical downscaling method to post-process the global predictions, obtaining an "advanced interpolation" for the local precipitation using climate network precipitation observations. In particular, we apply the PROMETEO downscaling system based on analogs and compare the SREPS ensemble of 20 members with the PROMETEO statistical ensemble of 5 (analog ensemble) x 4 (GCMs) = 20 members. Moreover, we will also compare the performance of a combined approach post-processing the SREPS outputs using the PROMETEO system. References: SREPS 2008. 2008 EWGLAM-SRNWP Meeting (http://www.aemet.es/documentos/va/divulgacion/conferencias/prediccion/Ewglam/PRED_CSantos.pdf)

  16. Regional climate models downscaling in the Alpine area with Multimodel SuperEnsemble

    NASA Astrophysics Data System (ADS)

    Cane, D.; Barbarino, S.; Renier, L. A.; Ronchi, C.

    2012-08-01

    The climatic scenarios show a strong signal of warming in the Alpine area already for the mid XXI century. The climate simulations, however, even when obtained with Regional Climate Models (RCMs), are affected by strong errors where compared with observations, due to their difficulties in representing the complex orography of the Alps and limitations in their physical parametrization. Therefore the aim of this work is reducing these model biases using a specific post processing statistic technique to obtain a more suitable projection of climate change scenarios in the Alpine area. For our purposes we use a selection of RCMs runs from the ENSEMBLES project, carefully chosen in order to maximise the variety of leading Global Climate Models and of the RCMs themselves, calculated on the SRES scenario A1B. The reference observation for the Greater Alpine Area are extracted from the European dataset E-OBS produced by the project ENSEMBLES with an available resolution of 25 km. For the study area of Piedmont daily temperature and precipitation observations (1957-present) were carefully gridded on a 14-km grid over Piedmont Region with an Optimal Interpolation technique. Hence, we applied the Multimodel SuperEnsemble technique to temperature fields, reducing the high biases of RCMs temperature field compared to observations in the control period. We propose also the first application to RCMS of a brand new probabilistic Multimodel SuperEnsemble Dressing technique to estimate precipitation fields, already applied successfully to weather forecast models, with careful description of precipitation Probability Density Functions conditioned to the model outputs. This technique reduces the strong precipitation overestimation by RCMs over the alpine chain and reproduces well the monthly behaviour of precipitation in the control period.

  17. Downscaled rainfall projections in south Florida using self-organizing maps.

    PubMed

    Sinha, Palash; Mann, Michael E; Fuentes, Jose D; Mejia, Alfonso; Ning, Liang; Sun, Weiyi; He, Tao; Obeysekera, Jayantha

    2018-04-20

    We make future projections of seasonal precipitation characteristics in southern Florida using a statistical downscaling approach based on Self Organized Maps. Our approach is applied separately to each three-month season: September-November; December-February; March-May; and June-August. We make use of 19 different simulations from the Coupled Model Inter-comparison Project, phase 5 (CMIP5) and generate an ensemble of 1500 independent daily precipitation surrogates for each model simulation, yielding a grand ensemble of 28,500 total realizations for each season. The center and moments (25%ile and 75%ile) of this distribution are used to characterize most likely scenarios and their associated uncertainties. This approach is applied to 30-year windows of daily mean precipitation for both the CMIP5 historical simulations (1976-2005) and the CMIP5 future (RCP 4.5) projections. For the latter case, we examine both the "near future" (2021-2050) and "far future" (2071-2100) periods for three scenarios (RCP2.6, RCP4.5, and RCP8.5). Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Dynamical Downscaling of Global Circulation Models With the Weather Research and Forecast Model in the Northern Great Plains

    NASA Astrophysics Data System (ADS)

    Burtch, D.; Mullendore, G. L.; Kennedy, A. D.; Simms, M.; Kirilenko, A.; Coburn, J.

    2015-12-01

    Understanding the impacts of global climate change on regional scales is crucial for accurate decision-making by state and local governments. This is especially true in North Dakota, where climate change can have significant consequences on agriculture, its traditionally strongest economic sector. This region of the country shows a high variability in precipitation, especially in the summer months and so the focus of this study is on warm season processes over decadal time scales. The Weather Research and Forecast (WRF) model is used to dynamically downscale two Global Circulation Models (GCMs) from the CMIP5 ensemble in order to determine the microphysical parameterization and nudging techniques (spectral or analysis) best suited for this region. The downscaled domain includes the entirety of North Dakota at a horizontal resolution of 5 km. In addition, smaller domains of 1 km horizontal resolution are centered over regions of focused hydrological importance. The dynamically downscaled simulations are compared with both gridded observational data and statistically downscaled data to evaluate the performance of the simulations. Preliminary results have shown a marked difference between the two downscaled GCMs in terms of temperature and precipitation bias. Choice of microphysical parameterization has not shown to create any significant differences in the temperature fields. However, the precipitation fields do appear to be most affected by the microphysical parameterization, regardless of the choice of GCM. Implications on the unique water resource challenges faced in this region will also be discussed.

  19. Downscaling of RCM outputs for representative catchments in the Mediterranean region, for the 1951-2100 time-frame

    NASA Astrophysics Data System (ADS)

    Deidda, Roberto; Marrocu, Marino; Pusceddu, Gabriella; Langousis, Andreas; Mascaro, Giuseppe; Caroletti, Giulio

    2013-04-01

    Within the activities of the EU FP7 CLIMB project (www.climb-fp7.eu), we developed downscaling procedures to reliably assess climate forcing at hydrologically relevant scales, and applied them to six representative hydrological basins located in the Mediterranean region: Riu Mannu and Noce in Italy, Chiba in Tunisia, Kocaeli in Turkey, Thau in France, and Gaza in Palestine. As a first step towards this aim, we used daily precipitation and temperature data from the gridded E-OBS project (www.ecad.eu/dailydata), as reference fields, to rank 14 Regional Climate Model (RCM) outputs from the ENSEMBLES project (http://ensembles-eu.metoffice.com). The four best performing model outputs were selected, with the additional constraint of maintaining 2 outputs obtained from running different RCMs driven by the same GCM, and 2 runs from the same RCM driven by different GCMs. For these four RCM-GCM model combinations, a set of downscaling techniques were developed and applied, for the period 1951-2100, to variables used in hydrological modeling (i.e. precipitation; mean, maximum and minimum daily temperatures; direct solar radiation, relative humidity, magnitude and direction of surface winds). The quality of the final products is discussed, together with the results obtained after applying a bias reduction procedure to daily temperature and precipitation fields.

  20. Quantification of downscaled precipitation uncertainties via Bayesian inference

    NASA Astrophysics Data System (ADS)

    Nury, A. H.; Sharma, A.; Marshall, L. A.

    2017-12-01

    Prediction of precipitation from global climate model (GCM) outputs remains critical to decision-making in water-stressed regions. In this regard, downscaling of GCM output has been a useful tool for analysing future hydro-climatological states. Several downscaling approaches have been developed for precipitation downscaling, including those using dynamical or statistical downscaling methods. Frequently, outputs from dynamical downscaling are not readily transferable across regions for significant methodical and computational difficulties. Statistical downscaling approaches provide a flexible and efficient alternative, providing hydro-climatological outputs across multiple temporal and spatial scales in many locations. However these approaches are subject to significant uncertainty, arising due to uncertainty in the downscaled model parameters and in the use of different reanalysis products for inferring appropriate model parameters. Consequently, these will affect the performance of simulation in catchment scale. This study develops a Bayesian framework for modelling downscaled daily precipitation from GCM outputs. This study aims to introduce uncertainties in downscaling evaluating reanalysis datasets against observational rainfall data over Australia. In this research a consistent technique for quantifying downscaling uncertainties by means of Bayesian downscaling frame work has been proposed. The results suggest that there are differences in downscaled precipitation occurrences and extremes.

  1. Dynamical downscaling of regional climate over eastern China using RSM with multiple physics scheme ensembles

    NASA Astrophysics Data System (ADS)

    Peishu, Zong; Jianping, Tang; Shuyu, Wang; Lingyun, Xie; Jianwei, Yu; Yunqian, Zhu; Xiaorui, Niu; Chao, Li

    2017-08-01

    The parameterization of physical processes is one of the critical elements to properly simulate the regional climate over eastern China. It is essential to conduct detailed analyses on the effect of physical parameterization schemes on regional climate simulation, to provide more reliable regional climate change information. In this paper, we evaluate the 25-year (1983-2007) summer monsoon climate characteristics of precipitation and surface air temperature by using the regional spectral model (RSM) with different physical schemes. The ensemble results using the reliability ensemble averaging (REA) method are also assessed. The result shows that the RSM model has the capacity to reproduce the spatial patterns, the variations, and the temporal tendency of surface air temperature and precipitation over eastern China. And it tends to predict better climatology characteristics over the Yangtze River basin and the South China. The impact of different physical schemes on RSM simulations is also investigated. Generally, the CLD3 cloud water prediction scheme tends to produce larger precipitation because of its overestimation of the low-level moisture. The systematic biases derived from the KF2 cumulus scheme are larger than those from the RAS scheme. The scale-selective bias correction (SSBC) method improves the simulation of the temporal and spatial characteristics of surface air temperature and precipitation and advances the circulation simulation capacity. The REA ensemble results show significant improvement in simulating temperature and precipitation distribution, which have much higher correlation coefficient and lower root mean square error. The REA result of selected experiments is better than that of nonselected experiments, indicating the necessity of choosing better ensemble samples for ensemble.

  2. Climate Downscaling over Nordeste, Brazil, Using the NCEP RSM97.

    NASA Astrophysics Data System (ADS)

    Sun, Liqiang; Ferran Moncunill, David; Li, Huilan; Divino Moura, Antonio; de Assis de Souza Filho, Francisco

    2005-02-01

    The NCEP Regional Spectral Model (RSM), with horizontal resolution of 60 km, was used to downscale the ECHAM4.5 AGCM (T42) simulations forced with observed SSTs over northeast Brazil. An ensemble of 10 runs for the period January-June 1971-2000 was used in this study. The RSM can resolve the spatial patterns of observed seasonal precipitation and capture the interannual variability of observed seasonal precipitation as well. The AGCM bias in displacement of the Atlantic ITCZ is partially corrected in the RSM. The RSM probability distribution function of seasonal precipitation anomalies is in better agreement with observations than that of the driving AGCM. Good potential prediction skills are demonstrated by the RSM in predicting the interannual variability of regional seasonal precipitation. The RSM can also capture the interannual variability of observed precipitation at intraseasonal time scales, such as precipitation intensity distribution and dry spells. A drought index and a flooding index were adopted to indicate the severity of drought and flooding conditions, and their interannual variability was reproduced by the RSM. The overall RSM performance in the downscaled climate of the ECHAM4.5 AGCM is satisfactory over Nordeste. The primary deficiency is a systematic dry bias for precipitation simulation.

  3. Climate change effects on wildland fire risk in the Northeastern and Great Lakes states predicted by a downscaled multi-model ensemble

    NASA Astrophysics Data System (ADS)

    Kerr, Gaige Hunter; DeGaetano, Arthur T.; Stoof, Cathelijne R.; Ward, Daniel

    2018-01-01

    This study is among the first to investigate wildland fire risk in the Northeastern and the Great Lakes states under a changing climate. We use a multi-model ensemble (MME) of regional climate models from the Coordinated Regional Downscaling Experiment (CORDEX) together with the Canadian Forest Fire Weather Index System (CFFWIS) to understand changes in wildland fire risk through differences between historical simulations and future projections. Our results are relatively homogeneous across the focus region and indicate modest increases in the magnitude of fire weather indices (FWIs) during northern hemisphere summer. The most pronounced changes occur in the date of the initialization of CFFWIS and peak of the wildland fire season, which in the future are trending earlier in the year, and in the significant increases in the length of high-risk episodes, defined by the number of consecutive days with FWIs above the current 95th percentile. Further analyses show that these changes are most closely linked to expected changes in the focus region's temperature and precipitation. These findings relate to the current understanding of particulate matter vis-à-vis wildfires and have implications for human health and local and regional changes in radiative forcings. When considering current fire management strategies which could be challenged by increasing wildland fire risk, fire management agencies could adapt new strategies to improve awareness, prevention, and resilience to mitigate potential impacts to critical infrastructure and population.

  4. A Novel approach for monitoring cyanobacterial blooms using an ensemble based system from MODIS imagery downscaled to 250 metres spatial resolution

    NASA Astrophysics Data System (ADS)

    El Alem, A.; Chokmani, K.; Laurion, I.; El-Adlouni, S. E.

    2014-12-01

    In reason of inland freshwaters sensitivity to Harmful algae blooms (HAB) development and the limits coverage of standards monitoring programs, remote sensing data have become increasingly used for monitoring HAB extension. Usually, HAB monitoring using remote sensing data is based on empirical and semi-empirical models. Development of such models requires a great number of continuous in situ measurements to reach an acceptable accuracy. However, Ministries and water management organizations often use two thresholds, established by the World Health Organization, to determine water quality. Consequently, the available data are ordinal «semi-qualitative» and they are mostly unexploited. Use of such databases with remote sensing data and statistical classification algorithms can produce hazard management maps linked to the presence of cyanobacteria. Unlike standard classification algorithms, which are generally unstable, classifiers based on ensemble systems are more general and stable. In the present study, an ensemble based classifier was developed and compared to a standard classification method called CART (Classification and Regression Tree) in a context of HAB monitoring in freshwaters using MODIS images downscaled to 250 spatial resolution and ordinal in situ data. Calibration and validation data on cyanobacteria densities were collected by the Ministère du Développement durable, de l'Environnement et de la Lutte contre les changements climatiques on 22 waters bodies between 2000 and 2010. These data comprise three density classes: waters poorly (< 20,000 cells mL-1), moderately (20,000 - 100,000 cells mL-1), and highly (> 100,000 cells mL-1) loaded in cyanobacteria. Results were very interesting and highlighted that inland waters exhibit different spectral response allowing them to be classified into the three above classes for water quality monitoring. On the other, even if the accuracy (Kappa-index = 0.86) of the proposed approach is relatively lower

  5. Climate change effects on extreme flows of water supply area in Istanbul: utility of regional climate models and downscaling method.

    PubMed

    Kara, Fatih; Yucel, Ismail

    2015-09-01

    This study investigates the climate change impact on the changes of mean and extreme flows under current and future climate conditions in the Omerli Basin of Istanbul, Turkey. The 15 regional climate model output from the EU-ENSEMBLES project and a downscaling method based on local implications from geophysical variables were used for the comparative analyses. Automated calibration algorithm is used to optimize the parameters of Hydrologiska Byråns Vattenbalansavdel-ning (HBV) model for the study catchment using observed daily temperature and precipitation. The calibrated HBV model was implemented to simulate daily flows using precipitation and temperature data from climate models with and without downscaling method for reference (1960-1990) and scenario (2071-2100) periods. Flood indices were derived from daily flows, and their changes throughout the four seasons and year were evaluated by comparing their values derived from simulations corresponding to the current and future climate. All climate models strongly underestimate precipitation while downscaling improves their underestimation feature particularly for extreme events. Depending on precipitation input from climate models with and without downscaling the HBV also significantly underestimates daily mean and extreme flows through all seasons. However, this underestimation feature is importantly improved for all seasons especially for spring and winter through the use of downscaled inputs. Changes in extreme flows from reference to future increased for the winter and spring and decreased for the fall and summer seasons. These changes were more significant with downscaling inputs. With respect to current time, higher flow magnitudes for given return periods will be experienced in the future and hence, in the planning of the Omerli reservoir, the effective storage and water use should be sustained.

  6. Spoilt for choice - A comparison of downscaling approaches for hydrological impact studies

    NASA Astrophysics Data System (ADS)

    Rössler, Ole; Fischer, Andreas; Kotlarski, Sven; Keller, Denise; Liniger, Mark; Weingartner, Rolf

    2017-04-01

    With the increasing number of available climate downscaling approaches, users are often faced with the luxury problem of which downscaling method to apply in a climate change impact assessment study. In Switzerland, for instance, the new generation of local scale climate scenarios CH2018 will be based on quantile mapping (QM), replacing the previous delta change (DC) method. Parallel to those two methods, a multi-site weather generator (WG) was developed to meet specific user needs. The question poses which downscaling method is the most suitable for a given application. Here, we analyze the differences of the three approaches in terms of hydro-meteorological responses in the Swiss pre-Alps in terms of mean values as well as indices of extremes. The comparison of the three different approaches was carried out in the frame of a hydrological impact assessment study that focused on different runoff characteristics and their related meteorological indices in the meso-scale catchment of the river Thur ( 1700 km2), Switzerland. For this purpose, we set up the hydrological model WaSiM-ETH under present (1980-2009) and under future conditions (2070-2099), assuming the SRES A1B emission scenario. Input to the three downscaling approaches were 10 GCM-RCM simulations of the ENSEMBLES project, while eight meteorological station observations served as the reference. All station data, observed and downscaled, were interpolated to obtain meteorological fields of temperature and precipitation required by the hydrological model. For the present-day reference period we evaluated the ability of each downscaling method to reproduce today's hydro-meteorological patterns. In the scenario runs, we focused on the comparison of change signals for each hydro-meteorological parameter generated by the three downscaling techniques. The evaluation exercise reveals that QM and WG perform equally well in representing present day average conditions, but that QM outperforms WG in reproducing

  7. Precipitation Dynamical Downscaling Over the Great Plains

    NASA Astrophysics Data System (ADS)

    Hu, Xiao-Ming; Xue, Ming; McPherson, Renee A.; Martin, Elinor; Rosendahl, Derek H.; Qiao, Lei

    2018-02-01

    Detailed, regional climate projections, particularly for precipitation, are critical for many applications. Accurate precipitation downscaling in the United States Great Plains remains a great challenge for most Regional Climate Models, particularly for warm months. Most previous dynamic downscaling simulations significantly underestimate warm-season precipitation in the region. This study aims to achieve a better precipitation downscaling in the Great Plains with the Weather Research and Forecast (WRF) model. To this end, WRF simulations with different physics schemes and nudging strategies are first conducted for a representative warm season. Results show that different cumulus schemes lead to more pronounced difference in simulated precipitation than other tested physics schemes. Simply choosing different physics schemes is not enough to alleviate the dry bias over the southern Great Plains, which is related to an anticyclonic circulation anomaly over the central and western parts of continental U.S. in the simulations. Spectral nudging emerges as an effective solution for alleviating the precipitation bias. Spectral nudging ensures that large and synoptic-scale circulations are faithfully reproduced while still allowing WRF to develop small-scale dynamics, thus effectively suppressing the large-scale circulation anomaly in the downscaling. As a result, a better precipitation downscaling is achieved. With the carefully validated configurations, WRF downscaling is conducted for 1980-2015. The downscaling captures well the spatial distribution of monthly climatology precipitation and the monthly/yearly variability, showing improvement over at least two previously published precipitation downscaling studies. With the improved precipitation downscaling, a better hydrological simulation over the trans-state Oologah watershed is also achieved.

  8. Downscaling Reanalysis over Continental Africa with a Regional Model: NCEP Versus ERA Interim Forcing

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.; Fulakeza, Matthew B.

    2013-01-01

    Five annual climate cycles (1998-2002) are simulated for continental Africa and adjacent oceans by a regional atmospheric model (RM3). RM3 horizontal grid spacing is 0.44deg at 28 vertical levels. Each of 2 simulation ensembles is driven by lateral boundary conditions from each of 2 alternative reanalysis data sets. One simulation downs cales National Center for Environmental Prediction reanalysis 2 (NCPR2) and the other the European Centre for Medium Range Weather Forecasts Interim reanalysis (ERA-I). NCPR2 data are archived at 2.5deg grid spacing, while a recent version of ERA-I provides data at 0.75deg spacing. ERA-I-forced simulations are recomrp. ended by the Coordinated Regional Downscaling Experiment (CORDEX). Comparisons of the 2 sets of simulations with each other and with observational evidence assess the relative performance of each downscaling system. A third simulation also uses ERA-I forcing, but degraded to the same horizontal resolution as NCPR2. RM3-simulated pentad and monthly mean precipitation data are compared to Tropical Rainfall Measuring Mission (TRMM) data, gridded at 0.5deg, and RM3-simulated circulation is compared to both reanalyses. Results suggest that each downscaling system provides advantages and disadvantages relative to the other. The RM3/NCPR2 achieves a more realistic northward advance of summer monsoon rains over West Africa, but RM3/ERA-I creates the more realistic monsoon circulation. Both systems recreate some features of JulySeptember 1999 minus 2002 precipitation differences. Degrading the resolution of ERA-I driving data unrealistically slows the monsoon circulation and considerably diminishes summer rainfall rates over West Africa. The high resolution of ERA-I data, therefore, contributes to the quality of the downscaling, but NCPR2laterai boundary conditions nevertheless produce better simulations of some features.

  9. Technical note: 3-hourly temporal downscaling of monthly global terrestrial biosphere model net ecosystem exchange

    DOE PAGES

    Fisher, Joshua B.; Sikka, Munish; Huntzinger, Deborah N.; ...

    2016-07-29

    Here, the land surface provides a boundary condition to atmospheric forward and flux inversion models. These models require prior estimates of CO 2 fluxes at relatively high temporal resolutions (e.g., 3-hourly) because of the high frequency of atmospheric mixing and wind heterogeneity. However, land surface model CO 2 fluxes are often provided at monthly time steps, typically because the land surface modeling community focuses more on time steps associated with plant phenology (e.g., seasonal) than on sub-daily phenomena. Here, we describe a new dataset created from 15 global land surface models and 4 ensemble products in the Multi-scale Synthesis andmore » Terrestrial Model Intercomparison Project (MsTMIP), temporally downscaled from monthly to 3-hourly output. We provide 3-hourly output for each individual model over 7 years (2004–2010), as well as an ensemble mean, a weighted ensemble mean, and the multi-model standard deviation. Output is provided in three different spatial resolutions for user preferences: 0.5° × 0.5°, 2.0° × 2.5°, and 4.0° × 5.0° (latitude × longitude).« less

  10. European extra-tropical storm damage risk from a multi-model ensemble of dynamically-downscaled global climate models

    NASA Astrophysics Data System (ADS)

    Haylock, M. R.

    2011-10-01

    Uncertainty in the return levels of insured loss from European wind storms was quantified using storms derived from twenty-two 25 km regional climate model runs driven by either the ERA40 reanalyses or one of four coupled atmosphere-ocean global climate models. Storms were identified using a model-dependent storm severity index based on daily maximum 10 m wind speed. The wind speed from each model was calibrated to a set of 7 km historical storm wind fields using the 70 storms with the highest severity index in the period 1961-2000, employing a two stage calibration methodology. First, the 25 km daily maximum wind speed was downscaled to the 7 km historical model grid using the 7 km surface roughness length and orography, also adopting an empirical gust parameterisation. Secondly, downscaled wind gusts were statistically scaled to the historical storms to match the geographically-dependent cumulative distribution function of wind gust speed. The calibrated wind fields were run through an operational catastrophe reinsurance risk model to determine the return level of loss to a European population density-derived property portfolio. The risk model produced a 50-yr return level of loss of between 0.025% and 0.056% of the total insured value of the portfolio.

  11. Assessing long-term hydrologic impact of climate change using ensemble approach and comparison with Global Gridded Model-A case study on Goodwater Creek Experimental Watershed

    USDA-ARS?s Scientific Manuscript database

    Potential impacts of climate change on hydrologic components of Goodwater Creek Experimental Watershed were assessed using climate datasets from the Coupled Model Intercomparison Project Phase 5 and Soil and Water Assessment Tool (SWAT). Historical and future ensembles of downscaled precipitation an...

  12. Atmospheric Downscaling using Genetic Programming

    NASA Astrophysics Data System (ADS)

    Zerenner, Tanja; Venema, Victor; Simmer, Clemens

    2013-04-01

    Coupling models for the different components of the Soil-Vegetation-Atmosphere-System requires up-and downscaling procedures. Subject of our work is the downscaling scheme used to derive high resolution forcing data for land-surface and subsurface models from coarser atmospheric model output. The current downscaling scheme [Schomburg et. al. 2010, 2012] combines a bi-quadratic spline interpolation, deterministic rules and autoregressive noise. For the development of the scheme, training and validation data sets have been created by carrying out high-resolution runs of the atmospheric model. The deterministic rules in this scheme are partly based on known physical relations and partly determined by an automated search for linear relationships between the high resolution fields of the atmospheric model output and high resolution data on surface characteristics. Up to now deterministic rules are available for downscaling surface pressure and partially, depending on the prevailing weather conditions, for near surface temperature and radiation. Aim of our work is to improve those rules and to find deterministic rules for the remaining variables, which require downscaling, e.g. precipitation or near surface specifc humidity. To accomplish that, we broaden the search by allowing for interdependencies between different atmospheric parameters, non-linear relations, non-local and time-lagged relations. To cope with the vast number of possible solutions, we use genetic programming, a method from machine learning, which is based on the principles of natural evolution. We are currently working with GPLAB, a Genetic Programming toolbox for Matlab. At first we have tested the GP system to retrieve the known physical rule for downscaling surface pressure, i.e. the hydrostatic equation, from our training data. We have found this to be a simple task to the GP system. Furthermore we have improved accuracy and efficiency of the GP solution by implementing constant variation and

  13. Optimal selection of MULTI-model downscaled ensembles for interannual and seasonal climate prediction in the eastern seaboard of Thailand

    NASA Astrophysics Data System (ADS)

    Bejranonda, W.; Koch, M.

    2010-12-01

    Because of the imminent threat of the water resources of the eastern seaboard of Thailand, a climate impact study has been carried out there. To that avail, a hydrological watershed model is being used to simulate the future water availability in the wake of possible climate change in the region. The hydrological model is forced by predictions from global climate models (GCMs) that are to be downscaled in an appropriate manner. The challenge at that stage of the climate impact analysis lies then the in the choice of the best GCM and the (statistical) downscaling method. In this study the selection of coarse grid resolution output of the GCMs, transferring information to the fine grid of local climate-hydrology is achieved by cross-correlation and multiple linear regression using meteorological data in the eastern seaboard of Thailand observed between 1970-1999. The grids of 20 atmosphere/ocean global climate models (AOGCM), covering latitude 12.5-15.0 N and longitude 100.0-102.5 E were examined using the Climate-Change Scenario Generator (SCENGEN). With that tool the model efficiency of the prediction of daily precipitation and mean temperature was calculated by comparing the 1980-1999 ECMWF reanalysis predictions with the observed data during that time period. The root means square errors of the predictions were considered and ranked to select the top 5 models, namely, BCCR-BCM2.0, GISS-ER, ECHO-G, ECHAM5/MPI-OM and PCM. The daily time-series of 338 predictors in 9 runs of the 5 selected models were gathered from the CMIP3 multi-model database. Monthly time-serial cross-correlations between the climate predictors and the meteorological measurements from 25 rainfall, 4 minimum and maximum temperature, 4 humidity and 2 solar radiation stations in the study area were then computed and ranked. Using the ranked predictors, a multiple-linear regression model (downscaling transfer model) to forecast the local climate was set up. To improve the prediction power of this

  14. Data Analysis, Modeling, and Ensemble Forecasting to Support NOWCAST and Forecast Activities at the Fallon Naval Station

    DTIC Science & Technology

    2011-09-30

    forecasting and use of satellite data assimilation for model evaluation (Jiang et al, 2011a). He is a task leader on another NSF EPSCoR project...K. Horvath, R. Belu, 2011a: Application of variational data assimilation to dynamical downscaling of regional wind energy resources in the western...1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Data Analysis, Modeling, and Ensemble Forecasting to

  15. A hybrid downscaling procedure for estimating the vertical distribution of ambient temperature in local scale

    NASA Astrophysics Data System (ADS)

    Yiannikopoulou, I.; Philippopoulos, K.; Deligiorgi, D.

    2012-04-01

    The vertical thermal structure of the atmosphere is defined by a combination of dynamic and radiation transfer processes and plays an important role in describing the meteorological conditions at local scales. The scope of this work is to develop and quantify the predictive ability of a hybrid dynamic-statistical downscaling procedure to estimate the vertical profile of ambient temperature at finer spatial scales. The study focuses on the warm period of the year (June - August) and the method is applied to an urban coastal site (Hellinikon), located in eastern Mediterranean. The two-step methodology initially involves the dynamic downscaling of coarse resolution climate data via the RegCM4.0 regional climate model and subsequently the statistical downscaling of the modeled outputs by developing and training site-specific artificial neural networks (ANN). The 2.5ox2.5o gridded NCEP-DOE Reanalysis 2 dataset is used as initial and boundary conditions for the dynamic downscaling element of the methodology, which enhances the regional representivity of the dataset to 20km and provides modeled fields in 18 vertical levels. The regional climate modeling results are compared versus the upper-air Hellinikon radiosonde observations and the mean absolute error (MAE) is calculated between the four grid point values nearest to the station and the ambient temperature at the standard and significant pressure levels. The statistical downscaling element of the methodology consists of an ensemble of ANN models, one for each pressure level, which are trained separately and employ the regional scale RegCM4.0 output. The ANN models are theoretically capable of estimating any measurable input-output function to any desired degree of accuracy. In this study they are used as non-linear function approximators for identifying the relationship between a number of predictor variables and the ambient temperature at the various vertical levels. An insight of the statistically derived input

  16. User's Manual for Downscaler Fusion Software

    EPA Science Inventory

    Recently, a series of 3 papers has been published in the statistical literature that details the use of downscaling to obtain more accurate and precise predictions of air pollution across the conterminous U.S. This downscaling approach combines CMAQ gridded numerical model output...

  17. Quantifying Information Gain from Dynamic Downscaling Experiments

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Peters-Lidard, C. D.

    2015-12-01

    Dynamic climate downscaling experiments are designed to produce information at higher spatial and temporal resolutions. Such additional information is generated from the low-resolution initial and boundary conditions via the predictive power of the physical laws. However, errors and uncertainties in the initial and boundary conditions can be propagated and even amplified to the downscaled simulations. Additionally, the limit of predictability in nonlinear dynamical systems will also damper the information gain, even if the initial and boundary conditions were error-free. Thus it is critical to quantitatively define and measure the amount of information increase from dynamic downscaling experiments, to better understand and appreciate their potentials and limitations. We present a scheme to objectively measure the information gain from such experiments. The scheme is based on information theory, and we argue that if a downscaling experiment is to exhibit value, it has to produce more information than what can be simply inferred from information sources already available. These information sources include the initial and boundary conditions, the coarse resolution model in which the higher-resolution models are embedded, and the same set of physical laws. These existing information sources define an "information threshold" as a function of the spatial and temporal resolution, and this threshold serves as a benchmark to quantify the information gain from the downscaling experiments, or any other approaches. For a downscaling experiment to shown any value, the information has to be above this threshold. A recent NASA-supported downscaling experiment is used as an example to illustrate the application of this scheme.

  18. Assessing data assimilation and model boundary error strategies for high resolution ocean model downscaling in the Northern North Sea

    NASA Astrophysics Data System (ADS)

    Sandvig Mariegaard, Jesper; Huiban, Méven Robin; Tornfeldt Sørensen, Jacob; Andersson, Henrik

    2017-04-01

    Determining the optimal domain size and associated position of open boundaries in local high-resolution downscaling ocean models is often difficult. As an important input data set for downscaling ocean modelling, the European Copernicus Marine Environment Monitoring Service (CMEMS) provides baroclinic initial and boundary conditions for local ocean models. Tidal dynamics is often neglected in CMEMS services at large scale but tides are generally crucial for coastal ocean dynamics. To address this need, tides can be superposed via Flather (1976) boundary conditions and the combined flow downscaled using unstructured mesh. The surge component is also only partially represented in selected CMEMS products and must be modelled inside the domain and modelled independently and superposed if the domain becomes too small to model the effect in the downscaling model. The tide and surge components can generally be improved by assimilating water level from tide gauge and altimetry data. An intrinsic part of the problem is to find the limitations of local scale data assimilation and the requirement for consistency between the larger scale ocean models and the local scale assimilation methodologies. This contribution investigates the impact of domain size and associated positions of open boundaries with and without data assimilation of water level. We have used the baroclinic ocean model, MIKE 3 FM, and its newly re-factored built-in data assimilation package. We consider boundary conditions of salinity, temperature, water level and depth varying currents from the Global CMEMS 1/4 degree resolution model from 2011, where in situ ADCP velocity data is available for validation. We apply data assimilation of in-situ tide gauge water levels and along track altimetry surface elevation data from selected satellites. The MIKE 3 FM data assimilation model which use the Ensemble Kalman filter have recently been parallelized with MPI allowing for much larger applications running on HPC

  19. Multi-objective optimization for generating a weighted multi-model ensemble

    NASA Astrophysics Data System (ADS)

    Lee, H.

    2017-12-01

    Many studies have demonstrated that multi-model ensembles generally show better skill than each ensemble member. When generating weighted multi-model ensembles, the first step is measuring the performance of individual model simulations using observations. There is a consensus on the assignment of weighting factors based on a single evaluation metric. When considering only one evaluation metric, the weighting factor for each model is proportional to a performance score or inversely proportional to an error for the model. While this conventional approach can provide appropriate combinations of multiple models, the approach confronts a big challenge when there are multiple metrics under consideration. When considering multiple evaluation metrics, it is obvious that a simple averaging of multiple performance scores or model ranks does not address the trade-off problem between conflicting metrics. So far, there seems to be no best method to generate weighted multi-model ensembles based on multiple performance metrics. The current study applies the multi-objective optimization, a mathematical process that provides a set of optimal trade-off solutions based on a range of evaluation metrics, to combining multiple performance metrics for the global climate models and their dynamically downscaled regional climate simulations over North America and generating a weighted multi-model ensemble. NASA satellite data and the Regional Climate Model Evaluation System (RCMES) software toolkit are used for assessment of the climate simulations. Overall, the performance of each model differs markedly with strong seasonal dependence. Because of the considerable variability across the climate simulations, it is important to evaluate models systematically and make future projections by assigning optimized weighting factors to the models with relatively good performance. Our results indicate that the optimally weighted multi-model ensemble always shows better performance than an arithmetic

  20. Development of Spatiotemporal Bias-Correction Techniques for Downscaling GCM Predictions

    NASA Astrophysics Data System (ADS)

    Hwang, S.; Graham, W. D.; Geurink, J.; Adams, A.; Martinez, C. J.

    2010-12-01

    Accurately representing the spatial variability of precipitation is an important factor for predicting watershed response to climatic forcing, particularly in small, low-relief watersheds affected by convective storm systems. Although Global Circulation Models (GCMs) generally preserve spatial relationships between large-scale and local-scale mean precipitation trends, most GCM downscaling techniques focus on preserving only observed temporal variability on point by point basis, not spatial patterns of events. Downscaled GCM results (e.g., CMIP3 ensembles) have been widely used to predict hydrologic implications of climate variability and climate change in large snow-dominated river basins in the western United States (Diffenbaugh et al., 2008; Adam et al., 2009). However fewer applications to smaller rain-driven river basins in the southeastern US (where preserving spatial variability of rainfall patterns may be more important) have been reported. In this study a new method was developed to bias-correct GCMs to preserve both the long term temporal mean and variance of the precipitation data, and the spatial structure of daily precipitation fields. Forty-year retrospective simulations (1960-1999) from 16 GCMs were collected (IPCC, 2007; WCRP CMIP3 multi-model database: https://esg.llnl.gov:8443/), and the daily precipitation data at coarse resolution (i.e., 280km) were interpolated to 12km spatial resolution and bias corrected using gridded observations over the state of Florida (Maurer et al., 2002; Wood et al, 2002; Wood et al, 2004). In this method spatial random fields which preserved the observed spatial correlation structure of the historic gridded observations and the spatial mean corresponding to the coarse scale GCM daily rainfall were generated. The spatiotemporal variability of the spatio-temporally bias-corrected GCMs were evaluated against gridded observations, and compared to the original temporally bias-corrected and downscaled CMIP3 data for the

  1. Downscaling of inundation extents

    NASA Astrophysics Data System (ADS)

    Aires, Filipe; Prigent, Catherine; Papa, Fabrice

    2014-05-01

    The Global Inundation Extent from Multi-Satellite (GIEMS) provides multi-year monthly variations of the global surface water extent at about 25 kmx25 km resolution, from 1993 to 2007. It is derived from multiple satellite observations. Its spatial resolution is usually compatible with climate model outputs and with global land surface model grids but is clearly not adequate for local applications that require the characterization of small individual water bodies. There is today a strong demand for high-resolution inundation extent datasets, for a large variety of applications such as water management, regional hydrological modeling, or for the analysis of mosquitos-related diseases. This paper present three approaches to do downscale GIEMS: The first one is based on a image-processing technique using neighborhood constraints. The third approach uses a PCA representation to perform an algebraic inversion. The PCA-representation is also very convenient to perform temporal and spatial interpolation of complexe inundation fields. The third downscaling method uses topography information from Hydroshed Digital Elevation Model (DEM). Information such as the elevation, distance to river and flow accumulation are used to define a ``flood ability index'' that is used by the downscaling. Three basins will be considered for illustrative purposes: Amazon, Niger and Mekong. Aires, F., F. Papa, C. Prigent, J.-F. Cretaux and M. Berge-Nguyen, Characterization and downscaling of the inundation extent over the Inner Niger delta using a multi-wavelength retrievals and Modis data, J. of Hydrometeorology, in press, 2014. Aires, F., F. Papa and C. Prigent, A long-term, high-resolution wetland dataset over the Amazon basin, downscaled from a multi-wavelength retrieval using SAR, J. of Hydrometeorology, 14, 594-6007, 2013. Prigent, C., F. Papa, F. Aires, C. Jimenez, W.B. Rossow, and E. Matthews. Changes in land surface water dynamics since the 1990s and relation to population pressure

  2. Application of physical scaling towards downscaling climate model precipitation data

    NASA Astrophysics Data System (ADS)

    Gaur, Abhishek; Simonovic, Slobodan P.

    2018-04-01

    Physical scaling (SP) method downscales climate model data to local or regional scales taking into consideration physical characteristics of the area under analysis. In this study, multiple SP method based models are tested for their effectiveness towards downscaling North American regional reanalysis (NARR) daily precipitation data. Model performance is compared with two state-of-the-art downscaling methods: statistical downscaling model (SDSM) and generalized linear modeling (GLM). The downscaled precipitation is evaluated with reference to recorded precipitation at 57 gauging stations located within the study region. The spatial and temporal robustness of the downscaling methods is evaluated using seven precipitation based indices. Results indicate that SP method-based models perform best in downscaling precipitation followed by GLM, followed by the SDSM model. Best performing models are thereafter used to downscale future precipitations made by three global circulation models (GCMs) following two emission scenarios: representative concentration pathway (RCP) 2.6 and RCP 8.5 over the twenty-first century. The downscaled future precipitation projections indicate an increase in mean and maximum precipitation intensity as well as a decrease in the total number of dry days. Further an increase in the frequency of short (1-day), moderately long (2-4 day), and long (more than 5-day) precipitation events is projected.

  3. Improved large-scale hydrological modelling through the assimilation of streamflow and downscaled satellite soil moisture observations.

    NASA Astrophysics Data System (ADS)

    López López, Patricia; Wanders, Niko; Sutanudjaja, Edwin; Renzullo, Luigi; Sterk, Geert; Schellekens, Jaap; Bierkens, Marc

    2015-04-01

    The coarse spatial resolution of global hydrological models (typically > 0.25o) often limits their ability to resolve key water balance processes for many river basins and thus compromises their suitability for water resources management, especially when compared to locally-tunes river models. A possible solution to the problem may be to drive the coarse resolution models with high-resolution meteorological data as well as to assimilate ground-based and remotely-sensed observations of key water cycle variables. While this would improve the modelling resolution of the global model, the impact of prediction accuracy remains largely an open question. In this study we investigated the impact that assimilating streamflow and satellite soil moisture observations have on global hydrological model estimation, driven by coarse- and high-resolution meteorological observations, for the Murrumbidgee river basin in Australia. The PCR-GLOBWB global hydrological model is forced with downscaled global climatological data (from 0.5o downscaled to 0.1o resolution) obtained from the WATCH Forcing Data (WFDEI) and local high resolution gauging station based gridded datasets (0.05o), sourced from the Australian Bureau of Meteorology. Downscaled satellite derived soil moisture (from 0.5o downscaled to 0.1o resolution) from AMSR-E and streamflow observations collected from 25 gauging stations are assimilated using an ensemble Kalman filter. Several scenarios are analysed to explore the added value of data assimilation considering both local and global climatological data. Results show that the assimilation of streamflow observations result in the largest improvement of the model estimates. The joint assimilation of both streamflow and downscaled soil moisture observations leads to further improved in streamflow simulations (10% reduction in RMSE), mainly in the headwater catchments (up to 10,000 km2). Results also show that the added contribution of data assimilation, for both soil

  4. Satellite-Enhanced Dynamical Downscaling of Extreme Events

    NASA Astrophysics Data System (ADS)

    Nunes, A.

    2015-12-01

    Severe weather events can be the triggers of environmental disasters in regions particularly susceptible to changes in hydrometeorological conditions. In that regard, the reconstruction of past extreme weather events can help in the assessment of vulnerability and risk mitigation actions. Using novel modeling approaches, dynamical downscaling of long-term integrations from global circulation models can be useful for risk analysis, providing more accurate climate information at regional scales. Originally developed at the National Centers for Environmental Prediction (NCEP), the Regional Spectral Model (RSM) is being used in the dynamical downscaling of global reanalysis, within the South American Hydroclimate Reconstruction Project. Here, RSM combines scale-selective bias correction with assimilation of satellite-based precipitation estimates to downscale extreme weather occurrences. Scale-selective bias correction is a method employed in the downscaling, similar to the spectral nudging technique, in which the downscaled solution develops in agreement with its coarse boundaries. Precipitation assimilation acts on modeled deep-convection, drives the land-surface variables, and therefore the hydrological cycle. During the downscaling of extreme events that took place in Brazil in recent years, RSM continuously assimilated NCEP Climate Prediction Center morphing technique precipitation rates. As a result, RSM performed better than its global (reanalysis) forcing, showing more consistent hydrometeorological fields compared with more sophisticated global reanalyses. Ultimately, RSM analyses might provide better-quality initial conditions for high-resolution numerical predictions in metropolitan areas, leading to more reliable short-term forecasting of severe local storms.

  5. Testing a Weather Generator for Downscaling Climate Change Projections over Switzerland

    NASA Astrophysics Data System (ADS)

    Keller, Denise E.; Fischer, Andreas M.; Liniger, Mark A.; Appenzeller, Christof; Knutti, Reto

    2016-04-01

    Climate information provided by global or regional climate models (RCMs) are often too coarse and prone to substantial biases, making it impossible to directly use daily time-series of the RCMs for local assessments and in climate impact models. Hence, statistical downscaling becomes necessary. For the Swiss National Climate Change Initiative (CH2011), a delta-change approach was used to provide daily climate projections at the local scale. This data have the main limitations that changes in variability, extremes and in the temporal structure, such as changes in the wet day frequency, are not reproduced. The latter is a considerable downside of the delta-change approach for many impact applications. In this regard, stochastic weather generators (WGs) are an appealing technique that allow the simulation of multiple realizations of synthetic weather sequences consistent with the locally observed weather statistics and its future changes. Here, we analyse a Richardson-type weather generator (WG) as an alternative method to downscale daily precipitation, minimum and maximum temperature. The WG is calibrated for 26 Swiss stations and the reference period 1980-2009. It is perturbed with change factors derived from 12 RCMs (ENSEMBLES) to represent the climate of 2070-2099 assuming the SRES A1B emission scenario. The WG can be run in multi-site mode, making it especially attractive for impact-modelers that rely on a realistic spatial structure in downscaled time-series. The results from the WG are benchmarked against the original delta-change approach that applies mean additive or multiplicative adjustments to the observations. According to both downscaling methods, the results reveal area-wide mean temperature increases and a precipitation decrease in summer, consistent with earlier studies. For the summer drying, the WG indicates primarily a decrease in wet-day frequency and correspondingly an increase in mean dry spell length by around 18% - 40% at low

  6. Climate change indices for Greenland applied directly for other arctic regions - Enhanced and utilized climate information from one high resolution RCM downscaling for Greenland evaluated through pattern scaling and CMIP5

    NASA Astrophysics Data System (ADS)

    Olesen, M.; Christensen, J. H.; Boberg, F.

    2016-12-01

    Climate change indices for Greenland applied directly for other arctic regions - Enhanced and utilized climate information from one high resolution RCM downscaling for Greenland evaluated through pattern scaling and CMIP5Climate change affects the Greenlandic society both advantageously and disadvantageously. Changes in temperature and precipitation patterns may result in changes in a number of derived society related climate indices, such as the length of growing season or the number of annual dry days or a combination of the two - indices of substantial importance to society in a climate adaptation context.Detailed climate indices require high resolution downscaling. We have carried out a very high resolution (5 km) simulation with the regional climate model HIRHAM5, forced by the global model EC-Earth. Evaluation of RCM output is usually done with an ensemble of downscaled output with multiple RCM's and GCM's. Here we have introduced and tested a new technique; a translation of the robustness of an ensemble of GCM models from CMIP5 into the specific index from the HIRHAM5 downscaling through a correlation between absolute temperatures and its corresponding index values from the HIRHAM5 output.The procedure is basically conducted in two steps: First, the correlation between temperature and a given index for the HIRHAM5 simulation by a best fit to a second order polynomial is identified. Second, the standard deviation from the CMIP5 simulations is introduced to show the corresponding standard deviation of the index from the HIRHAM5 run. The change of specific climate indices due to global warming will then be possible to evaluate elsewhere corresponding to the change in absolute temperature.Results based on selected indices with focus on the future climate in Greenland calculated for the rcp4.5 and rcp8.5 scenarios will be presented.

  7. Extreme Value Analysis of hydro meteorological extremes in the ClimEx Large-Ensemble

    NASA Astrophysics Data System (ADS)

    Wood, R. R.; Martel, J. L.; Willkofer, F.; von Trentini, F.; Schmid, F. J.; Leduc, M.; Frigon, A.; Ludwig, R.

    2017-12-01

    Many studies show an increase in the magnitude and frequency of hydrological extreme events in the course of climate change. However the contribution of natural variability to the magnitude and frequency of hydrological extreme events is not yet settled. A reliable estimate of extreme events is from great interest for water management and public safety. In the course of the ClimEx Project (www.climex-project.org) a new single-model large-ensemble was created by dynamically downscaling the CanESM2 large-ensemble with the Canadian Regional Climate Model version 5 (CRCM5) for an European Domain and a Northeastern North-American domain. By utilizing the ClimEx 50-Member Large-Ensemble (CRCM5 driven by CanESM2 Large-Ensemble) a thorough analysis of natural variability in extreme events is possible. Are the current extreme value statistical methods able to account for natural variability? How large is the natural variability for e.g. a 1/100 year return period derived from a 50-Member Large-Ensemble for Europe and Northeastern North-America? These questions should be answered by applying various generalized extreme value distributions (GEV) to the ClimEx Large-Ensemble. Hereby various return levels (5-, 10-, 20-, 30-, 60- and 100-years) based on various lengths of time series (20-, 30-, 50-, 100- and 1500-years) should be analyzed for the maximum one day precipitation (RX1d), the maximum three hourly precipitation (RX3h) and the streamflow for selected catchments in Europe. The long time series of the ClimEx Ensemble (7500 years) allows us to give a first reliable estimate of the magnitude and frequency of certain extreme events.

  8. Changes in Seasonal and Extreme Hydrologic Conditions of the Georgia Basin/Puget Sound in an Ensemble Regional Climate Simulation for the Mid-Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, Lai R.; Qian, Yun

    This study examines an ensemble of climate change projections simulated by a global climate model (GCM) and downscaled with a region climate model (RCM) to 40 km spatial resolution for the western North America. One control and three ensemble future climate simulations were produced by the GCM following a business as usual scenario for greenhouse gases and aerosols emissions from 1995 to 2100. The RCM was used to downscale the GCM control simulation (1995-2015) and each ensemble future GCM climate (2040-2060) simulation. Analyses of the regional climate simulations for the Georgia Basin/Puget Sound showed a warming of 1.5-2oC and statisticallymore » insignificant changes in precipitation by the mid-century. Climate change has large impacts on snowpack (about 50% reduction) but relatively smaller impacts on the total runoff for the basin as a whole. However, climate change can strongly affect small watersheds such as those located in the transient snow zone, causing a higher likelihood of winter flooding as a higher percentage of precipitation falls in the form of rain rather than snow, and reduced streamflow in early summer. In addition, there are large changes in the monthly total runoff above the upper 1% threshold (or flood volume) from October through May, and the December flood volume of the future climate is 60% above the maximum monthly flood volume of the control climate. Uncertainty of the climate change projections, as characterized by the spread among the ensemble future climate simulations, is relatively small for the basin mean snowpack and runoff, but increases in smaller watersheds, especially in the transient snow zone, and associated with extreme events. This emphasizes the importance of characterizing uncertainty through ensemble simulations.« less

  9. Ensemble reconstruction of severe low flow events in France since 1871

    NASA Astrophysics Data System (ADS)

    Caillouet, Laurie; Vidal, Jean-Philippe; Sauquet, Eric; Devers, Alexandre; Graff, Benjamin

    2016-04-01

    This work presents a study of severe low flow events that occurred from 1871 onwards for a large number of near-natural catchments in France. It aims at assessing and comparing their characteristics to improve our knowledge on historical events and to provide a selection of benchmark events for climate change adaptation purposes. The historical depth of streamflow observations is generally limited to the last 50 years and therefore offers too small a sample of severe low flow events to properly explore the long-term evolution of their characteristics and associated impacts. In order to overcome this limit, this work takes advantage of a 140-year ensemble hydrometeorological dataset over France based on: (1) a probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France (Caillouet et al., 2015), and (2) a continuous hydrological modelling that uses the high-resolution meteorological reconstructions as forcings over the whole period. This dataset provides an ensemble of 25 equally plausible daily streamflow time series for a reference network of stations in France over the whole 1871-2012 period. Severe low flow events are identified based on a combination of a fixed threshold and a daily variable threshold. Each event is characterized by its deficit, duration and timing by applying the Sequent Peak Algorithm. The procedure is applied to the 25 simulated time series as well as to the observed time series in order to compare observed and simulated events over the recent period, and to characterize in a probabilistic way unrecorded historical events. The ensemble aspect of the reconstruction leads to address specific issues, for properly defining events across ensemble simulations, as well as for adequately comparing the simulated characteristics to the observed ones. This study brings forward the outstanding 1921 and 1940s events but also older and less known ones that occurred during the last decade of the 19th century. For

  10. Daily Reservoir Inflow Forecasting using Deep Learning with Downscaled Multi-General Circulation Models (GCMs) Platform

    NASA Astrophysics Data System (ADS)

    Li, D.; Fang, N. Z.

    2017-12-01

    Dallas-Fort Worth Metroplex (DFW) has a population of over 7 million depending on many water supply reservoirs. The reservoir inflow plays a vital role in water supply decision making process and long-term strategic planning for the region. This paper demonstrates a method of utilizing deep learning algorithms and multi-general circulation model (GCM) platform to forecast reservoir inflow for three reservoirs within the DFW: Eagle Mountain Lake, Lake Benbrook and Lake Arlington. Ensemble empirical mode decomposition was firstly employed to extract the features, which were then represented by the deep belief networks (DBNs). The first 75 years of the historical data (1940 -2015) were used to train the model, while the last 2 years of the data (2016-2017) were used for the model validation. The weights of each DBN gained from the training process were then applied to establish a neural network (NN) that was able to forecast reservoir inflow. Feature predictors used for the forecasting model were generated from weather forecast results of the downscaled multi-GCM platform for the North Texas region. By comparing root mean square error (RMSE) and mean bias error (MBE) with the observed data, the authors found that the deep learning with downscaled multi-GCM platform is an effective approach in the reservoir inflow forecasting.

  11. Improving GEFS Weather Forecasts for Indian Monsoon with Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Agrawal, Ankita; Salvi, Kaustubh; Ghosh, Subimal

    2014-05-01

    Weather forecast has always been a challenging research problem, yet of a paramount importance as it serves the role of 'key input' in formulating modus operandi for immediate future. Short range rainfall forecasts influence a wide range of entities, right from agricultural industry to a common man. Accurate forecasts actually help in minimizing the possible damage by implementing pre-decided plan of action and hence it is necessary to gauge the quality of forecasts which might vary with the complexity of weather state and regional parameters. Indian Summer Monsoon Rainfall (ISMR) is one such perfect arena to check the quality of weather forecast not only because of the level of intricacy in spatial and temporal patterns associated with it, but also the amount of damage it can cause (because of poor forecasts) to the Indian economy by affecting agriculture Industry. The present study is undertaken with the rationales of assessing, the ability of Global Ensemble Forecast System (GEFS) in predicting ISMR over central India and the skill of statistical downscaling technique in adding value to the predictions by taking them closer to evidentiary target dataset. GEFS is a global numerical weather prediction system providing the forecast results of different climate variables at a fine resolution (0.5 degree and 1 degree). GEFS shows good skills in predicting different climatic variables but fails miserably over rainfall predictions for Indian summer monsoon rainfall, which is evident from a very low to negative correlation values between predicted and observed rainfall. Towards the fulfilment of second rationale, the statistical relationship is established between the reasonably well predicted climate variables (GEFS) and observed rainfall. The GEFS predictors are treated with multicollinearity and dimensionality reduction techniques, such as principal component analysis (PCA) and least absolute shrinkage and selection operator (LASSO). Statistical relationship is

  12. Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caillouet, Laurie; Vidal, Jean -Philippe; Sauquet, Eric

    This work proposes a daily high-resolution probabilistic reconstruction of precipitation and temperature fields in France over the 1871–2012 period built on the NOAA Twentieth Century global extended atmospheric reanalysis (20CR). The objective is to fill in the spatial and temporal data gaps in surface observations in order to improve our knowledge on the local-scale climate variability from the late nineteenth century onwards. The SANDHY (Stepwise ANalogue Downscaling method for HYdrology) statistical downscaling method, initially developed for quantitative precipitation forecast, is used here to bridge the scale gap between large-scale 20CR predictors and local-scale predictands from the Safran high-resolution near-surface reanalysis,more » available from 1958 onwards only. SANDHY provides a daily ensemble of 125 analogue dates over the 1871–2012 period for 608 climatically homogeneous zones paving France. Large precipitation biases in intermediary seasons are shown to occur in regions with high seasonal asymmetry like the Mediterranean. Moreover, winter and summer temperatures are respectively over- and under-estimated over the whole of France. Two analogue subselection methods are therefore developed with the aim of keeping the structure of the SANDHY method unchanged while reducing those seasonal biases. The calendar selection keeps the analogues closest to the target calendar day. The stepwise selection applies two new analogy steps based on similarity of the sea surface temperature (SST) and the large-scale 2 m temperature ( T). Comparisons to the Safran reanalysis over 1959–2007 and to homogenized series over the whole twentieth century show that biases in the interannual cycle of precipitation and temperature are reduced with both methods. The stepwise subselection moreover leads to a large improvement of interannual correlation and reduction of errors in seasonal temperature time series. When the calendar subselection is an easily applicable

  13. Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France

    DOE PAGES

    Caillouet, Laurie; Vidal, Jean -Philippe; Sauquet, Eric; ...

    2016-03-16

    This work proposes a daily high-resolution probabilistic reconstruction of precipitation and temperature fields in France over the 1871–2012 period built on the NOAA Twentieth Century global extended atmospheric reanalysis (20CR). The objective is to fill in the spatial and temporal data gaps in surface observations in order to improve our knowledge on the local-scale climate variability from the late nineteenth century onwards. The SANDHY (Stepwise ANalogue Downscaling method for HYdrology) statistical downscaling method, initially developed for quantitative precipitation forecast, is used here to bridge the scale gap between large-scale 20CR predictors and local-scale predictands from the Safran high-resolution near-surface reanalysis,more » available from 1958 onwards only. SANDHY provides a daily ensemble of 125 analogue dates over the 1871–2012 period for 608 climatically homogeneous zones paving France. Large precipitation biases in intermediary seasons are shown to occur in regions with high seasonal asymmetry like the Mediterranean. Moreover, winter and summer temperatures are respectively over- and under-estimated over the whole of France. Two analogue subselection methods are therefore developed with the aim of keeping the structure of the SANDHY method unchanged while reducing those seasonal biases. The calendar selection keeps the analogues closest to the target calendar day. The stepwise selection applies two new analogy steps based on similarity of the sea surface temperature (SST) and the large-scale 2 m temperature ( T). Comparisons to the Safran reanalysis over 1959–2007 and to homogenized series over the whole twentieth century show that biases in the interannual cycle of precipitation and temperature are reduced with both methods. The stepwise subselection moreover leads to a large improvement of interannual correlation and reduction of errors in seasonal temperature time series. When the calendar subselection is an easily applicable

  14. Multi-site precipitation downscaling using a stochastic weather generator

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Chen, Hua; Guo, Shenglian

    2018-03-01

    Statistical downscaling is an efficient way to solve the spatiotemporal mismatch between climate model outputs and the data requirements of hydrological models. However, the most commonly-used downscaling method only produces climate change scenarios for a specific site or watershed average, which is unable to drive distributed hydrological models to study the spatial variability of climate change impacts. By coupling a single-site downscaling method and a multi-site weather generator, this study proposes a multi-site downscaling approach for hydrological climate change impact studies. Multi-site downscaling is done in two stages. The first stage involves spatially downscaling climate model-simulated monthly precipitation from grid scale to a specific site using a quantile mapping method, and the second stage involves the temporal disaggregating of monthly precipitation to daily values by adjusting the parameters of a multi-site weather generator. The inter-station correlation is specifically considered using a distribution-free approach along with an iterative algorithm. The performance of the downscaling approach is illustrated using a 10-station watershed as an example. The precipitation time series derived from the National Centers for Environment Prediction (NCEP) reanalysis dataset is used as the climate model simulation. The precipitation time series of each station is divided into 30 odd years for calibration and 29 even years for validation. Several metrics, including the frequencies of wet and dry spells and statistics of the daily, monthly and annual precipitation are used as criteria to evaluate the multi-site downscaling approach. The results show that the frequencies of wet and dry spells are well reproduced for all stations. In addition, the multi-site downscaling approach performs well with respect to reproducing precipitation statistics, especially at monthly and annual timescales. The remaining biases mainly result from the non-stationarity of

  15. Projections of the Ganges-Brahmaputra precipitation: downscaled from GCM predictors

    USGS Publications Warehouse

    Pervez, Md Shahriar; Henebry, Geoffrey M.

    2014-01-01

    Downscaling Global Climate Model (GCM) projections of future climate is critical for impact studies. Downscaling enables use of GCM experiments for regional scale impact studies by generating regionally specific forecasts connecting global scale predictions and regional scale dynamics. We employed the Statistical Downscaling Model (SDSM) to downscale 21st century precipitation for two data-sparse hydrologically challenging river basins in South Asia—the Ganges and the Brahmaputra. We used CGCM3.1 by Canadian Center for Climate Modeling and Analysis version 3.1 predictors in downscaling the precipitation. Downscaling was performed on the basis of established relationships between historical Global Summary of Day observed precipitation records from 43 stations and National Center for Environmental Prediction re-analysis large scale atmospheric predictors. Although the selection of predictors was challenging during the set-up of SDSM, they were found to be indicative of important physical forcings in the basins. The precipitation of both basins was largely influenced by geopotential height: the Ganges precipitation was modulated by the U component of the wind and specific humidity at 500 and 1000 h Pa pressure levels; whereas, the Brahmaputra precipitation was modulated by the V component of the wind at 850 and 1000 h Pa pressure levels. The evaluation of the SDSM performance indicated that model accuracy for reproducing precipitation at the monthly scale was acceptable, but at the daily scale the model inadequately simulated some daily extreme precipitation events. Therefore, while the downscaled precipitation may not be the suitable input to analyze future extreme flooding or drought events, it could be adequate for analysis of future freshwater availability. Analysis of the CGCM3.1 downscaled precipitation projection with respect to observed precipitation reveals that the precipitation regime in each basin may be significantly impacted by climate change

  16. Projection of wave conditions in response to climate change: A community approach to global and regional wave downscaling

    USGS Publications Warehouse

    Erikson, Li H.; Hemer, M.; Lionello, Piero; Mendez, Fernando J.; Mori, Nobuhito; Semedo, Alvaro; Wang, Xiaolan; Wolf, Judith

    2015-01-01

    Future changes in wind-wave climate have broad implications for coastal geomorphology and management. General circulation models (GCM) are now routinely used for assessing climatological parameters, but generally do not provide parameterizations of ocean wind-waves. To fill this information gap, a growing number of studies use GCM outputs to independently downscale wave conditions to global and regional levels. To consolidate these efforts and provide a robust picture of projected changes, we present strategies from the community-derived multi-model ensemble of wave climate projections (COWCLIP) and an overview of regional contributions. Results and strategies from one contributing regional study concerning changes along the eastern North Pacific coast are presented.

  17. Hydrological responses to dynamically and statistically downscaled climate model output

    USGS Publications Warehouse

    Wilby, R.L.; Hay, L.E.; Gutowski, W.J.; Arritt, R.W.; Takle, E.S.; Pan, Z.; Leavesley, G.H.; Clark, M.P.

    2000-01-01

    Daily rainfall and surface temperature series were simulated for the Animas River basin, Colorado using dynamically and statistically downscaled output from the National Center for Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) re-analysis. A distributed hydrological model was then applied to the downscaled data. Relative to raw NCEP output, downscaled climate variables provided more realistic stimulations of basin scale hydrology. However, the results highlight the sensitivity of modeled processes to the choice of downscaling technique, and point to the need for caution when interpreting future hydrological scenarios.

  18. Dynamically-downscaled temperature and precipitation changes over Saskatchewan using the PRECIS model

    NASA Astrophysics Data System (ADS)

    Zhou, Xiong; Huang, Guohe; Wang, Xiuquan; Cheng, Guanhui

    2018-02-01

    In this study, dynamically-downscaled temperature and precipitation changes over Saskatchewan are developed through the Providing Regional Climates for Impacts Studies (PRECIS) model. It can resolve detailed features within GCM grids such as topography, clouds, and land use in Saskatchewan. The PRECIS model is employed to carry out ensemble simulations for projections of temperature and precipitation changes over Saskatchewan. Temperature and precipitation variables at 14 weather stations for the baseline period are first extracted from each model run. Ranges of simulated temperature and precipitation variables are then obtained through combination of maximum and minimum values calculated from the five ensemble runs. The performance of PRECIS ensemble simulations can be evaluated through checking if observations of current temperature at each weather station are within the simulated range. Future climate projections are analyzed over three time slices (i.e., the 2030s, 2050s, and 2080s) to help understand the plausible changes in temperature and precipitation over Saskatchewan in response to global warming. The evaluation results show that the PRECIS ensemble simulations perform very well in terms of capturing the spatial patterns of temperature and precipitation variables. The results of future climate projections over three time slices indicate that there will be an obvious warming trend from the 2030s, to the 2050s, and the 2080s over Saskatchewan. The projected changes of mean temperature over the whole Saskatchewan area is [0, 2] °C in the 2030s at 10th percentile, [2, 5.5] °C in the 2050s at 50th percentile, and [3, 10] °C in the 2090s at 90th percentile. There are no significant changes in the spatial patterns of the projected total precipitation from the 2030s to the end of this century. The minimum change of the projected total precipitation over the whole Province of Saskatchewan is most likely to be -1.3% in the 2030s, and -0.2% in the 2050s, while

  19. New Physical Algorithms for Downscaling SMAP Soil Moisture

    NASA Astrophysics Data System (ADS)

    Sadeghi, M.; Ghafari, E.; Babaeian, E.; Davary, K.; Farid, A.; Jones, S. B.; Tuller, M.

    2017-12-01

    The NASA Soil Moisture Active Passive (SMAP) mission provides new means for estimation of surface soil moisture at the global scale. However, for many hydrological and agricultural applications the spatial SMAP resolution is too low. To address this scale issue we fused SMAP data with MODIS observations to generate soil moisture maps at 1-km spatial resolution. In course of this study we have improved several existing empirical algorithms and introduced a new physical approach for downscaling SMAP data. The universal triangle/trapezoid model was applied to relate soil moisture to optical/thermal observations such as NDVI, land surface temperature and surface reflectance. These algorithms were evaluated with in situ data measured at 5-cm depth. Our results demonstrate that downscaling SMAP soil moisture data based on physical indicators of soil moisture derived from the MODIS satellite leads to higher accuracy than that achievable with empirical downscaling algorithms. Keywords: Soil moisture, microwave data, downscaling, MODIS, triangle/trapezoid model.

  20. Downscaling Coarse Actual ET Data Using Land Surface Resistance

    NASA Astrophysics Data System (ADS)

    Shen, T.

    2017-12-01

    This study proposed a new approach of downscaling ETWATCH 1km actual evapotranspiration (ET) product to a spatial resolution of 30m using land surface resistance that simulated mainly from monthly Landsat8 data and Jarvis method, which combined the benefits of both high temporal resolution of ETWATCH product and fine spatial resolution of Landsat8. The driving factor, surface resistance (Rs), was chosen for the reason that could reflect the transfer ability of vapor flow over canopy. Combined resistance Rs both upon canopy conditions, atmospheric factors and available water content of soil, which remains stable inside one ETWATCH pixel (1km). In this research, we used ETWATCH 1km ten-day actual ET product from April to October in a total of twenty-one images and monthly 30 meters cloud-free NDVI of 2013 (two images from HJ as a substitute due to cloud contamination) combined meteorological indicators for downscaling. A good agreement and correlation were obtained between the downscaled data and three flux sites observation in the middle reach of Heihe basin. The downscaling results show good consistency with the original ETWATCH 1km data both temporal and spatial scale over different land cover types with R2 ranged from 0.8 to 0.98. Besides, downscaled result captured the progression of vegetation transpiration well. This study proved the practicability of new downscaling method in the water resource management.

  1. Analysis of the regional MiKlip decadal prediction system over Europe: skill, added value of regionalization, and ensemble size dependeny

    NASA Astrophysics Data System (ADS)

    Reyers, Mark; Moemken, Julia; Pinto, Joaquim; Feldmann, Hendrik; Kottmeier, Christoph; MiKlip Module-C Team

    2017-04-01

    Decadal climate predictions can provide a useful basis for decision making support systems for the public and private sectors. Several generations of decadal hindcasts and predictions have been generated throughout the German research program MiKlip. Together with the global climate predictions computed with MPI-ESM, the regional climate model (RCM) COSMO-CLM is used for regional downscaling by MiKlip Module-C. The RCMs provide climate information on spatial and temporal scales closer to the needs of potential users. In this study, two downscaled hindcast generations are analysed (named b0 and b1). The respective global generations are both initialized by nudging them towards different reanalysis anomaly fields. An ensemble of five starting years (1961, 1971, 1981, 1991, and 2001), each comprising ten ensemble members, is used for both generations in order to quantify the regional decadal prediction skill for precipitation and near-surface temperature and wind speed over Europe. All datasets (including hindcasts, observations, reanalysis, and historical MPI-ESM runs) are pre-processed in an analogue manner by (i) removing the long-term trend and (ii) re-gridding to a common grid. Our analysis shows that there is potential for skillful decadal predictions over Europe in the regional MiKlip ensemble, but the skill is not systematic and depends on the PRUDENCE region and the variable. Further, the differences between the two hindcast generations are mostly small. As we used detrended time series, the predictive skill found in our study can probably attributed to reasonable predictions of anomalies which are associated with the natural climate variability. In a sensitivity study, it is shown that the results may strongly change when the long-term trend is kept in the datasets, as here the skill of predicting the long-term trend (e.g. for temperature) also plays a major role. The regionalization of the global ensemble provides an added value for decadal predictions for

  2. Downscaling GCM Output with Genetic Programming Model

    NASA Astrophysics Data System (ADS)

    Shi, X.; Dibike, Y. B.; Coulibaly, P.

    2004-05-01

    Climate change impact studies on watershed hydrology require reliable data at appropriate spatial and temporal resolution. However, the outputs of the current global climate models (GCMs) cannot be used directly because GCM do not provide hourly or daily precipitation and temperature reliable enough for hydrological modeling. Nevertheless, we can get more reliable data corresponding to future climate scenarios derived from GCM outputs using the so called 'downscaling techniques'. This study applies Genetic Programming (GP) based technique to downscale daily precipitation and temperature values at the Chute-du-Diable basin of the Saguenay watershed in Canada. In applying GP downscaling technique, the objective is to find a relationship between the large-scale predictor variables (NCEP data which provide daily information concerning the observed large-scale state of the atmosphere) and the predictand (meteorological data which describes conditions at the site scale). The selection of the most relevant predictor variables is achieved using the Pearson's coefficient of determination ( R2) (between the large-scale predictor variables and the daily meteorological data). In this case, the period (1961 - 2000) is identified to represent the current climate condition. For the forty years of data, the first 30 years (1961-1990) are considered for calibrating the models while the remaining ten years of data (1991-2000) are used to validate those models. In general, the R2 between the predictor variables and each predictand is very low in case of precipitation compared to that of maximum and minimum temperature. Moreover, the strength of individual predictors varies for every month and for each GP grammar. Therefore, the most appropriate combination of predictors has to be chosen by looking at the output analysis of all the twelve months and the different GP grammars. During the calibration of the GP model for precipitation downscaling, in addition to the mean daily

  3. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  4. SDSM-DC: A smarter approach to downscaling for decision-making? (Invited)

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.; Dawson, C. W.

    2011-12-01

    General Circulation Model (GCM) output has been used for downscaling and impact assessments for at least 25 years. Downscaling methods raise awareness about risks posed by climate variability and change to human and natural systems. However, there are relatively few instances where these analyses have translated into actionable information for adaptation. One reason is that conventional ';top down' downscaling typically yields very large uncertainty bounds in projected impacts at regional and local scales. Consequently, there are growing calls to use downscaling tools in smarter ways that refocus attention on the decision problem rather than on the climate modelling per se. The talk begins with an overview of various application of the Statistical DownScaling Model (SDSM) over the last decade. This sample offers insights to downscaling practice in terms of regions and sectors of interest, modes of application and adaptation outcomes. The decision-centred rationale and functionality of the latest version of SDSM is then explained. This new downscaling tool does not require GCM input but enables the user to generate plausible daily weather scenarios that may be informed by climate model and/or palaeoenvironmental information. Importantly, the tool is intended for stress-testing adaptation options rather than for exhaustive analysis of uncertainty components. The approach is demonstrated by downscaling multi-basin, multi-elevation temperature and precipitation scenarios for the Upper Colorado River Basin. These scenarios are used alongside other narratives of future conditions that might potential affect the security of water supplies, and for evaluating steps that can be taken to manage these risks.

  5. SDSM-DC: A smarter approach to downscaling for decision-making? (Invited)

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.; Dawson, C. W.

    2013-12-01

    General Circulation Model (GCM) output has been used for downscaling and impact assessments for at least 25 years. Downscaling methods raise awareness about risks posed by climate variability and change to human and natural systems. However, there are relatively few instances where these analyses have translated into actionable information for adaptation. One reason is that conventional ';top down' downscaling typically yields very large uncertainty bounds in projected impacts at regional and local scales. Consequently, there are growing calls to use downscaling tools in smarter ways that refocus attention on the decision problem rather than on the climate modelling per se. The talk begins with an overview of various application of the Statistical DownScaling Model (SDSM) over the last decade. This sample offers insights to downscaling practice in terms of regions and sectors of interest, modes of application and adaptation outcomes. The decision-centred rationale and functionality of the latest version of SDSM is then explained. This new downscaling tool does not require GCM input but enables the user to generate plausible daily weather scenarios that may be informed by climate model and/or palaeoenvironmental information. Importantly, the tool is intended for stress-testing adaptation options rather than for exhaustive analysis of uncertainty components. The approach is demonstrated by downscaling multi-basin, multi-elevation temperature and precipitation scenarios for the Upper Colorado River Basin. These scenarios are used alongside other narratives of future conditions that might potential affect the security of water supplies, and for evaluating steps that can be taken to manage these risks.

  6. Multi-model Ensemble Regional Climate Projection of the Maritime Continent using the MIT Regional Climate Model

    NASA Astrophysics Data System (ADS)

    Kang, S.; IM, E. S.; Eltahir, E. A. B.

    2016-12-01

    In this study, the future change in precipitation due to global warming is investigated over the Maritime Continent using the MIT Regional Climate Model (MRCM). A total of nine 30-year projections under multi-GCMs (CCSM, MPI, ACCESS) and multi-scenarios of emissions (Control, RCP4.5, RCP8.5) are dynamically downscaled using the MRCM with 12km horizontal resolution. Since downscaled results tend to systematically overestimate the precipitation regardless of GCM used as lateral boundary conditions, the Parametric Quantile Mapping (PQM) is applied to reduce this wet bias. The cross validation for the control simulation shows that the PQM method seems to retain the spatial pattern and temporal variability of raw simulation, however it effectively reduce the wet bias. Based on ensemble projections produced by dynamical downscaling and statistical bias correction, a reduction of future precipitation is discernible, in particular during dry season (June-July-August). For example, intense precipitation in Singapore is expected to be reduced in RCP8.5 projection compared to control simulation. However, the geographical patterns and magnitude of changes still remain uncertain, suffering from statistical insignificance and a lack of model agreement. Acknowledgements This research is supported by the National Research Foundation Singapore under its Campus for Research Excellence and Technological Enterprise programme. The Center for Environmental Sensing and Modeling is an interdisciplinary research group of the Singapore-MIT Alliance for Research and Technology

  7. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Werner, A. T.; Cannon, A. J.

    2015-06-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e., correlation tests) and distributional properties (i.e., tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3 day peak flow and 7 day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational datasets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational dataset. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7 day low flow events, regardless of reanalysis or observational dataset. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event

  8. Hydrologic extremes - an intercomparison of multiple gridded statistical downscaling methods

    NASA Astrophysics Data System (ADS)

    Werner, Arelia T.; Cannon, Alex J.

    2016-04-01

    Gridded statistical downscaling methods are the main means of preparing climate model data to drive distributed hydrological models. Past work on the validation of climate downscaling methods has focused on temperature and precipitation, with less attention paid to the ultimate outputs from hydrological models. Also, as attention shifts towards projections of extreme events, downscaling comparisons now commonly assess methods in terms of climate extremes, but hydrologic extremes are less well explored. Here, we test the ability of gridded downscaling models to replicate historical properties of climate and hydrologic extremes, as measured in terms of temporal sequencing (i.e. correlation tests) and distributional properties (i.e. tests for equality of probability distributions). Outputs from seven downscaling methods - bias correction constructed analogues (BCCA), double BCCA (DBCCA), BCCA with quantile mapping reordering (BCCAQ), bias correction spatial disaggregation (BCSD), BCSD using minimum/maximum temperature (BCSDX), the climate imprint delta method (CI), and bias corrected CI (BCCI) - are used to drive the Variable Infiltration Capacity (VIC) model over the snow-dominated Peace River basin, British Columbia. Outputs are tested using split-sample validation on 26 climate extremes indices (ClimDEX) and two hydrologic extremes indices (3-day peak flow and 7-day peak flow). To characterize observational uncertainty, four atmospheric reanalyses are used as climate model surrogates and two gridded observational data sets are used as downscaling target data. The skill of the downscaling methods generally depended on reanalysis and gridded observational data set. However, CI failed to reproduce the distribution and BCSD and BCSDX the timing of winter 7-day low-flow events, regardless of reanalysis or observational data set. Overall, DBCCA passed the greatest number of tests for the ClimDEX indices, while BCCAQ, which is designed to more accurately resolve event

  9. Projecting malaria hazard from climate change in eastern Africa using large ensembles to estimate uncertainty.

    PubMed

    Leedale, Joseph; Tompkins, Adrian M; Caminade, Cyril; Jones, Anne E; Nikulin, Grigory; Morse, Andrew P

    2016-03-31

    The effect of climate change on the spatiotemporal dynamics of malaria transmission is studied using an unprecedented ensemble of climate projections, employing three diverse bias correction and downscaling techniques, in order to partially account for uncertainty in climate- driven malaria projections. These large climate ensembles drive two dynamical and spatially explicit epidemiological malaria models to provide future hazard projections for the focus region of eastern Africa. While the two malaria models produce very distinct transmission patterns for the recent climate, their response to future climate change is similar in terms of sign and spatial distribution, with malaria transmission moving to higher altitudes in the East African Community (EAC) region, while transmission reduces in lowland, marginal transmission zones such as South Sudan. The climate model ensemble generally projects warmer and wetter conditions over EAC. The simulated malaria response appears to be driven by temperature rather than precipitation effects. This reduces the uncertainty due to the climate models, as precipitation trends in tropical regions are very diverse, projecting both drier and wetter conditions with the current state-of-the-art climate model ensemble. The magnitude of the projected changes differed considerably between the two dynamical malaria models, with one much more sensitive to climate change, highlighting that uncertainty in the malaria projections is also associated with the disease modelling approach.

  10. A Multiplicative Cascade Model for High-Resolution Space-Time Downscaling of Rainfall

    NASA Astrophysics Data System (ADS)

    Raut, Bhupendra A.; Seed, Alan W.; Reeder, Michael J.; Jakob, Christian

    2018-02-01

    Distributions of rainfall with the time and space resolutions of minutes and kilometers, respectively, are often needed to drive the hydrological models used in a range of engineering, environmental, and urban design applications. The work described here is the first step in constructing a model capable of downscaling rainfall to scales of minutes and kilometers from time and space resolutions of several hours and a hundred kilometers. A multiplicative random cascade model known as the Short-Term Ensemble Prediction System is run with parameters from the radar observations at Melbourne (Australia). The orographic effects are added through multiplicative correction factor after the model is run. In the first set of model calculations, 112 significant rain events over Melbourne are simulated 100 times. Because of the stochastic nature of the cascade model, the simulations represent 100 possible realizations of the same rain event. The cascade model produces realistic spatial and temporal patterns of rainfall at 6 min and 1 km resolution (the resolution of the radar data), the statistical properties of which are in close agreement with observation. In the second set of calculations, the cascade model is run continuously for all days from January 2008 to August 2015 and the rainfall accumulations are compared at 12 locations in the greater Melbourne area. The statistical properties of the observations lie with envelope of the 100 ensemble members. The model successfully reproduces the frequency distribution of the 6 min rainfall intensities, storm durations, interarrival times, and autocorrelation function.

  11. FORWINE - Statistical Downscaling of Seasonal forecasts for wine

    NASA Astrophysics Data System (ADS)

    Cardoso, Rita M.; Soares, Pedro M. M.; Miranda, Pedro M. A.

    2016-04-01

    The most renowned viticulture regions in the Iberian Peninsula have a long standing tradition in winemaking and are considered world-class grapevine (Vitis Vinifera L.) producing regions. Portugal is the 11th wine producer in the world, with internationally acclaimed wines, such as Port wine, and vineyards across the whole territory. Climate is widely acknowledged of one of the most important factors for grapevine development and growth (Fraga et al. 2014a and b; Jackson et al. 1993; Keller 2010). During the growing season (April-October in the Northern Hemisphere) of this perennial and deciduous crop, the climatic conditions are responsible for numerous morphologically and physiological changes. Anomalously low February-March mean temperature, anomalously high May mean temperature and anomalously high March precipitation tend to be favourable to wine production in the Douro Valley. Seasonal forecast of precipitation and temperature tailored to fit critical thresholds, for crucial seasons, can be used to inform management practices (viz. phytosanitary measures, land operations, marketing campaigns) and develop a wine production forecast. Statistical downscaling of precipitation, maximum, minimum temperatures is used to model wine production following Santos et al. (2013) and to calculate bioclimatic indices. The skill of the ensemble forecast is evaluated through anomaly correlation, ROC area, spread-error ratio and CRPS

  12. CMIP5-downscaled projections for the NW European Shelf Seas: initial results and insights into uncertainties

    NASA Astrophysics Data System (ADS)

    Tinker, Jonathan; Palmer, Matthew; Lowe, Jason; Howard, Tom

    2017-04-01

    The North Sea, and wider Northwest European Shelf seas (NWS) are economically, environmentally, and culturally important for a number of European countries. They are protected by European legislation, often with specific reference to the potential impacts of climate change. Coastal climate change projections are an important source of information for effective management of European Shelf Seas. For example, potential changes in the marine environment are a key component of the climate change risk assessments (CCRAs) carried out under the UK Climate Change Act We use the NEMO shelf seas model combined with CMIP5 climate model and EURO-CORDEX regional atmospheric model data to generate new simulations of the NWS. Building on previous work using a climate model perturbed physics ensemble and the POLCOMS, this new model setup is used to provide first indication of the uncertainties associated with: (i) the driving climate model; (ii) the atmospheric downscaling model (iii) the shelf seas downscaling model; (iv) the choice of climate change scenario. Our analysis considers a range of physical marine impacts and the drivers of coastal variability and change, including sea level and the propagation of open ocean signals onto the shelf. The simulations are being carried out as part of the UK Climate Projections 2018 (UKCP18) and will feed into the following UK CCRA.

  13. Towards an Australian ensemble streamflow forecasting system for flood prediction and water management

    NASA Astrophysics Data System (ADS)

    Bennett, J.; David, R. E.; Wang, Q.; Li, M.; Shrestha, D. L.

    2016-12-01

    Flood forecasting in Australia has historically relied on deterministic forecasting models run only when floods are imminent, with considerable forecaster input and interpretation. These now co-existed with a continually available 7-day streamflow forecasting service (also deterministic) aimed at operational water management applications such as environmental flow releases. The 7-day service is not optimised for flood prediction. We describe progress on developing a system for ensemble streamflow forecasting that is suitable for both flood prediction and water management applications. Precipitation uncertainty is handled through post-processing of Numerical Weather Prediction (NWP) output with a Bayesian rainfall post-processor (RPP). The RPP corrects biases, downscales NWP output, and produces reliable ensemble spread. Ensemble precipitation forecasts are used to force a semi-distributed conceptual rainfall-runoff model. Uncertainty in precipitation forecasts is insufficient to reliably describe streamflow forecast uncertainty, particularly at shorter lead-times. We characterise hydrological prediction uncertainty separately with a 4-stage error model. The error model relies on data transformation to ensure residuals are homoscedastic and symmetrically distributed. To ensure streamflow forecasts are accurate and reliable, the residuals are modelled using a mixture-Gaussian distribution with distinct parameters for the rising and falling limbs of the forecast hydrograph. In a case study of the Murray River in south-eastern Australia, we show ensemble predictions of floods generally have lower errors than deterministic forecasting methods. We also discuss some of the challenges in operationalising short-term ensemble streamflow forecasts in Australia, including meeting the needs for accurate predictions across all flow ranges and comparing forecasts generated by event and continuous hydrological models.

  14. VALUE - A Framework to Validate Downscaling Approaches for Climate Change Studies

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilke, Renate A. I.

    2015-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. Here, we present the key ingredients of this framework. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.

  15. VALUE: A framework to validate downscaling approaches for climate change studies

    NASA Astrophysics Data System (ADS)

    Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilcke, Renate A. I.

    2015-01-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. In this paper, we present the key ingredients of this framework. VALUE's main approach to validation is user- focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.

  16. Evaluation of Statistical Downscaling Skill at Reproducing Extreme Events

    NASA Astrophysics Data System (ADS)

    McGinnis, S. A.; Tye, M. R.; Nychka, D. W.; Mearns, L. O.

    2015-12-01

    Climate model outputs usually have much coarser spatial resolution than is needed by impacts models. Although higher resolution can be achieved using regional climate models for dynamical downscaling, further downscaling is often required. The final resolution gap is often closed with a combination of spatial interpolation and bias correction, which constitutes a form of statistical downscaling. We use this technique to downscale regional climate model data and evaluate its skill in reproducing extreme events. We downscale output from the North American Regional Climate Change Assessment Program (NARCCAP) dataset from its native 50-km spatial resolution to the 4-km resolution of University of Idaho's METDATA gridded surface meterological dataset, which derives from the PRISM and NLDAS-2 observational datasets. We operate on the major variables used in impacts analysis at a daily timescale: daily minimum and maximum temperature, precipitation, humidity, pressure, solar radiation, and winds. To interpolate the data, we use the patch recovery method from the Earth System Modeling Framework (ESMF) regridding package. We then bias correct the data using Kernel Density Distribution Mapping (KDDM), which has been shown to exhibit superior overall performance across multiple metrics. Finally, we evaluate the skill of this technique in reproducing extreme events by comparing raw and downscaled output with meterological station data in different bioclimatic regions according to the the skill scores defined by Perkins et al in 2013 for evaluation of AR4 climate models. We also investigate techniques for improving bias correction of values in the tails of the distributions. These techniques include binned kernel density estimation, logspline kernel density estimation, and transfer functions constructed by fitting the tails with a generalized pareto distribution.

  17. Generalized canonical ensembles and ensemble equivalence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costeniuc, M.; Ellis, R.S.; Turkington, B.

    2006-02-15

    This paper is a companion piece to our previous work [J. Stat. Phys. 119, 1283 (2005)], which introduced a generalized canonical ensemble obtained by multiplying the usual Boltzmann weight factor e{sup -{beta}}{sup H} of the canonical ensemble with an exponential factor involving a continuous function g of the Hamiltonian H. We provide here a simplified introduction to our previous work, focusing now on a number of physical rather than mathematical aspects of the generalized canonical ensemble. The main result discussed is that, for suitable choices of g, the generalized canonical ensemble reproduces, in the thermodynamic limit, all the microcanonical equilibriummore » properties of the many-body system represented by H even if this system has a nonconcave microcanonical entropy function. This is something that in general the standard (g=0) canonical ensemble cannot achieve. Thus a virtue of the generalized canonical ensemble is that it can often be made equivalent to the microcanonical ensemble in cases in which the canonical ensemble cannot. The case of quadratic g functions is discussed in detail; it leads to the so-called Gaussian ensemble.« less

  18. Examining Extreme Events Using Dynamically Downscaled 12-km WRF Simulations

    EPA Science Inventory

    Continued improvements in the speed and availability of computational resources have allowed dynamical downscaling of global climate model (GCM) projections to be conducted at increasingly finer grid scales and over extended time periods. The implementation of dynamical downscal...

  19. Satellite-enhanced dynamical downscaling for the analysis of extreme events

    NASA Astrophysics Data System (ADS)

    Nunes, Ana M. B.

    2016-09-01

    The use of regional models in the downscaling of general circulation models provides a strategy to generate more detailed climate information. In that case, boundary-forcing techniques can be useful to maintain the large-scale features from the coarse-resolution global models in agreement with the inner modes of the higher-resolution regional models. Although those procedures might improve dynamics, downscaling via regional modeling still aims for better representation of physical processes. With the purpose of improving dynamics and physical processes in regional downscaling of global reanalysis, the Regional Spectral Model—originally developed at the National Centers for Environmental Prediction—employs a newly reformulated scale-selective bias correction, together with the 3-hourly assimilation of the satellite-based precipitation estimates constructed from the Climate Prediction Center morphing technique. The two-scheme technique for the dynamical downscaling of global reanalysis can be applied in analyses of environmental disasters and risk assessment, with hourly outputs, and resolution of about 25 km. Here the satellite-enhanced dynamical downscaling added value is demonstrated in simulations of the first reported hurricane in the western South Atlantic Ocean basin through comparisons with global reanalyses and satellite products available in ocean areas.

  20. "Going the Extra Mile in Downscaling: Why Downscaling is not ...

    EPA Pesticide Factsheets

    This presentation provides an example of doing additional work for preprocessing global climate model data for use in regional climate modeling simulations with the Weather Research and Forecasting (WRF) model. In this presentation, results from 15 months of downscaling the Community Earth System Model (CESM) were shown, both using the out-of-the-box downscaling of CESM and also with a modification to setting the inland lake temperatures. The National Exposure Research Laboratory (NERL) Atmospheric Modeling and Analysis Division (AMAD) conducts research in support of EPA mission to protect human health and the environment. AMAD research program is engaged in developing and evaluating predictive atmospheric models on all spatial and temporal scales for forecasting the air quality and for assessing changes in air quality and air pollutant exposures, as affected by changes in ecosystem management and regulatory decisions. AMAD is responsible for providing a sound scientific and technical basis for regulatory policies based on air quality models to improve ambient air quality. The models developed by AMAD are being used by EPA, NOAA, and the air pollution community in understanding and forecasting not only the magnitude of the air pollution problem, but also in developing emission control policies and regulations for air quality improvements.

  1. "Going the Extra Mile in Downscaling: Why Downscaling is not jut "Plug-and-Play"

    EPA Science Inventory

    This presentation provides an example of doing additional work for preprocessing global climate model data for use in regional climate modeling simulations with the Weather Research and Forecasting (WRF) model. In this presentation, results from 15 months of downscaling the Comm...

  2. Probabilistic regional climate projection in Japan using a regression model with CMIP5 multi-model ensemble experiments

    NASA Astrophysics Data System (ADS)

    Ishizaki, N. N.; Dairaku, K.; Ueno, G.

    2016-12-01

    We have developed a statistical downscaling method for estimating probabilistic climate projection using CMIP5 multi general circulation models (GCMs). A regression model was established so that the combination of weights of GCMs reflects the characteristics of the variation of observations at each grid point. Cross validations were conducted to select GCMs and to evaluate the regression model to avoid multicollinearity. By using spatially high resolution observation system, we conducted statistically downscaled probabilistic climate projections with 20-km horizontal grid spacing. Root mean squared errors for monthly mean air surface temperature and precipitation estimated by the regression method were the smallest compared with the results derived from a simple ensemble mean of GCMs and a cumulative distribution function based bias correction method. Projected changes in the mean temperature and precipitation were basically similar to those of the simple ensemble mean of GCMs. Mean precipitation was generally projected to increase associated with increased temperature and consequent increased moisture content in the air. Weakening of the winter monsoon may affect precipitation decrease in some areas. Temperature increase in excess of 4 K was expected in most areas of Japan in the end of 21st century under RCP8.5 scenario. The estimated probability of monthly precipitation exceeding 300 mm would increase around the Pacific side during the summer and the Japan Sea side during the winter season. This probabilistic climate projection based on the statistical method can be expected to bring useful information to the impact studies and risk assessments.

  3. A seasonal hydrologic ensemble prediction system for water resource management

    NASA Astrophysics Data System (ADS)

    Luo, L.; Wood, E. F.

    2006-12-01

    A seasonal hydrologic ensemble prediction system, developed for the Ohio River basin, has been improved and expanded to several other regions including the Eastern U.S., Africa and East Asia. The prediction system adopts the traditional Extended Streamflow Prediction (ESP) approach, utilizing the VIC (Variable Infiltration Capacity) hydrological model as the central tool for producing ensemble prediction of soil moisture, snow and streamflow with lead times up to 6-month. VIC is forced by observed meteorology to estimate the hydrological initial condition prior to the forecast, but during the forecast period the atmospheric forcing comes from statistically downscaled, seasonal forecast from dynamic climate models. The seasonal hydrologic ensemble prediction system is currently producing realtime seasonal hydrologic forecast for these regions on a monthly basis. Using hindcasts from a 19-year period (1981-1999), during which seasonal hindcasts from NCEP Climate Forecast System (CFS) and European Union DEMETER project are available, we evaluate the performance of the forecast system over our forecast regions. The evaluation shows that the prediction system using the current forecast approach is able to produce reliable and accurate precipitation, soil moisture and streamflow predictions. The overall skill is much higher then the traditional ESP. In particular, forecasts based on multiple climate model forecast are more skillful than single model-based forecast. This emphasizes the significant need for producing seasonal climate forecast with multiple climate models for hydrologic applications. Forecast from this system is expected to provide very valuable information about future hydrologic states and associated risks for end users, including water resource management and financial sectors.

  4. Sub-daily Statistical Downscaling of Meteorological Variables Using Neural Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Jitendra; Brooks, Bjørn-Gustaf J.; Thornton, Peter E

    2012-01-01

    A new open source neural network temporal downscaling model is described and tested using CRU-NCEP reanal ysis and CCSM3 climate model output. We downscaled multiple meteorological variables in tandem from monthly to sub-daily time steps while also retaining consistent correlations between variables. We found that our feed forward, error backpropagation approach produced synthetic 6 hourly meteorology with biases no greater than 0.6% across all variables and variance that was accurate within 1% for all variables except atmospheric pressure, wind speed, and precipitation. Correlations between downscaled output and the expected (original) monthly means exceeded 0.99 for all variables, which indicates thatmore » this approach would work well for generating atmospheric forcing data consistent with mass and energy conserved GCM output. Our neural network approach performed well for variables that had correlations to other variables of about 0.3 and better and its skill was increased by downscaling multiple correlated variables together. Poor replication of precipitation intensity however required further post-processing in order to obtain the expected probability distribution. The concurrence of precipitation events with expected changes in sub ordinate variables (e.g., less incident shortwave radiation during precipitation events) were nearly as consistent in the downscaled data as in the training data with probabilities that differed by no more than 6%. Our downscaling approach requires training data at the target time step and relies on a weak assumption that climate variability in the extrapolated data is similar to variability in the training data.« less

  5. Downscaling SMAP Soil Moisture Using Geoinformation Data and Geostatistics

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Wang, L.

    2017-12-01

    Soil moisture is important for agricultural and hydrological studies. However, ground truth soil moisture data for wide area is difficult to achieve. Microwave remote sensing such as Soil Moisture Active Passive (SMAP) can offer a solution for wide coverage. However, existing global soil moisture products only provide observations at coarse spatial resolutions, which often limit their applications in regional agricultural and hydrological studies. This paper therefore aims to generate fine scale soil moisture information and extend soil moisture spatial availability. A statistical downscaling scheme is presented that incorporates multiple fine scale geoinformation data into the downscaling of coarse scale SMAP data in the absence of ground measurement data. Geoinformation data related to soil moisture patterns including digital elevation model (DEM), land surface temperature (LST), land use and normalized difference vegetation index (NDVI) at a fine scale are used as auxiliary environmental variables for downscaling SMAP data. Generalized additive model (GAM) and regression tree are first conducted to derive statistical relationships between SMAP data and auxiliary geoinformation data at an original coarse scale, and residuals are then downscaled to a finer scale via area-to-point kriging (ATPK) by accounting for the spatial correlation information of the input residuals. The results from standard validation scores as well as the triple collocation (TC) method against soil moisture in-situ measurements show that the downscaling method can significantly improve the spatial details of SMAP soil moisture while maintain the accuracy.

  6. Utilizing Multi-Ensemble of Downscaled CMIP5 GCMs to Investigate Trends and Spatial and Temporal Extent of Drought in Willamette Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, A.; Beal, B.; Moradkhani, H.

    2015-12-01

    Changing climate and potential future increases in global temperature are likely to have impacts on drought characteristics and hydrologic cylce. In this study, we analyze changes in temporal and spatial extent of meteorological and hydrological droughts in future, and their trends. Three statistically downscaled datasets from NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP), Multivariate Adaptive Constructed Analogs (MACA), and Bias Correction and Spatial Disagregation (BCSD-PSU) each consisting of 10 CMIP5 Global Climate Models (GCM) are utilized for RCP4.5 and RCP8.5 scenarios. Further, Precipitation Runoff Modeling System (PRMS) hydrologic model is used to simulate streamflow from GCM inputs and assess the hydrological drought characteristics. Standard Precipitation Index (SPI) and Streamflow Drought Index (SDI) are the two indexes used to investigate meteorological and hydrological drought, respectively. Study is done for Willamette Basin with a drainage area of 29,700 km2 accommodating more than 3 million inhabitants and 25 dams. We analyze our study for annual time scale as well as three future periods of near future (2010-2039), intermediate future (2040-2069), and far future (2070-2099). Large uncertainty is found from GCM predictions. Results reveal that meteorological drought events are expected to increase in near future. Severe to extreme drought with large areal coverage and several years of occurance is predicted around year 2030 with the likelihood of exceptional drought for both drought types. SPI is usually showing positive trends, while SDI indicates negative trends in most cases.

  7. Analyzing the Multiscale Processes in Tropical Cyclone Genesis Associated with African Easterly Waves using the PEEMD. Part I: Downscaling Processes

    NASA Astrophysics Data System (ADS)

    Wu, Y.; Shen, B. W.; Cheung, S.

    2016-12-01

    Recent advance in high-resolution global hurricane simulations and visualizations have collectively suggested the importance of both downscaling and upscaling processes in the formation and intensification of TCs. To reveal multiscale processes from massive volume of global data for multiple years, a scalable Parallel Ensemble Empirical Mode Decomposition (PEEMD) method has been developed for the analysis. In this study, the PEEMD is applied to analyzing 10-year (2004-2013) ERA-Interim global 0.750 resolution reanalysis data to explore the role of the downscaling processes in tropical cyclogenesis associated with African Easterly Waves (AEWs). Using the PEEMD, raw data are decomposed into oscillatory Intrinsic Function Modes (IMFs) that represent atmospheric systems of the various length scales and the trend mode that represents a non-oscillatory large scale environmental flow. Among oscillatory modes, results suggest that the third oscillatory mode (IMF3) is statistically correlated with the TC/AEW scale systems. Therefore, IMF3 and trend mode are analyzed in details. Our 10-year analysis shows that more than 50% of the AEW associated hurricanes reveal the association of storms' formation with the significant downscaling shear transfer from the larger-scale trend mode to the smaller scale IMF3. Future work will apply the PEEMD to the analysis of higher-resolution datasets to explore the role of the upscaling processes provided by the convection (or TC) in the development of the TC (or AEW). Figure caption: The tendency for horizontal wind shear for the total winds (black line), IMF3 (blue line), and trend mode (red line) and SLP (black dotted line) along the storm track of Helene (2006).

  8. Incremental dynamical downscaling for probabilistic analysis based on multiple GCM projections

    NASA Astrophysics Data System (ADS)

    Wakazuki, Y.

    2015-12-01

    A dynamical downscaling method for probabilistic regional scale climate change projections was developed to cover an uncertainty of multiple general circulation model (GCM) climate simulations. The climatological increments (future minus present climate states) estimated by GCM simulation results were statistically analyzed using the singular vector decomposition. Both positive and negative perturbations from the ensemble mean with the magnitudes of their standard deviations were extracted and were added to the ensemble mean of the climatological increments. The analyzed multiple modal increments were utilized to create multiple modal lateral boundary conditions for the future climate regional climate model (RCM) simulations by adding to an objective analysis data. This data handling is regarded to be an advanced method of the pseudo-global-warming (PGW) method previously developed by Kimura and Kitoh (2007). The incremental handling for GCM simulations realized approximated probabilistic climate change projections with the smaller number of RCM simulations. Three values of a climatological variable simulated by RCMs for a mode were used to estimate the response to the perturbation of the mode. For the probabilistic analysis, climatological variables of RCMs were assumed to show linear response to the multiple modal perturbations, although the non-linearity was seen for local scale rainfall. Probability of temperature was able to be estimated within two modes perturbation simulations, where the number of RCM simulations for the future climate is five. On the other hand, local scale rainfalls needed four modes simulations, where the number of the RCM simulations is nine. The probabilistic method is expected to be used for regional scale climate change impact assessment in the future.

  9. Combining Statistics and Physics to Improve Climate Downscaling

    NASA Astrophysics Data System (ADS)

    Gutmann, E. D.; Eidhammer, T.; Arnold, J.; Nowak, K.; Clark, M. P.

    2017-12-01

    Getting useful information from climate models is an ongoing problem that has plagued climate science and hydrologic prediction for decades. While it is possible to develop statistical corrections for climate models that mimic current climate almost perfectly, this does not necessarily guarantee that future changes are portrayed correctly. In contrast, convection permitting regional climate models (RCMs) have begun to provide an excellent representation of the regional climate system purely from first principles, providing greater confidence in their change signal. However, the computational cost of such RCMs prohibits the generation of ensembles of simulations or long time periods, thus limiting their applicability for hydrologic applications. Here we discuss a new approach combining statistical corrections with physical relationships for a modest computational cost. We have developed the Intermediate Complexity Atmospheric Research model (ICAR) to provide a climate and weather downscaling option that is based primarily on physics for a fraction of the computational requirements of a traditional regional climate model. ICAR also enables the incorporation of statistical adjustments directly within the model. We demonstrate that applying even simple corrections to precipitation while the model is running can improve the simulation of land atmosphere feedbacks in ICAR. For example, by incorporating statistical corrections earlier in the modeling chain, we permit the model physics to better represent the effect of mountain snowpack on air temperature changes.

  10. Assessing the Added Value of Dynamical Downscaling Using ...

    EPA Pesticide Factsheets

    In this study, the Standardized Precipitation Index (SPI) is used to ascertain the added value of dynamical downscaling over the contiguous United States. WRF is used as a regional climate model (RCM) to dynamically downscale reanalysis fields to compare values of SPI over drought timescales that have implications for agriculture and water resources planning. The regional climate generated by WRF has the largest improvement over reanalysis for SPI correlation with observations as the drought timescale increases. This suggests that dynamically downscaled fields may be more reliable than larger-scale fields for water resource applications (e.g., water storage within reservoirs). WRF improves the timing and intensity of moderate to extreme wet and dry periods, even in regions with homogenous terrain. This study also examines changes in SPI from the extreme drought of 1988 and three “drought busting” tropical storms. Each of those events illustrates the importance of using downscaling to resolve the spatial extent of droughts. The analysis of the “drought busting” tropical storms demonstrates that while the impact of these storms on ending prolonged droughts is improved by the RCM relative to the reanalysis, it remains underestimated. These results illustrate the importance and some limitations of using RCMs to project drought. The National Exposure Research Laboratory’s Atmospheric Modeling Division (AMAD) conducts research in support of EPA’s mission t

  11. Inter-comparison of multiple statistically downscaled climate datasets for the Pacific Northwest, USA

    PubMed Central

    Jiang, Yueyang; Kim, John B.; Still, Christopher J.; Kerns, Becky K.; Kline, Jeffrey D.; Cunningham, Patrick G.

    2018-01-01

    Statistically downscaled climate data have been widely used to explore possible impacts of climate change in various fields of study. Although many studies have focused on characterizing differences in the downscaling methods, few studies have evaluated actual downscaled datasets being distributed publicly. Spatially focusing on the Pacific Northwest, we compare five statistically downscaled climate datasets distributed publicly in the US: ClimateNA, NASA NEX-DCP30, MACAv2-METDATA, MACAv2-LIVNEH and WorldClim. We compare the downscaled projections of climate change, and the associated observational data used as training data for downscaling. We map and quantify the variability among the datasets and characterize the spatio-temporal patterns of agreement and disagreement among the datasets. Pair-wise comparisons of datasets identify the coast and high-elevation areas as areas of disagreement for temperature. For precipitation, high-elevation areas, rainshadows and the dry, eastern portion of the study area have high dissimilarity among the datasets. By spatially aggregating the variability measures into watersheds, we develop guidance for selecting datasets within the Pacific Northwest climate change impact studies. PMID:29461513

  12. Inter-comparison of multiple statistically downscaled climate datasets for the Pacific Northwest, USA.

    PubMed

    Jiang, Yueyang; Kim, John B; Still, Christopher J; Kerns, Becky K; Kline, Jeffrey D; Cunningham, Patrick G

    2018-02-20

    Statistically downscaled climate data have been widely used to explore possible impacts of climate change in various fields of study. Although many studies have focused on characterizing differences in the downscaling methods, few studies have evaluated actual downscaled datasets being distributed publicly. Spatially focusing on the Pacific Northwest, we compare five statistically downscaled climate datasets distributed publicly in the US: ClimateNA, NASA NEX-DCP30, MACAv2-METDATA, MACAv2-LIVNEH and WorldClim. We compare the downscaled projections of climate change, and the associated observational data used as training data for downscaling. We map and quantify the variability among the datasets and characterize the spatio-temporal patterns of agreement and disagreement among the datasets. Pair-wise comparisons of datasets identify the coast and high-elevation areas as areas of disagreement for temperature. For precipitation, high-elevation areas, rainshadows and the dry, eastern portion of the study area have high dissimilarity among the datasets. By spatially aggregating the variability measures into watersheds, we develop guidance for selecting datasets within the Pacific Northwest climate change impact studies.

  13. A Dynamical Downscaling Approach with GCM Bias Corrections and Spectral Nudging

    NASA Astrophysics Data System (ADS)

    Xu, Z.; Yang, Z.

    2013-12-01

    To reduce the biases in the regional climate downscaling simulations, a dynamical downscaling approach with GCM bias corrections and spectral nudging is developed and assessed over North America. Regional climate simulations are performed with the Weather Research and Forecasting (WRF) model embedded in the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). To reduce the GCM biases, the GCM climatological means and the variances of interannual variations are adjusted based on the National Centers for Environmental Prediction-NCAR global reanalysis products (NNRP) before using them to drive WRF which is the same as our previous method. In this study, we further introduce spectral nudging to reduce the RCM-based biases. Two sets of WRF experiments are performed with and without spectral nudging. All WRF experiments are identical except that the initial and lateral boundary conditions are derived from the NNRP, the original GCM output, and the bias corrected GCM output, respectively. The GCM-driven RCM simulations with bias corrections and spectral nudging (IDDng) are compared with those without spectral nudging (IDD) and North American Regional Reanalysis (NARR) data to assess the additional reduction in RCM biases relative to the IDD approach. The results show that the spectral nudging introduces the effect of GCM bias correction into the RCM domain, thereby minimizing the climate drift resulting from the RCM biases. The GCM bias corrections and spectral nudging significantly improve the downscaled mean climate and extreme temperature simulations. Our results suggest that both GCM bias corrections or spectral nudging are necessary to reduce the error of downscaled climate. Only one of them does not guarantee better downscaling simulation. The new dynamical downscaling method can be applied to regional projection of future climate or downscaling of GCM sensitivity simulations. Annual mean RMSEs. The RMSEs are computed over the

  14. Actor groups, related needs, and challenges at the climate downscaling interface

    NASA Astrophysics Data System (ADS)

    Rössler, Ole; Benestad, Rasmus; Diamando, Vlachogannis; Heike, Hübener; Kanamaru, Hideki; Pagé, Christian; Margarida Cardoso, Rita; Soares, Pedro; Maraun, Douglas; Kreienkamp, Frank; Christodoulides, Paul; Fischer, Andreas; Szabo, Peter

    2016-04-01

    At the climate downscaling interface, numerous downscaling techniques and different philosophies compete on being the best method in their specific terms. Thereby, it remains unclear to what extent and for which purpose these downscaling techniques are valid or even the most appropriate choice. A common validation framework that compares all the different available methods was missing so far. The initiative VALUE closes this gap with such a common validation framework. An essential part of a validation framework for downscaling techniques is the definition of appropriate validation measures. The selection of validation measures should consider the needs of the stakeholder: some might need a temporal or spatial average of a certain variable, others might need temporal or spatial distributions of some variables, still others might need extremes for the variables of interest or even inter-variable dependencies. Hence, a close interaction of climate data providers and climate data users is necessary. Thus, the challenge in formulating a common validation framework mirrors also the challenges between the climate data providers and the impact assessment community. This poster elaborates the issues and challenges at the downscaling interface as it is seen within the VALUE community. It suggests three different actor groups: one group consisting of the climate data providers, the other two groups being climate data users (impact modellers and societal users). Hence, the downscaling interface faces classical transdisciplinary challenges. We depict a graphical illustration of actors involved and their interactions. In addition, we identified four different types of issues that need to be considered: i.e. data based, knowledge based, communication based, and structural issues. They all may, individually or jointly, hinder an optimal exchange of data and information between the actor groups at the downscaling interface. Finally, some possible ways to tackle these issues are

  15. Tethys – A Python Package for Spatial and Temporal Downscaling of Global Water Withdrawals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.

    Downscaling of water withdrawals from regional/national to local scale is a fundamental step and also a common problem when integrating large scale economic and integrated assessment models with high-resolution detailed sectoral models. Tethys, an open-access software written in Python, is developed with statistical downscaling algorithms, to spatially and temporally downscale water withdrawal data to a finer scale. The spatial resolution will be downscaled from region/basin scale to grid (0.5 geographic degree) scale and the temporal resolution will be downscaled from year to month. Tethys is used to produce monthly global gridded water withdrawal products based on estimates from the Globalmore » Change Assessment Model (GCAM).« less

  16. Tethys – A Python Package for Spatial and Temporal Downscaling of Global Water Withdrawals

    DOE PAGES

    Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.; ...

    2018-02-09

    Downscaling of water withdrawals from regional/national to local scale is a fundamental step and also a common problem when integrating large scale economic and integrated assessment models with high-resolution detailed sectoral models. Tethys, an open-access software written in Python, is developed with statistical downscaling algorithms, to spatially and temporally downscale water withdrawal data to a finer scale. The spatial resolution will be downscaled from region/basin scale to grid (0.5 geographic degree) scale and the temporal resolution will be downscaled from year to month. Tethys is used to produce monthly global gridded water withdrawal products based on estimates from the Globalmore » Change Assessment Model (GCAM).« less

  17. Combining super-ensembles and statistical emulation to improve a regional climate and vegetation model

    NASA Astrophysics Data System (ADS)

    Hawkins, L. R.; Rupp, D. E.; Li, S.; Sarah, S.; McNeall, D. J.; Mote, P.; Betts, R. A.; Wallom, D.

    2017-12-01

    Changing regional patterns of surface temperature, precipitation, and humidity may cause ecosystem-scale changes in vegetation, altering the distribution of trees, shrubs, and grasses. A changing vegetation distribution, in turn, alters the albedo, latent heat flux, and carbon exchanged with the atmosphere with resulting feedbacks onto the regional climate. However, a wide range of earth-system processes that affect the carbon, energy, and hydrologic cycles occur at sub grid scales in climate models and must be parameterized. The appropriate parameter values in such parameterizations are often poorly constrained, leading to uncertainty in predictions of how the ecosystem will respond to changes in forcing. To better understand the sensitivity of regional climate to parameter selection and to improve regional climate and vegetation simulations, we used a large perturbed physics ensemble and a suite of statistical emulators. We dynamically downscaled a super-ensemble (multiple parameter sets and multiple initial conditions) of global climate simulations using a 25-km resolution regional climate model HadRM3p with the land-surface scheme MOSES2 and dynamic vegetation module TRIFFID. We simultaneously perturbed land surface parameters relating to the exchange of carbon, water, and energy between the land surface and atmosphere in a large super-ensemble of regional climate simulations over the western US. Statistical emulation was used as a computationally cost-effective tool to explore uncertainties in interactions. Regions of parameter space that did not satisfy observational constraints were eliminated and an ensemble of parameter sets that reduce regional biases and span a range of plausible interactions among earth system processes were selected. This study demonstrated that by combining super-ensemble simulations with statistical emulation, simulations of regional climate could be improved while simultaneously accounting for a range of plausible land

  18. Rainfall Downscaling Conditional on Upper-air Atmospheric Predictors: Improved Assessment of Rainfall Statistics in a Changing Climate

    NASA Astrophysics Data System (ADS)

    Langousis, Andreas; Mamalakis, Antonis; Deidda, Roberto; Marrocu, Marino

    2015-04-01

    regional level. This is done for an intermediate-sized catchment in Italy, i.e. the Flumendosa catchment, using climate model rainfall and atmospheric data from the ENSEMBLES project (http://ensembleseu.metoffice.com). In doing so, we split the historical rainfall record of mean areal precipitation (MAP) in 15-year calibration and 45-year validation periods, and compare the historical rainfall statistics to those obtained from: a) Q-Q corrected climate model rainfall products, and b) synthetic rainfall series generated by the suggested downscaling scheme. To our knowledge, this is the first time that climate model rainfall and statistically downscaled precipitation are compared to catchment-averaged MAP at a daily resolution. The obtained results are promising, since the proposed downscaling scheme is more accurate and robust in reproducing a number of historical rainfall statistics, independent of the climate model used and the length of the calibration period. This is particularly the case for the yearly rainfall maxima, where direct statistical correction of climate model rainfall outputs shows increased sensitivity to the length of the calibration period and the climate model used. The robustness of the suggested downscaling scheme in modeling rainfall extremes at a daily resolution, is a notable feature that can effectively be used to assess hydrologic risk at a regional level under changing climatic conditions. Acknowledgments The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General Secretariat for Research and Technology), and is co-financed by the European Social Fund (ESF) and the Greek State. CRS4 highly acknowledges the contribution of the Sardinian regional authorities.

  19. The Ensembl REST API: Ensembl Data for Any Language.

    PubMed

    Yates, Andrew; Beal, Kathryn; Keenan, Stephen; McLaren, William; Pignatelli, Miguel; Ritchie, Graham R S; Ruffier, Magali; Taylor, Kieron; Vullo, Alessandro; Flicek, Paul

    2015-01-01

    We present a Web service to access Ensembl data using Representational State Transfer (REST). The Ensembl REST server enables the easy retrieval of a wide range of Ensembl data by most programming languages, using standard formats such as JSON and FASTA while minimizing client work. We also introduce bindings to the popular Ensembl Variant Effect Predictor tool permitting large-scale programmatic variant analysis independent of any specific programming language. The Ensembl REST API can be accessed at http://rest.ensembl.org and source code is freely available under an Apache 2.0 license from http://github.com/Ensembl/ensembl-rest. © The Author 2014. Published by Oxford University Press.

  20. Downscaling Global Emissions and Its Implications Derived from Climate Model Experiments

    PubMed Central

    Abe, Manabu; Kinoshita, Tsuguki; Hasegawa, Tomoko; Kawase, Hiroaki; Kushida, Kazuhide; Masui, Toshihiko; Oka, Kazutaka; Shiogama, Hideo; Takahashi, Kiyoshi; Tatebe, Hiroaki; Yoshikawa, Minoru

    2017-01-01

    In climate change research, future scenarios of greenhouse gas and air pollutant emissions generated by integrated assessment models (IAMs) are used in climate models (CMs) and earth system models to analyze future interactions and feedback between human activities and climate. However, the spatial resolutions of IAMs and CMs differ. IAMs usually disaggregate the world into 10–30 aggregated regions, whereas CMs require a grid-based spatial resolution. Therefore, downscaling emissions data from IAMs into a finer scale is necessary to input the emissions into CMs. In this study, we examined whether differences in downscaling methods significantly affect climate variables such as temperature and precipitation. We tested two downscaling methods using the same regionally aggregated sulfur emissions scenario obtained from the Asian-Pacific Integrated Model/Computable General Equilibrium (AIM/CGE) model. The downscaled emissions were fed into the Model for Interdisciplinary Research on Climate (MIROC). One of the methods assumed a strong convergence of national emissions intensity (e.g., emissions per gross domestic product), while the other was based on inertia (i.e., the base-year remained unchanged). The emissions intensities in the downscaled spatial emissions generated from the two methods markedly differed, whereas the emissions densities (emissions per area) were similar. We investigated whether the climate change projections of temperature and precipitation would significantly differ between the two methods by applying a field significance test, and found little evidence of a significant difference between the two methods. Moreover, there was no clear evidence of a difference between the climate simulations based on these two downscaling methods. PMID:28076446

  1. Evaluation of downscaled, gridded climate data for the conterminous United States

    USGS Publications Warehouse

    Robert J. Behnke,; Stephen J. Vavrus,; Andrew Allstadt,; Thomas P. Albright,; Thogmartin, Wayne E.; Volker C. Radeloff,

    2016-01-01

    Weather and climate affect many ecological processes, making spatially continuous yet fine-resolution weather data desirable for ecological research and predictions. Numerous downscaled weather data sets exist, but little attempt has been made to evaluate them systematically. Here we address this shortcoming by focusing on four major questions: (1) How accurate are downscaled, gridded climate data sets in terms of temperature and precipitation estimates?, (2) Are there significant regional differences in accuracy among data sets?, (3) How accurate are their mean values compared with extremes?, and (4) Does their accuracy depend on spatial resolution? We compared eight widely used downscaled data sets that provide gridded daily weather data for recent decades across the United States. We found considerable differences among data sets and between downscaled and weather station data. Temperature is represented more accurately than precipitation, and climate averages are more accurate than weather extremes. The data set exhibiting the best agreement with station data varies among ecoregions. Surprisingly, the accuracy of the data sets does not depend on spatial resolution. Although some inherent differences among data sets and weather station data are to be expected, our findings highlight how much different interpolation methods affect downscaled weather data, even for local comparisons with nearby weather stations located inside a grid cell. More broadly, our results highlight the need for careful consideration among different available data sets in terms of which variables they describe best, where they perform best, and their resolution, when selecting a downscaled weather data set for a given ecological application.

  2. NASA Downscaling Project

    NASA Technical Reports Server (NTRS)

    Ferraro, Robert; Waliser, Duane; Peters-Lidard, Christa

    2017-01-01

    A team of researchers from NASA Ames Research Center, Goddard Space Flight Center, the Jet Propulsion Laboratory, and Marshall Space Flight Center, along with university partners at UCLA, conducted an investigation to explore whether downscaling coarse resolution global climate model (GCM) predictions might provide valid insights into the regional impacts sought by decision makers. Since the computational cost of running global models at high spatial resolution for any useful climate scale period is prohibitive, the hope for downscaling is that a coarse resolution GCM provides sufficiently accurate synoptic scale information for a regional climate model (RCM) to accurately develop fine scale features that represent the regional impacts of a changing climate. As a proxy for a prognostic climate forecast model, and so that ground truth in the form of satellite and in-situ observations could be used for evaluation, the MERRA and MERRA-2 reanalyses were used to drive the NU-WRF regional climate model and a GEOS-5 replay. This was performed at various resolutions that were at factors of 2 to 10 higher than the reanalysis forcing. A number of experiments were conducted that varied resolution, model parameterizations, and intermediate scale nudging, for simulations over the continental US during the period from 2000-2010. The results of these experiments were compared to observational datasets to evaluate the output.

  3. Dynamical Downscaling of NASA/GISS ModelE: Continuous, Multi-Year WRF Simulations

    NASA Astrophysics Data System (ADS)

    Otte, T.; Bowden, J. H.; Nolte, C. G.; Otte, M. J.; Herwehe, J. A.; Faluvegi, G.; Shindell, D. T.

    2010-12-01

    The WRF Model is being used at the U.S. EPA for dynamical downscaling of the NASA/GISS ModelE fields to assess regional impacts of climate change in the United States. The WRF model has been successfully linked to the ModelE fields in their raw hybrid vertical coordinate, and continuous, multi-year WRF downscaling simulations have been performed. WRF will be used to downscale decadal time slices of ModelE for recent past, current, and future climate as the simulations being conducted for the IPCC Fifth Assessment Report become available. This presentation will focus on the sensitivity to interior nudging within the RCM. The use of interior nudging for downscaled regional climate simulations has been somewhat controversial over the past several years but has been recently attracting attention. Several recent studies that have used reanalysis (i.e., verifiable) fields as a proxy for GCM input have shown that interior nudging can be beneficial toward achieving the desired downscaled fields. In this study, the value of nudging will be shown using fields from ModelE that are downscaled using WRF. Several different methods of nudging are explored, and it will be shown that the method of nudging and the choices made with respect to how nudging is used in WRF are critical to balance the constraint of ModelE against the freedom of WRF to develop its own fields.

  4. Projecting future precipitation and temperature at sites with diverse climate through multiple statistical downscaling schemes

    NASA Astrophysics Data System (ADS)

    Vallam, P.; Qin, X. S.

    2017-10-01

    Anthropogenic-driven climate change would affect the global ecosystem and is becoming a world-wide concern. Numerous studies have been undertaken to determine the future trends of meteorological variables at different scales. Despite these studies, there remains significant uncertainty in the prediction of future climates. To examine the uncertainty arising from using different schemes to downscale the meteorological variables for the future horizons, projections from different statistical downscaling schemes were examined. These schemes included statistical downscaling method (SDSM), change factor incorporated with LARS-WG, and bias corrected disaggregation (BCD) method. Global circulation models (GCMs) based on CMIP3 (HadCM3) and CMIP5 (CanESM2) were utilized to perturb the changes in the future climate. Five study sites (i.e., Alice Springs, Edmonton, Frankfurt, Miami, and Singapore) with diverse climatic conditions were chosen for examining the spatial variability of applying various statistical downscaling schemes. The study results indicated that the regions experiencing heavy precipitation intensities were most likely to demonstrate the divergence between the predictions from various statistical downscaling methods. Also, the variance computed in projecting the weather extremes indicated the uncertainty derived from selection of downscaling tools and climate models. This study could help gain an improved understanding about the features of different downscaling approaches and the overall downscaling uncertainty.

  5. The Ensembl REST API: Ensembl Data for Any Language

    PubMed Central

    Yates, Andrew; Beal, Kathryn; Keenan, Stephen; McLaren, William; Pignatelli, Miguel; Ritchie, Graham R. S.; Ruffier, Magali; Taylor, Kieron; Vullo, Alessandro; Flicek, Paul

    2015-01-01

    Motivation: We present a Web service to access Ensembl data using Representational State Transfer (REST). The Ensembl REST server enables the easy retrieval of a wide range of Ensembl data by most programming languages, using standard formats such as JSON and FASTA while minimizing client work. We also introduce bindings to the popular Ensembl Variant Effect Predictor tool permitting large-scale programmatic variant analysis independent of any specific programming language. Availability and implementation: The Ensembl REST API can be accessed at http://rest.ensembl.org and source code is freely available under an Apache 2.0 license from http://github.com/Ensembl/ensembl-rest. Contact: ayates@ebi.ac.uk or flicek@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25236461

  6. Downscaled climate projections for the Southeast United States: evaluation and use for ecological applications

    USGS Publications Warehouse

    Wootten, Adrienne; Smith, Kara; Boyles, Ryan; Terando, Adam; Stefanova, Lydia; Misra, Vasru; Smith, Tom; Blodgett, David L.; Semazzi, Fredrick

    2014-01-01

    Climate change is likely to have many effects on natural ecosystems in the Southeast U.S. The National Climate Assessment Southeast Technical Report (SETR) indicates that natural ecosystems in the Southeast are likely to be affected by warming temperatures, ocean acidification, sea-level rise, and changes in rainfall and evapotranspiration. To better assess these how climate changes could affect multiple sectors, including ecosystems, climatologists have created several downscaled climate projections (or downscaled datasets) that contain information from the global climate models (GCMs) translated to regional or local scales. The process of creating these downscaled datasets, known as downscaling, can be carried out using a broad range of statistical or numerical modeling techniques. The rapid proliferation of techniques that can be used for downscaling and the number of downscaled datasets produced in recent years present many challenges for scientists and decisionmakers in assessing the impact or vulnerability of a given species or ecosystem to climate change. Given the number of available downscaled datasets, how do these model outputs compare to each other? Which variables are available, and are certain downscaled datasets more appropriate for assessing vulnerability of a particular species? Given the desire to use these datasets for impact and vulnerability assessments and the lack of comparison between these datasets, the goal of this report is to synthesize the information available in these downscaled datasets and provide guidance to scientists and natural resource managers with specific interests in ecological modeling and conservation planning related to climate change in the Southeast U.S. This report enables the Southeast Climate Science Center (SECSC) to address an important strategic goal of providing scientific information and guidance that will enable resource managers and other participants in Landscape Conservation Cooperatives to make science

  7. Dynamical Downscaling of Climate Change over the Hawaiian Islands

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Zhang, C.; Hamilton, K. P.; Lauer, A.

    2015-12-01

    The pseudo-global-warming (PGW) method was applied to the Hawaii Regional Climate Model (HRCM) to dynamically downscale the projected climate in the late 21st century over the Hawaiian Islands. The initial and boundary conditions were adopted from MERRA reanalysis and NOAA SST data for the present-day simulations. The global warming increments constructed from the CMIP3 multi-model ensemble mean were added to the reanalysis and SST data to perform the future climate simulations. We found that the Hawaiian Islands are vulnerable to global warming effects and the changes are diverse due to the varied topography. The windward side will have more clouds and receive more rainfall. The increase of the moisture in the boundary layer makes the major contribution. On the contrary, the leeward side will have less clouds and rainfall. The clouds and rain can slightly slow down the warming trend over the windward side. The temperature increases almost linearly with the terrain height. Cloud base and top heights will slightly decline in response to the slightly lower trade wind inversion base height, while the trade wind occurrence frequency will increase by about 8% in the future. More extreme rainfall events will occur in the warming climate over the Hawaiian Islands. And the snow cover on the top of Mauna Kea and Mauna Loa will nearly disappear in the future winter.

  8. Assessing Fire Weather Index using statistical downscaling and spatial interpolation techniques in Greece

    NASA Astrophysics Data System (ADS)

    Karali, Anna; Giannakopoulos, Christos; Frias, Maria Dolores; Hatzaki, Maria; Roussos, Anargyros; Casanueva, Ana

    2013-04-01

    Forest fires have always been present in the Mediterranean ecosystems, thus they constitute a major ecological and socio-economic issue. The last few decades though, the number of forest fires has significantly increased, as well as their severity and impact on the environment. Local fire danger projections are often required when dealing with wild fire research. In the present study the application of statistical downscaling and spatial interpolation methods was performed to the Canadian Fire Weather Index (FWI), in order to assess forest fire risk in Greece. The FWI is used worldwide (including the Mediterranean basin) to estimate the fire danger in a generalized fuel type, based solely on weather observations. The meteorological inputs to the FWI System are noon values of dry-bulb temperature, air relative humidity, 10m wind speed and precipitation during the previous 24 hours. The statistical downscaling methods are based on a statistical model that takes into account empirical relationships between large scale variables (used as predictors) and local scale variables. In the framework of the current study the statistical downscaling portal developed by the Santander Meteorology Group (https://www.meteo.unican.es/downscaling) in the framework of the EU project CLIMRUN (www.climrun.eu) was used to downscale non standard parameters related to forest fire risk. In this study, two different approaches were adopted. Firstly, the analogue downscaling technique was directly performed to the FWI index values and secondly the same downscaling technique was performed indirectly through the meteorological inputs of the index. In both cases, the statistical downscaling portal was used considering the ERA-Interim reanalysis as predictands due to the lack of observations at noon. Additionally, a three-dimensional (3D) interpolation method of position and elevation, based on Thin Plate Splines (TPS) was used, to interpolate the ERA-Interim data used to calculate the index

  9. Multi objective climate change impact assessment using multi downscaled climate scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-04-01

    Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional and global scale. In the present study, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from a set of statistically downscaled GCM projections for Columbia River Basin (CRB). Analysis is performed using 2 different statistically downscaled climate projections namely the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. Analysis is performed on spatial, temporal and frequency based parameters in the future period at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice-versa for temperature. Frequency analysis provided insights into possible explanation to changes in precipitation.

  10. Stochastic Downscaling of Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Rasera, Luiz Gustavo; Mariethoz, Gregoire; Lane, Stuart N.

    2016-04-01

    High-resolution digital elevation models (HR-DEMs) are extremely important for the understanding of small-scale geomorphic processes in Alpine environments. In the last decade, remote sensing techniques have experienced a major technological evolution, enabling fast and precise acquisition of HR-DEMs. However, sensors designed to measure elevation data still feature different spatial resolution and coverage capabilities. Terrestrial altimetry allows the acquisition of HR-DEMs with centimeter to millimeter-level precision, but only within small spatial extents and often with dead ground problems. Conversely, satellite radiometric sensors are able to gather elevation measurements over large areas but with limited spatial resolution. In the present study, we propose an algorithm to downscale low-resolution satellite-based DEMs using topographic patterns extracted from HR-DEMs derived for example from ground-based and airborne altimetry. The method consists of a multiple-point geostatistical simulation technique able to generate high-resolution elevation data from low-resolution digital elevation models (LR-DEMs). Initially, two collocated DEMs with different spatial resolutions serve as an input to construct a database of topographic patterns, which is also used to infer the statistical relationships between the two scales. High-resolution elevation patterns are then retrieved from the database to downscale a LR-DEM through a stochastic simulation process. The output of the simulations are multiple equally probable DEMs with higher spatial resolution that also depict the large-scale geomorphic structures present in the original LR-DEM. As these multiple models reflect the uncertainty related to the downscaling, they can be employed to quantify the uncertainty of phenomena that are dependent on fine topography, such as catchment hydrological processes. The proposed methodology is illustrated for a case study in the Swiss Alps. A swissALTI3D HR-DEM (with 5 m resolution

  11. Ensemble Simulation of Sierra Nevada Snowmelt Runoff Using a Regional Climate Modeling Approach

    NASA Astrophysics Data System (ADS)

    Holtzman, N.; Pavelsky, T.; Wrzesien, M.

    2017-12-01

    The snowmelt-dominated watersheds on the western slopes of the California Sierra Nevada drain into reservoirs that generate electricity and help irrigate Central Valley farms. At the end of the wet season of each year, around April 1, most of the water that will become runoff in these basins is stored as snow at high elevations. Snow measurements provide a good estimate of the total annual runoff to come. For efficient water management, however, it is also useful to know the timing of runoff. When and how large will the peak flow into a reservoir be, and how fast will the flow decline after it peaks? We address such questions using a coupled regional climate and land surface model, WRF and Noah-MP, to dynamically downscale the North American Regional Reanalysis (NARR) with an ensemble approach. First, we assess several methods of deriving melt-season runoff from WRF. We run WRF for a complete water year, and also test initializing WRF snow from observation-based datasets at the approximate date of peak snow water equivalent. By aggregating the modeled runoffs over the drainage basins of reservoirs and comparing to naturalized flow data, we can assess the basin-scale snow accumulation accuracy of WRF and the other datasets in the Sierra. After choosing a procedure to set the model snow at the end of the wet season, we apply in WRF the melt-season meteorology from 20 different past years of NARR to produce an ensemble of simulations, each with modeled flows into 8 reservoirs spanning the Sierra. We use the ensemble to characterize the likely spread in the timing and magnitude of hydrologic outcomes during the melt season. Probabilistic forecasts can help water-energy systems operate more efficiently. The ensemble also shows the effect of warm-season temperature extremes on flow timing, allowing human systems to prepare for those possibilities. Finally, the ensemble provides a baseline estimate of the maximum variability in runoff timing that could be generated by

  12. Assessing the Added Value of Dynamical Downscaling Using the Standardized Precipitation Index

    EPA Science Inventory

    In this study, the Standardized Precipitation Index (SPI) is used to ascertain the added value of dynamical downscaling over the contiguous United States. WRF is used as a regional climate model (RCM) to dynamically downscale reanalysis fields to compare values of SPI over drough...

  13. Modelling climate impact on floods under future emission scenarios using an ensemble of climate model projections

    NASA Astrophysics Data System (ADS)

    Wetterhall, F.; Cloke, H. L.; He, Y.; Freer, J.; Pappenberger, F.

    2012-04-01

    Evidence provided by modelled assessments of climate change impact on flooding is fundamental to water resource and flood risk decision making. Impact models usually rely on climate projections from Global and Regional Climate Models, and there is no doubt that these provide a useful assessment of future climate change. However, cascading ensembles of climate projections into impact models is not straightforward because of problems of coarse resolution in Global and Regional Climate Models (GCM/RCM) and the deficiencies in modelling high-intensity precipitation events. Thus decisions must be made on how to appropriately pre-process the meteorological variables from GCM/RCMs, such as selection of downscaling methods and application of Model Output Statistics (MOS). In this paper a grand ensemble of projections from several GCM/RCM are used to drive a hydrological model and analyse the resulting future flood projections for the Upper Severn, UK. The impact and implications of applying MOS techniques to precipitation as well as hydrological model parameter uncertainty is taken into account. The resultant grand ensemble of future river discharge projections from the RCM/GCM-hydrological model chain is evaluated against a response surface technique combined with a perturbed physics experiment creating a probabilisic ensemble climate model outputs. The ensemble distribution of results show that future risk of flooding in the Upper Severn increases compared to present conditions, however, the study highlights that the uncertainties are large and that strong assumptions were made in using Model Output Statistics to produce the estimates of future discharge. The importance of analysing on a seasonal basis rather than just annual is highlighted. The inability of the RCMs (and GCMs) to produce realistic precipitation patterns, even in present conditions, is a major caveat of local climate impact studies on flooding, and this should be a focus for future development.

  14. Evaluating the Appropriateness of Downscaled Climate Information for Projecting Risks of Salmonella

    PubMed Central

    Guentchev, Galina S.; Rood, Richard B.; Ammann, Caspar M.; Barsugli, Joseph J.; Ebi, Kristie; Berrocal, Veronica; O’Neill, Marie S.; Gronlund, Carina J.; Vigh, Jonathan L.; Koziol, Ben; Cinquini, Luca

    2016-01-01

    Foodborne diseases have large economic and societal impacts worldwide. To evaluate how the risks of foodborne diseases might change in response to climate change, credible and usable climate information tailored to the specific application question is needed. Global Climate Model (GCM) data generally need to, both, be downscaled to the scales of the application to be usable, and represent, well, the key characteristics that inflict health impacts. This study presents an evaluation of temperature-based heat indices for the Washington D.C. area derived from statistically downscaled GCM simulations for 1971–2000—a necessary step in establishing the credibility of these data. The indices approximate high weekly mean temperatures linked previously to occurrences of Salmonella infections. Due to bias-correction, included in the Asynchronous Regional Regression Model (ARRM) and the Bias Correction Constructed Analogs (BCCA) downscaling methods, the observed 30-year means of the heat indices were reproduced reasonably well. In April and May, however, some of the statistically downscaled data misrepresent the increase in the number of hot days towards the summer months. This study demonstrates the dependence of the outcomes to the selection of downscaled climate data and the potential for misinterpretation of future estimates of Salmonella infections. PMID:26938544

  15. Evaluating the Appropriateness of Downscaled Climate Information for Projecting Risks of Salmonella.

    PubMed

    Guentchev, Galina S; Rood, Richard B; Ammann, Caspar M; Barsugli, Joseph J; Ebi, Kristie; Berrocal, Veronica; O'Neill, Marie S; Gronlund, Carina J; Vigh, Jonathan L; Koziol, Ben; Cinquini, Luca

    2016-02-29

    Foodborne diseases have large economic and societal impacts worldwide. To evaluate how the risks of foodborne diseases might change in response to climate change, credible and usable climate information tailored to the specific application question is needed. Global Climate Model (GCM) data generally need to, both, be downscaled to the scales of the application to be usable, and represent, well, the key characteristics that inflict health impacts. This study presents an evaluation of temperature-based heat indices for the Washington D.C. area derived from statistically downscaled GCM simulations for 1971-2000--a necessary step in establishing the credibility of these data. The indices approximate high weekly mean temperatures linked previously to occurrences of Salmonella infections. Due to bias-correction, included in the Asynchronous Regional Regression Model (ARRM) and the Bias Correction Constructed Analogs (BCCA) downscaling methods, the observed 30-year means of the heat indices were reproduced reasonably well. In April and May, however, some of the statistically downscaled data misrepresent the increase in the number of hot days towards the summer months. This study demonstrates the dependence of the outcomes to the selection of downscaled climate data and the potential for misinterpretation of future estimates of Salmonella infections.

  16. Downscaling climate model output for water resources impacts assessment (Invited)

    NASA Astrophysics Data System (ADS)

    Maurer, E. P.; Pierce, D. W.; Cayan, D. R.

    2013-12-01

    Water agencies in the U.S. and around the globe are beginning to wrap climate change projections into their planning procedures, recognizing that ongoing human-induced changes to hydrology can affect water management in significant ways. Future hydrology changes are derived using global climate model (GCM) projections, though their output is at a spatial scale that is too coarse to meet the needs of those concerned with local and regional impacts. Those investigating local impacts have employed a range of techniques for downscaling, the process of translating GCM output to a more locally-relevant spatial scale. Recent projects have produced libraries of publicly-available downscaled climate projections, enabling managers, researchers and others to focus on impacts studies, drawing from a shared pool of fine-scale climate data. Besides the obvious advantage to data users, who no longer need to develop expertise in downscaling prior to examining impacts, the use of the downscaled data by hundreds of people has allowed a crowdsourcing approach to examining the data. The wide variety of applications employed by different users has revealed characteristics not discovered during the initial data set production. This has led to a deeper look at the downscaling methods, including the assumptions and effect of bias correction of GCM output. Here new findings are presented related to the assumption of stationarity in the relationships between large- and fine-scale climate, as well as the impact of quantile mapping bias correction on precipitation trends. The validity of these assumptions can influence the interpretations of impacts studies using data derived using these standard statistical methods and help point the way to improved methods.

  17. What model resolution is required in climatological downscaling over complex terrain?

    NASA Astrophysics Data System (ADS)

    El-Samra, Renalda; Bou-Zeid, Elie; El-Fadel, Mutasem

    2018-05-01

    This study presents results from the Weather Research and Forecasting (WRF) model applied for climatological downscaling simulations over highly complex terrain along the Eastern Mediterranean. We sequentially downscale general circulation model results, for a mild and wet year (2003) and a hot and dry year (2010), to three local horizontal resolutions of 9, 3 and 1 km. Simulated near-surface hydrometeorological variables are compared at different time scales against data from an observational network over the study area comprising rain gauges, anemometers, and thermometers. The overall performance of WRF at 1 and 3 km horizontal resolution was satisfactory, with significant improvement over the 9 km downscaling simulation. The total yearly precipitation from WRF's 1 km and 3 km domains exhibited < 10% bias with respect to observational data. The errors in minimum and maximum temperatures were reduced by the downscaling, along with a high-quality delineation of temperature variability and extremes for both the 1 and 3 km resolution runs. Wind speeds, on the other hand, are generally overestimated for all model resolutions, in comparison with observational data, particularly on the coast (up to 50%) compared to inland stations (up to 40%). The findings therefore indicate that a 3 km resolution is sufficient for the downscaling, especially that it would allow more years and scenarios to be investigated compared to the higher 1 km resolution at the same computational effort. In addition, the results provide a quantitative measure of the potential errors for various hydrometeorological variables.

  18. Simulation's Ensemble is Better Than Ensemble Simulation

    NASA Astrophysics Data System (ADS)

    Yan, X.

    2017-12-01

    Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.

  19. Improvement of downscaled rainfall and temperature across generations over the Western Himalayan region of India

    NASA Astrophysics Data System (ADS)

    Das, L.; Dutta, M.; Akhter, J.; Meher, J. K.

    2016-12-01

    It is a challenging task to create station level (local scale) climate change information over the mountainous locations of Western Himalayan Region (WHR) in India because of limited data availability and poor data quality. In the present study, missing values of station data were handled through Multiple Imputation Chained Equation (MICE) technique. Finally 22 numbers of rain gauge and 16 number of temperature station data having continuous record during 1901­2005 and 1969­2009 period respectively were considered as reference stations for developing downscaled rainfall and temperature time series from five commonly available GCMs in the IPCC's different generation assessment reports namely 2nd, 3rd, 4th and 5th hereafter known as SAR, TAR, AR4 and AR5 respectively. Downscaled models were developed using the combined data from the ERA-interim reanalysis and GCMs historical runs (in spite of forcing were not identical in different generation) as predictor and station level rainfall and temperature as predictands. Station level downscaled rainfall and temperature time series were constructed for five GCMs available in each generation. Regional averaged downscaled time series comprising of all stations was prepared for each model and generation and the downscaled results were compared with observed time series. Finally an Overall Model Improvement Index (OMII) was developed using the downscaling results, which was used to investigate the model improvement across generations as well as the improvement of downscaling results obtained from the Empirical Statistical Downscaling (ESD) methods. In case of temperature, models have improved from SAR to AR5 over the study area. In all most all the GCMs TAR is showing worst performance over the WHR by considering the different statistical indices used in this study. In case of precipitation, no model has shown gradual improvement from SAR to AR5 both for interpolated and downscaled values.

  20. High-resolution downscaling for hydrological management

    NASA Astrophysics Data System (ADS)

    Ulbrich, Uwe; Rust, Henning; Meredith, Edmund; Kpogo-Nuwoklo, Komlan; Vagenas, Christos

    2017-04-01

    Hydrological modellers and water managers require high-resolution climate data to model regional hydrologies and how these may respond to future changes in the large-scale climate. The ability to successfully model such changes and, by extension, critical infrastructure planning is often impeded by a lack of suitable climate data. This typically takes the form of too-coarse data from climate models, which are not sufficiently detailed in either space or time to be able to support water management decisions and hydrological research. BINGO (Bringing INnovation in onGOing water management; ) aims to bridge the gap between the needs of hydrological modellers and planners, and the currently available range of climate data, with the overarching aim of providing adaptation strategies for climate change-related challenges. Producing the kilometre- and sub-daily-scale climate data needed by hydrologists through continuous simulations is generally computationally infeasible. To circumvent this hurdle, we adopt a two-pronged approach involving (1) selective dynamical downscaling and (2) conditional stochastic weather generators, with the former presented here. We take an event-based approach to downscaling in order to achieve the kilometre-scale input needed by hydrological modellers. Computational expenses are minimized by identifying extremal weather patterns for each BINGO research site in lower-resolution simulations and then only downscaling to the kilometre-scale (convection permitting) those events during which such patterns occur. Here we (1) outline the methodology behind the selection of the events, and (2) compare the modelled precipitation distribution and variability (preconditioned on the extremal weather patterns) with that found in observations.

  1. Soil moisture downscaling using a simple thermal based proxy

    NASA Astrophysics Data System (ADS)

    Peng, Jian; Loew, Alexander; Niesel, Jonathan

    2016-04-01

    Microwave remote sensing has been largely applied to retrieve soil moisture (SM) from active and passive sensors. The obvious advantage of microwave sensor is that SM can be obtained regardless of atmospheric conditions. However, existing global SM products only provide observations at coarse spatial resolutions, which often hamper their applications in regional hydrological studies. Therefore, various downscaling methods have been proposed to enhance the spatial resolution of satellite soil moisture products. The aim of this study is to investigate the validity and robustness of a simple Vegetation Temperature Condition Index (VTCI) downscaling scheme over different climates and regions. Both polar orbiting (MODIS) and geostationary (MSG SEVIRI) satellite data are used to improve the spatial resolution of the European Space Agency's Water Cycle Multi-mission Observation Strategy and Climate Change Initiative (ESA CCI) soil moisture, which is a merged product based on both active and passive microwave observations. The results from direct validation against soil moisture in-situ measurements, spatial pattern comparison, as well as seasonal and land use analyses show that the downscaling method can significantly improve the spatial details of CCI soil moisture while maintain the accuracy of CCI soil moisture. The application of the scheme with different satellite platforms and over different regions further demonstrate the robustness and effectiveness of the proposed method. Therefore, the VTCI downscaling method has the potential to facilitate relevant hydrological applications that require high spatial and temporal resolution soil moisture.

  2. Regional reanalysis without local data: Exploiting the downscaling paradigm

    NASA Astrophysics Data System (ADS)

    von Storch, Hans; Feser, Frauke; Geyer, Beate; Klehmet, Katharina; Li, Delei; Rockel, Burkhardt; Schubert-Frisius, Martina; Tim, Nele; Zorita, Eduardo

    2017-08-01

    This paper demonstrates two important aspects of regional dynamical downscaling of multidecadal atmospheric reanalysis. First, that in this way skillful regional descriptions of multidecadal climate variability may be constructed in regions with little or no local data. Second, that the concept of large-scale constraining allows global downscaling, so that global reanalyses may be completed by additions of consistent detail in all regions of the world. Global reanalyses suffer from inhomogeneities. However, their large-scale componenst are mostly homogeneous; Therefore, the concept of downscaling may be applied to homogeneously complement the large-scale state of the reanalyses with regional detail—wherever the condition of homogeneity of the description of large scales is fulfilled. Technically, this can be done by dynamical downscaling using a regional or global climate model, which's large scales are constrained by spectral nudging. This approach has been developed and tested for the region of Europe, and a skillful representation of regional weather risks—in particular marine risks—was identified. We have run this system in regions with reduced or absent local data coverage, such as Central Siberia, the Bohai and Yellow Sea, Southwestern Africa, and the South Atlantic. Also, a global simulation was computed, which adds regional features to prescribed global dynamics. Our cases demonstrate that spatially detailed reconstructions of the climate state and its change in the recent three to six decades add useful supplementary information to existing observational data for midlatitude and subtropical regions of the world.

  3. Ideas for a pattern-oriented approach towards a VERA analysis ensemble

    NASA Astrophysics Data System (ADS)

    Gorgas, T.; Dorninger, M.

    2010-09-01

    Ideas for a pattern-oriented approach towards a VERA analysis ensemble For many applications in meteorology and especially for verification purposes it is important to have some information about the uncertainties of observation and analysis data. A high quality of these "reference data" is an absolute necessity as the uncertainties are reflected in verification measures. The VERA (Vienna Enhanced Resolution Analysis) scheme includes a sophisticated quality control tool which accounts for the correction of observational data and provides an estimation of the observation uncertainty. It is crucial for meteorologically and physically reliable analysis fields. VERA is based on a variational principle and does not need any first guess fields. It is therefore NWP model independent and can also be used as an unbiased reference for real time model verification. For downscaling purposes VERA uses an a priori knowledge on small-scale physical processes over complex terrain, the so called "fingerprint technique", which transfers information from rich to data sparse regions. The enhanced Joint D-PHASE and COPS data set forms the data base for the analysis ensemble study. For the WWRP projects D-PHASE and COPS a joint activity has been started to collect GTS and non-GTS data from the national and regional meteorological services in Central Europe for 2007. Data from more than 11.000 stations are available for high resolution analyses. The usage of random numbers as perturbations for ensemble experiments is a common approach in meteorology. In most implementations, like for NWP-model ensemble systems, the focus lies on error growth and propagation on the spatial and temporal scale. When defining errors in analysis fields we have to consider the fact that analyses are not time dependent and that no perturbation method aimed at temporal evolution is possible. Further, the method applied should respect two major sources of analysis errors: Observation errors AND analysis or

  4. Ensemble Methods

    NASA Astrophysics Data System (ADS)

    Re, Matteo; Valentini, Giorgio

    2012-03-01

    Ensemble methods are statistical and computational learning procedures reminiscent of the human social learning behavior of seeking several opinions before making any crucial decision. The idea of combining the opinions of different "experts" to obtain an overall “ensemble” decision is rooted in our culture at least from the classical age of ancient Greece, and it has been formalized during the Enlightenment with the Condorcet Jury Theorem[45]), which proved that the judgment of a committee is superior to those of individuals, provided the individuals have reasonable competence. Ensembles are sets of learning machines that combine in some way their decisions, or their learning algorithms, or different views of data, or other specific characteristics to obtain more reliable and more accurate predictions in supervised and unsupervised learning problems [48,116]. A simple example is represented by the majority vote ensemble, by which the decisions of different learning machines are combined, and the class that receives the majority of “votes” (i.e., the class predicted by the majority of the learning machines) is the class predicted by the overall ensemble [158]. In the literature, a plethora of terms other than ensembles has been used, such as fusion, combination, aggregation, and committee, to indicate sets of learning machines that work together to solve a machine learning problem [19,40,56,66,99,108,123], but in this chapter we maintain the term ensemble in its widest meaning, in order to include the whole range of combination methods. Nowadays, ensemble methods represent one of the main current research lines in machine learning [48,116], and the interest of the research community on ensemble methods is witnessed by conferences and workshops specifically devoted to ensembles, first of all the multiple classifier systems (MCS) conference organized by Roli, Kittler, Windeatt, and other researchers of this area [14,62,85,149,173]. Several theories have been

  5. Using Four Downscaling Techniques to Characterize Uncertainty in Updating Intensity-Duration-Frequency Curves Under Climate Change

    NASA Astrophysics Data System (ADS)

    Cook, L. M.; Samaras, C.; McGinnis, S. A.

    2017-12-01

    Intensity-duration-frequency (IDF) curves are a common input to urban drainage design, and are used to represent extreme rainfall in a region. As rainfall patterns shift into a non-stationary regime as a result of climate change, these curves will need to be updated with future projections of extreme precipitation. Many regions have begun to update these curves to reflect the trends from downscaled climate models; however, few studies have compared the methods for doing so, as well as the uncertainty that results from the selection of the native grid scale and temporal resolution of the climate model. This study examines the variability in updated IDF curves for Pittsburgh using four different methods for adjusting gridded regional climate model (RCM) outputs into station scale precipitation extremes: (1) a simple change factor applied to observed return levels, (2) a naïve adjustment of stationary and non-stationary Generalized Extreme Value (GEV) distribution parameters, (3) a transfer function of the GEV parameters from the annual maximum series, and (4) kernel density distribution mapping bias correction of the RCM time series. Return level estimates (rainfall intensities) and confidence intervals from these methods for the 1-hour to 48-hour duration are tested for sensitivity to the underlying spatial and temporal resolution of the climate ensemble from the NA-CORDEX project, as well as, the future time period for updating. The first goal is to determine if uncertainty is highest for: (i) the downscaling method, (ii) the climate model resolution, (iii) the climate model simulation, (iv) the GEV parameters, or (v) the future time period examined. Initial results of the 6-hour, 10-year return level adjusted with the simple change factor method using four climate model simulations of two different spatial resolutions show that uncertainty is highest in the estimation of the GEV parameters. The second goal is to determine if complex downscaling methods and high

  6. Residue-level global and local ensemble-ensemble comparisons of protein domains.

    PubMed

    Clark, Sarah A; Tronrud, Dale E; Karplus, P Andrew

    2015-09-01

    Many methods of protein structure generation such as NMR-based solution structure determination and template-based modeling do not produce a single model, but an ensemble of models consistent with the available information. Current strategies for comparing ensembles lose information because they use only a single representative structure. Here, we describe the ENSEMBLATOR and its novel strategy to directly compare two ensembles containing the same atoms to identify significant global and local backbone differences between them on per-atom and per-residue levels, respectively. The ENSEMBLATOR has four components: eePREP (ee for ensemble-ensemble), which selects atoms common to all models; eeCORE, which identifies atoms belonging to a cutoff-distance dependent common core; eeGLOBAL, which globally superimposes all models using the defined core atoms and calculates for each atom the two intraensemble variations, the interensemble variation, and the closest approach of members of the two ensembles; and eeLOCAL, which performs a local overlay of each dipeptide and, using a novel measure of local backbone similarity, reports the same four variations as eeGLOBAL. The combination of eeGLOBAL and eeLOCAL analyses identifies the most significant differences between ensembles. We illustrate the ENSEMBLATOR's capabilities by showing how using it to analyze NMR ensembles and to compare NMR ensembles with crystal structures provides novel insights compared to published studies. One of these studies leads us to suggest that a "consistency check" of NMR-derived ensembles may be a useful analysis step for NMR-based structure determinations in general. The ENSEMBLATOR 1.0 is available as a first generation tool to carry out ensemble-ensemble comparisons. © 2015 The Protein Society.

  7. Residue-level global and local ensemble-ensemble comparisons of protein domains

    PubMed Central

    Clark, Sarah A; Tronrud, Dale E; Andrew Karplus, P

    2015-01-01

    Many methods of protein structure generation such as NMR-based solution structure determination and template-based modeling do not produce a single model, but an ensemble of models consistent with the available information. Current strategies for comparing ensembles lose information because they use only a single representative structure. Here, we describe the ENSEMBLATOR and its novel strategy to directly compare two ensembles containing the same atoms to identify significant global and local backbone differences between them on per-atom and per-residue levels, respectively. The ENSEMBLATOR has four components: eePREP (ee for ensemble-ensemble), which selects atoms common to all models; eeCORE, which identifies atoms belonging to a cutoff-distance dependent common core; eeGLOBAL, which globally superimposes all models using the defined core atoms and calculates for each atom the two intraensemble variations, the interensemble variation, and the closest approach of members of the two ensembles; and eeLOCAL, which performs a local overlay of each dipeptide and, using a novel measure of local backbone similarity, reports the same four variations as eeGLOBAL. The combination of eeGLOBAL and eeLOCAL analyses identifies the most significant differences between ensembles. We illustrate the ENSEMBLATOR's capabilities by showing how using it to analyze NMR ensembles and to compare NMR ensembles with crystal structures provides novel insights compared to published studies. One of these studies leads us to suggest that a “consistency check” of NMR-derived ensembles may be a useful analysis step for NMR-based structure determinations in general. The ENSEMBLATOR 1.0 is available as a first generation tool to carry out ensemble-ensemble comparisons. PMID:26032515

  8. Downscaling scheme to drive soil-vegetation-atmosphere transfer models

    NASA Astrophysics Data System (ADS)

    Schomburg, Annika; Venema, Victor; Lindau, Ralf; Ament, Felix; Simmer, Clemens

    2010-05-01

    The earth's surface is characterized by heterogeneity at a broad range of scales. Weather forecast models and climate models are not able to resolve this heterogeneity at the smaller scales. Many processes in the soil or at the surface, however, are highly nonlinear. This holds, for example, for evaporation processes, where stomata or aerodynamic resistances are nonlinear functions of the local micro-climate. Other examples are threshold dependent processes, e.g., the generation of runoff or the melting of snow. It has been shown that using averaged parameters in the computation of these processes leads to errors and especially biases, due to the involved nonlinearities. Thus it is necessary to account for the sub-grid scale surface heterogeneities in atmospheric modeling. One approach to take the variability of the earth's surface into account is the mosaic approach. Here the soil-vegetation-atmosphere transfer (SVAT) model is run on an explicit higher resolution than the atmospheric part of a coupled model, which is feasible due to generally lower computational costs of a SVAT model compared to the atmospheric part. The question arises how to deal with the scale differences at the interface between the two resolutions. Usually the assumption of a homogeneous forcing for all sub-pixels is made. However, over a heterogeneous surface, usually the boundary layer is also heterogeneous. Thus, by assuming a constant atmospheric forcing again biases in the turbulent heat fluxes may occur due to neglected atmospheric forcing variability. Therefore we have developed and tested a downscaling scheme to disaggregate the atmospheric variables of the lower atmosphere that are used as input to force a SVAT model. Our downscaling scheme consists of three steps: 1) a bi-quadratic spline interpolation of the coarse-resolution field; 2) a "deterministic" part, where relationships between surface and near-surface variables are exploited; and 3) a noise-generation step, in which the

  9. Downscaling global land-use/land-cover projections for use in region-level state-and-transition simulation modeling

    USGS Publications Warehouse

    Sherba, Jason T.; Sleeter, Benjamin M.; Davis, Adam W.; Parker, Owen P.

    2015-01-01

    Global land-use/land-cover (LULC) change projections and historical datasets are typically available at coarse grid resolutions and are often incompatible with modeling applications at local to regional scales. The difficulty of downscaling and reapportioning global gridded LULC change projections to regional boundaries is a barrier to the use of these datasets in a state-and-transition simulation model (STSM) framework. Here we compare three downscaling techniques to transform gridded LULC transitions into spatial scales and thematic LULC classes appropriate for use in a regional STSM. For each downscaling approach, Intergovernmental Panel on Climate Change (IPCC) Representative Concentration Pathway (RCP) LULC projections, at the 0.5 × 0.5 cell resolution, were downscaled to seven Level III ecoregions in the Pacific Northwest, United States. RCP transition values at each cell were downscaled based on the proportional distribution between ecoregions of (1) cell area, (2) land-cover composition derived from remotely-sensed imagery, and (3) historic LULC transition values from a LULC history database. Resulting downscaled LULC transition values were aggregated according to their bounding ecoregion and “cross-walked” to relevant LULC classes. Ecoregion-level LULC transition values were applied in a STSM projecting LULC change between 2005 and 2100. While each downscaling methods had advantages and disadvantages, downscaling using the historical land-use history dataset consistently apportioned RCP LULC transitions in agreement with historical observations. Regardless of the downscaling method, some LULC projections remain improbable and require further investigation.

  10. A comparison of breeding and ensemble transform vectors for global ensemble generation

    NASA Astrophysics Data System (ADS)

    Deng, Guo; Tian, Hua; Li, Xiaoli; Chen, Jing; Gong, Jiandong; Jiao, Meiyan

    2012-02-01

    To compare the initial perturbation techniques using breeding vectors and ensemble transform vectors, three ensemble prediction systems using both initial perturbation methods but with different ensemble member sizes based on the spectral model T213/L31 are constructed at the National Meteorological Center, China Meteorological Administration (NMC/CMA). A series of ensemble verification scores such as forecast skill of the ensemble mean, ensemble resolution, and ensemble reliability are introduced to identify the most important attributes of ensemble forecast systems. The results indicate that the ensemble transform technique is superior to the breeding vector method in light of the evaluation of anomaly correlation coefficient (ACC), which is a deterministic character of the ensemble mean, the root-mean-square error (RMSE) and spread, which are of probabilistic attributes, and the continuous ranked probability score (CRPS) and its decomposition. The advantage of the ensemble transform approach is attributed to its orthogonality among ensemble perturbations as well as its consistence with the data assimilation system. Therefore, this study may serve as a reference for configuration of the best ensemble prediction system to be used in operation.

  11. Spatial Downscaling of Alien Species Presences using Machine Learning

    NASA Astrophysics Data System (ADS)

    Daliakopoulos, Ioannis N.; Katsanevakis, Stelios; Moustakas, Aristides

    2017-07-01

    Large scale, high-resolution data on alien species distributions are essential for spatially explicit assessments of their environmental and socio-economic impacts, and management interventions for mitigation. However, these data are often unavailable. This paper presents a method that relies on Random Forest (RF) models to distribute alien species presence counts at a finer resolution grid, thus achieving spatial downscaling. A sufficiently large number of RF models are trained using random subsets of the dataset as predictors, in a bootstrapping approach to account for the uncertainty introduced by the subset selection. The method is tested with an approximately 8×8 km2 grid containing floral alien species presence and several indices of climatic, habitat, land use covariates for the Mediterranean island of Crete, Greece. Alien species presence is aggregated at 16×16 km2 and used as a predictor of presence at the original resolution, thus simulating spatial downscaling. Potential explanatory variables included habitat types, land cover richness, endemic species richness, soil type, temperature, precipitation, and freshwater availability. Uncertainty assessment of the spatial downscaling of alien species’ occurrences was also performed and true/false presences and absences were quantified. The approach is promising for downscaling alien species datasets of larger spatial scale but coarse resolution, where the underlying environmental information is available at a finer resolution than the alien species data. Furthermore, the RF architecture allows for tuning towards operationally optimal sensitivity and specificity, thus providing a decision support tool for designing a resource efficient alien species census.

  12. Statistical downscaling of mean temperature, maximum temperature, and minimum temperature on the Loess Plateau, China

    NASA Astrophysics Data System (ADS)

    Jiang, L.

    2017-12-01

    Climate change is considered to be one of the greatest environmental threats. Global climate models (GCMs) are the primary tool used for studying climate change. However, GCMs are limited because of their coarse spatial resolution and inability to resolve important sub-grid scale features such as terrain and clouds. Statistical downscaling methods can be used to downscale large-scale variables to local-scale. In this study, we assess the applicability of the Statistical Downscaling Model (SDSM) in downscaling the outputs from Beijing Normal University Earth System Model (BNU-ESM). The study focus on the the Loess Plateau, China, and the variables for downscaling include daily mean temperature (TMEAN), maximum temperature (TMAX) and minimum temperature (TMIN). The results show that SDSM performs well for these three climatic variables on the Loess Plateau. After downscaling, the root mean square errors for TMEAN, TMAX, TMIN for BNU-ESM were reduced by 70.9%, 75.1%, and 67.2%, respectively. All the rates of change in TMEAN, TMAX and TMIN during the 21st century decreased after SDSM downscaling. We also show that SDSM can effectively reduce uncertainty, compared with the raw model outputs. TMEAN uncertainty was reduced by 27.1%, 26.8%, and 16.3% for the future scenarios of RCP 2.6, RCP 4.5 and RCP 8.5, respectively. The corresponding reductions in uncertainty were 23.6%, 30.7%, and 18.7% for TMAX; 37.6%, 31.8%, and 23.2% for TMIN.

  13. Downscaling Land Surface Temperature in Complex Regions by Using Multiple Scale Factors with Adaptive Thresholds

    PubMed Central

    Yang, Yingbao; Li, Xiaolong; Pan, Xin; Zhang, Yong; Cao, Chen

    2017-01-01

    Many downscaling algorithms have been proposed to address the issue of coarse-resolution land surface temperature (LST) derived from available satellite-borne sensors. However, few studies have focused on improving LST downscaling in urban areas with several mixed surface types. In this study, LST was downscaled by a multiple linear regression model between LST and multiple scale factors in mixed areas with three or four surface types. The correlation coefficients (CCs) between LST and the scale factors were used to assess the importance of the scale factors within a moving window. CC thresholds determined which factors participated in the fitting of the regression equation. The proposed downscaling approach, which involves an adaptive selection of the scale factors, was evaluated using the LST derived from four Landsat 8 thermal imageries of Nanjing City in different seasons. Results of the visual and quantitative analyses show that the proposed approach achieves relatively satisfactory downscaling results on 11 August, with coefficient of determination and root-mean-square error of 0.87 and 1.13 °C, respectively. Relative to other approaches, our approach shows the similar accuracy and the availability in all seasons. The best (worst) availability occurred in the region of vegetation (water). Thus, the approach is an efficient and reliable LST downscaling method. Future tasks include reliable LST downscaling in challenging regions and the application of our model in middle and low spatial resolutions. PMID:28368301

  14. Which complexity of regional climate system models is essential for downscaling anthropogenic climate change in the Northwest European Shelf?

    NASA Astrophysics Data System (ADS)

    Mathis, Moritz; Elizalde, Alberto; Mikolajewicz, Uwe

    2018-04-01

    Climate change impact studies for the Northwest European Shelf (NWES) make use of various dynamical downscaling strategies in the experimental setup of regional ocean circulation models. Projected change signals from coupled and uncoupled downscalings with different domain sizes and forcing global and regional models show substantial uncertainty. In this paper, we investigate influences of the downscaling strategy on projected changes in the physical and biogeochemical conditions of the NWES. Our results indicate that uncertainties due to different downscaling strategies are similar to uncertainties due to the choice of the parent global model and the downscaling regional model. Downscaled change signals reveal to depend stronger on the downscaling strategy than on the model skills in simulating present-day conditions. Uncoupled downscalings of sea surface temperature (SST) changes are found to be tightly constrained by the atmospheric forcing. The incorporation of coupled air-sea interaction, by contrast, allows the regional model system to develop independently. Changes in salinity show a higher sensitivity to open lateral boundary conditions and river runoff than to coupled or uncoupled atmospheric forcings. Dependencies on the downscaling strategy for changes in SST, salinity, stratification and circulation collectively affect changes in nutrient import and biological primary production.

  15. Downscaling Thermal Infrared Radiance for Subpixel Land Surface Temperature Retrieval

    PubMed Central

    Liu, Desheng; Pu, Ruiliang

    2008-01-01

    Land surface temperature (LST) retrieved from satellite thermal sensors often consists of mixed temperature components. Retrieving subpixel LST is therefore needed in various environmental and ecological studies. In this paper, we developed two methods for downscaling coarse resolution thermal infrared (TIR) radiance for the purpose of subpixel temperature retrieval. The first method was developed on the basis of a scale-invariant physical model on TIR radiance. The second method was based on a statistical relationship between TIR radiance and land cover fraction at high spatial resolution. The two methods were applied to downscale simulated 990-m ASTER TIR data to 90-m resolution. When validated against the original 90-m ASTER TIR data, the results revealed that both downscaling methods were successful in capturing the general patterns of the original data and resolving considerable spatial details. Further quantitative assessments indicated a strong agreement between the true values and the estimated values by both methods. PMID:27879844

  16. Downscaling Thermal Infrared Radiance for Subpixel Land Surface Temperature Retrieval.

    PubMed

    Liu, Desheng; Pu, Ruiliang

    2008-04-06

    Land surface temperature (LST) retrieved from satellite thermal sensors often consists of mixed temperature components. Retrieving subpixel LST is therefore needed in various environmental and ecological studies. In this paper, we developed two methods for downscaling coarse resolution thermal infrared (TIR) radiance for the purpose of subpixel temperature retrieval. The first method was developed on the basis of a scale-invariant physical model on TIR radiance. The second method was based on a statistical relationship between TIR radiance and land cover fraction at high spatial resolution. The two methods were applied to downscale simulated 990-m ASTER TIR data to 90-m resolution. When validated against the original 90-m ASTER TIR data, the results revealed that both downscaling methods were successful in capturing the general patterns of the original data and resolving considerable spatial details. Further quantitative assessments indicated a strong agreement between the true values and the estimated values by both methods.

  17. Land surface temperature downscaling using random forest regression: primary result and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Pan, Xin; Cao, Chen; Yang, Yingbao; Li, Xiaolong; Shan, Liangliang; Zhu, Xi

    2018-04-01

    The land surface temperature (LST) derived from thermal infrared satellite images is a meaningful variable in many remote sensing applications. However, at present, the spatial resolution of the satellite thermal infrared remote sensing sensor is coarser, which cannot meet the needs. In this study, LST image was downscaled by a random forest model between LST and multiple predictors in an arid region with an oasis-desert ecotone. The proposed downscaling approach was evaluated using LST derived from the MODIS LST product of Zhangye City in Heihe Basin. The primary result of LST downscaling has been shown that the distribution of downscaled LST matched with that of the ecosystem of oasis and desert. By the way of sensitivity analysis, the most sensitive factors to LST downscaling were modified normalized difference water index (MNDWI)/normalized multi-band drought index (NMDI), soil adjusted vegetation index (SAVI)/ shortwave infrared reflectance (SWIR)/normalized difference vegetation index (NDVI), normalized difference building index (NDBI)/SAVI and SWIR/NDBI/MNDWI/NDWI for the region of water, vegetation, building and desert, with LST variation (at most) of 0.20/-0.22 K, 0.92/0.62/0.46 K, 0.28/-0.29 K and 3.87/-1.53/-0.64/-0.25 K in the situation of +/-0.02 predictor perturbances, respectively.

  18. Development and Application of a Soil Moisture Downscaling Method for Mobility Assessment

    DTIC Science & Technology

    2011-05-01

    instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send...REPORT Development and Application of a Soil Moisture Downscaling Method for Mobility Assessment 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: Soil...cells). Thus, a method is required to downscale intermediate-resolution patterns to finer resolutions. Fortunately, fine-resolution variations in

  19. Optimising predictor domains for spatially coherent precipitation downscaling

    NASA Astrophysics Data System (ADS)

    Radanovics, S.; Vidal, J.-P.; Sauquet, E.; Ben Daoud, A.; Bontron, G.

    2012-04-01

    Relationships between local precipitation (predictands) and large-scale circulation (predictors) are used for statistical downscaling purposes in various contexts, from medium-term forecasting to climate change impact studies. For hydrological purposes like flood forecasting, the downscaled precipitation spatial fields have furthermore to be coherent over possibly large basins. This thus first requires to know what predictor domain can be associated to the precipitation over each part of the studied basin. This study addresses this issue by identifying the optimum predictor domains over the whole of France, for a specific downscaling method based on a analogue approach and developed by Ben Daoud et al. (2011). The downscaling method used here is based on analogies on different variables: temperature, relative humidity, vertical velocity and geopotentials. The optimum predictor domain has been found to consist of the nearest grid cell for all variables except geopotentials (Ben Daoud et al., 2011). Moreover, geopotential domains have been found to be sensitive to the target location by Obled et al. (2002), and the present study thus focuses on optimizing the domains of this specific predictor over France. The predictor domains for geopotential at 500 hPa and 1000 hPa are optimised for 608 climatologically homogeneous zones in France using the ERA-40 reanalysis data for the large-scale predictors and local precipitation from the Safran near-surface atmospheric reanalysis (Vidal et al., 2010). The similarity of geopotential fields is measured by the Teweles and Wobus shape criterion. The predictive skill of different predictor domains for the different regions is tested with the Continuous Ranked Probability Score (CRPS) for the 25 best analogue days found with the statistical downscaling method. Rectangular predictor domains of different sizes, shapes and locations are tested, and the one that leads to the smallest CRPS for the zone in question is retained. The

  20. Examining the Stationarity Assumption for Statistically Downscaled Climate Projections of Precipitation

    NASA Astrophysics Data System (ADS)

    Wootten, A.; Dixon, K. W.; Lanzante, J. R.; Mcpherson, R. A.

    2017-12-01

    Empirical statistical downscaling (ESD) approaches attempt to refine global climate model (GCM) information via statistical relationships between observations and GCM simulations. The aim of such downscaling efforts is to create added-value climate projections by adding finer spatial detail and reducing biases. The results of statistical downscaling exercises are often used in impact assessments under the assumption that past performance provides an indicator of future results. Given prior research describing the danger of this assumption with regards to temperature, this study expands the perfect model experimental design from previous case studies to test the stationarity assumption with respect to precipitation. Assuming stationarity implies the performance of ESD methods are similar between the future projections and historical training. Case study results from four quantile-mapping based ESD methods demonstrate violations of the stationarity assumption for both central tendency and extremes of precipitation. These violations vary geographically and seasonally. For the four ESD methods tested the greatest challenges for downscaling of daily total precipitation projections occur in regions with limited precipitation and for extremes of precipitation along Southeast coastal regions. We conclude with a discussion of future expansion of the perfect model experimental design and the implications for improving ESD methods and providing guidance on the use of ESD techniques for impact assessments and decision-support.

  1. Intercomparison of Downscaling Methods on Hydrological Impact for Earth System Model of NE United States

    NASA Astrophysics Data System (ADS)

    Yang, P.; Fekete, B. M.; Rosenzweig, B.; Lengyel, F.; Vorosmarty, C. J.

    2012-12-01

    Atmospheric dynamics are essential inputs to Regional-scale Earth System Models (RESMs). Variables including surface air temperature, total precipitation, solar radiation, wind speed and humidity must be downscaled from coarse-resolution, global General Circulation Models (GCMs) to the high temporal and spatial resolution required for regional modeling. However, this downscaling procedure can be challenging due to the need to correct for bias from the GCM and to capture the spatiotemporal heterogeneity of the regional dynamics. In this study, the results obtained using several downscaling techniques and observational datasets were compared for a RESM of the Northeast Corridor of the United States. Previous efforts have enhanced GCM model outputs through bias correction using novel techniques. For example, the Climate Impact Research at Potsdam Institute developed a series of bias-corrected GCMs towards the next generation climate change scenarios (Schiermeier, 2012; Moss et al., 2010). Techniques to better represent the heterogeneity of climate variables have also been improved using statistical approaches (Maurer, 2008; Abatzoglou, 2011). For this study, four downscaling approaches to transform bias-corrected HADGEM2-ES Model output (daily at .5 x .5 degree) to the 3'*3'(longitude*latitude) daily and monthly resolution required for the Northeast RESM were compared: 1) Bilinear Interpolation, 2) Daily bias-corrected spatial downscaling (D-BCSD) with Gridded Meteorological Datasets (developed by Abazoglou 2011), 3) Monthly bias-corrected spatial disaggregation (M-BCSD) with CRU(Climate Research Unit) and 4) Dynamic Downscaling based on Weather Research and Forecast (WRF) model. Spatio-temporal analysis of the variability in precipitation was conducted over the study domain. Validation of the variables of different downscaling methods against observational datasets was carried out for assessment of the downscaled climate model outputs. The effects of using the

  2. Utilizing Machine Learning to Downscale SMAP L3_SM_P Brightness Temperatures in Iowa for Agricultural Applications

    NASA Astrophysics Data System (ADS)

    Chakrabarti, S.; Judge, J.; Bindlish, R.; Bongiovanni, T.; Jackson, T. J.

    2016-12-01

    The NASA Soil Moisture Active Passive (SMAP) mission provides global observations of brightness temperatures (TB) at 36km. For these observations to be relevant to studies in agricultural regions, the TB values need to be downscaled to finer resolutions. In this study, a machine learning algorithm is introduced for downscaling of TB from 36km to 9km. The algorithm uses image segmentation to cluster the study region based on meteorological and land cover similarity, followed by a support vector machine based regression that computes the value of the disaggregated TB at all pixels. High resolution remote sensing products such as land surface temperature, normalized difference vegetation index, enhanced vegetation index, precipitation, soil texture, and land-cover were used for downscaling. The algorithm was implemented in Iowa, United States, during the growing season from April to July 2015 when the SMAP L3-SM_AP TB product at 9 km was available for comparison. In addition, the downscaled estimates from the algorithm are compared with 9km TB obtained by resampling SMAP L1B_TB product at 36km. It was found that the downscaled TB were very similar to the SMAP-L3_SM _AP TB product, even for vegetated areas with a mean difference ≤ 5K. However, the standard deviation of the downscaled was lower by 7K than that of the AP product. The probability density functions of the downscaled TB were similar to the SMAP- TB. The results indicate that these downscaling algorithms may be used for downscaling TB using complex non-linear correlations on a grid without using active microwave observations.

  3. From ENSEMBLES to CORDEX: exploring the progress for hydrological impact research for the upper Danube basin

    NASA Astrophysics Data System (ADS)

    Stanzel, Philipp; Kling, Harald

    2017-04-01

    EURO-CORDEX Regional Climate Model (RCM) data are available as result of the latest initiative of the climate modelling community to provide ever improved simulations of past and future climate in Europe. The spatial resolution of the climate models increased from 25 x 25 km in the previous coordinated initiative, ENSEMBLES, to 12 x 12 km in the CORDEX EUR-11 simulations. This higher spatial resolution might yield improved representation of the historic climate, especially in complex mountainous terrain, improving applicability in impact studies. CORDEX scenario simulations are based on Representative Concentration Pathways, while ENSEMBLES applied the SRES greenhouse gas emission scenarios. The new emission scenarios might lead to different projections of future climate. In this contribution we explore these two dimensions of development from ENSEMBLES to CORDEX - representation of the past and projections for the future - in the context of a hydrological climate change impact study for the Danube River. We replicated previous hydrological simulations that used ENSEMBLES data of 21 RCM simulations under SRES A1B emission scenario as meteorological input data (Kling et al. 2012), and now applied CORDEX EUR-11 data of 16 RCM simulations under RCP4.5 and RCP8.5 emission scenarios. The climate variables precipitation and temperature were used to drive a monthly hydrological model of the upper Danube basin upstream of Vienna (100,000 km2). RCM data was bias corrected and downscaled to the scale of hydrological model units. Results with CORDEX data were compared with results with ENSEMBLES data, analysing both the driving meteorological input and the resulting discharge projections. Results with CORDEX data show no general improvement in the accuracy of representing historic climatic features, despite the increase in spatial model resolution. The tendency of ENSEMBLES scenario projections of increasing precipitation in winter and decreasing precipitation in summer is

  4. Nine Hundred Years of Weekly Streamflows: Stochastic Downscaling of Ensemble Tree-Ring Reconstructions

    NASA Astrophysics Data System (ADS)

    Sauchyn, David; Ilich, Nesa

    2017-11-01

    We combined the methods and advantages of stochastic hydrology and paleohydrology to estimate 900 years of weekly flows for the North and South Saskatchewan Rivers at Edmonton and Medicine Hat, Alberta, respectively. Regression models of water-year streamflow were constructed using historical naturalized flow data and a pool of 196 tree-ring (earlywood, latewood, and annual) ring-width chronologies from 76 sites. The tree-ring models accounted for up to 80% of the interannual variability in historical naturalized flows. We developed a new algorithm for generating stochastic time series of weekly flows constrained by the statistical properties of both the historical record and proxy streamflow data, and by the necessary condition that weekly flows correlate between the end of a year and the start of the next. A second innovation, enabled by the density of our tree-ring network, is to derive the paleohydrology from an ensemble of 100 statistically significant reconstructions at each gauge. Using paleoclimatic data to generate long series of weekly flow estimates augments the short historical record with an expanded range of hydrologic variability, including sequences of wet and dry years of greater length and severity. This unique hydrometric time series will enable evaluation of the reliability of current water supply and management systems given the range of hydroclimatic variability and extremes contained in the stochastic paleohydrology. It also could inform evaluation of the uncertainty in climate model projections, given that internal hydroclimatic variability is the dominant source of uncertainty.

  5. Spatial Downscaling of TRMM Precipitation using MODIS product in the Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Cho, H.; Choi, M.

    2013-12-01

    Precipitation is a major driving force in the water cycle. But, it is difficult to provide spatially distributed precipitation data from isolated individual in situ. The Tropical Rainfall Monitoring Mission (TRMM) satellite can provide precipitation data with relatively coarse spatial resolution (0.25° scale) at daily basis. In order to overcome the coarse spatial resolution of TRMM precipitation products, we conducted a downscaling technique using a scaling parameter from the Moderate Resolution Imaging Spectroradiometers (MODIS) sensor. In this study, statistical relations between precipitation estimates derived from the TRMM satellite and the normalized difference vegetation index (NDVI) which is obtained from the MODIS sensor in TERRA satellite are found for different spatial scales on the Korean peninsula in northeast Asia. We obtain the downscaled precipitation mapping by regression equation between yearly TRMM precipitations values and annual average NDVI aggregating 1km to 25 degree. The downscaled precipitation is validated using time series of the ground measurements precipitation dataset provided by Korea Meteorological Organization (KMO) from 2002 to 2005. To improve the spatial downscaling of precipitation, we will conduct a study about correlation between precipitation and land surface temperature, perceptible water and other hydrological parameters.

  6. Forecasting European Droughts using the North American Multi-Model Ensemble (NMME)

    NASA Astrophysics Data System (ADS)

    Thober, Stephan; Kumar, Rohini; Samaniego, Luis; Sheffield, Justin; Schäfer, David; Mai, Juliane

    2015-04-01

    Soil moisture droughts have the potential to diminish crop yields causing economic damage or even threatening the livelihood of societies. State-of-the-art drought forecasting systems incorporate seasonal meteorological forecasts to estimate future drought conditions. Meteorological forecasting skill (in particular that of precipitation), however, is limited to a few weeks because of the chaotic behaviour of the atmosphere. One of the most important challenges in drought forecasting is to understand how the uncertainty in the atmospheric forcings (e.g., precipitation and temperature) is further propagated into hydrologic variables such as soil moisture. The North American Multi-Model Ensemble (NMME) provides the latest collection of a multi-institutional seasonal forecasting ensemble for precipitation and temperature. In this study, we analyse the skill of NMME forecasts for predicting European drought events. The monthly NMME forecasts are downscaled to daily values to force the mesoscale hydrological model (mHM). The mHM soil moisture forecasts obtained with the forcings of the dynamical models are then compared against those obtained with the Ensemble Streamflow Prediction (ESP) approach. ESP recombines historical meteorological forcings to create a new ensemble forecast. Both forecasts are compared against reference soil moisture conditions obtained using observation based meteorological forcings. The study is conducted for the period from 1982 to 2009 and covers a large part of the Pan-European domain (10°W to 40°E and 35°N to 55°N). Results indicate that NMME forecasts are better at predicting the reference soil moisture variability as compared to ESP. For example, NMME explains 50% of the variability in contrast to only 31% by ESP at a six-month lead time. The Equitable Threat Skill Score (ETS), which combines the hit and false alarm rates, is analysed for drought events using a 0.2 threshold of a soil moisture percentile index. On average, the NMME

  7. Application of Unmanned Aerial Systems in Spatial Downscaling of Landsat VIR imageries of Agricultural Fields

    NASA Astrophysics Data System (ADS)

    Torres, A.; Hassan Esfahani, L.; Ebtehaj, A.; McKee, M.

    2016-12-01

    While coarse space-time resolution of satellite observations in visible to near infrared (VIR) is a serious limiting factor for applications in precision agriculture, high resolution remotes sensing observation by the Unmanned Aerial Systems (UAS) systems are also site-specific and still practically restrictive for widespread applications in precision agriculture. We present a modern spatial downscaling approach that relies on new sparse approximation techniques. The downscaling approach learns from a large set of coincident low- and high-resolution satellite and UAS observations to effectively downscale the satellite imageries in VIR bands. We focus on field experiments using the AggieAirTM platform and Landsat 7 ETM+ and Landsat 8 OLI observations obtained in an intensive field campaign in 2013 over an agriculture field in Scipio, Utah. The results show that the downscaling methods can effectively increase the resolution of Landsat VIR imageries by the order of 2 to 4 from 30 m to 15 and 7.5 m, respectively. Specifically, on average, the downscaling method reduces the root mean squared errors up to 26%, considering bias corrected AggieAir imageries as the reference.

  8. Structural uncertainty of downscaled climate model output in a difficult-to-resolve environment: data sparseness and parameterization error contribution to statistical and dynamical downscaling output in the U.S. Caribbean region

    NASA Astrophysics Data System (ADS)

    Terando, A. J.; Grade, S.; Bowden, J.; Henareh Khalyani, A.; Wootten, A.; Misra, V.; Collazo, J.; Gould, W. A.; Boyles, R.

    2016-12-01

    Sub-tropical island nations may be particularly vulnerable to anthropogenic climate change because of predicted changes in the hydrologic cycle that would lead to significant drying in the future. However, decision makers in these regions have seen their adaptation planning efforts frustrated by the lack of island-resolving climate model information. Recently, two investigations have used statistical and dynamical downscaling techniques to develop climate change projections for the U.S. Caribbean region (Puerto Rico and U.S. Virgin Islands). We compare the results from these two studies with respect to three commonly downscaled CMIP5 global climate models (GCMs). The GCMs were dynamically downscaled at a convective-permitting scale using two different regional climate models. The statistical downscaling approach was conducted at locations with long-term climate observations and then further post-processed using climatologically aided interpolation (yielding two sets of projections). Overall, both approaches face unique challenges. The statistical approach suffers from a lack of observations necessary to constrain the model, particularly at the land-ocean boundary and in complex terrain. The dynamically downscaled model output has a systematic dry bias over the island despite ample availability of moisture in the atmospheric column. Notwithstanding these differences, both approaches are consistent in projecting a drier climate that is driven by the strong global-scale anthropogenic forcing.

  9. Validation of spatial variability in downscaling results from the VALUE perfect predictor experiment

    NASA Astrophysics Data System (ADS)

    Widmann, Martin; Bedia, Joaquin; Gutiérrez, Jose Manuel; Maraun, Douglas; Huth, Radan; Fischer, Andreas; Keller, Denise; Hertig, Elke; Vrac, Mathieu; Wibig, Joanna; Pagé, Christian; Cardoso, Rita M.; Soares, Pedro MM; Bosshard, Thomas; Casado, Maria Jesus; Ramos, Petra

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research. Within VALUE a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods has been developed. In the first validation experiment the downscaling methods are validated in a setup with perfect predictors taken from the ERA-interim reanalysis for the period 1997 - 2008. This allows to investigate the isolated skill of downscaling methods without further error contributions from the large-scale predictors. One aspect of the validation is the representation of spatial variability. As part of the VALUE validation we have compared various properties of the spatial variability of downscaled daily temperature and precipitation with the corresponding properties in observations. We have used two test validation datasets, one European-wide set of 86 stations, and one higher-density network of 50 stations in Germany. Here we present results based on three approaches, namely the analysis of i.) correlation matrices, ii.) pairwise joint threshold exceedances, and iii.) regions of similar variability. We summarise the information contained in correlation matrices by calculating the dependence of the correlations on distance and deriving decorrelation lengths, as well as by determining the independent degrees of freedom. Probabilities for joint threshold exceedances and (where appropriate) non-exceedances are calculated for various user-relevant thresholds related for instance to extreme precipitation or frost and heat days. The dependence of these probabilities on distance is again characterised by calculating typical length scales that separate dependent from independent exceedances. Regionalisation is based on rotated Principal Component Analysis. The results indicate which downscaling methods are preferable if the dependency of variability at different locations is relevant for the user.

  10. Multifractal Downscaling of Rainfall Using Normalized Difference Vegetation Index (NDVI) in the Andes Plateau.

    PubMed

    Duffaut Espinosa, L A; Posadas, A N; Carbajal, M; Quiroz, R

    2017-01-01

    In this paper, a multifractal downscaling technique is applied to adequately transformed and lag corrected normalized difference vegetation index (NDVI) in order to obtain daily estimates of rainfall in an area of the Peruvian Andean high plateau. This downscaling procedure is temporal in nature since the original NDVI information is provided at an irregular temporal sampling period between 8 and 11 days, and the desired final scale is 1 day. The spatial resolution of approximately 1 km remains the same throughout the downscaling process. The results were validated against on-site measurements of meteorological stations distributed in the area under study.

  11. Multifractal Downscaling of Rainfall Using Normalized Difference Vegetation Index (NDVI) in the Andes Plateau

    PubMed Central

    Posadas, A. N.; Carbajal, M.; Quiroz, R.

    2017-01-01

    In this paper, a multifractal downscaling technique is applied to adequately transformed and lag corrected normalized difference vegetation index (NDVI) in order to obtain daily estimates of rainfall in an area of the Peruvian Andean high plateau. This downscaling procedure is temporal in nature since the original NDVI information is provided at an irregular temporal sampling period between 8 and 11 days, and the desired final scale is 1 day. The spatial resolution of approximately 1 km remains the same throughout the downscaling process. The results were validated against on-site measurements of meteorological stations distributed in the area under study. PMID:28125607

  12. HLPI-Ensemble: Prediction of human lncRNA-protein interactions based on ensemble strategy.

    PubMed

    Hu, Huan; Zhang, Li; Ai, Haixin; Zhang, Hui; Fan, Yetian; Zhao, Qi; Liu, Hongsheng

    2018-03-27

    LncRNA plays an important role in many biological and disease progression by binding to related proteins. However, the experimental methods for studying lncRNA-protein interactions are time-consuming and expensive. Although there are a few models designed to predict the interactions of ncRNA-protein, they all have some common drawbacks that limit their predictive performance. In this study, we present a model called HLPI-Ensemble designed specifically for human lncRNA-protein interactions. HLPI-Ensemble adopts the ensemble strategy based on three mainstream machine learning algorithms of Support Vector Machines (SVM), Random Forests (RF) and Extreme Gradient Boosting (XGB) to generate HLPI-SVM Ensemble, HLPI-RF Ensemble and HLPI-XGB Ensemble, respectively. The results of 10-fold cross-validation show that HLPI-SVM Ensemble, HLPI-RF Ensemble and HLPI-XGB Ensemble achieved AUCs of 0.95, 0.96 and 0.96, respectively, in the test dataset. Furthermore, we compared the performance of the HLPI-Ensemble models with the previous models through external validation dataset. The results show that the false positives (FPs) of HLPI-Ensemble models are much lower than that of the previous models, and other evaluation indicators of HLPI-Ensemble models are also higher than those of the previous models. It is further showed that HLPI-Ensemble models are superior in predicting human lncRNA-protein interaction compared with previous models. The HLPI-Ensemble is publicly available at: http://ccsipb.lnu.edu.cn/hlpiensemble/ .

  13. On the downscaling of actual evapotranspiration maps based on combination of MODIS and landsat-based actual evapotranspiration estimates

    USGS Publications Warehouse

    Singh, Ramesh K.; Senay, Gabriel B.; Velpuri, Naga Manohar; Bohms, Stefanie; Verdin, James P.

    2014-01-01

     Downscaling is one of the important ways of utilizing the combined benefits of the high temporal resolution of Moderate Resolution Imaging Spectroradiometer (MODIS) images and fine spatial resolution of Landsat images. We have evaluated the output regression with intercept method and developed the Linear with Zero Intercept (LinZI) method for downscaling MODIS-based monthly actual evapotranspiration (AET) maps to the Landsat-scale monthly AET maps for the Colorado River Basin for 2010. We used the 8-day MODIS land surface temperature product (MOD11A2) and 328 cloud-free Landsat images for computing AET maps and downscaling. The regression with intercept method does have limitations in downscaling if the slope and intercept are computed over a large area. A good agreement was obtained between downscaled monthly AET using the LinZI method and the eddy covariance measurements from seven flux sites within the Colorado River Basin. The mean bias ranged from −16 mm (underestimation) to 22 mm (overestimation) per month, and the coefficient of determination varied from 0.52 to 0.88. Some discrepancies between measured and downscaled monthly AET at two flux sites were found to be due to the prevailing flux footprint. A reasonable comparison was also obtained between downscaled monthly AET using LinZI method and the gridded FLUXNET dataset. The downscaled monthly AET nicely captured the temporal variation in sampled land cover classes. The proposed LinZI method can be used at finer temporal resolution (such as 8 days) with further evaluation. The proposed downscaling method will be very useful in advancing the application of remotely sensed images in water resources planning and management.

  14. a Spiral-Based Downscaling Method for Generating 30 M Time Series Image Data

    NASA Astrophysics Data System (ADS)

    Liu, B.; Chen, J.; Xing, H.; Wu, H.; Zhang, J.

    2017-09-01

    The spatial detail and updating frequency of land cover data are important factors influencing land surface dynamic monitoring applications in high spatial resolution scale. However, the fragmentized patches and seasonal variable of some land cover types (e. g. small crop field, wetland) make it labor-intensive and difficult in the generation of land cover data. Utilizing the high spatial resolution multi-temporal image data is a possible solution. Unfortunately, the spatial and temporal resolution of available remote sensing data like Landsat or MODIS datasets can hardly satisfy the minimum mapping unit and frequency of current land cover mapping / updating at the same time. The generation of high resolution time series may be a compromise to cover the shortage in land cover updating process. One of popular way is to downscale multi-temporal MODIS data with other high spatial resolution auxiliary data like Landsat. But the usual manner of downscaling pixel based on a window may lead to the underdetermined problem in heterogeneous area, result in the uncertainty of some high spatial resolution pixels. Therefore, the downscaled multi-temporal data can hardly reach high spatial resolution as Landsat data. A spiral based method was introduced to downscale low spatial and high temporal resolution image data to high spatial and high temporal resolution image data. By the way of searching the similar pixels around the adjacent region based on the spiral, the pixel set was made up in the adjacent region pixel by pixel. The underdetermined problem is prevented to a large extent from solving the linear system when adopting the pixel set constructed. With the help of ordinary least squares, the method inverted the endmember values of linear system. The high spatial resolution image was reconstructed on the basis of high spatial resolution class map and the endmember values band by band. Then, the high spatial resolution time series was formed with these high spatial

  15. Uncertainty Assessment of the NASA Earth Exchange Global Daily Downscaled Climate Projections (NEX-GDDP) Dataset

    NASA Technical Reports Server (NTRS)

    Wang, Weile; Nemani, Ramakrishna R.; Michaelis, Andrew; Hashimoto, Hirofumi; Dungan, Jennifer L.; Thrasher, Bridget L.; Dixon, Keith W.

    2016-01-01

    The NASA Earth Exchange Global Daily Downscaled Projections (NEX-GDDP) dataset is comprised of downscaled climate projections that are derived from 21 General Circulation Model (GCM) runs conducted under the Coupled Model Intercomparison Project Phase 5 (CMIP5) and across two of the four greenhouse gas emissions scenarios (RCP4.5 and RCP8.5). Each of the climate projections includes daily maximum temperature, minimum temperature, and precipitation for the periods from 1950 through 2100 and the spatial resolution is 0.25 degrees (approximately 25 km x 25 km). The GDDP dataset has received warm welcome from the science community in conducting studies of climate change impacts at local to regional scales, but a comprehensive evaluation of its uncertainties is still missing. In this study, we apply the Perfect Model Experiment framework (Dixon et al. 2016) to quantify the key sources of uncertainties from the observational baseline dataset, the downscaling algorithm, and some intrinsic assumptions (e.g., the stationary assumption) inherent to the statistical downscaling techniques. We developed a set of metrics to evaluate downscaling errors resulted from bias-correction ("quantile-mapping"), spatial disaggregation, as well as the temporal-spatial non-stationarity of climate variability. Our results highlight the spatial disaggregation (or interpolation) errors, which dominate the overall uncertainties of the GDDP dataset, especially over heterogeneous and complex terrains (e.g., mountains and coastal area). In comparison, the temporal errors in the GDDP dataset tend to be more constrained. Our results also indicate that the downscaled daily precipitation also has relatively larger uncertainties than the temperature fields, reflecting the rather stochastic nature of precipitation in space. Therefore, our results provide insights in improving statistical downscaling algorithms and products in the future.

  16. Individual differences in ensemble perception reveal multiple, independent levels of ensemble representation.

    PubMed

    Haberman, Jason; Brady, Timothy F; Alvarez, George A

    2015-04-01

    Ensemble perception, including the ability to "see the average" from a group of items, operates in numerous feature domains (size, orientation, speed, facial expression, etc.). Although the ubiquity of ensemble representations is well established, the large-scale cognitive architecture of this process remains poorly defined. We address this using an individual differences approach. In a series of experiments, observers saw groups of objects and reported either a single item from the group or the average of the entire group. High-level ensemble representations (e.g., average facial expression) showed complete independence from low-level ensemble representations (e.g., average orientation). In contrast, low-level ensemble representations (e.g., orientation and color) were correlated with each other, but not with high-level ensemble representations (e.g., facial expression and person identity). These results suggest that there is not a single domain-general ensemble mechanism, and that the relationship among various ensemble representations depends on how proximal they are in representational space. (c) 2015 APA, all rights reserved).

  17. Deriving temporally continuous soil moisture estimations at fine resolution by downscaling remotely sensed product

    NASA Astrophysics Data System (ADS)

    Jin, Yan; Ge, Yong; Wang, Jianghao; Heuvelink, Gerard B. M.

    2018-06-01

    Land surface soil moisture (SSM) has important roles in the energy balance of the land surface and in the water cycle. Downscaling of coarse-resolution SSM remote sensing products is an efficient way for producing fine-resolution data. However, the downscaling methods used most widely require full-coverage visible/infrared satellite data as ancillary information. These methods are restricted to cloud-free days, making them unsuitable for continuous monitoring. The purpose of this study is to overcome this limitation to obtain temporally continuous fine-resolution SSM estimations. The local spatial heterogeneities of SSM and multiscale ancillary variables were considered in the downscaling process both to solve the problem of the strong variability of SSM and to benefit from the fusion of ancillary information. The generation of continuous downscaled remote sensing data was achieved via two principal steps. For cloud-free days, a stepwise hybrid geostatistical downscaling approach, based on geographically weighted area-to-area regression kriging (GWATARK), was employed by combining multiscale ancillary variables with passive microwave remote sensing data. Then, the GWATARK-estimated SSM and China Soil Moisture Dataset from Microwave Data Assimilation SSM data were combined to estimate fine-resolution data for cloudy days. The developed methodology was validated by application to the 25-km resolution daily AMSR-E SSM product to produce continuous SSM estimations at 1-km resolution over the Tibetan Plateau. In comparison with ground-based observations, the downscaled estimations showed correlation (R ≥ 0.7) for both ascending and descending overpasses. The analysis indicated the high potential of the proposed approach for producing a temporally continuous SSM product at fine spatial resolution.

  18. Ensembl regulation resources

    PubMed Central

    Zerbino, Daniel R.; Johnson, Nathan; Juetteman, Thomas; Sheppard, Dan; Wilder, Steven P.; Lavidas, Ilias; Nuhn, Michael; Perry, Emily; Raffaillac-Desfosses, Quentin; Sobral, Daniel; Keefe, Damian; Gräf, Stefan; Ahmed, Ikhlak; Kinsella, Rhoda; Pritchard, Bethan; Brent, Simon; Amode, Ridwan; Parker, Anne; Trevanion, Steven; Birney, Ewan; Dunham, Ian; Flicek, Paul

    2016-01-01

    New experimental techniques in epigenomics allow researchers to assay a diversity of highly dynamic features such as histone marks, DNA modifications or chromatin structure. The study of their fluctuations should provide insights into gene expression regulation, cell differentiation and disease. The Ensembl project collects and maintains the Ensembl regulation data resources on epigenetic marks, transcription factor binding and DNA methylation for human and mouse, as well as microarray probe mappings and annotations for a variety of chordate genomes. From this data, we produce a functional annotation of the regulatory elements along the human and mouse genomes with plans to expand to other species as data becomes available. Starting from well-studied cell lines, we will progressively expand our library of measurements to a greater variety of samples. Ensembl’s regulation resources provide a central and easy-to-query repository for reference epigenomes. As with all Ensembl data, it is freely available at http://www.ensembl.org, from the Perl and REST APIs and from the public Ensembl MySQL database server at ensembldb.ensembl.org. Database URL: http://www.ensembl.org PMID:26888907

  19. Identification of robust statistical downscaling methods based on a comprehensive suite of performance metrics for South Korea

    NASA Astrophysics Data System (ADS)

    Eum, H. I.; Cannon, A. J.

    2015-12-01

    Climate models are a key provider to investigate impacts of projected future climate conditions on regional hydrologic systems. However, there is a considerable mismatch of spatial resolution between GCMs and regional applications, in particular a region characterized by complex terrain such as Korean peninsula. Therefore, a downscaling procedure is an essential to assess regional impacts of climate change. Numerous statistical downscaling methods have been used mainly due to the computational efficiency and simplicity. In this study, four statistical downscaling methods [Bias-Correction/Spatial Disaggregation (BCSD), Bias-Correction/Constructed Analogue (BCCA), Multivariate Adaptive Constructed Analogs (MACA), and Bias-Correction/Climate Imprint (BCCI)] are applied to downscale the latest Climate Forecast System Reanalysis data to stations for precipitation, maximum temperature, and minimum temperature over South Korea. By split sampling scheme, all methods are calibrated with observational station data for 19 years from 1973 to 1991 are and tested for the recent 19 years from 1992 to 2010. To assess skill of the downscaling methods, we construct a comprehensive suite of performance metrics that measure an ability of reproducing temporal correlation, distribution, spatial correlation, and extreme events. In addition, we employ Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) to identify robust statistical downscaling methods based on the performance metrics for each season. The results show that downscaling skill is considerably affected by the skill of CFSR and all methods lead to large improvements in representing all performance metrics. According to seasonal performance metrics evaluated, when TOPSIS is applied, MACA is identified as the most reliable and robust method for all variables and seasons. Note that such result is derived from CFSR output which is recognized as near perfect climate data in climate studies. Therefore, the

  20. NASA Downscaling Project: Final Report

    NASA Technical Reports Server (NTRS)

    Ferraro, Robert; Waliser, Duane; Peters-Lidard, Christa

    2017-01-01

    A team of researchers from NASA Ames Research Center, Goddard Space Flight Center, the Jet Propulsion Laboratory, and Marshall Space Flight Center, along with university partners at UCLA, conducted an investigation to explore whether downscaling coarse resolution global climate model (GCM) predictions might provide valid insights into the regional impacts sought by decision makers. Since the computational cost of running global models at high spatial resolution for any useful climate scale period is prohibitive, the hope for downscaling is that a coarse resolution GCM provides sufficiently accurate synoptic scale information for a regional climate model (RCM) to accurately develop fine scale features that represent the regional impacts of a changing climate. As a proxy for a prognostic climate forecast model, and so that ground truth in the form of satellite and in-situ observations could be used for evaluation, the MERRA and MERRA - 2 reanalyses were used to drive the NU - WRF regional climate model and a GEOS - 5 replay. This was performed at various resolutions that were at factors of 2 to 10 higher than the reanalysis forcing. A number of experiments were conducted that varied resolution, model parameterizations, and intermediate scale nudging, for simulations over the continental US during the period from 2000 - 2010. The results of these experiments were compared to observational datasets to evaluate the output.

  1. Reliability of the North America CORDEX and NARCCAP simulations in the context of uncertainty in regional climate change projections

    NASA Astrophysics Data System (ADS)

    Karmalkar, A.

    2017-12-01

    Ensembles of dynamically downscaled climate change simulations are routinely used to capture uncertainty in projections at regional scales. I assess the reliability of two such ensembles for North America - NARCCAP and NA-CORDEX - by investigating the impact of model selection on representing uncertainty in regional projections, and the ability of the regional climate models (RCMs) to provide reliable information. These aspects - discussed for the six regions used in the US National Climate Assessment - provide an important perspective on the interpretation of downscaled results. I show that selecting general circulation models for downscaling based on their equilibrium climate sensitivities is a reasonable choice, but the six models chosen for NA-CORDEX do a poor job at representing uncertainty in winter temperature and precipitation projections in many parts of the eastern US, which lead to overconfident projections. The RCM performance is highly variable across models, regions, and seasons and the ability of the RCMs to provide improved seasonal mean performance relative to their parent GCMs seems limited in both RCM ensembles. Additionally, the ability of the RCMs to simulate historical climates is not strongly related to their ability to simulate climate change across the ensemble. This finding suggests limited use of models' historical performance to constrain their projections. Given these challenges in dynamical downscaling, the RCM results should not be used in isolation. Information on how well the RCM ensembles represent known uncertainties in regional climate change projections discussed here needs to be communicated clearly to inform maagement decisions.

  2. The effects of climate downscaling technique and observational data set on modeled ecological responses

    Treesearch

    Afshin Pourmokhtarian; Charles T. Driscoll; John L. Campbell; Katharine Hayhoe; Anne M. K. Stoner

    2016-01-01

    Assessments of future climate change impacts on ecosystems typically rely on multiple climate model projections, but often utilize only one downscaling approach trained on one set of observations. Here, we explore the extent to which modeled biogeochemical responses to changing climate are affected by the selection of the climate downscaling method and training...

  3. Importance of Preserving Cross-correlation in developing Statistically Downscaled Climate Forcings and in estimating Land-surface Fluxes and States

    NASA Astrophysics Data System (ADS)

    Das Bhowmik, R.; Arumugam, S.

    2015-12-01

    Multivariate downscaling techniques exhibited superiority over univariate regression schemes in terms of preserving cross-correlations between multiple variables- precipitation and temperature - from GCMs. This study focuses on two aspects: (a) develop an analytical solutions on estimating biases in cross-correlations from univariate downscaling approaches and (b) quantify the uncertainty in land-surface states and fluxes due to biases in cross-correlations in downscaled climate forcings. Both these aspects are evaluated using climate forcings available from both historical climate simulations and CMIP5 hindcasts over the entire US. The analytical solution basically relates the univariate regression parameters, co-efficient of determination of regression and the co-variance ratio between GCM and downscaled values. The analytical solutions are compared with the downscaled univariate forcings by choosing the desired p-value (Type-1 error) in preserving the observed cross-correlation. . For quantifying the impacts of biases on cross-correlation on estimating streamflow and groundwater, we corrupt the downscaled climate forcings with different cross-correlation structure.

  4. MSEBAG: a dynamic classifier ensemble generation based on `minimum-sufficient ensemble' and bagging

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Kamel, Mohamed S.

    2016-01-01

    In this paper, we propose a dynamic classifier system, MSEBAG, which is characterised by searching for the 'minimum-sufficient ensemble' and bagging at the ensemble level. It adopts an 'over-generation and selection' strategy and aims to achieve a good bias-variance trade-off. In the training phase, MSEBAG first searches for the 'minimum-sufficient ensemble', which maximises the in-sample fitness with the minimal number of base classifiers. Then, starting from the 'minimum-sufficient ensemble', a backward stepwise algorithm is employed to generate a collection of ensembles. The objective is to create a collection of ensembles with a descending fitness on the data, as well as a descending complexity in the structure. MSEBAG dynamically selects the ensembles from the collection for the decision aggregation. The extended adaptive aggregation (EAA) approach, a bagging-style algorithm performed at the ensemble level, is employed for this task. EAA searches for the competent ensembles using a score function, which takes into consideration both the in-sample fitness and the confidence of the statistical inference, and averages the decisions of the selected ensembles to label the test pattern. The experimental results show that the proposed MSEBAG outperforms the benchmarks on average.

  5. Expansion of the On-line Archive "Statistically Downscaled WCRP CMIP3 Climate Projections"

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Das, T.; Duffy, P.; White, K.

    2009-12-01

    response, archive developers are adding content in 2010, teaming with Scripps Institution of Oceanography (through their NOAA-RISA California-Nevada Applications Program and the California Climate Change Center) to apply a new daily downscaling technique to a sub-ensemble of the archive’s CMIP3 projections. The new technique, Bias-Corrected Constructed Analogs, combines the BC part of BCSD with a recently developed technique that preserves the daily sequencing structure of CMIP3 projections (Constructed Analogs, or CA). Such data will more easily serve hydrologic and ecological impacts assessments, and offer an opportunity to evaluate projection uncertainty associated with downscaling technique. Looking ahead to the arrival CMIP5 projections, archive collaborators have plans apply both BCSD and BCCA over the contiguous U.S. consistent with CMIP3 applications above, and also apply BCSD globally at a 0.5 degree spatial resolution. The latter effort involves collaboration with U.S. Army Corps of Engineers (USACE) and Climate Central.

  6. The fundamental downscaling limit of field effect transistors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mamaluy, Denis, E-mail: mamaluy@sandia.gov; Gao, Xujiao

    2015-05-11

    We predict that within next 15 years a fundamental down-scaling limit for CMOS technology and other Field-Effect Transistors (FETs) will be reached. Specifically, we show that at room temperatures all FETs, irrespective of their channel material, will start experiencing unacceptable level of thermally induced errors around 5-nm gate lengths. These findings were confirmed by performing quantum mechanical transport simulations for a variety of 6-, 5-, and 4-nm gate length Si devices, optimized to satisfy high-performance logic specifications by ITRS. Different channel materials and wafer/channel orientations have also been studied; it is found that altering channel-source-drain materials achieves only insignificant increasemore » in switching energy, which overall cannot sufficiently delay the approaching downscaling limit. Alternative possibilities are discussed to continue the increase of logic element densities for room temperature operation below the said limit.« less

  7. The fundamental downscaling limit of field effect transistors

    DOE PAGES

    Mamaluy, Denis; Gao, Xujiao

    2015-05-12

    We predict that within next 15 years a fundamental down-scaling limit for CMOS technology and other Field-Effect Transistors (FETs) will be reached. Specifically, we show that at room temperatures all FETs, irrespective of their channel material, will start experiencing unacceptable level of thermally induced errors around 5-nm gate lengths. These findings were confirmed by performing quantum mechanical transport simulations for a variety of 6-, 5-, and 4-nm gate length Si devices, optimized to satisfy high-performance logic specifications by ITRS. Different channel materials and wafer/channel orientations have also been studied; it is found that altering channel-source-drain materials achieves only insignificant increasemore » in switching energy, which overall cannot sufficiently delay the approaching downscaling limit. Alternative possibilities are discussed to continue the increase of logic element densities for room temperature operation below the said limit.« less

  8. Ensembl 2004.

    PubMed

    Birney, E; Andrews, D; Bevan, P; Caccamo, M; Cameron, G; Chen, Y; Clarke, L; Coates, G; Cox, T; Cuff, J; Curwen, V; Cutts, T; Down, T; Durbin, R; Eyras, E; Fernandez-Suarez, X M; Gane, P; Gibbins, B; Gilbert, J; Hammond, M; Hotz, H; Iyer, V; Kahari, A; Jekosch, K; Kasprzyk, A; Keefe, D; Keenan, S; Lehvaslaiho, H; McVicker, G; Melsopp, C; Meidl, P; Mongin, E; Pettett, R; Potter, S; Proctor, G; Rae, M; Searle, S; Slater, G; Smedley, D; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Storey, R; Ureta-Vidal, A; Woodwark, C; Clamp, M; Hubbard, T

    2004-01-01

    The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organize biology around the sequences of large genomes. It is a comprehensive and integrated source of annotation of large genome sequences, available via interactive website, web services or flat files. As well as being one of the leading sources of genome annotation, Ensembl is an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements. The facilities of the system range from sequence analysis to data storage and visualization and installations exist around the world both in companies and at academic sites. With a total of nine genome sequences available from Ensembl and more genomes to follow, recent developments have focused mainly on closer integration between genomes and external data.

  9. Optimizing dynamic downscaling in one-way nesting using a regional ocean model

    NASA Astrophysics Data System (ADS)

    Pham, Van Sy; Hwang, Jin Hwan; Ku, Hyeyun

    2016-10-01

    Dynamical downscaling with nested regional oceanographic models has been demonstrated to be an effective approach for both operationally forecasted sea weather on regional scales and projections of future climate change and its impact on the ocean. However, when nesting procedures are carried out in dynamic downscaling from a larger-scale model or set of observations to a smaller scale, errors are unavoidable due to the differences in grid sizes and updating intervals. The present work assesses the impact of errors produced by nesting procedures on the downscaled results from Ocean Regional Circulation Models (ORCMs). Errors are identified and evaluated based on their sources and characteristics by employing the Big-Brother Experiment (BBE). The BBE uses the same model to produce both nesting and nested simulations; so it addresses those error sources separately (i.e., without combining the contributions of errors from different sources). Here, we focus on discussing errors resulting from the spatial grids' differences, the updating times and the domain sizes. After the BBE was separately run for diverse cases, a Taylor diagram was used to analyze the results and recommend an optimal combination of grid size, updating period and domain sizes. Finally, suggested setups for the downscaling were evaluated by examining the spatial correlations of variables and the relative magnitudes of variances between the nested model and the original data.

  10. Downscaling GISS ModelE Boreal Summer Climate over Africa

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.; Fulakeza, Matthew

    2015-01-01

    The study examines the perceived added value of downscaling atmosphere-ocean global climate model simulations over Africa and adjacent oceans by a nested regional climate model. NASA/Goddard Institute for Space Studies (GISS) coupled ModelE simulations for June- September 1998-2002 are used to form lateral boundary conditions for synchronous simulations by the GISS RM3 regional climate model. The ModelE computational grid spacing is 2deg latitude by 2.5deg longitude and the RM3 grid spacing is 0.44deg. ModelE precipitation climatology for June-September 1998-2002 is shown to be a good proxy for 30-year means so results based on the 5-year sample are presumed to be generally representative. Comparison with observational evidence shows several discrepancies in ModelE configuration of the boreal summer inter-tropical convergence zone (ITCZ). One glaring shortcoming is that ModelE simulations do not advance the West African rain band northward during the summer to represent monsoon precipitation onset over the Sahel. Results for 1998-2002 show that onset simulation is an important added value produced by downscaling with RM3. ModelE Eastern South Atlantic Ocean computed sea-surface temperatures (SST) are some 4 K warmer than reanalysis, contributing to large positive biases in overlying surface air temperatures (Tsfc). ModelE Tsfc are also too warm over most of Africa. RM3 downscaling somewhat mitigates the magnitude of Tsfc biases over the African continent, it eliminates the ModelE double ITCZ over the Atlantic and it produces more realistic orographic precipitation maxima. Parallel ModelE and RM3 simulations with observed SST forcing (in place of the predicted ocean) lower Tsfc errors but have mixed impacts on circulation and precipitation biases. Downscaling improvements of the meridional movement of the rain band over West Africa and the configuration of orographic precipitation maxima are realized irrespective of the SST biases.

  11. Spatial downscaling of soil prediction models based on weighted generalized additive models in smallholder farm settings.

    PubMed

    Xu, Yiming; Smith, Scot E; Grunwald, Sabine; Abd-Elrahman, Amr; Wani, Suhas P; Nair, Vimala D

    2017-09-11

    Digital soil mapping (DSM) is gaining momentum as a technique to help smallholder farmers secure soil security and food security in developing regions. However, communications of the digital soil mapping information between diverse audiences become problematic due to the inconsistent scale of DSM information. Spatial downscaling can make use of accessible soil information at relatively coarse spatial resolution to provide valuable soil information at relatively fine spatial resolution. The objective of this research was to disaggregate the coarse spatial resolution soil exchangeable potassium (K ex ) and soil total nitrogen (TN) base map into fine spatial resolution soil downscaled map using weighted generalized additive models (GAMs) in two smallholder villages in South India. By incorporating fine spatial resolution spectral indices in the downscaling process, the soil downscaled maps not only conserve the spatial information of coarse spatial resolution soil maps but also depict the spatial details of soil properties at fine spatial resolution. The results of this study demonstrated difference between the fine spatial resolution downscaled maps and fine spatial resolution base maps is smaller than the difference between coarse spatial resolution base maps and fine spatial resolution base maps. The appropriate and economical strategy to promote the DSM technique in smallholder farms is to develop the relatively coarse spatial resolution soil prediction maps or utilize available coarse spatial resolution soil maps at the regional scale and to disaggregate these maps to the fine spatial resolution downscaled soil maps at farm scale.

  12. Statistical Downscaling and Bias Correction of Climate Model Outputs for Climate Change Impact Assessment in the U.S. Northeast

    NASA Technical Reports Server (NTRS)

    Ahmed, Kazi Farzan; Wang, Guiling; Silander, John; Wilson, Adam M.; Allen, Jenica M.; Horton, Radley; Anyah, Richard

    2013-01-01

    Statistical downscaling can be used to efficiently downscale a large number of General Circulation Model (GCM) outputs to a fine temporal and spatial scale. To facilitate regional impact assessments, this study statistically downscales (to 1/8deg spatial resolution) and corrects the bias of daily maximum and minimum temperature and daily precipitation data from six GCMs and four Regional Climate Models (RCMs) for the northeast United States (US) using the Statistical Downscaling and Bias Correction (SDBC) approach. Based on these downscaled data from multiple models, five extreme indices were analyzed for the future climate to quantify future changes of climate extremes. For a subset of models and indices, results based on raw and bias corrected model outputs for the present-day climate were compared with observations, which demonstrated that bias correction is important not only for GCM outputs, but also for RCM outputs. For future climate, bias correction led to a higher level of agreements among the models in predicting the magnitude and capturing the spatial pattern of the extreme climate indices. We found that the incorporation of dynamical downscaling as an intermediate step does not lead to considerable differences in the results of statistical downscaling for the study domain.

  13. Downscaled and debiased climate simulations for North America from 21,000 years ago to 2100AD

    PubMed Central

    Lorenz, David J.; Nieto-Lugilde, Diego; Blois, Jessica L.; Fitzpatrick, Matthew C.; Williams, John W.

    2016-01-01

    Increasingly, ecological modellers are integrating paleodata with future projections to understand climate-driven biodiversity dynamics from the past through the current century. Climate simulations from earth system models are necessary to this effort, but must be debiased and downscaled before they can be used by ecological models. Downscaling methods and observational baselines vary among researchers, which produces confounding biases among downscaled climate simulations. We present unified datasets of debiased and downscaled climate simulations for North America from 21 ka BP to 2100AD, at 0.5° spatial resolution. Temporal resolution is decadal averages of monthly data until 1950AD, average climates for 1950–2005 AD, and monthly data from 2010 to 2100AD, with decadal averages also provided. This downscaling includes two transient paleoclimatic simulations and 12 climate models for the IPCC AR5 (CMIP5) historical (1850–2005), RCP4.5, and RCP8.5 21st-century scenarios. Climate variables include primary variables and derived bioclimatic variables. These datasets provide a common set of climate simulations suitable for seamlessly modelling the effects of past and future climate change on species distributions and diversity. PMID:27377537

  14. Downscaled and debiased climate simulations for North America from 21,000 years ago to 2100AD.

    PubMed

    Lorenz, David J; Nieto-Lugilde, Diego; Blois, Jessica L; Fitzpatrick, Matthew C; Williams, John W

    2016-07-05

    Increasingly, ecological modellers are integrating paleodata with future projections to understand climate-driven biodiversity dynamics from the past through the current century. Climate simulations from earth system models are necessary to this effort, but must be debiased and downscaled before they can be used by ecological models. Downscaling methods and observational baselines vary among researchers, which produces confounding biases among downscaled climate simulations. We present unified datasets of debiased and downscaled climate simulations for North America from 21 ka BP to 2100AD, at 0.5° spatial resolution. Temporal resolution is decadal averages of monthly data until 1950AD, average climates for 1950-2005 AD, and monthly data from 2010 to 2100AD, with decadal averages also provided. This downscaling includes two transient paleoclimatic simulations and 12 climate models for the IPCC AR5 (CMIP5) historical (1850-2005), RCP4.5, and RCP8.5 21st-century scenarios. Climate variables include primary variables and derived bioclimatic variables. These datasets provide a common set of climate simulations suitable for seamlessly modelling the effects of past and future climate change on species distributions and diversity.

  15. Statistical downscaling and future scenario generation of temperatures for Pakistan Region

    NASA Astrophysics Data System (ADS)

    Kazmi, Dildar Hussain; Li, Jianping; Rasul, Ghulam; Tong, Jiang; Ali, Gohar; Cheema, Sohail Babar; Liu, Luliu; Gemmer, Marco; Fischer, Thomas

    2015-04-01

    Finer climate change information on spatial scale is required for impact studies than that presently provided by global or regional climate models. It is especially true for regions like South Asia with complex topography, coastal or island locations, and the areas of highly heterogeneous land-cover. To deal with the situation, an inexpensive method (statistical downscaling) has been adopted. Statistical DownScaling Model (SDSM) employed for downscaling of daily minimum and maximum temperature data of 44 national stations for base time (1961-1990) and then the future scenarios generated up to 2099. Observed as well as Predictors (product of National Oceanic and Atmospheric Administration) data were calibrated and tested on individual/multiple basis through linear regression. Future scenario was generated based on HadCM3 daily data for A2 and B2 story lines. The downscaled data has been tested, and it has shown a relatively strong relationship with the observed in comparison to ECHAM5 data. Generally, the southern half of the country is considered vulnerable in terms of increasing temperatures, but the results of this study projects that in future, the northern belt in particular would have a possible threat of increasing tendency in air temperature. Especially, the northern areas (hosting the third largest ice reserves after the Polar Regions), an important feeding source for Indus River, are projected to be vulnerable in terms of increasing temperatures. Consequently, not only the hydro-agricultural sector but also the environmental conditions in the area may be at risk, in future.

  16. Downscaling of Seasonal Landsat-8 and MODIS Land Surface Temperature (LST) in Kolkata, India

    NASA Astrophysics Data System (ADS)

    Garg, R. D.; Guha, S.; Mondal, A.; Lakshmi, V.; Kundu, S.

    2017-12-01

    The quality of life of urban people is affected by urban heat environment. The urban heat studies can be carried out using remotely sensed thermal infrared imagery for retrieving Land Surface Temperature (LST). Currently, high spatial resolution (<200 m) thermal images are limited and their temporal resolution is low (e.g., 17 days of Landsat-8). Coarse spatial resolution (1000 m) and high temporal resolution (daily) thermal images of MODIS (Moderate Resolution Imaging Spectroradiometer) are frequently available. The present study is to downscale spatially coarser resolution of the thermal image to fine resolution thermal image using regression based downscaling technique. This method is based on the relationship between (LST) and vegetation indices (e.g., Normalized Difference Vegetation Index or NDVI) over a heterogeneous landscape. The Kolkata metropolitan city, which experiences a tropical wet-and-dry type of climate has been selected for the study. This study applied different seasonal open source satellite images viz., Landsat-8 and Terra MODIS. The Landsat-8 images are aggregated at 960 m resolution and downscaled into 480, 240 120 and 60 m. Optical and thermal resolution of Landsat-8 and MODIS are 30 m and 60 m; 250 m and 1000 m respectively. The homogeneous land cover areas have shown better accuracy than heterogeneous land cover areas. The downscaling method plays a crucial role while the spatial resolution of thermal band renders it unable for advanced study. Key words: Land Surface Temperature (LST), Downscale, MODIS, Landsat, Kolkata

  17. Ensemble Cannonical Correlation Prediction of Seasonal Precipitation Over the US

    NASA Technical Reports Server (NTRS)

    Lau, William K. M.; Kim, Kyu-Myong; Shen, Samuel; Einaudi, Franco (Technical Monitor)

    2001-01-01

    This paper presents preliminary results of an ensemble cannonical correlation (ECC) prediction scheme developed at the Climate and Radiation Branch, NASA/Goddard Space Flight Center for determining the potential predictability of regional precipitation, and for climate downscaling studies. The scheme is tested on seasonal hindcasts of anomalous precipitation over the continental United States using global sea surface temperature (SST) for 1951-2000. To maximize the forecast skill derived from SST, the world ocean is divided into nonoverlapping sectors. The cannonical SST modes for each sector are used as the predictor for the ensemble hindcasts. Results show that the ECC yields a substantial (10-25%) increase in prediction skills for all regions of the US and for all seasonal compared to traditional CCA prediction schemes. For the boreal winter, the tropical Pacific contributes the largest potential predictability to precipitation in the southwestern and southeastern regions, while the North Pacific and the North Atlantic are responsible for enhanced forecast skills in the Pacific Northwest, the northern Great Plains and Ohio Valley. Most importantly, the ECC increases skill for summertime precipitation prediction and substantially reduced the spring predictability barrier over all regions of the US continent. Besides SST, the ECC is designed with the flexibility to include any number of predictor fields, such as soil moisture, snow cover and regional regional data. Moreover, the ECC forecasts can be applied to other climate subsystems and, in conjunction with further diagnostic or model studies will enables a better understanding of the dynamic links between climate variations and precipitation, not only for the US, but also for other regions of the world.

  18. The Ensemble Canon

    NASA Technical Reports Server (NTRS)

    MIittman, David S

    2011-01-01

    Ensemble is an open architecture for the development, integration, and deployment of mission operations software. Fundamentally, it is an adaptation of the Eclipse Rich Client Platform (RCP), a widespread, stable, and supported framework for component-based application development. By capitalizing on the maturity and availability of the Eclipse RCP, Ensemble offers a low-risk, politically neutral path towards a tighter integration of operations tools. The Ensemble project is a highly successful, ongoing collaboration among NASA Centers. Since 2004, the Ensemble project has supported the development of mission operations software for NASA's Exploration Systems, Science, and Space Operations Directorates.

  19. Climate change projections for winter precipitation over Tropical America using statistical downscaling

    NASA Astrophysics Data System (ADS)

    Palomino-Lemus, Reiner; Córdoba-Machado, Samir; Quishpe-Vásquez, César; García-Valdecasas-Ojeda, Matilde; Raquel Gámiz-Fortis, Sonia; Castro-Díez, Yolanda; Jesús Esteban-Parra, María

    2017-04-01

    In this study the Principal Component Regression (PCR) method has been used as statistical downscaling technique for simulating boreal winter precipitation in Tropical America during the period 1950-2010, and then for generating climate change projections for 2071-2100 period. The study uses the Global Precipitation Climatology Centre (GPCC, version 6) data set over the Tropical America region [30°N-30°S, 120°W-30°W] as predictand variable in the downscaling model. The mean monthly sea level pressure (SLP) from the National Center for Environmental Prediction - National Center for Atmospheric Research (NCEP-NCAR reanalysis project), has been used as predictor variable, covering a more extended area [30°N-30°S, 180°W-30°W]. Also, the SLP outputs from 20 GCMs, taken from the Coupled Model Intercomparison Project (CMIP5) have been used. The model data include simulations with historical atmospheric concentrations and future projections for the representative concentration pathways RCP2.6, RCP4.5, and RCP8.5. The ability of the different GCMs to simulate the winter precipitation in the study area for present climate (1971-2000) was analyzed by calculating the differences between the simulated and observed precipitation values. Additionally, the statistical significance at 95% confidence level of these differences has been estimated by means of the bilateral rank sum test of Wilcoxon-Mann-Whitney. Finally, to project winter precipitation in the area for the period 2071-2100, the downscaling model, recalibrated for the total period 1950-2010, was applied to the SLP outputs of the GCMs under the RCP2.6, RCP4.5, and RCP8.5 scenarios. The results show that, generally, for present climate the statistical downscaling shows a high ability to faithfully reproduce the precipitation field, while the simulations performed directly by using not downscaled outputs of GCMs strongly distort the precipitation field. For future climate, the projected predictions under the RCP4

  20. Ensemble Models

    EPA Science Inventory

    Ensemble forecasting has been used for operational numerical weather prediction in the United States and Europe since the early 1990s. An ensemble of weather or climate forecasts is used to characterize the two main sources of uncertainty in computer models of physical systems: ...

  1. Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project

    NASA Technical Reports Server (NTRS)

    Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa

    2013-01-01

    This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software

  2. Dynamically downscaling predictions for deciduous tree leaf emergence in California under current and future climate.

    PubMed

    Medvigy, David; Kim, Seung Hee; Kim, Jinwon; Kafatos, Menas C

    2016-07-01

    Models that predict the timing of deciduous tree leaf emergence are typically very sensitive to temperature. However, many temperature data products, including those from climate models, have been developed at a very coarse spatial resolution. Such coarse-resolution temperature products can lead to highly biased predictions of leaf emergence. This study investigates how dynamical downscaling of climate models impacts simulations of deciduous tree leaf emergence in California. Models for leaf emergence are forced with temperatures simulated by a general circulation model (GCM) at ~200-km resolution for 1981-2000 and 2031-2050 conditions. GCM simulations are then dynamically downscaled to 32- and 8-km resolution, and leaf emergence is again simulated. For 1981-2000, the regional average leaf emergence date is 30.8 days earlier in 32-km simulations than in ~200-km simulations. Differences between the 32 and 8 km simulations are small and mostly local. The impact of downscaling from 200 to 8 km is ~15 % smaller in 2031-2050 than in 1981-2000, indicating that the impacts of downscaling are unlikely to be stationary.

  3. A real-time evaluation and demonstration of strategies for 'Over-The-Loop' ensemble streamflow forecasting in US watersheds

    NASA Astrophysics Data System (ADS)

    Wood, Andy; Clark, Elizabeth; Mendoza, Pablo; Nijssen, Bart; Newman, Andy; Clark, Martyn; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    ' (SHARP) to implement, assess and demonstrate real-time over-the-loop ensemble flow forecasts in a range of US watersheds. The system relies on fully ensemble techniques, including: an 100-member ensemble of meteorological model forcings and an ensemble particle filter data assimilation for initializing watershed states; analog/regression-based downscaling of ensemble weather forecasts from GEFS; and statistical post-processing of ensemble forecast outputs, all of which run in real-time within a workflow managed by ECWMF's ecFlow libraries over large US regional domains. We describe SHARP and present early hindcast and verification results for short to seasonal range streamflow forecasts in a number of US case study watersheds.

  4. Climate downscaling effects on predictive ecological models: a case study for threatened and endangered vertebrates in the southeastern United States

    USGS Publications Warehouse

    Bucklin, David N.; Watling, James I.; Speroterra, Carolina; Brandt, Laura A.; Mazzotti, Frank J.; Romañach, Stephanie S.

    2013-01-01

    High-resolution (downscaled) projections of future climate conditions are critical inputs to a wide variety of ecological and socioeconomic models and are created using numerous different approaches. Here, we conduct a sensitivity analysis of spatial predictions from climate envelope models for threatened and endangered vertebrates in the southeastern United States to determine whether two different downscaling approaches (with and without the use of a regional climate model) affect climate envelope model predictions when all other sources of variation are held constant. We found that prediction maps differed spatially between downscaling approaches and that the variation attributable to downscaling technique was comparable to variation between maps generated using different general circulation models (GCMs). Precipitation variables tended to show greater discrepancies between downscaling techniques than temperature variables, and for one GCM, there was evidence that more poorly resolved precipitation variables contributed relatively more to model uncertainty than more well-resolved variables. Our work suggests that ecological modelers requiring high-resolution climate projections should carefully consider the type of downscaling applied to the climate projections prior to their use in predictive ecological modeling. The uncertainty associated with alternative downscaling methods may rival that of other, more widely appreciated sources of variation, such as the general circulation model or emissions scenario with which future climate projections are created.

  5. TopoSCALE v.1.0: downscaling gridded climate data in complex terrain

    NASA Astrophysics Data System (ADS)

    Fiddes, J.; Gruber, S.

    2014-02-01

    Simulation of land surface processes is problematic in heterogeneous terrain due to the the high resolution required of model grids to capture strong lateral variability caused by, for example, topography, and the lack of accurate meteorological forcing data at the site or scale it is required. Gridded data products produced by atmospheric models can fill this gap, however, often not at an appropriate spatial resolution to drive land-surface simulations. In this study we describe a method that uses the well-resolved description of the atmospheric column provided by climate models, together with high-resolution digital elevation models (DEMs), to downscale coarse-grid climate variables to a fine-scale subgrid. The main aim of this approach is to provide high-resolution driving data for a land-surface model (LSM). The method makes use of an interpolation of pressure-level data according to topographic height of the subgrid. An elevation and topography correction is used to downscale short-wave radiation. Long-wave radiation is downscaled by deriving a cloud-component of all-sky emissivity at grid level and using downscaled temperature and relative humidity fields to describe variability with elevation. Precipitation is downscaled with a simple non-linear lapse and optionally disaggregated using a climatology approach. We test the method in comparison with unscaled grid-level data and a set of reference methods, against a large evaluation dataset (up to 210 stations per variable) in the Swiss Alps. We demonstrate that the method can be used to derive meteorological inputs in complex terrain, with most significant improvements (with respect to reference methods) seen in variables derived from pressure levels: air temperature, relative humidity, wind speed and incoming long-wave radiation. This method may be of use in improving inputs to numerical simulations in heterogeneous and/or remote terrain, especially when statistical methods are not possible, due to lack of

  6. Entropy of network ensembles

    NASA Astrophysics Data System (ADS)

    Bianconi, Ginestra

    2009-03-01

    In this paper we generalize the concept of random networks to describe network ensembles with nontrivial features by a statistical mechanics approach. This framework is able to describe undirected and directed network ensembles as well as weighted network ensembles. These networks might have nontrivial community structure or, in the case of networks embedded in a given space, they might have a link probability with a nontrivial dependence on the distance between the nodes. These ensembles are characterized by their entropy, which evaluates the cardinality of networks in the ensemble. In particular, in this paper we define and evaluate the structural entropy, i.e., the entropy of the ensembles of undirected uncorrelated simple networks with given degree sequence. We stress the apparent paradox that scale-free degree distributions are characterized by having small structural entropy while they are so widely encountered in natural, social, and technological complex systems. We propose a solution to the paradox by proving that scale-free degree distributions are the most likely degree distribution with the corresponding value of the structural entropy. Finally, the general framework we present in this paper is able to describe microcanonical ensembles of networks as well as canonical or hidden-variable network ensembles with significant implications for the formulation of network-constructing algorithms.

  7. Ensembl variation resources

    PubMed Central

    2010-01-01

    Background The maturing field of genomics is rapidly increasing the number of sequenced genomes and producing more information from those previously sequenced. Much of this additional information is variation data derived from sampling multiple individuals of a given species with the goal of discovering new variants and characterising the population frequencies of the variants that are already known. These data have immense value for many studies, including those designed to understand evolution and connect genotype to phenotype. Maximising the utility of the data requires that it be stored in an accessible manner that facilitates the integration of variation data with other genome resources such as gene annotation and comparative genomics. Description The Ensembl project provides comprehensive and integrated variation resources for a wide variety of chordate genomes. This paper provides a detailed description of the sources of data and the methods for creating the Ensembl variation databases. It also explores the utility of the information by explaining the range of query options available, from using interactive web displays, to online data mining tools and connecting directly to the data servers programmatically. It gives a good overview of the variation resources and future plans for expanding the variation data within Ensembl. Conclusions Variation data is an important key to understanding the functional and phenotypic differences between individuals. The development of new sequencing and genotyping technologies is greatly increasing the amount of variation data known for almost all genomes. The Ensembl variation resources are integrated into the Ensembl genome browser and provide a comprehensive way to access this data in the context of a widely used genome bioinformatics system. All Ensembl data is freely available at http://www.ensembl.org and from the public MySQL database server at ensembldb.ensembl.org. PMID:20459805

  8. Dynamical Downscaling of Typhoon Vera (1959) and related Storm Surge based on JRA-55 Reanalysis

    NASA Astrophysics Data System (ADS)

    Ninomiya, J.; Takemi, T.; Mori, N.; Shibutani, Y.; Kim, S.

    2015-12-01

    Typhoon Vera in 1959 is historical extreme typhoon that caused severest typhoon damage mainly due to the storm surge up to 389 cm in Japan. Vera developed 895 hPa on offshore and landed with 929.2 hPa. There are many studies of the dynamical downscaling of Vera but it is difficult to simulate accurately because of the lack of the accuracy of global reanalysis data. This study carried out dynamical downscaling experiment of Vera using WRF downscaling forced by JRA-55 that are latest atmospheric model and reanalysis data. In this study, the reproducibility of five global reanalysis data for Typhoon Vera were compered. Comparison shows that reanalysis data doesn't have strong typhoon information except for JRA-55, so that downscaling with conventional reanalysis data goes wrong. The dynamical downscaling method for storm surge is studied very much (e.g. choice of physical model, nudging, 4D-VAR, bogus and so on). In this study, domain size and resolution of the coarse domain were considered. The coarse domain size influences the typhoon route and central pressure, and larger domain restrains the typhoon strength. The results of simulations with different domain size show that the threshold of developing restrain is whether the coarse domain fully includes the area of wind speed more than 15 m/s around the typhoon. The results of simulations with different resolution show that the resolution doesn't affect the typhoon route, and higher resolution gives stronger typhoon simulation.

  9. The role of observational reference data for climate downscaling: Insights from the VALUE COST Action

    NASA Astrophysics Data System (ADS)

    Kotlarski, Sven; Gutiérrez, José M.; Boberg, Fredrik; Bosshard, Thomas; Cardoso, Rita M.; Herrera, Sixto; Maraun, Douglas; Mezghani, Abdelkader; Pagé, Christian; Räty, Olle; Stepanek, Petr; Soares, Pedro M. M.; Szabo, Peter

    2016-04-01

    VALUE is an open European network to validate and compare downscaling methods for climate change research (http://www.value-cost.eu). A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of downscaling methods. Such assessments can be expected to crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling, observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. We here present a comprehensive assessment of the influence of uncertainties in observational reference data and of scale-related issues on several of the above-mentioned aspects. First, temperature and precipitation characteristics as simulated by a set of reanalysis-driven EURO-CORDEX RCM experiments are validated against three different gridded reference data products, namely (1) the EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. The analysis reveals a considerable influence of the choice of the reference data on the evaluation results, especially for precipitation. It is also illustrated how differences between the reference data sets influence the ranking of RCMs according to a comprehensive set of performance measures.

  10. Spatial, temporal and frequency based climate change assessment in Columbia River Basin using multi downscaled-scenarios

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid

    2016-07-01

    Uncertainties in climate modelling are well documented in literature. Global Climate Models (GCMs) are often used to downscale the climatic parameters on a regional scale. In the present work, we have analyzed the changes in precipitation and temperature for future scenario period of 2070-2099 with respect to historical period of 1970-2000 from statistically downscaled GCM projections in Columbia River Basin (CRB). Analysis is performed using two different statistically downscaled climate projections (with ten GCMs downscaled products each, for RCP 4.5 and RCP 8.5, from CMIP5 dataset) namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, totaling to 40 different scenarios. The two datasets for BCSD and MACA are downscaled from observed data for both scenarios projections i.e. RCP4.5 and RCP8.5. Analysis is performed using spatial change (yearly scale), temporal change (monthly scale), percentile change (seasonal scale), quantile change (yearly scale), and wavelet analysis (yearly scale) in the future period from the historical period, respectively, at a scale of 1/16th of degree for entire CRB region. Results have indicated in varied degree of spatial change pattern for the entire Columbia River Basin, especially western part of the basin. At temporal scales, winter precipitation has higher variability than summer and vice versa for temperature. Most of the models have indicated considerate positive change in quantiles and percentiles for both precipitation and temperature. Wavelet analysis provided insights into possible explanation to changes in precipitation.

  11. Statistical downscaling rainfall using artificial neural network: significantly wetter Bangkok?

    NASA Astrophysics Data System (ADS)

    Vu, Minh Tue; Aribarg, Thannob; Supratid, Siriporn; Raghavan, Srivatsan V.; Liong, Shie-Yui

    2016-11-01

    Artificial neural network (ANN) is an established technique with a flexible mathematical structure that is capable of identifying complex nonlinear relationships between input and output data. The present study utilizes ANN as a method of statistically downscaling global climate models (GCMs) during the rainy season at meteorological site locations in Bangkok, Thailand. The study illustrates the applications of the feed forward back propagation using large-scale predictor variables derived from both the ERA-Interim reanalyses data and present day/future GCM data. The predictors are first selected over different grid boxes surrounding Bangkok region and then screened by using principal component analysis (PCA) to filter the best correlated predictors for ANN training. The reanalyses downscaled results of the present day climate show good agreement against station precipitation with a correlation coefficient of 0.8 and a Nash-Sutcliffe efficiency of 0.65. The final downscaled results for four GCMs show an increasing trend of precipitation for rainy season over Bangkok by the end of the twenty-first century. The extreme values of precipitation determined using statistical indices show strong increases of wetness. These findings will be useful for policy makers in pondering adaptation measures due to flooding such as whether the current drainage network system is sufficient to meet the changing climate and to plan for a range of related adaptation/mitigation measures.

  12. The Schaake shuffle: A method for reconstructing space-time variability in forecasted precipitation and temperature fields

    USGS Publications Warehouse

    Clark, M.R.; Gangopadhyay, S.; Hay, L.; Rajagopalan, B.; Wilby, R.

    2004-01-01

    A number of statistical methods that are used to provide local-scale ensemble forecasts of precipitation and temperature do not contain realistic spatial covariability between neighboring stations or realistic temporal persistence for subsequent forecast lead times. To demonstrate this point, output from a global-scale numerical weather prediction model is used in a stepwise multiple linear regression approach to downscale precipitation and temperature to individual stations located in and around four study basins in the United States. Output from the forecast model is downscaled for lead times up to 14 days. Residuals in the regression equation are modeled stochastically to provide 100 ensemble forecasts. The precipitation and temperature ensembles from this approach have a poor representation of the spatial variability and temporal persistence. The spatial correlations for downscaled output are considerably lower than observed spatial correlations at short forecast lead times (e.g., less than 5 days) when there is high accuracy in the forecasts. At longer forecast lead times, the downscaled spatial correlations are close to zero. Similarly, the observed temporal persistence is only partly present at short forecast lead times. A method is presented for reordering the ensemble output in order to recover the space-time variability in precipitation and temperature fields. In this approach, the ensemble members for a given forecast day are ranked and matched with the rank of precipitation and temperature data from days randomly selected from similar dates in the historical record. The ensembles are then reordered to correspond to the original order of the selection of historical data. Using this approach, the observed intersite correlations, intervariable correlations, and the observed temporal persistence are almost entirely recovered. This reordering methodology also has applications for recovering the space-time variability in modeled streamflow. ?? 2004 American

  13. Locally Weighted Ensemble Clustering.

    PubMed

    Huang, Dong; Wang, Chang-Dong; Lai, Jian-Huang

    2018-05-01

    Due to its ability to combine multiple base clusterings into a probably better and more robust clustering, the ensemble clustering technique has been attracting increasing attention in recent years. Despite the significant success, one limitation to most of the existing ensemble clustering methods is that they generally treat all base clusterings equally regardless of their reliability, which makes them vulnerable to low-quality base clusterings. Although some efforts have been made to (globally) evaluate and weight the base clusterings, yet these methods tend to view each base clustering as an individual and neglect the local diversity of clusters inside the same base clustering. It remains an open problem how to evaluate the reliability of clusters and exploit the local diversity in the ensemble to enhance the consensus performance, especially, in the case when there is no access to data features or specific assumptions on data distribution. To address this, in this paper, we propose a novel ensemble clustering approach based on ensemble-driven cluster uncertainty estimation and local weighting strategy. In particular, the uncertainty of each cluster is estimated by considering the cluster labels in the entire ensemble via an entropic criterion. A novel ensemble-driven cluster validity measure is introduced, and a locally weighted co-association matrix is presented to serve as a summary for the ensemble of diverse clusters. With the local diversity in ensembles exploited, two novel consensus functions are further proposed. Extensive experiments on a variety of real-world datasets demonstrate the superiority of the proposed approach over the state-of-the-art.

  14. Some Advances in Downscaling Probabilistic Climate Forecasts for Agricultural Decision Support

    NASA Astrophysics Data System (ADS)

    Han, E.; Ines, A.

    2015-12-01

    Seasonal climate forecasts, commonly provided in tercile-probabilities format (below-, near- and above-normal), need to be translated into more meaningful information for decision support of practitioners in agriculture. In this paper, we will present two new novel approaches to temporally downscale probabilistic seasonal climate forecasts: one non-parametric and another parametric method. First, the non-parametric downscaling approach called FResampler1 uses the concept of 'conditional block sampling' of weather data to create daily weather realizations of a tercile-based seasonal climate forecasts. FResampler1 randomly draws time series of daily weather parameters (e.g., rainfall, maximum and minimum temperature and solar radiation) from historical records, for the season of interest from years that belong to a certain rainfall tercile category (e.g., being below-, near- and above-normal). In this way, FResampler1 preserves the covariance between rainfall and other weather parameters as if conditionally sampling maximum and minimum temperature and solar radiation if that day is wet or dry. The second approach called predictWTD is a parametric method based on a conditional stochastic weather generator. The tercile-based seasonal climate forecast is converted into a theoretical forecast cumulative probability curve. Then the deviates for each percentile is converted into rainfall amount or frequency or intensity to downscale the 'full' distribution of probabilistic seasonal climate forecasts. Those seasonal deviates are then disaggregated on a monthly basis and used to constrain the downscaling of forecast realizations at different percentile values of the theoretical forecast curve. As well as the theoretical basis of the approaches we will discuss sensitivity analysis (length of data and size of samples) of them. In addition their potential applications for managing climate-related risks in agriculture will be shown through a couple of case studies based on

  15. Genetic particle filter application to land surface temperature downscaling

    NASA Astrophysics Data System (ADS)

    Mechri, Rihab; Ottlé, Catherine; Pannekoucke, Olivier; Kallel, Abdelaziz

    2014-03-01

    Thermal infrared data are widely used for surface flux estimation giving the possibility to assess water and energy budgets through land surface temperature (LST). Many applications require both high spatial resolution (HSR) and high temporal resolution (HTR), which are not presently available from space. It is therefore necessary to develop methodologies to use the coarse spatial/high temporal resolutions LST remote-sensing products for a better monitoring of fluxes at appropriate scales. For that purpose, a data assimilation method was developed to downscale LST based on particle filtering. The basic tenet of our approach is to constrain LST dynamics simulated at both HSR and HTR, through the optimization of aggregated temperatures at the coarse observation scale. Thus, a genetic particle filter (GPF) data assimilation scheme was implemented and applied to a land surface model which simulates prior subpixel temperatures. First, the GPF downscaling scheme was tested on pseudoobservations generated in the framework of the study area landscape (Crau-Camargue, France) and climate for the year 2006. The GPF performances were evaluated against observation errors and temporal sampling. Results show that GPF outperforms prior model estimations. Finally, the GPF method was applied on Spinning Enhanced Visible and InfraRed Imager time series and evaluated against HSR data provided by an Advanced Spaceborne Thermal Emission and Reflection Radiometer image acquired on 26 July 2006. The temperatures of seven land cover classes present in the study area were estimated with root-mean-square errors less than 2.4 K which is a very promising result for downscaling LST satellite products.

  16. Impacts of precipitation and potential evapotranspiration patterns on downscaling soil moisture in regions with large topographic relief

    NASA Astrophysics Data System (ADS)

    Cowley, Garret S.; Niemann, Jeffrey D.; Green, Timothy R.; Seyfried, Mark S.; Jones, Andrew S.; Grazaitis, Peter J.

    2017-02-01

    Soil moisture can be estimated at coarse resolutions (>1 km) using satellite remote sensing, but that resolution is poorly suited for many applications. The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales coarse-resolution soil moisture using fine-resolution topographic, vegetation, and soil data to produce fine-resolution (10-30 m) estimates of soil moisture. The EMT+VS model performs well at catchments with low topographic relief (≤124 m), but it has not been applied to regions with larger ranges of elevation. Large relief can produce substantial variations in precipitation and potential evapotranspiration (PET), which might affect the fine-resolution patterns of soil moisture. In this research, simple methods to downscale temporal average precipitation and PET are developed and included in the EMT+VS model, and the effects of spatial variations in these variables on the surface soil moisture estimates are investigated. The methods are tested against ground truth data at the 239 km2 Reynolds Creek watershed in southern Idaho, which has 1145 m of relief. The precipitation and PET downscaling methods are able to capture the main features in the spatial patterns of both variables. The space-time Nash-Sutcliffe coefficients of efficiency of the fine-resolution soil moisture estimates improve from 0.33 to 0.36 and 0.41 when the precipitation and PET downscaling methods are included, respectively. PET downscaling provides a larger improvement in the soil moisture estimates than precipitation downscaling likely because the PET pattern is more persistent through time, and thus more predictable, than the precipitation pattern.

  17. Multi-model ensemble projections of European river floods and high flows at 1.5, 2, and 3 degrees global warming

    NASA Astrophysics Data System (ADS)

    Thober, Stephan; Kumar, Rohini; Wanders, Niko; Marx, Andreas; Pan, Ming; Rakovec, Oldrich; Samaniego, Luis; Sheffield, Justin; Wood, Eric F.; Zink, Matthias

    2018-01-01

    Severe river floods often result in huge economic losses and fatalities. Since 1980, almost 1500 such events have been reported in Europe. This study investigates climate change impacts on European floods under 1.5, 2, and 3 K global warming. The impacts are assessed employing a multi-model ensemble containing three hydrologic models (HMs: mHM, Noah-MP, PCR-GLOBWB) forced by five CMIP5 general circulation models (GCMs) under three Representative Concentration Pathways (RCPs 2.6, 6.0, and 8.5). This multi-model ensemble is unprecedented with respect to the combination of its size (45 realisations) and its spatial resolution, which is 5 km over the entirety of Europe. Climate change impacts are quantified for high flows and flood events, represented by 10% exceedance probability and annual maxima of daily streamflow, respectively. The multi-model ensemble points to the Mediterranean region as a hotspot of changes with significant decrements in high flows from -11% at 1.5 K up to -30% at 3 K global warming mainly resulting from reduced precipitation. Small changes (< ±10%) are observed for river basins in Central Europe and the British Isles under different levels of warming. Projected higher annual precipitation increases high flows in Scandinavia, but reduced snow melt equivalent decreases flood events in this region. Neglecting uncertainties originating from internal climate variability, downscaling technique, and hydrologic model parameters, the contribution by the GCMs to the overall uncertainties of the ensemble is in general higher than that by the HMs. The latter, however, have a substantial share in the Mediterranean and Scandinavia. Adaptation measures for limiting the impacts of global warming could be similar under 1.5 K and 2 K global warming, but have to account for significantly higher changes under 3 K global warming.

  18. Verification of GCM-generated regional seasonal precipitation for current climate and of statistical downscaling estimates under changing climate conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busuioc, A.; Storch, H. von; Schnur, R.

    Empirical downscaling procedures relate large-scale atmospheric features with local features such as station rainfall in order to facilitate local scenarios of climate change. The purpose of the present paper is twofold: first, a downscaling technique is used as a diagnostic tool to verify the performance of climate models on the regional scale; second, a technique is proposed for verifying the validity of empirical downscaling procedures in climate change applications. The case considered is regional seasonal precipitation in Romania. The downscaling model is a regression based on canonical correlation analysis between observed station precipitation and European-scale sea level pressure (SLP). Themore » climate models considered here are the T21 and T42 versions of the Hamburg ECHAM3 atmospheric GCM run in time-slice mode. The climate change scenario refers to the expected time of doubled carbon dioxide concentrations around the year 2050. Generally, applications of statistical downscaling to climate change scenarios have been based on the assumption that the empirical link between the large-scale and regional parameters remains valid under a changed climate. In this study, a rationale is proposed for this assumption by showing the consistency of the 2 x CO{sub 2} GCM scenarios in winter, derived directly from the gridpoint data, with the regional scenarios obtained through empirical downscaling. Since the skill of the GCMs in regional terms is already established, it is concluded that the downscaling technique is adequate for describing climatically changing regional and local conditions, at least for precipitation in Romania during winter.« less

  19. Understanding the joint behavior of temperature and precipitation for climate change impact studies

    NASA Astrophysics Data System (ADS)

    Rana, Arun; Moradkhani, Hamid; Qin, Yueyue

    2017-07-01

    The multiple downscaled scenario products allow us to assess the uncertainty of the variations of precipitation and temperature in the current and future periods. Probabilistic assessments of both climatic variables help better understand the interdependence of the two and thus, in turn, help in assessing the future with confidence. In the present study, we use ensemble of statistically downscaled precipitation and temperature from various models. The dataset used is multi-model ensemble of 10 global climate models (GCMs) downscaled product from CMIP5 daily dataset using the Bias Correction and Spatial Downscaling (BCSD) technique, generated at Portland State University. The multi-model ensemble of both precipitation and temperature is evaluated for dry and wet periods for 10 sub-basins across Columbia River Basin (CRB). Thereafter, copula is applied to establish the joint distribution of two variables on multi-model ensemble data. The joint distribution is then used to estimate the change in trends of said variables in future, along with estimation of the probabilities of the given change. The joint distribution trends vary, but certainly positive, for dry and wet periods in sub-basins of CRB. Dry season, generally, is indicating a higher positive change in precipitation than temperature (as compared to historical) across sub-basins with wet season inferring otherwise. Probabilities of changes in future, as estimated from the joint distribution, indicate varied degrees and forms during dry season whereas the wet season is rather constant across all the sub-basins.

  20. Comparison of Grid Nudging and Spectral Nudging Techniques for Dynamical Climate Downscaling within the WRF Model

    NASA Astrophysics Data System (ADS)

    Fan, X.; Chen, L.; Ma, Z.

    2010-12-01

    Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.

  1. A method for deterministic statistical downscaling of daily precipitation at a monsoonal site in Eastern China

    NASA Astrophysics Data System (ADS)

    Liu, Yonghe; Feng, Jinming; Liu, Xiu; Zhao, Yadi

    2017-12-01

    Statistical downscaling (SD) is a method that acquires the local information required for hydrological impact assessment from large-scale atmospheric variables. Very few statistical and deterministic downscaling models for daily precipitation have been conducted for local sites influenced by the East Asian monsoon. In this study, SD models were constructed by selecting the best predictors and using generalized linear models (GLMs) for Feixian, a site in the Yishu River Basin and Shandong Province. By calculating and mapping Spearman rank correlation coefficients between the gridded standardized values of five large-scale variables and daily observed precipitation, different cyclonic circulation patterns were found for monsoonal precipitation in summer (June-September) and winter (November-December and January-March); the values of the gridded boxes with the highest absolute correlations for observed precipitation were selected as predictors. Data for predictors and predictands covered the period 1979-2015, and different calibration and validation periods were divided when fitting and validating the models. Meanwhile, the bootstrap method was also used to fit the GLM. All the above thorough validations indicated that the models were robust and not sensitive to different samples or different periods. Pearson's correlations between downscaled and observed precipitation (logarithmically transformed) on a daily scale reached 0.54-0.57 in summer and 0.56-0.61 in winter, and the Nash-Sutcliffe efficiency between downscaled and observed precipitation reached 0.1 in summer and 0.41 in winter. The downscaled precipitation partially reflected exact variations in winter and main trends in summer for total interannual precipitation. For the number of wet days, both winter and summer models were able to reflect interannual variations. Other comparisons were also made in this study. These results demonstrated that when downscaling, it is appropriate to combine a correlation

  2. Downscaling Satellite Precipitation with Emphasis on Extremes: A Variational 1-Norm Regularization in the Derivative Domain

    NASA Technical Reports Server (NTRS)

    Foufoula-Georgiou, E.; Ebtehaj, A. M.; Zhang, S. Q.; Hou, A. Y.

    2013-01-01

    The increasing availability of precipitation observations from space, e.g., from the Tropical Rainfall Measuring Mission (TRMM) and the forthcoming Global Precipitation Measuring (GPM) Mission, has fueled renewed interest in developing frameworks for downscaling and multi-sensor data fusion that can handle large data sets in computationally efficient ways while optimally reproducing desired properties of the underlying rainfall fields. Of special interest is the reproduction of extreme precipitation intensities and gradients, as these are directly relevant to hazard prediction. In this paper, we present a new formalism for downscaling satellite precipitation observations, which explicitly allows for the preservation of some key geometrical and statistical properties of spatial precipitation. These include sharp intensity gradients (due to high-intensity regions embedded within lower-intensity areas), coherent spatial structures (due to regions of slowly varying rainfall),and thicker-than-Gaussian tails of precipitation gradients and intensities. Specifically, we pose the downscaling problem as a discrete inverse problem and solve it via a regularized variational approach (variational downscaling) where the regularization term is selected to impose the desired smoothness in the solution while allowing for some steep gradients(called 1-norm or total variation regularization). We demonstrate the duality between this geometrically inspired solution and its Bayesian statistical interpretation, which is equivalent to assuming a Laplace prior distribution for the precipitation intensities in the derivative (wavelet) space. When the observation operator is not known, we discuss the effect of its misspecification and explore a previously proposed dictionary-based sparse inverse downscaling methodology to indirectly learn the observation operator from a database of coincidental high- and low-resolution observations. The proposed method and ideas are illustrated in case

  3. Stochastic downscaling of numerically simulated spatial rain and cloud fields using a transient multifractal approach

    NASA Astrophysics Data System (ADS)

    Nogueira, M.; Barros, A. P.; Miranda, P. M.

    2012-04-01

    Atmospheric fields can be extremely variable over wide ranges of spatial scales, with a scale ratio of 109-1010 between largest (planetary) and smallest (viscous dissipation) scale. Furthermore atmospheric fields with strong variability over wide ranges in scale most likely should not be artificially split apart into large and small scales, as in reality there is no scale separation between resolved and unresolved motions. Usually the effects of the unresolved scales are modeled by a deterministic bulk formula representing an ensemble of incoherent subgrid processes on the resolved flow. This is a pragmatic approach to the problem and not the complete solution to it. These models are expected to underrepresent the small-scale spatial variability of both dynamical and scalar fields due to implicit and explicit numerical diffusion as well as physically based subgrid scale turbulent mixing, resulting in smoother and less intermittent fields as compared to observations. Thus, a fundamental change in the way we formulate our models is required. Stochastic approaches equipped with a possible realization of subgrid processes and potentially coupled to the resolved scales over the range of significant scale interactions range provide one alternative to address the problem. Stochastic multifractal models based on the cascade phenomenology of the atmosphere and its governing equations in particular are the focus of this research. Previous results have shown that rain and cloud fields resulting from both idealized and realistic numerical simulations display multifractal behavior in the resolved scales. This result is observed even in the absence of scaling in the initial conditions or terrain forcing, suggesting that multiscaling is a general property of the nonlinear solutions of the Navier-Stokes equations governing atmospheric dynamics. Our results also show that the corresponding multiscaling parameters for rain and cloud fields exhibit complex nonlinear behavior

  4. Statistical Downscaling of WRF-Chem Model: An Air Quality Analysis over Bogota, Colombia

    NASA Astrophysics Data System (ADS)

    Kumar, Anikender; Rojas, Nestor

    2015-04-01

    Statistical downscaling is a technique that is used to extract high-resolution information from regional scale variables produced by coarse resolution models such as Chemical Transport Models (CTMs). The fully coupled WRF-Chem (Weather Research and Forecasting with Chemistry) model is used to simulate air quality over Bogota. Bogota is a tropical Andean megacity located over a high-altitude plateau in the middle of very complex terrain. The WRF-Chem model was adopted for simulating the hourly ozone concentrations. The computational domains were chosen of 120x120x32, 121x121x32 and 121x121x32 grid points with horizontal resolutions of 27, 9 and 3 km respectively. The model was initialized with real boundary conditions using NCAR-NCEP's Final Analysis (FNL) and a 1ox1o (~111 km x 111 km) resolution. Boundary conditions were updated every 6 hours using reanalysis data. The emission rates were obtained from global inventories, namely the REanalysis of the TROpospheric (RETRO) chemical composition and the Emission Database for Global Atmospheric Research (EDGAR). Multiple linear regression and artificial neural network techniques are used to downscale the model output at each monitoring stations. The results confirm that the statistically downscaled outputs reduce simulated errors by up to 25%. This study provides a general overview of statistical downscaling of chemical transport models and can constitute a reference for future air quality modeling exercises over Bogota and other Colombian cities.

  5. Evaluation of TIGGE Ensemble Forecasts of Precipitation in Distinct Climate Regions in Iran

    NASA Astrophysics Data System (ADS)

    Aminyavari, Saleh; Saghafian, Bahram; Delavar, Majid

    2018-04-01

    The application of numerical weather prediction (NWP) products is increasing dramatically. Existing reports indicate that ensemble predictions have better skill than deterministic forecasts. In this study, numerical ensemble precipitation forecasts in the TIGGE database were evaluated using deterministic, dichotomous (yes/no), and probabilistic techniques over Iran for the period 2008-16. Thirteen rain gauges spread over eight homogeneous precipitation regimes were selected for evaluation. The Inverse Distance Weighting and Kriging methods were adopted for interpolation of the prediction values, downscaled to the stations at lead times of one to three days. To enhance the forecast quality, NWP values were post-processed via Bayesian Model Averaging. The results showed that ECMWF had better scores than other products. However, products of all centers underestimated precipitation in high precipitation regions while overestimating precipitation in other regions. This points to a systematic bias in forecasts and demands application of bias correction techniques. Based on dichotomous evaluation, NCEP did better at most stations, although all centers overpredicted the number of precipitation events. Compared to those of ECMWF and NCEP, UKMO yielded higher scores in mountainous regions, but performed poorly at other selected stations. Furthermore, the evaluations showed that all centers had better skill in wet than in dry seasons. The quality of post-processed predictions was better than those of the raw predictions. In conclusion, the accuracy of the NWP predictions made by the selected centers could be classified as medium over Iran, while post-processing of predictions is recommended to improve the quality.

  6. Uncertainty Analysis of Downscaled CMIP5 Precipitation Data for Louisiana, USA

    NASA Astrophysics Data System (ADS)

    Sumi, S. J.; Tamanna, M.; Chivoiu, B.; Habib, E. H.

    2014-12-01

    The downscaled CMIP3 and CMIP5 Climate and Hydrology Projections dataset contains fine spatial resolution translations of climate projections over the contiguous United States developed using two downscaling techniques (monthly Bias Correction Spatial Disaggregation (BCSD) and daily Bias Correction Constructed Analogs (BCCA)). The objective of this study is to assess the uncertainty of the CMIP5 downscaled general circulation models (GCM). We performed an analysis of the daily, monthly, seasonal and annual variability of precipitation downloaded from the Downscaled CMIP3 and CMIP5 Climate and Hydrology Projections website for the state of Louisiana, USA at 0.125° x 0.125° resolution. A data set of daily gridded observations of precipitation of a rectangular boundary covering Louisiana is used to assess the validity of 21 downscaled GCMs for the 1950-1999 period. The following statistics are computed using the CMIP5 observed dataset with respect to the 21 models: the correlation coefficient, the bias, the normalized bias, the mean absolute error (MAE), the mean absolute percentage error (MAPE), and the root mean square error (RMSE). A measure of variability simulated by each model is computed as the ratio of its standard deviation, in both space and time, to the corresponding standard deviation of the observation. The correlation and MAPE statistics are also computed for each of the nine climate divisions of Louisiana. Some of the patterns that we observed are: 1) Average annual precipitation rate shows similar spatial distribution for all the models within a range of 3.27 to 4.75 mm/day from Northwest to Southeast. 2) Standard deviation of summer (JJA) precipitation (mm/day) for the models maintains lower value than the observation whereas they have similar spatial patterns and range of values in winter (NDJ). 3) Correlation coefficients of annual precipitation of models against observation have a range of -0.48 to 0.36 with variable spatial distribution by model

  7. Imprinting and recalling cortical ensembles.

    PubMed

    Carrillo-Reid, Luis; Yang, Weijian; Bando, Yuki; Peterka, Darcy S; Yuste, Rafael

    2016-08-12

    Neuronal ensembles are coactive groups of neurons that may represent building blocks of cortical circuits. These ensembles could be formed by Hebbian plasticity, whereby synapses between coactive neurons are strengthened. Here we report that repetitive activation with two-photon optogenetics of neuronal populations from ensembles in the visual cortex of awake mice builds neuronal ensembles that recur spontaneously after being imprinted and do not disrupt preexisting ones. Moreover, imprinted ensembles can be recalled by single- cell stimulation and remain coactive on consecutive days. Our results demonstrate the persistent reconfiguration of cortical circuits by two-photon optogenetics into neuronal ensembles that can perform pattern completion. Copyright © 2016, American Association for the Advancement of Science.

  8. Downscaling of Remotely Sensed Land Surface Temperature with multi-sensor based products

    NASA Astrophysics Data System (ADS)

    Jeong, J.; Baik, J.; Choi, M.

    2016-12-01

    Remotely sensed satellite data provides a bird's eye view, which allows us to understand spatiotemporal behavior of hydrologic variables at global scale. Especially, geostationary satellite continuously observing specific regions is useful to monitor the fluctuations of hydrologic variables as well as meteorological factors. However, there are still problems regarding spatial resolution whether the fine scale land cover can be represented with the spatial resolution of the satellite sensor, especially in the area of complex topography. To solve these problems, many researchers have been trying to establish the relationship among various hydrological factors and combine images from multi-sensor to downscale land surface products. One of geostationary satellite, Communication, Ocean and Meteorological Satellite (COMS), has Meteorological Imager (MI) and Geostationary Ocean Color Imager (GOCI). MI performing the meteorological mission produce Rainfall Intensity (RI), Land Surface Temperature (LST), and many others every 15 minutes. Even though it has high temporal resolution, low spatial resolution of MI data is treated as major research problem in many studies. This study suggests a methodology to downscale 4 km LST datasets derived from MI in finer resolution (500m) by using GOCI datasets in Northeast Asia. Normalized Difference Vegetation Index (NDVI) recognized as variable which has significant relationship with LST are chosen to estimate LST in finer resolution. Each pixels of NDVI and LST are separated according to land cover provided from MODerate resolution Imaging Spectroradiometer (MODIS) to achieve more accurate relationship. Downscaled LST are compared with LST observed from Automated Synoptic Observing System (ASOS) for assessing its accuracy. The downscaled LST results of this study, coupled with advantage of geostationary satellite, can be applied to observe hydrologic process efficiently.

  9. World Music Ensemble: Kulintang

    ERIC Educational Resources Information Center

    Beegle, Amy C.

    2012-01-01

    As instrumental world music ensembles such as steel pan, mariachi, gamelan and West African drums are becoming more the norm than the exception in North American school music programs, there are other world music ensembles just starting to gain popularity in particular parts of the United States. The kulintang ensemble, a drum and gong ensemble…

  10. Statistical downscaling of precipitation using long short-term memory recurrent neural networks

    NASA Astrophysics Data System (ADS)

    Misra, Saptarshi; Sarkar, Sudeshna; Mitra, Pabitra

    2017-11-01

    Hydrological impacts of global climate change on regional scale are generally assessed by downscaling large-scale climatic variables, simulated by General Circulation Models (GCMs), to regional, small-scale hydrometeorological variables like precipitation, temperature, etc. In this study, we propose a new statistical downscaling model based on Recurrent Neural Network with Long Short-Term Memory which captures the spatio-temporal dependencies in local rainfall. The previous studies have used several other methods such as linear regression, quantile regression, kernel regression, beta regression, and artificial neural networks. Deep neural networks and recurrent neural networks have been shown to be highly promising in modeling complex and highly non-linear relationships between input and output variables in different domains and hence we investigated their performance in the task of statistical downscaling. We have tested this model on two datasets—one on precipitation in Mahanadi basin in India and the second on precipitation in Campbell River basin in Canada. Our autoencoder coupled long short-term memory recurrent neural network model performs the best compared to other existing methods on both the datasets with respect to temporal cross-correlation, mean squared error, and capturing the extremes.

  11. Influence of reanalysis datasets on dynamically downscaling the recent past

    NASA Astrophysics Data System (ADS)

    Moalafhi, Ditiro B.; Evans, Jason P.; Sharma, Ashish

    2017-08-01

    Multiple reanalysis datasets currently exist that can provide boundary conditions for dynamic downscaling and simulating local hydro-climatic processes at finer spatial and temporal resolutions. Previous work has suggested that there are two reanalyses alternatives that provide the best lateral boundary conditions for downscaling over southern Africa. This study dynamically downscales these reanalyses (ERA-I and MERRA) over southern Africa to a high resolution (10 km) grid using the WRF model. Simulations cover the period 1981-2010. Multiple observation datasets were used for both surface temperature and precipitation to account for observational uncertainty when assessing results. Generally, temperature is simulated quite well, except over the Namibian coastal plain where the simulations show anomalous warm temperature related to the failure to propagate the influence of the cold Benguela current inland. Precipitation tends to be overestimated in high altitude areas, and most of southern Mozambique. This could be attributed to challenges in handling complex topography and capturing large-scale circulation patterns. While MERRA driven WRF exhibits slightly less bias in temperature especially for La Nina years, ERA-I driven simulations are on average superior in terms of RMSE. When considering multiple variables and metrics, ERA-I is found to produce the best simulation of the climate over the domain. The influence of the regional model appears to be large enough to overcome the small difference in relative errors present in the lateral boundary conditions derived from these two reanalyses.

  12. The Practitioner's Dilemma: How to Assess the Credibility of Downscaled Climate Projections

    NASA Technical Reports Server (NTRS)

    Barsugli, Joseph J.; Guentchev, Galina; Horton, Radley M.; Wood, Andrew; Mearns, Lindo O.; Liang, Xin-Zhong; Winkler, Julia A.; Dixon, Keith; Hayhoe, Katharine; Rood, Richard B.; hide

    2013-01-01

    Suppose you are a city planner, regional water manager, or wildlife conservation specialist who is asked to include the potential impacts of climate variability and change in your risk management and planning efforts. What climate information would you use? The choice is often regional or local climate projections downscaled from global climate models (GCMs; also known as general circulation models) to include detail at spatial and temporal scales that align with those of the decision problem. A few years ago this information was hard to come by. Now there is Web-based access to a proliferation of high-resolution climate projections derived with differing downscaling methods.

  13. Evaluation of cool season precipitation event characteristics over the Northeast US in a suite of downscaled climate model hindcasts

    NASA Astrophysics Data System (ADS)

    Loikith, Paul C.; Waliser, Duane E.; Kim, Jinwon; Ferraro, Robert

    2017-08-01

    Cool season precipitation event characteristics are evaluated across a suite of downscaled climate models over the northeastern US. Downscaled hindcast simulations are produced by dynamically downscaling the Modern-Era Retrospective Analysis for Research and Applications version 2 (MERRA2) using the National Aeronautics and Space Administration (NASA)-Unified Weather Research and Forecasting (WRF) regional climate model (RCM) and the Goddard Earth Observing System Model, Version 5 (GEOS-5) global climate model. NU-WRF RCM simulations are produced at 24, 12, and 4-km horizontal resolutions using a range of spectral nudging schemes while the MERRA2 global downscaled run is provided at 12.5-km. All model runs are evaluated using four metrics designed to capture key features of precipitation events: event frequency, event intensity, even total, and event duration. Overall, the downscaling approaches result in a reasonable representation of many of the key features of precipitation events over the region, however considerable biases exist in the magnitude of each metric. Based on this evaluation there is no clear indication that higher resolution simulations result in more realistic results in general, however many small-scale features such as orographic enhancement of precipitation are only captured at higher resolutions suggesting some added value over coarser resolution. While the differences between simulations produced using nudging and no nudging are small, there is some improvement in model fidelity when nudging is introduced, especially at a cutoff wavelength of 600 km compared to 2000 km. Based on the results of this evaluation, dynamical regional downscaling using NU-WRF results in a more realistic representation of precipitation event climatology than the global downscaling of MERRA2 using GEOS-5.

  14. High-resolution dynamical downscaling of the future Alpine climate

    NASA Astrophysics Data System (ADS)

    Bozhinova, Denica; José Gómez-Navarro, Juan; Raible, Christoph

    2017-04-01

    The Alpine region and Switzerland is a challenging area for simulating and analysing Global Climate Model (GCM) results. This is mostly due to the combination of a very complex topography and the still rather coarse horizontal resolution of current GCMs, in which not all of the many-scale processes that drive the local weather and climate can be resolved. In our study, the Weather Research and Forecasting (WRF) model is used to dynamically downscale a GCM simulation to a resolution as high as 2 km x 2 km. WRF is driven by initial and boundary conditions produced with the Community Earth System Model (CESM) for the recent past (control run) and until 2100 using the RCP8.5 climate scenario (future run). The control run downscaled with WRF covers the period 1976-2005, while the future run investigates a 20-year-slice simulated for the 2080-2099. We compare the control WRF-CESM simulations to an observational product provided by MeteoSwiss and an additional WRF simulation driven by the ERA-Interim reanalysis, to estimate the bias that is introduced by the extra modelling step of our framework. Several bias-correction methods are evaluated, including a quantile mapping technique, to ameliorate the bias in the control WRF-CESM simulation. In the next step of our study these corrections are applied to our future WRF-CESM run. The resulting downscaled and bias-corrected data is analysed for the properties of precipitation and wind speed in the future climate. Our special interest focuses on the absolute quantities simulated for these meteorological variables as these are used to identify extreme events, such as wind storms and situations that can lead to floods.

  15. CMIP5 based downscaled temperature over Western Himalayan region

    NASA Astrophysics Data System (ADS)

    Dutta, M.; Das, L.; Meher, J. K.

    2016-12-01

    Limited numbers of reliable temperature data is available for assessing warming over the Western Himalayan Region (WHR) of India. India meteorological Department provided many stations having more than 30% missing values. Stations having <30% missing values, were replaced using the Multiple Imputation Chained Equation (MICE) technique. Finally 16 stations having continuous records during 1969-2009 were considered as the "reference stations" for assessing the trends in addition to evaluate the Coupled Model Intercomparison, phase 5 (CMIP5) Global Circulation Model(GCMs). Station data indicates higher and rapid (1.41oC) winter warming than the other seasons and least warming was observed in the post monsoon (0.31oC) season. Mean annual warming is 0.84 oC during 1969-2009 indicating the warming over the WHR is more than double the global warming (0.85oC during 1880-2012). The performance of 34 CMIP5 models was evaluated through three different approaches namely comparison of: i) mean seasonal cycle ii) temporal trends and iii) spatial correlation and a rank was assigned to each GCM. How the better performing GCMs able to reproduce the observed spatial details were verified the ERA-interim reanalysis data. Finally station level future downscaled winter temperature has constructed using Empirical Statistical Downscaling (ESD) technique where 2 meter air temperature (T2m) is considered as predictor and station temperature as predictant. Future range of downscaled temperature change for the stations Dheradun, Manali and Gulmarg are 1.3-6.1OC, 1.1-5.8OC and 0.5-5.8OC respectively at the end of 21st century.

  16. Downscaling future climate scenarios to fine scales for hydrologic and ecological modeling and analysis

    USGS Publications Warehouse

    Flint, Lorraine E.; Flint, Alan L.

    2012-01-01

    The methodology, which includes a sequence of rigorous analyses and calculations, is intended to reduce the addition of uncertainty to the climate data as a result of the downscaling while providing the fine-scale climate information necessary for ecological analyses. It results in new but consistent data sets for the US at 4 km, the southwest US at 270 m, and California at 90 m and illustrates the utility of fine-scale downscaling to analyses of ecological processes influenced by topographic complexity.

  17. Performance of Statistical Temporal Downscaling Techniques of Wind Speed Data Over Aegean Sea

    NASA Astrophysics Data System (ADS)

    Gokhan Guler, Hasan; Baykal, Cuneyt; Ozyurt, Gulizar; Kisacik, Dogan

    2016-04-01

    Wind speed data is a key input for many meteorological and engineering applications. Many institutions provide wind speed data with temporal resolutions ranging from one hour to twenty four hours. Higher temporal resolution is generally required for some applications such as reliable wave hindcasting studies. One solution to generate wind data at high sampling frequencies is to use statistical downscaling techniques to interpolate values of the finer sampling intervals from the available data. In this study, the major aim is to assess temporal downscaling performance of nine statistical interpolation techniques by quantifying the inherent uncertainty due to selection of different techniques. For this purpose, hourly 10-m wind speed data taken from 227 data points over Aegean Sea between 1979 and 2010 having a spatial resolution of approximately 0.3 degrees are analyzed from the National Centers for Environmental Prediction (NCEP) The Climate Forecast System Reanalysis database. Additionally, hourly 10-m wind speed data of two in-situ measurement stations between June, 2014 and June, 2015 are considered to understand effect of dataset properties on the uncertainty generated by interpolation technique. In this study, nine statistical interpolation techniques are selected as w0 (left constant) interpolation, w6 (right constant) interpolation, averaging step function interpolation, linear interpolation, 1D Fast Fourier Transform interpolation, 2nd and 3rd degree Lagrange polynomial interpolation, cubic spline interpolation, piecewise cubic Hermite interpolating polynomials. Original data is down sampled to 6 hours (i.e. wind speeds at 0th, 6th, 12th and 18th hours of each day are selected), then 6 hourly data is temporally downscaled to hourly data (i.e. the wind speeds at each hour between the intervals are computed) using nine interpolation technique, and finally original data is compared with the temporally downscaled data. A penalty point system based on

  18. Statistical downscaling of mean temperature, maximum temperature, and minimum temperature on the Loess Plateau, China

    NASA Astrophysics Data System (ADS)

    Lin, Jiang; Miao, Chiyuan

    2017-04-01

    Climate change is considered to be one of the greatest environmental threats. This has urged scientific communities to focus on the hot topic. Global climate models (GCMs) are the primary tool used for studying climate change. However, GCMs are limited because of their coarse spatial resolution and inability to resolve important sub-grid scale features such as terrain and clouds. Statistical downscaling methods can be used to downscale large-scale variables to local-scale. In this study, we assess the applicability of the widely used Statistical Downscaling Model (SDSM) for the Loess Plateau, China. The observed variables included daily mean temperature (TMEAN), maximum temperature (TMAX) and minimum temperature (TMIN) from 1961 to 2005. The and the daily atmospheric data were taken from reanalysis data from 1961 to 2005, and global climate model outputs from Beijing Normal University Earth System Model (BNU-ESM) from 1961 to 2099 and from observations . The results show that SDSM performs well for these three climatic variables on the Loess Plateau. After downscaling, the root mean square errors for TMEAN, TMAX, TMIN for BNU-ESM were reduced by 70.9%, 75.1%, and 67.2%, respectively. All the rates of change in TMEAN, TMAX and TMIN during the 21st century decreased after SDSM downscaling. We also show that SDSM can effectively reduce uncertainty, compared with the raw model outputs. TMEAN uncertainty was reduced by 27.1%, 26.8%, and 16.3% for the future scenarios of RCP 2.6, RCP 4.5 and RCP 8.5, respectively. The corresponding reductions in uncertainty were 23.6%, 30.7%, and 18.7% for TMAX, ; and 37.6%, 31.8%, and 23.2% for TMIN.

  19. Sensitivity of Statistical Downscaling Techniques to Reanalysis Choice and Implications for Regional Climate Change Scenarios

    NASA Astrophysics Data System (ADS)

    Manzanas, R., Sr.; Brands, S.; San Martin, D., Sr.; Gutiérrez, J. M., Sr.

    2014-12-01

    This work shows that local-scale climate projections obtained by means of statistical downscaling are sensitive to the choice of reanalysis used for calibration. To this aim, a Generalized Linear Model (GLM) approach is applied to downscale daily precipitation in the Philippines. First, the GLMs are trained and tested -under a cross-validation scheme- separately for two distinct reanalyses (ERA-Interim and JRA-25) for the period 1981-2000. When the observed and downscaled time-series are compared, the attained performance is found to be sensitive to the reanalysis considered if climate change signal bearing variables (temperature and/or specific humidity) are included in the predictor field. Moreover, performance differences are shown to be in correspondence with the disagreement found between the raw predictors from the two reanalyses. Second, the regression coefficients calibrated either with ERA-Interim or JRA-25 are subsequently applied to the output of a Global Climate Model (MPI-ECHAM5) in order to assess the sensitivity of local-scale climate change projections (up to 2100) to reanalysis choice. In this case, the differences detected in present climate conditions are considerably amplified, leading to "delta-change" estimates differing by up to a 35% (on average for the entire country) depending on the reanalysis used for calibration. Therefore, reanalysis choice is shown to importantly contribute to the uncertainty of local-scale climate change projections, and, consequently, should be treated with equal care as other, well-known, sources of uncertainty -e.g., the choice of the GCM and/or downscaling method.- Implications of the results for the entire tropics, as well as for the Model Output Statistics downscaling approach are also briefly discussed.

  20. Using a Coupled Lake Model with WRF for Dynamical Downscaling

    EPA Science Inventory

    The Weather Research and Forecasting (WRF) model is used to downscale a coarse reanalysis (National Centers for Environmental Prediction–Department of Energy Atmospheric Model Intercomparison Project reanalysis, hereafter R2) as a proxy for a global climate model (GCM) to examine...

  1. A review of downscaling procedures - a contribution to the research on climate change impacts at city scale

    NASA Astrophysics Data System (ADS)

    Smid, Marek; Costa, Ana; Pebesma, Edzer; Granell, Carlos; Bhattacharya, Devanjan

    2016-04-01

    Human kind is currently predominantly urban based, and the majority of ever continuing population growth will take place in urban agglomerations. Urban systems are not only major drivers of climate change, but also the impact hot spots. Furthermore, climate change impacts are commonly managed at city scale. Therefore, assessing climate change impacts on urban systems is a very relevant subject of research. Climate and its impacts on all levels (local, meso and global scale) and also the inter-scale dependencies of those processes should be a subject to detail analysis. While global and regional projections of future climate are currently available, local-scale information is lacking. Hence, statistical downscaling methodologies represent a potentially efficient way to help to close this gap. In general, the methodological reviews of downscaling procedures cover the various methods according to their application (e.g. downscaling for the hydrological modelling). Some of the most recent and comprehensive studies, such as the ESSEM COST Action ES1102 (VALUE), use the concept of Perfect Prog and MOS. Other examples of classification schemes of downscaling techniques consider three main categories: linear methods, weather classifications and weather generators. Downscaling and climate modelling represent a multidisciplinary field, where researchers from various backgrounds intersect their efforts, resulting in specific terminology, which may be somewhat confusing. For instance, the Polynomial Regression (also called the Surface Trend Analysis) is a statistical technique. In the context of the spatial interpolation procedures, it is commonly classified as a deterministic technique, and kriging approaches are classified as stochastic. Furthermore, the terms "statistical" and "stochastic" (frequently used as names of sub-classes in downscaling methodological reviews) are not always considered as synonymous, even though both terms could be seen as identical since they are

  2. Ensemble data assimilation in the Red Sea: sensitivity to ensemble selection and atmospheric forcing

    NASA Astrophysics Data System (ADS)

    Toye, Habib; Zhan, Peng; Gopalakrishnan, Ganesh; Kartadikaria, Aditya R.; Huang, Huang; Knio, Omar; Hoteit, Ibrahim

    2017-07-01

    We present our efforts to build an ensemble data assimilation and forecasting system for the Red Sea. The system consists of the high-resolution Massachusetts Institute of Technology general circulation model (MITgcm) to simulate ocean circulation and of the Data Research Testbed (DART) for ensemble data assimilation. DART has been configured to integrate all members of an ensemble adjustment Kalman filter (EAKF) in parallel, based on which we adapted the ensemble operations in DART to use an invariant ensemble, i.e., an ensemble Optimal Interpolation (EnOI) algorithm. This approach requires only single forward model integration in the forecast step and therefore saves substantial computational cost. To deal with the strong seasonal variability of the Red Sea, the EnOI ensemble is then seasonally selected from a climatology of long-term model outputs. Observations of remote sensing sea surface height (SSH) and sea surface temperature (SST) are assimilated every 3 days. Real-time atmospheric fields from the National Center for Environmental Prediction (NCEP) and the European Center for Medium-Range Weather Forecasts (ECMWF) are used as forcing in different assimilation experiments. We investigate the behaviors of the EAKF and (seasonal-) EnOI and compare their performances for assimilating and forecasting the circulation of the Red Sea. We further assess the sensitivity of the assimilation system to various filtering parameters (ensemble size, inflation) and atmospheric forcing.

  3. Weather extremes in very large, high-resolution ensembles: the weatherathome experiment

    NASA Astrophysics Data System (ADS)

    Allen, M. R.; Rosier, S.; Massey, N.; Rye, C.; Bowery, A.; Miller, J.; Otto, F.; Jones, R.; Wilson, S.; Mote, P.; Stone, D. A.; Yamazaki, Y. H.; Carrington, D.

    2011-12-01

    Resolution and ensemble size are often seen as alternatives in climate modelling. Models with sufficient resolution to simulate many classes of extreme weather cannot normally be run often enough to assess the statistics of rare events, still less how these statistics may be changing. As a result, assessments of the impact of external forcing on regional climate extremes must be based either on statistical downscaling from relatively coarse-resolution models, or statistical extrapolation from 10-year to 100-year events. Under the weatherathome experiment, part of the climateprediction.net initiative, we have compiled the Met Office Regional Climate Model HadRM3P to run on personal computer volunteered by the general public at 25 and 50km resolution, embedded within the HadAM3P global atmosphere model. With a global network of about 50,000 volunteers, this allows us to run time-slice ensembles of essentially unlimited size, exploring the statistics of extreme weather under a range of scenarios for surface forcing and atmospheric composition, allowing for uncertainty in both boundary conditions and model parameters. Current experiments, developed with the support of Microsoft Research, focus on three regions, the Western USA, Europe and Southern Africa. We initially simulate the period 1959-2010 to establish which variables are realistically simulated by the model and on what scales. Our next experiments are focussing on the Event Attribution problem, exploring how the probability of various types of extreme weather would have been different over the recent past in a world unaffected by human influence, following the design of Pall et al (2011), but extended to a longer period and higher spatial resolution. We will present the first results of the unique, global, participatory experiment and discuss the implications for the attribution of recent weather events to anthropogenic influence on climate.

  4. Downscaling Satellite Precipitation with Emphasis on Extremes: A Variational ℓ1-Norm Regularization in the Derivative Domain

    NASA Astrophysics Data System (ADS)

    Foufoula-Georgiou, E.; Ebtehaj, A. M.; Zhang, S. Q.; Hou, A. Y.

    2014-05-01

    The increasing availability of precipitation observations from space, e.g., from the Tropical Rainfall Measuring Mission (TRMM) and the forthcoming Global Precipitation Measuring (GPM) Mission, has fueled renewed interest in developing frameworks for downscaling and multi-sensor data fusion that can handle large data sets in computationally efficient ways while optimally reproducing desired properties of the underlying rainfall fields. Of special interest is the reproduction of extreme precipitation intensities and gradients, as these are directly relevant to hazard prediction. In this paper, we present a new formalism for downscaling satellite precipitation observations, which explicitly allows for the preservation of some key geometrical and statistical properties of spatial precipitation. These include sharp intensity gradients (due to high-intensity regions embedded within lower-intensity areas), coherent spatial structures (due to regions of slowly varying rainfall), and thicker-than-Gaussian tails of precipitation gradients and intensities. Specifically, we pose the downscaling problem as a discrete inverse problem and solve it via a regularized variational approach (variational downscaling) where the regularization term is selected to impose the desired smoothness in the solution while allowing for some steep gradients (called ℓ1-norm or total variation regularization). We demonstrate the duality between this geometrically inspired solution and its Bayesian statistical interpretation, which is equivalent to assuming a Laplace prior distribution for the precipitation intensities in the derivative (wavelet) space. When the observation operator is not known, we discuss the effect of its misspecification and explore a previously proposed dictionary-based sparse inverse downscaling methodology to indirectly learn the observation operator from a data base of coincidental high- and low-resolution observations. The proposed method and ideas are illustrated in case

  5. A framework for evaluating statistical downscaling performance under changing climatic conditions (Invited)

    NASA Astrophysics Data System (ADS)

    Dixon, K. W.; Balaji, V.; Lanzante, J.; Radhakrishnan, A.; Hayhoe, K.; Stoner, A. K.; Gaitan, C. F.

    2013-12-01

    Statistical downscaling (SD) methods may be viewed as generating a value-added product - a refinement of global climate model (GCM) output designed to add finer scale detail and to address GCM shortcomings via a process that gleans information from a combination of observations and GCM-simulated climate change responses. Making use of observational data sets and GCM simulations representing the same historical period, cross-validation techniques allow one to assess how well an SD method meets this goal. However, lacking observations of future, the extent to which a particular SD method's skill might degrade when applied to future climate projections cannot be assessed in the same manner. Here we illustrate and describe extensions to a 'perfect model' experimental design that seeks to quantify aspects of SD method performance both for a historical period (1979-2008) and for late 21st century climate projections. Examples highlighting cases in which downscaling performance deteriorates in future climate projections will be discussed. Also, results will be presented showing how synthetic datasets having known statistical properties may be used to further isolate factors responsible for degradations in SD method skill under changing climatic conditions. We will describe a set of input files used to conduct these analyses that are being made available to researchers who wish to utilize this experimental framework to evaluate SD methods they have developed. The gridded data sets cover a region centered on the contiguous 48 United States with a grid spacing of approximately 25km, have daily time resolution (e.g., maximum and minimum near-surface temperature and precipitation), and represent a total of 120 years of model simulations. This effort is consistent with the 2013 National Climate Predictions and Projections Platform Quantitative Evaluation of Downscaling Workshop goal of supporting a community approach to promote the informed use of downscaled climate projections.

  6. Using Random Forest to Improve the Downscaling of Global Livestock Census Data

    PubMed Central

    Nicolas, Gaëlle; Robinson, Timothy P.; Wint, G. R. William; Conchedda, Giulia; Cinardi, Giuseppina; Gilbert, Marius

    2016-01-01

    Large scale, high-resolution global data on farm animal distributions are essential for spatially explicit assessments of the epidemiological, environmental and socio-economic impacts of the livestock sector. This has been the major motivation behind the development of the Gridded Livestock of the World (GLW) database, which has been extensively used since its first publication in 2007. The database relies on a downscaling methodology whereby census counts of animals in sub-national administrative units are redistributed at the level of grid cells as a function of a series of spatial covariates. The recent upgrade of GLW1 to GLW2 involved automating the processing, improvement of input data, and downscaling at a spatial resolution of 1 km per cell (5 km per cell in the earlier version). The underlying statistical methodology, however, remained unchanged. In this paper, we evaluate new methods to downscale census data with a higher accuracy and increased processing efficiency. Two main factors were evaluated, based on sample census datasets of cattle in Africa and chickens in Asia. First, we implemented and evaluated Random Forest models (RF) instead of stratified regressions. Second, we investigated whether models that predicted the number of animals per rural person (per capita) could provide better downscaled estimates than the previous approach that predicted absolute densities (animals per km2). RF models consistently provided better predictions than the stratified regressions for both continents and species. The benefit of per capita over absolute density models varied according to the species and continent. In addition, different technical options were evaluated to reduce the processing time while maintaining their predictive power. Future GLW runs (GLW 3.0) will apply the new RF methodology with optimized modelling options. The potential benefit of per capita models will need to be further investigated with a better distinction between rural and agricultural

  7. Downscaled soil moisture from SMAP evaluated using high density observations

    USDA-ARS?s Scientific Manuscript database

    Recently, a soil moisture downscaling algorithm based on a regression relationship between daily temperature changes and daily average soil moisture was developed to produce an enhanced spatial resolution on soil moisture product for the Advanced Microwave Scanning Radiometer–EOS (AMSR-E) satellite ...

  8. Downscaling Indicators of Forest Habitat Structure from National Assessments

    Treesearch

    Kurt H. Riitters

    2005-01-01

    Downscaling is an important problem because consistent large-area assessments of forest habitat structure, while feasible, are only feasible when using relatively coarse data and indicators. Techniques are needed to enable more detailed and local interpretations of the national statistics. Using the results of national assessments from land-cover maps, this paper...

  9. Extended-Range High-Resolution Dynamical Downscaling over a Continental-Scale Domain

    NASA Astrophysics Data System (ADS)

    Husain, S. Z.; Separovic, L.; Yu, W.; Fernig, D.

    2014-12-01

    High-resolution mesoscale simulations, when applied for downscaling meteorological fields over large spatial domains and for extended time periods, can provide valuable information for many practical application scenarios including the weather-dependent renewable energy industry. In the present study, a strategy has been proposed to dynamically downscale coarse-resolution meteorological fields from Environment Canada's regional analyses for a period of multiple years over the entire Canadian territory. The study demonstrates that a continuous mesoscale simulation over the entire domain is the most suitable approach in this regard. Large-scale deviations in the different meteorological fields pose the biggest challenge for extended-range simulations over continental scale domains, and the enforcement of the lateral boundary conditions is not sufficient to restrict such deviations. A scheme has therefore been developed to spectrally nudge the simulated high-resolution meteorological fields at the different model vertical levels towards those embedded in the coarse-resolution driving fields derived from the regional analyses. A series of experiments were carried out to determine the optimal nudging strategy including the appropriate nudging length scales, nudging vertical profile and temporal relaxation. A forcing strategy based on grid nudging of the different surface fields, including surface temperature, soil-moisture, and snow conditions, towards their expected values obtained from a high-resolution offline surface scheme was also devised to limit any considerable deviation in the evolving surface fields due to extended-range temporal integrations. The study shows that ensuring large-scale atmospheric similarities helps to deliver near-surface statistical scores for temperature, dew point temperature and horizontal wind speed that are better or comparable to the operational regional forecasts issued by Environment Canada. Furthermore, the meteorological fields

  10. Downscale climate change scenarios over the Western Himalayan region of India using multi-generation CMIP experiments

    NASA Astrophysics Data System (ADS)

    Das, Lalu; Meher, Jitendra K.; Akhter, Javed

    2017-04-01

    Assessing climate change information over the Western Himalayan Region (WHR) of India is crucial but challenging task due to its limited numbers of station data containing huge missing values. The issues of missing values of station data were replaced the Multiple Imputation Chained Equation (MICE) technique. Finally 22 numbers of rain gauge stations having continuous data during 1901-2005 and 16 numbers stations having continuous temperature data during 1969-2009 were considered as " reference stations for assessing rainfall and temperature trends in addition to evaluation of the GCMs available in the Coupled Model Intercomparison Project, Phase 3 (CMIP3) and phase 5 (CMIP5) over WRH. Station data indicates that the winter warming is higher and rapid (1.05oC) than other seasons and less warming in the post monsoon season in the last 41 years. Area averaged using 22 station data indicates that monsoon and winter rainfall has decreased by -5 mm and -320 mm during 1901-2000 while pre-monsoon and post monsoon showed an increasing trends of 21 mm and 13 mm respectively. Present study is constructed the downscaled climate change information at station locations (22 and 16 stations for rainfall and temperature respectively) over the WHR from the GCMs commonly available in the IPCC's different generations assessment reports namely 2nd, 3rd, 4th and 5th thereafter known as SAR, TAR, AR4 and AR5 respectively. Once the downscaled results are obtained for each generation model outputs, then a comparison of studies is carried out from the results of each generation. Finally an overall model improvement index (OMII) is developed using the downscaling results which is used to investigate the model improvement across generations as well as the improvement of downscaling results obtained from the empirical statistical downscaling (ESD) methods. In general, the results indicate that there is a gradual improvement of GCMs simulations as well as downscaling results across generation

  11. A review of spatial downscaling of satellite remotely sensed soil moisture

    NASA Astrophysics Data System (ADS)

    Peng, Jian; Loew, Alexander; Merlin, Olivier; Verhoest, Niko E. C.

    2017-06-01

    Satellite remote sensing technology has been widely used to estimate surface soil moisture. Numerous efforts have been devoted to develop global soil moisture products. However, these global soil moisture products, normally retrieved from microwave remote sensing data, are typically not suitable for regional hydrological and agricultural applications such as irrigation management and flood predictions, due to their coarse spatial resolution. Therefore, various downscaling methods have been proposed to improve the coarse resolution soil moisture products. The purpose of this paper is to review existing methods for downscaling satellite remotely sensed soil moisture. These methods are assessed and compared in terms of their advantages and limitations. This review also provides the accuracy level of these methods based on published validation studies. In the final part, problems and future trends associated with these methods are analyzed.

  12. Distributed HUC-based modeling with SUMMA for ensemble streamflow forecasting over large regional domains.

    NASA Astrophysics Data System (ADS)

    Saharia, M.; Wood, A.; Clark, M. P.; Bennett, A.; Nijssen, B.; Clark, E.; Newman, A. J.

    2017-12-01

    Most operational streamflow forecasting systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow require an experienced human forecaster. But this approach faces challenges surrounding process reproducibility, hindcasting capability, and extension to large domains. The operational hydrologic community is increasingly moving towards `over-the-loop' (completely automated) large-domain simulations yet recent developments indicate a widespread lack of community knowledge about the strengths and weaknesses of such systems for forecasting. A realistic representation of land surface hydrologic processes is a critical element for improving forecasts, but often comes at the substantial cost of forecast system agility and efficiency. While popular grid-based models support the distributed representation of land surface processes, intermediate-scale Hydrologic Unit Code (HUC)-based modeling could provide a more efficient and process-aligned spatial discretization, reducing the need for tradeoffs between model complexity and critical forecasting requirements such as ensemble methods and comprehensive model calibration. The National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the USACE to implement, assess, and demonstrate real-time, over-the-loop distributed streamflow forecasting for several large western US river basins and regions. In this presentation, we present early results from short to medium range hydrologic and streamflow forecasts for the Pacific Northwest (PNW). We employ a real-time 1/16th degree daily ensemble model forcings as well as downscaled Global Ensemble Forecasting System (GEFS) meteorological forecasts. These datasets drive an intermediate-scale configuration of the Structure for Unifying Multiple Modeling Alternatives (SUMMA) model, which represents the PNW using over 11,700 HUCs. The system produces not only streamflow forecasts (using the Mizu

  13. Downscaling essential climate variable soil moisture using multisource data from 2003 to 2010 in China

    NASA Astrophysics Data System (ADS)

    Wang, Hui-Lin; An, Ru; You, Jia-jun; Wang, Ying; Chen, Yuehong; Shen, Xiao-ji; Gao, Wei; Wang, Yi-nan; Zhang, Yu; Wang, Zhe; Quaye-Ballard, Jonathan Arthur

    2017-10-01

    Soil moisture plays an important role in the water cycle within the surface ecosystem, and it is the basic condition for the growth of plants. Currently, the spatial resolutions of most soil moisture data from remote sensing range from ten to several tens of km, while those observed in-situ and simulated for watershed hydrology, ecology, agriculture, weather, and drought research are generally <1 km. Therefore, the existing coarse-resolution remotely sensed soil moisture data need to be downscaled. This paper proposes a universal and multitemporal soil moisture downscaling method suitable for large areas. The datasets comprise land surface, brightness temperature, precipitation, and soil and topographic parameters from high-resolution data and active/passive microwave remotely sensed essential climate variable soil moisture (ECV_SM) data with a spatial resolution of 25 km. Using this method, a total of 288 soil moisture maps of 1-km resolution from the first 10-day period of January 2003 to the last 10-day period of December 2010 were derived. The in-situ observations were used to validate the downscaled ECV_SM. In general, the downscaled soil moisture values for different land cover and land use types are consistent with the in-situ observations. Mean square root error is reduced from 0.070 to 0.061 using 1970 in-situ time series observation data from 28 sites distributed over different land uses and land cover types. The performance was also assessed using the GDOWN metric, a measure of the overall performance of the downscaling methods based on the same dataset. It was positive in 71.429% of cases, indicating that the suggested method in the paper generally improves the representation of soil moisture at 1-km resolution.

  14. Impact of climate change on hydrological conditions in a tropical West African catchment using an ensemble of climate simulations

    NASA Astrophysics Data System (ADS)

    Yira, Yacouba; Diekkrüger, Bernd; Steup, Gero; Yaovi Bossa, Aymar

    2017-04-01

    This study evaluates climate change impacts on water resources using an ensemble of six regional climate models (RCMs)-global climate models (GCMs) in the Dano catchment (Burkina Faso). The applied climate datasets were performed in the framework of the COordinated Regional climate Downscaling Experiment (CORDEX-Africa) project.

    After evaluation of the historical runs of the climate models' ensemble, a statistical bias correction (empirical quantile mapping) was applied to daily precipitation. Temperature and bias corrected precipitation data from the ensemble of RCMs-GCMs was then used as input for the Water flow and balance Simulation Model (WaSiM) to simulate water balance components.

    The mean hydrological and climate variables for two periods (1971-2000 and 2021-2050) were compared to assess the potential impact of climate change on water resources up to the middle of the 21st century under two greenhouse gas concentration scenarios, the Representative Concentration Pathways (RCPs) 4.5 and 8.5. The results indicate (i) a clear signal of temperature increase of about 0.1 to 2.6 °C for all members of the RCM-GCM ensemble; (ii) high uncertainty about how the catchment precipitation will evolve over the period 2021-2050; (iii) the applied bias correction method only affected the magnitude of the climate change signal; (iv) individual climate models results lead to opposite discharge change signals; and (v) the results for the RCM-GCM ensemble are too uncertain to give any clear direction for future hydrological development. Therefore, potential increase and

  15. Downscaling and hydrological uncertainties in 20th century hydrometeorological reconstructions over France

    NASA Astrophysics Data System (ADS)

    Vidal, Jean-Philippe; Caillouet, Laurie; Dayon, Gildas; Boé, Julien; Sauquet, Eric; Thirel, Guillaume; Graff, Benjamin

    2017-04-01

    The record length of streamflow observations is generally limited to the last 50 years, which is not enough to properly explore the natural hydrometeorological variability, a key to better understand the effects of anthropogenic climate change. This work proposes a comparison of different hydrometeorological reconstruction datasets over France built on the downscaling of the NOAA 20th century global extended reanalysis (20CR, Compo et al., 2011). It aims at assessing the uncertainties related to these reconstructions and improving our knowledge of the multi-decadal hydrometeorological variability over the 20th century. High-resolution daily meteorological reconstructions over the period 1871-2012 are obtained with two statistical downscaling methods based on the analogue approach: the deterministic ANALOG method (Dayon et al., 2015) and the probabilistic SCOPE method (Caillouet et al., 2016). These reconstructions are then used as forcings for the GR6J lumped conceptual rainfall-runoff model and the SIM physically-based distributed hydrological model, in order to derive daily streamflow reconstructions over a set of around 70 reference near-natural catchments. Results show a large multi-decadal streamflow variability over the last 140 years, which is however relatively consistent over France. Empirical estimates of three types of uncertainty - structure of the downscaling method, small-scale internal variability, and hydrological model structure - show roughly equal contributions to the streamflow uncertainty at the annual time scale, with values as high as 20% of the interannual mean. Caillouet, L., Vidal, J.-P., Sauquet, E., and Graff, B.: Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France, Clim. Past, 12, 635-662, doi:10.5194/cp-12-635-2016, 2016. Compo, G. P., Whitaker, J. S., Sardeshmukh, P. D., Matsui, N., Allan, R. J., Yin, X., Gleason, B. E., Vose, R. S., Rutledge, G., Bessemoulin, P., Brönnimann, S

  16. Potential interactions between diadromous fishes of U.K. conservation importance and the electromagnetic fields and subsea noise from marine renewable energy developments.

    PubMed

    Gill, A B; Bartlett, M; Thomsen, F

    2012-07-01

    The considerable extent of construction and operation of marine renewable energy developments (MRED) within U.K. and adjacent waters will lead, among other things, to the emission of electromagnetic fields (EMF) and subsea sounds into the marine environment. Migratory fishes that respond to natural environmental cues, such as the Earth's geomagnetic field or underwater sounds, move through the same waters that the MRED occupy, thereby raising the question of whether there are any effects of MRED on migratory fishes. Diadromous species, such as the Salmonidae and Anguillidae, which undertake large-scale migrations through coastal and offshore waters, are already significantly affected by other human activities leading to national and international conservation efforts to manage any existing threats and to minimize future concerns, including the potential effect of MRED. Here, the current state of knowledge with regard to the potential for diadromous fishes of U.K. conservation importance to be affected by MRED is reviewed. The information on which to base the review was found to be limited with respect to all aspects of these fishes' migratory behaviour and activity, especially with regards to MRED deployment, making it difficult to establish cause and effect relationships. The main findings, however, were that diadromous species can use the Earth's magnetic field for orientation and direction finding during migrations. Juveniles of anadromous brown trout (sea trout) Salmo trutta and close relatives of S. trutta respond to both the Earth's magnetic field and artificial magnetic fields. Current knowledge suggests that EMFs from subsea cables may interact with migrating Anguilla sp. (and possibly other diadromous fishes) if their movement routes take them over the cables, particularly in shallow water (<20 m). The only known effect is a temporary change in swimming direction. Whether this will represent a biologically significant effect, for example delayed migration

  17. Projections of rising heat stress over the western Maritime Continent from dynamically downscaled climate simulations

    NASA Astrophysics Data System (ADS)

    Im, Eun-Soon; Kang, Suchul; Eltahir, Elfatih A. B.

    2018-06-01

    This study assesses the future changes in heat stress in response to different emission scenarios over the western Maritime Continent. To better resolve the region-specific changes and to enhance the performance in simulating extreme events, the MIT Regional Climate Model with a 12-km horizontal resolution is used for the dynamical downscaling of three carefully selected CMIP5 global projections forced by two Representative Concentration Pathway (RCP4.5 and RCP8.5) scenarios. Daily maximum wet-bulb temperature (TWmax), which includes the effect of humidity, is examined to describe heat stress as regulated by future changes in temperature and humidity. An ensemble of projections reveals robust pattern in which a large increase in temperature is accompanied by a reduction in relative humidity but a significant increase in wet-bulb temperature. This increase in TWmax is relatively smaller over flat and coastal regions than that over mountainous region. However, the flat and coastal regions characterized by warm and humid present-day climate will be at risk even under modest increase in TWmax. The regional extent exposed to higher TWmax and the number of days on which TWmax exceeds its threshold value are projected to be much higher in RCP8.5 scenario than those in RCP4.5 scenario, thus highlighting the importance of controlling greenhouse gas emissions to reduce the adverse impacts on human health and heat-related mortality.

  18. Ensembl comparative genomics resources

    PubMed Central

    Muffato, Matthieu; Beal, Kathryn; Fitzgerald, Stephen; Gordon, Leo; Pignatelli, Miguel; Vilella, Albert J.; Searle, Stephen M. J.; Amode, Ridwan; Brent, Simon; Spooner, William; Kulesha, Eugene; Yates, Andrew; Flicek, Paul

    2016-01-01

    Evolution provides the unifying framework with which to understand biology. The coherent investigation of genic and genomic data often requires comparative genomics analyses based on whole-genome alignments, sets of homologous genes and other relevant datasets in order to evaluate and answer evolutionary-related questions. However, the complexity and computational requirements of producing such data are substantial: this has led to only a small number of reference resources that are used for most comparative analyses. The Ensembl comparative genomics resources are one such reference set that facilitates comprehensive and reproducible analysis of chordate genome data. Ensembl computes pairwise and multiple whole-genome alignments from which large-scale synteny, per-base conservation scores and constrained elements are obtained. Gene alignments are used to define Ensembl Protein Families, GeneTrees and homologies for both protein-coding and non-coding RNA genes. These resources are updated frequently and have a consistent informatics infrastructure and data presentation across all supported species. Specialized web-based visualizations are also available including synteny displays, collapsible gene tree plots, a gene family locator and different alignment views. The Ensembl comparative genomics infrastructure is extensively reused for the analysis of non-vertebrate species by other projects including Ensembl Genomes and Gramene and much of the information here is relevant to these projects. The consistency of the annotation across species and the focus on vertebrates makes Ensembl an ideal system to perform and support vertebrate comparative genomic analyses. We use robust software and pipelines to produce reference comparative data and make it freely available. Database URL: http://www.ensembl.org. PMID:26896847

  19. Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France

    NASA Astrophysics Data System (ADS)

    Caillouet, Laurie; Vidal, Jean-Philippe; Sauquet, Eric; Graff, Benjamin

    2015-04-01

    This work proposes a daily high-resolution probabilistic reconstruction of precipitation and temperature fields in France over the last century built on the NOAA 20th century global extended atmospheric reanalysis (20CR, Compo et al., 2011). It aims at delivering appropriate meteorological forcings for continuous distributed hydrological modelling over the last 140 years. The longer term objective is to improve our knowledge of major historical hydrometeorological events having occurred outside of the last 50-year period, over which comprehensive reconstructions and observations are available. It would constitute a perfect framework for assessing the recent observed events but also future events projected by climate change impact studies. The Sandhy (Stepwise ANalogue Downscaling method for Hydrology) statistical downscaling method (Radanovics et al., 2013), initially developed for quantitative precipitation forecast, is used here to bridge the scale gap between 20CR predictors - temperature, geopotential shape, vertical velocity and relative humidity - and local predictands - precipitation and temperature - relevant for catchment-scale hydrology. Multiple predictor domains for geopotential shape are retained from a local optimisation over France using the Safran near-surface reanalysis (Vidal et al., 2010). Sandhy gives an ensemble of 125 equally plausible gridded precipitation and temperature time series over the whole 1871-2012 period. Previous studies showed that Sandhy precipitation outputs are very slightly biased at the annual time scale. Nevertheless, the seasonal precipitation signal for areas with a high interannual variability is not well simulated. Moreover, winter and summer temperatures are respectively over- and underestimated. Reliable seasonal precipitation and temperature signals are however necessary for hydrological modelling, especially for evapotranspiration and snow accumulation/snowmelt processes. Two different post-processing methods are

  20. Use of dynamical downscaling to improve the simulation of Central U.S. warm season precipitation in CMIP5 models

    NASA Astrophysics Data System (ADS)

    Harding, Keith J.; Snyder, Peter K.; Liess, Stefan

    2013-11-01

    supporting exceptionally productive agricultural lands, the Central U.S. is susceptible to severe droughts and floods. Such precipitation extremes are expected to worsen with climate change. However, future projections are highly uncertain as global climate models (GCMs) generally fail to resolve precipitation extremes. In this study, we assess how well models from the Coupled Model Intercomparison Project Phase 5 (CMIP5) simulate summer means, variability, extremes, and the diurnal cycle of Central U.S. summer rainfall. Output from a subset of historical CMIP5 simulations are used to drive the Weather Research and Forecasting model to determine whether dynamical downscaling improves the representation of Central U.S. rainfall. We investigate which boundary conditions influence dynamically downscaled precipitation estimates and identify GCMs that can reasonably simulate precipitation when downscaled. The CMIP5 models simulate the seasonal mean and variability of summer rainfall reasonably well but fail to resolve extremes, the diurnal cycle, and the dynamic forcing of precipitation. Downscaling to 30 km improves these characteristics of precipitation, with the greatest improvement in the representation of extremes. Additionally, sizeable diurnal cycle improvements occur with higher (10 km) resolution and convective parameterization disabled, as the daily rainfall peak shifts 4 h closer to observations than 30 km resolution simulations. This lends greater confidence that the mechanisms responsible for producing rainfall are better simulated. Because dynamical downscaling can more accurately simulate these aspects of Central U.S. summer rainfall, policymakers can have added confidence in dynamically downscaled rainfall projections, allowing for more targeted adaptation and mitigation.

  1. Impacts of calibration strategies and ensemble methods on ensemble flood forecasting over Lanjiang basin, Southeast China

    NASA Astrophysics Data System (ADS)

    Liu, Li; Xu, Yue-Ping

    2017-04-01

    Ensemble flood forecasting driven by numerical weather prediction products is becoming more commonly used in operational flood forecasting applications.In this study, a hydrological ensemble flood forecasting system based on Variable Infiltration Capacity (VIC) model and quantitative precipitation forecasts from TIGGE dataset is constructed for Lanjiang Basin, Southeast China. The impacts of calibration strategies and ensemble methods on the performance of the system are then evaluated.The hydrological model is optimized by parallel programmed ɛ-NSGAII multi-objective algorithm and two respectively parameterized models are determined to simulate daily flows and peak flows coupled with a modular approach.The results indicatethat the ɛ-NSGAII algorithm permits more efficient optimization and rational determination on parameter setting.It is demonstrated that the multimodel ensemble streamflow mean have better skills than the best singlemodel ensemble mean (ECMWF) and the multimodel ensembles weighted on members and skill scores outperform other multimodel ensembles. For typical flood event, it is proved that the flood can be predicted 3-4 days in advance, but the flows in rising limb can be captured with only 1-2 days ahead due to the flash feature. With respect to peak flows selected by Peaks Over Threshold approach, the ensemble means from either singlemodel or multimodels are generally underestimated as the extreme values are smoothed out by ensemble process.

  2. Assessment of the scale effect on statistical downscaling quality at a station scale using a weather generator-based model

    USDA-ARS?s Scientific Manuscript database

    The resolution of General Circulation Models (GCMs) is too coarse to assess the fine scale or site-specific impacts of climate change. Downscaling approaches including dynamical and statistical downscaling have been developed to meet this requirement. As the resolution of climate model increases, it...

  3. Ensembl comparative genomics resources.

    PubMed

    Herrero, Javier; Muffato, Matthieu; Beal, Kathryn; Fitzgerald, Stephen; Gordon, Leo; Pignatelli, Miguel; Vilella, Albert J; Searle, Stephen M J; Amode, Ridwan; Brent, Simon; Spooner, William; Kulesha, Eugene; Yates, Andrew; Flicek, Paul

    2016-01-01

    Evolution provides the unifying framework with which to understand biology. The coherent investigation of genic and genomic data often requires comparative genomics analyses based on whole-genome alignments, sets of homologous genes and other relevant datasets in order to evaluate and answer evolutionary-related questions. However, the complexity and computational requirements of producing such data are substantial: this has led to only a small number of reference resources that are used for most comparative analyses. The Ensembl comparative genomics resources are one such reference set that facilitates comprehensive and reproducible analysis of chordate genome data. Ensembl computes pairwise and multiple whole-genome alignments from which large-scale synteny, per-base conservation scores and constrained elements are obtained. Gene alignments are used to define Ensembl Protein Families, GeneTrees and homologies for both protein-coding and non-coding RNA genes. These resources are updated frequently and have a consistent informatics infrastructure and data presentation across all supported species. Specialized web-based visualizations are also available including synteny displays, collapsible gene tree plots, a gene family locator and different alignment views. The Ensembl comparative genomics infrastructure is extensively reused for the analysis of non-vertebrate species by other projects including Ensembl Genomes and Gramene and much of the information here is relevant to these projects. The consistency of the annotation across species and the focus on vertebrates makes Ensembl an ideal system to perform and support vertebrate comparative genomic analyses. We use robust software and pipelines to produce reference comparative data and make it freely available. Database URL: http://www.ensembl.org. © The Author(s) 2016. Published by Oxford University Press.

  4. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    NASA Astrophysics Data System (ADS)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily

  5. Precipitation projections under GCMs perspective and Turkish Water Foundation (TWF) statistical downscaling model procedures

    NASA Astrophysics Data System (ADS)

    Dabanlı, İsmail; Şen, Zekai

    2018-04-01

    The statistical climate downscaling model by the Turkish Water Foundation (TWF) is further developed and applied to a set of monthly precipitation records. The model is structured by two phases as spatial (regional) and temporal downscaling of global circulation model (GCM) scenarios. The TWF model takes into consideration the regional dependence function (RDF) for spatial structure and Markov whitening process (MWP) for temporal characteristics of the records to set projections. The impact of climate change on monthly precipitations is studied by downscaling Intergovernmental Panel on Climate Change-Special Report on Emission Scenarios (IPCC-SRES) A2 and B2 emission scenarios from Max Plank Institute (EH40PYC) and Hadley Center (HadCM3). The main purposes are to explain the TWF statistical climate downscaling model procedures and to expose the validation tests, which are rewarded in same specifications as "very good" for all stations except one (Suhut) station in the Akarcay basin that is in the west central part of Turkey. Eventhough, the validation score is just a bit lower at the Suhut station, the results are "satisfactory." It is, therefore, possible to say that the TWF model has reasonably acceptable skill for highly accurate estimation regarding standard deviation ratio (SDR), Nash-Sutcliffe efficiency (NSE), and percent bias (PBIAS) criteria. Based on the validated model, precipitation predictions are generated from 2011 to 2100 by using 30-year reference observation period (1981-2010). Precipitation arithmetic average and standard deviation have less than 5% error for EH40PYC and HadCM3 SRES (A2 and B2) scenarios.

  6. Spatial Scale Gap Filling Using an Unmanned Aerial System: A Statistical Downscaling Method for Applications in Precision Agriculture.

    PubMed

    Hassan-Esfahani, Leila; Ebtehaj, Ardeshir M; Torres-Rua, Alfonso; McKee, Mac

    2017-09-14

    Applications of satellite-borne observations in precision agriculture (PA) are often limited due to the coarse spatial resolution of satellite imagery. This paper uses high-resolution airborne observations to increase the spatial resolution of satellite data for related applications in PA. A new variational downscaling scheme is presented that uses coincident aerial imagery products from "AggieAir", an unmanned aerial system, to increase the spatial resolution of Landsat satellite data. This approach is primarily tested for downscaling individual band Landsat images that can be used to derive normalized difference vegetation index (NDVI) and surface soil moisture (SSM). Quantitative and qualitative results demonstrate promising capabilities of the downscaling approach enabling effective increase of the spatial resolution of Landsat imageries by orders of 2 to 4. Specifically, the downscaling scheme retrieved the missing high-resolution feature of the imageries and reduced the root mean squared error by 15, 11, and 10 percent in visual, near infrared, and thermal infrared bands, respectively. This metric is reduced by 9% in the derived NDVI and remains negligibly for the soil moisture products.

  7. Spatial Scale Gap Filling Using an Unmanned Aerial System: A Statistical Downscaling Method for Applications in Precision Agriculture

    PubMed Central

    Hassan-Esfahani, Leila; Ebtehaj, Ardeshir M.; McKee, Mac

    2017-01-01

    Applications of satellite-borne observations in precision agriculture (PA) are often limited due to the coarse spatial resolution of satellite imagery. This paper uses high-resolution airborne observations to increase the spatial resolution of satellite data for related applications in PA. A new variational downscaling scheme is presented that uses coincident aerial imagery products from “AggieAir”, an unmanned aerial system, to increase the spatial resolution of Landsat satellite data. This approach is primarily tested for downscaling individual band Landsat images that can be used to derive normalized difference vegetation index (NDVI) and surface soil moisture (SSM). Quantitative and qualitative results demonstrate promising capabilities of the downscaling approach enabling effective increase of the spatial resolution of Landsat imageries by orders of 2 to 4. Specifically, the downscaling scheme retrieved the missing high-resolution feature of the imageries and reduced the root mean squared error by 15, 11, and 10 percent in visual, near infrared, and thermal infrared bands, respectively. This metric is reduced by 9% in the derived NDVI and remains negligibly for the soil moisture products. PMID:28906428

  8. EnsembleGraph: Interactive Visual Analysis of Spatial-Temporal Behavior for Ensemble Simulation Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shu, Qingya; Guo, Hanqi; Che, Limei

    We present a novel visualization framework—EnsembleGraph— for analyzing ensemble simulation data, in order to help scientists understand behavior similarities between ensemble members over space and time. A graph-based representation is used to visualize individual spatiotemporal regions with similar behaviors, which are extracted by hierarchical clustering algorithms. A user interface with multiple-linked views is provided, which enables users to explore, locate, and compare regions that have similar behaviors between and then users can investigate and analyze the selected regions in detail. The driving application of this paper is the studies on regional emission influences over tropospheric ozone, which is based onmore » ensemble simulations conducted with different anthropogenic emission absences using the MOZART-4 (model of ozone and related tracers, version 4) model. We demonstrate the effectiveness of our method by visualizing the MOZART-4 ensemble simulation data and evaluating the relative regional emission influences on tropospheric ozone concentrations. Positive feedbacks from domain experts and two case studies prove efficiency of our method.« less

  9. Downscaling an Eddy-Resolving Global Model for the Continental Shelf off South Eastern Australia

    NASA Astrophysics Data System (ADS)

    Roughan, M.; Baird, M.; MacDonald, H.; Oke, P.

    2008-12-01

    The Australian Bluelink collaboration between CSIRO, the Bureau of Meteorology and the Royal Australian Navy has made available to the research community the output of BODAS (Bluelink ocean data assimilation system), an ensemble optimal interpolation reanalysis system with ~10 km resolution around Australia. Within the Bluelink project, BODAS fields are assimilated into a dynamic ocean model of the same resolution to produce BRAN (BlueLink ReANalysis, a hindcast of water properties around Australia from 1992 to 2004). In this study, BODAS hydrographic fields are assimilated into a ~ 3 km resolution Princeton Ocean Model (POM) configuration of the coastal ocean off SE Australia. Experiments were undertaken to establish the optimal strength and duration of the assimilation of BODAS fields into the 3 km resolution POM configuration for the purpose of producing hindcasts of ocean state. It is shown that the resultant downscaling of Bluelink products is better able to reproduce coastal features, particularly velocities and hydrography over the continental shelf off south eastern Australia. The BODAS-POM modelling system is used to provide a high-resolution simulation of the East Australian Current over the period 1992 to 2004. One of the applications that we will present is an investigation of the seasonal and inter-annual variability in the dispersion of passive particles in the East Australian Current. The practical outcome is an estimate of the connectivity of estuaries along the coast of southeast Australia, which is relevant for the dispersion of marine pests.

  10. Measuring social interaction in music ensembles

    PubMed Central

    D'Ausilio, Alessandro; Badino, Leonardo; Camurri, Antonio; Fadiga, Luciano

    2016-01-01

    Music ensembles are an ideal test-bed for quantitative analysis of social interaction. Music is an inherently social activity, and music ensembles offer a broad variety of scenarios which are particularly suitable for investigation. Small ensembles, such as string quartets, are deemed a significant example of self-managed teams, where all musicians contribute equally to a task. In bigger ensembles, such as orchestras, the relationship between a leader (the conductor) and a group of followers (the musicians) clearly emerges. This paper presents an overview of recent research on social interaction in music ensembles with a particular focus on (i) studies from cognitive neuroscience; and (ii) studies adopting a computational approach for carrying out automatic quantitative analysis of ensemble music performances. PMID:27069054

  11. Measuring social interaction in music ensembles.

    PubMed

    Volpe, Gualtiero; D'Ausilio, Alessandro; Badino, Leonardo; Camurri, Antonio; Fadiga, Luciano

    2016-05-05

    Music ensembles are an ideal test-bed for quantitative analysis of social interaction. Music is an inherently social activity, and music ensembles offer a broad variety of scenarios which are particularly suitable for investigation. Small ensembles, such as string quartets, are deemed a significant example of self-managed teams, where all musicians contribute equally to a task. In bigger ensembles, such as orchestras, the relationship between a leader (the conductor) and a group of followers (the musicians) clearly emerges. This paper presents an overview of recent research on social interaction in music ensembles with a particular focus on (i) studies from cognitive neuroscience; and (ii) studies adopting a computational approach for carrying out automatic quantitative analysis of ensemble music performances. © 2016 The Author(s).

  12. Statistical Ensemble of Large Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Carati, Daniele; Rogers, Michael M.; Wray, Alan A.; Mansour, Nagi N. (Technical Monitor)

    2001-01-01

    A statistical ensemble of large eddy simulations (LES) is run simultaneously for the same flow. The information provided by the different large scale velocity fields is used to propose an ensemble averaged version of the dynamic model. This produces local model parameters that only depend on the statistical properties of the flow. An important property of the ensemble averaged dynamic procedure is that it does not require any spatial averaging and can thus be used in fully inhomogeneous flows. Also, the ensemble of LES's provides statistics of the large scale velocity that can be used for building new models for the subgrid-scale stress tensor. The ensemble averaged dynamic procedure has been implemented with various models for three flows: decaying isotropic turbulence, forced isotropic turbulence, and the time developing plane wake. It is found that the results are almost independent of the number of LES's in the statistical ensemble provided that the ensemble contains at least 16 realizations.

  13. A downscaled 1 km dataset of daily Greenland ice sheet surface mass balance components (1958-2014)

    NASA Astrophysics Data System (ADS)

    Noel, B.; Van De Berg, W. J.; Fettweis, X.; Machguth, H.; Howat, I. M.; van den Broeke, M. R.

    2015-12-01

    The current spatial resolution in regional climate models (RCMs), typically around 5 to 20 km, remains too coarse to accurately reproduce the spatial variability in surface mass balance (SMB) components over the narrow ablation zones, marginal outlet glaciers and neighbouring ice caps of the Greenland ice sheet (GrIS). In these topographically rough terrains, the SMB components are highly dependent on local variations in topography. However, the relatively low-resolution elevation and ice mask prescribed in RCMs contribute to significantly underestimate melt and runoff in these regions due to unresolved valley glaciers and fjords. Therefore, near-km resolution topography is essential to better capture SMB variability in these spatially restricted regions. We present a 1 km resolution dataset of daily GrIS SMB covering the period 1958-2014, which is statistically downscaled from data of the polar regional climate model RACMO2.3 at 11 km, using an elevation dependence. The dataset includes all individual SMB components projected on the elevation and ice mask from the GIMP DEM, down-sampled to 1 km. Daily runoff and sublimation are interpolated to the 1 km topography using a local regression to elevation valid for each day specifically; daily precipitation is bi-linearly downscaled without elevation corrections. The daily SMB dataset is then reconstructed by summing downscaled precipitation, sublimation and runoff. High-resolution elevation and ice mask allow for properly resolving the narrow ablation zones and valley glaciers at the GrIS margins, leading to significant increase in runoff estimate. In these regions, and especially over narrow glaciers tongues, the downscaled products improve on the original RACMO2.3 outputs by better representing local SMB patterns through a gradual ablation increase towards the GrIS margins. We discuss the impact of downscaling on the SMB components in a case study for a spatially restricted region, where large elevation

  14. Examining the Performance of Statistical Downscaling Methods: Toward Matching Applications to Data Products

    NASA Astrophysics Data System (ADS)

    Dixon, K. W.; Lanzante, J. R.; Adams-Smith, D.

    2017-12-01

    Several challenges exist when seeking to use future climate model projections in a climate impacts study. A not uncommon approach is to utilize climate projection data sets derived from more than one future emissions scenario and from multiple global climate models (GCMs). The range of future climate responses represented in the set is sometimes taken to be indicative of levels of uncertainty in the projections. Yet, GCM outputs are deemed to be unsuitable for direct use in many climate impacts applications. GCM grids typically are viewed as being too coarse. Additionally, regional or local-scale biases in a GCM's simulation of the contemporary climate that may not be problematic from a global climate modeling perspective may be unacceptably large for a climate impacts application. Statistical downscaling (SD) of climate projections - a type of post-processing that uses observations to inform the refinement of GCM projections - is often used in an attempt to account for GCM biases and to provide additional spatial detail. "What downscaled climate projection is the best one to use" is a frequently asked question, but one that is not always easy to answer, as it can be dependent on stakeholder needs and expectations. Here we present results from a perfect model experimental design illustrating how SD method performance can vary not only by SD method, but how performance can also vary by location, season, climate variable of interest, amount of projected climate change, SD configuration choices, and whether one is interested in central tendencies or the tails of the distribution. Awareness of these factors can be helpful when seeking to determine the suitability of downscaled climate projections for specific climate impacts applications. It also points to the potential value of considering more than one SD data product in a study, so as to acknowledge uncertainties associated with the strengths and weaknesses of different downscaling methods.

  15. Downscaling CESM1 climate change projections for the MENA-CORDEX domain using WRF

    NASA Astrophysics Data System (ADS)

    Zittis, George; Hadjinicolaou, Panos; Lelieveld, Jos

    2017-04-01

    According to analysis of observations and global climate model projections, the broader Middle East, North Africa and Mediterranean region is found to be a climate change hotspot. Substantial changes in precipitation amounts and patterns and strong summer warming (including an intensification of heat extremes) is a likely future scenario for the region, but a recent uncertainty analysis indicated good model agreement for temperature but much less for precipitation. Although the horizontal resolution of global models has increased over the last years, it is still not adequate for impact and adaptation assessments of regional or national level and further downscaling of the climate information is required. The region is now studied within the CORDEX initiative (Coordinated Regional Climate Downscaling Experiment) with the establishment of a domain covering the Middle East - North Africa (MENA-CORDEX) region (http://mena-cordex.cyi.ac.cy/). In this study, we present the first climate change projections for the MENA produced by dynamically downscaling a bias-corrected output of the CESM1 global earth system model. For the downscaling, we use a climate configuration of the Weather, Research and Forecasting model (WRF). Our simulations use a standard CORDEX Phase I 50-km grid in three simulations, a historical (1950-2005) and two scenario runs (2006-2100) with the greenhouse gas forcing following the RCP 4.5 and 8.5. We evaluate precipitation, temperature and other surface meteorological variables from the historical using gridded and station observational datasets. Maps of projected changes are constructed for different periods in the future as differences of the two scenarios model output against the data from the historical run. The main spatial and temporal patterns of change are discussed, especially in the context of the United Nations Framework Convention on Climate Change agreement in Paris to limit the global average temperature increase to 1.5 degrees above pre

  16. Multi-Site and Multi-Variables Statistical Downscaling Technique in the Monsoon Dominated Region of Pakistan

    NASA Astrophysics Data System (ADS)

    Khan, Firdos; Pilz, Jürgen

    2016-04-01

    South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological

  17. Statistical downscaling of CMIP5 outputs for projecting future maximum and minimum temperature over the Haihe River Bain, China

    NASA Astrophysics Data System (ADS)

    Yan, Tiezhu; Shen, Zhenyao; Heng, Lee; Dercon, Gerd

    2016-04-01

    Future climate change information is important to formulate adaptation and mitigation strategies for climate change. In this study, a statistical downscaling model (SDSM) was established using both NCEP reanalysis data and ground observations (daily maximum and minimum temperature) during the period 1971-2010, and then calibrated model was applied to generate the future maximum and minimum temperature projections using predictors from the two CMIP5 models (MPI-ESM-LR and CNRM-CM5) under two Representative Concentration Pathway (RCP2.6 and RCP8.5) during the period 2011-2100 for the Haihe River Basin, China. Compared to the baseline period, future change in annual and seasonal maximum and minimum temperature was computed after bias correction. The spatial distribution and trend change of annual maximum and minimum temperature were also analyzed using ensemble projections. The results shows that: (1)The downscaling model had a good applicability on reproducing daily and monthly mean maximum and minimum temperature over the whole basin. (2) Bias was observed when using historical predictors from CMIP5 models and the performance of CNRM-CM5 was a little worse than that of MPI-ESM-LR. (3) The change in annual mean maximum and minimum temperature under the two scenarios in 2020s, 2050s and 2070s will increase and magnitude of maximum temperature will be higher than minimum temperature. (4) The increase in temperature in the mountains and along the coastline is remarkably high than the other parts of the studies basin. (5) For annual maximum and minimum temperature, the significant upward trend will be obtained under RCP 8.5 scenario and the magnitude will be 0.37 and 0.39 ℃ per decade, respectively; the increase in magnitude under RCP 2.6 scenario will be upward in 2020s and then decrease in 2050s and 2070s, and the magnitude will be 0.01 and 0.01℃ per decade, respectively.

  18. Joys of Community Ensemble Playing: The Case of the Happy Roll Elastic Ensemble in Taiwan

    ERIC Educational Resources Information Center

    Hsieh, Yuan-Mei; Kao, Kai-Chi

    2012-01-01

    The Happy Roll Elastic Ensemble (HREE) is a community music ensemble supported by Tainan Culture Centre in Taiwan. With enjoyment and friendship as its primary goals, it aims to facilitate the joys of ensemble playing and the spirit of social networking. This article highlights the key aspects of HREE's development in its first two years…

  19. Climatological attribution of wind power ramp events in East Japan and their probabilistic forecast based on multi-model ensembles downscaled by analog ensemble using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Ohba, Masamichi; Nohara, Daisuke; Kadokura, Shinji

    2016-04-01

    Severe storms or other extreme weather events can interrupt the spin of wind turbines in large scale that cause unexpected "wind ramp events". In this study, we present an application of self-organizing maps (SOMs) for climatological attribution of the wind ramp events and their probabilistic prediction. The SOM is an automatic data-mining clustering technique, which allows us to summarize a high-dimensional data space in terms of a set of reference vectors. The SOM is applied to analyze and connect the relationship between atmospheric patterns over Japan and wind power generation. SOM is employed on sea level pressure derived from the JRA55 reanalysis over the target area (Tohoku region in Japan), whereby a two-dimensional lattice of weather patterns (WPs) classified during the 1977-2013 period is obtained. To compare with the atmospheric data, the long-term wind power generation is reconstructed by using a high-resolution surface observation network AMeDAS (Automated Meteorological Data Acquisition System) in Japan. Our analysis extracts seven typical WPs, which are linked to frequent occurrences of wind ramp events. Probabilistic forecasts to wind power generation and ramps are conducted by using the obtained SOM. The probability are derived from the multiple SOM lattices based on the matching of output from TIGGE multi-model global forecast to the WPs on the lattices. Since this method effectively takes care of the empirical uncertainties from the historical data, wind power generation and ramp is probabilistically forecasted from the forecasts of global models. The predictability skill of the forecasts for the wind power generation and ramp events show the relatively good skill score under the downscaling technique. It is expected that the results of this study provides better guidance to the user community and contribute to future development of system operation model for the transmission grid operator.

  20. Hydrologic Implications of Dynamical and Statistical Approaches to Downscaling Climate Model Outputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wood, Andrew W; Leung, Lai R; Sridhar, V

    Six approaches for downscaling climate model outputs for use in hydrologic simulation were evaluated, with particular emphasis on each method's ability to produce precipitation and other variables used to drive a macroscale hydrology model applied at much higher spatial resolution than the climate model. Comparisons were made on the basis of a twenty-year retrospective (1975–1995) climate simulation produced by the NCAR-DOE Parallel Climate Model (PCM), and the implications of the comparison for a future (2040–2060) PCM climate scenario were also explored. The six approaches were made up of three relatively simple statistical downscaling methods – linear interpolation (LI), spatial disaggregationmore » (SD), and bias-correction and spatial disaggregation (BCSD) – each applied to both PCM output directly (at T42 spatial resolution), and after dynamical downscaling via a Regional Climate Model (RCM – at ½-degree spatial resolution), for downscaling the climate model outputs to the 1/8-degree spatial resolution of the hydrological model. For the retrospective climate simulation, results were compared to an observed gridded climatology of temperature and precipitation, and gridded hydrologic variables resulting from forcing the hydrologic model with observations. The most significant findings are that the BCSD method was successful in reproducing the main features of the observed hydrometeorology from the retrospective climate simulation, when applied to both PCM and RCM outputs. Linear interpolation produced better results using RCM output than PCM output, but both methods (PCM-LI and RCM-LI) lead to unacceptably biased hydrologic simulations. Spatial disaggregation of the PCM output produced results similar to those achieved with the RCM interpolated output; nonetheless, neither PCM nor RCM output was useful for hydrologic simulation purposes without a bias-correction step. For the future climate scenario, only the BCSD-method (using PCM or RCM) was able

  1. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    NASA Astrophysics Data System (ADS)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  2. Ensemble Data Mining Methods

    NASA Technical Reports Server (NTRS)

    Oza, Nikunj C.

    2004-01-01

    Ensemble Data Mining Methods, also known as Committee Methods or Model Combiners, are machine learning methods that leverage the power of multiple models to achieve better prediction accuracy than any of the individual models could on their own. The basic goal when designing an ensemble is the same as when establishing a committee of people: each member of the committee should be as competent as possible, but the members should be complementary to one another. If the members are not complementary, Le., if they always agree, then the committee is unnecessary---any one member is sufficient. If the members are complementary, then when one or a few members make an error, the probability is high that the remaining members can correct this error. Research in ensemble methods has largely revolved around designing ensembles consisting of competent yet complementary models.

  3. Assessing the impact of land use change on hydrology by ensemble modelling (LUCHEM) II: Ensemble combinations and predictions

    USGS Publications Warehouse

    Viney, N.R.; Bormann, H.; Breuer, L.; Bronstert, A.; Croke, B.F.W.; Frede, H.; Graff, T.; Hubrechts, L.; Huisman, J.A.; Jakeman, A.J.; Kite, G.W.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Willems, P.

    2009-01-01

    This paper reports on a project to compare predictions from a range of catchment models applied to a mesoscale river basin in central Germany and to assess various ensemble predictions of catchment streamflow. The models encompass a large range in inherent complexity and input requirements. In approximate order of decreasing complexity, they are DHSVM, MIKE-SHE, TOPLATS, WASIM-ETH, SWAT, PRMS, SLURP, HBV, LASCAM and IHACRES. The models are calibrated twice using different sets of input data. The two predictions from each model are then combined by simple averaging to produce a single-model ensemble. The 10 resulting single-model ensembles are combined in various ways to produce multi-model ensemble predictions. Both the single-model ensembles and the multi-model ensembles are shown to give predictions that are generally superior to those of their respective constituent models, both during a 7-year calibration period and a 9-year validation period. This occurs despite a considerable disparity in performance of the individual models. Even the weakest of models is shown to contribute useful information to the ensembles they are part of. The best model combination methods are a trimmed mean (constructed using the central four or six predictions each day) and a weighted mean ensemble (with weights calculated from calibration performance) that places relatively large weights on the better performing models. Conditional ensembles, in which separate model weights are used in different system states (e.g. summer and winter, high and low flows) generally yield little improvement over the weighted mean ensemble. However a conditional ensemble that discriminates between rising and receding flows shows moderate improvement. An analysis of ensemble predictions shows that the best ensembles are not necessarily those containing the best individual models. Conversely, it appears that some models that predict well individually do not necessarily combine well with other models in

  4. The Weighted-Average Lagged Ensemble.

    PubMed

    DelSole, T; Trenary, L; Tippett, M K

    2017-11-01

    A lagged ensemble is an ensemble of forecasts from the same model initialized at different times but verifying at the same time. The skill of a lagged ensemble mean can be improved by assigning weights to different forecasts in such a way as to maximize skill. If the forecasts are bias corrected, then an unbiased weighted lagged ensemble requires the weights to sum to one. Such a scheme is called a weighted-average lagged ensemble. In the limit of uncorrelated errors, the optimal weights are positive and decay monotonically with lead time, so that the least skillful forecasts have the least weight. In more realistic applications, the optimal weights do not always behave this way. This paper presents a series of analytic examples designed to illuminate conditions under which the weights of an optimal weighted-average lagged ensemble become negative or depend nonmonotonically on lead time. It is shown that negative weights are most likely to occur when the errors grow rapidly and are highly correlated across lead time. The weights are most likely to behave nonmonotonically when the mean square error is approximately constant over the range forecasts included in the lagged ensemble. An extreme example of the latter behavior is presented in which the optimal weights vanish everywhere except at the shortest and longest lead times.

  5. The Personal Software Process: Downscaling the factory

    NASA Technical Reports Server (NTRS)

    Roy, Daniel M.

    1994-01-01

    It is argued that the next wave of software process improvement (SPI) activities will be based on a people-centered paradigm. The most promising such paradigm, Watts Humphrey's personal software process (PSP), is summarized and its advantages are listed. The concepts of the PSP are shown also to fit a down-scaled version of Basili's experience factory. The author's data and lessons learned while practicing the PSP are presented along with personal experience, observations, and advice from the perspective of a consultant and teacher for the personal software process.

  6. Adaptive correction of ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane

    2017-04-01

    Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO

  7. On the generation of climate model ensembles

    NASA Astrophysics Data System (ADS)

    Haughton, Ned; Abramowitz, Gab; Pitman, Andy; Phipps, Steven J.

    2014-10-01

    Climate model ensembles are used to estimate uncertainty in future projections, typically by interpreting the ensemble distribution for a particular variable probabilistically. There are, however, different ways to produce climate model ensembles that yield different results, and therefore different probabilities for a future change in a variable. Perhaps equally importantly, there are different approaches to interpreting the ensemble distribution that lead to different conclusions. Here we use a reduced-resolution climate system model to compare three common ways to generate ensembles: initial conditions perturbation, physical parameter perturbation, and structural changes. Despite these three approaches conceptually representing very different categories of uncertainty within a modelling system, when comparing simulations to observations of surface air temperature they can be very difficult to separate. Using the twentieth century CMIP5 ensemble for comparison, we show that initial conditions ensembles, in theory representing internal variability, significantly underestimate observed variance. Structural ensembles, perhaps less surprisingly, exhibit over-dispersion in simulated variance. We argue that future climate model ensembles may need to include parameter or structural perturbation members in addition to perturbed initial conditions members to ensure that they sample uncertainty due to internal variability more completely. We note that where ensembles are over- or under-dispersive, such as for the CMIP5 ensemble, estimates of uncertainty need to be treated with care.

  8. Ensembl 2002: accommodating comparative genomics.

    PubMed

    Clamp, M; Andrews, D; Barker, D; Bevan, P; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Hubbard, T; Kasprzyk, A; Keefe, D; Lehvaslaiho, H; Iyer, V; Melsopp, C; Mongin, E; Pettett, R; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Birney, E

    2003-01-01

    The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of human, mouse and other genome sequences, available as either an interactive web site or as flat files. Ensembl also integrates manually annotated gene structures from external sources where available. As well as being one of the leading sources of genome annotation, Ensembl is an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements. These range from sequence analysis to data storage and visualisation and installations exist around the world in both companies and at academic sites. With both human and mouse genome sequences available and more vertebrate sequences to follow, many of the recent developments in Ensembl have focusing on developing automatic comparative genome analysis and visualisation.

  9. Imprinting and Recalling Cortical Ensembles

    PubMed Central

    Carrillo-Reid, Luis; Yang, Weijian; Bando, Yuki; Peterka, Darcy S.; Yuste, Rafael

    2017-01-01

    Neuronal ensembles are coactive groups of neurons that may represent emergent building blocks of neural circuits. They could be formed by Hebbian plasticity, whereby synapses between coactive neurons are strengthened. Here we report that repetitive activation with two-photon optogenetics of neuronal populations in visual cortex of awake mice generates artificially induced ensembles which recur spontaneously after being imprinted and do not disrupt preexistent ones. Moreover, imprinted ensembles can be recalled by single cell stimulation and remain coactive on consecutive days. Our results demonstrate the persistent reconfiguration of cortical circuits by two-photon optogenetics into neuronal ensembles that can perform pattern completion. PMID:27516599

  10. Evaluating the ClimEx Single Model large ensemble in comparison with EURO-CORDEX results of heatwave and drought indicators

    NASA Astrophysics Data System (ADS)

    von Trentini, F.; Schmid, F. J.; Braun, M.; Frigon, A.; Leduc, M.; Martel, J. L.; Willkofer, F.; Wood, R. R.; Ludwig, R.

    2017-12-01

    Meteorological extreme events seem to become more frequent in the present and future, and a seperation of natural climate variability and a clear climate change effect on these extreme events gains more and more interest. Since there is only one realisation of historical events, natural variability in terms of very long timeseries for a robust statistical analysis is not possible with observation data. A new single model large ensemble (SMLE), developed for the ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) is supposed to overcome this lack of data by downscaling 50 members of the CanESM2 (RCP 8.5) with the Canadian CRCM5 regional model (using the EURO-CORDEX grid specifications) for timeseries of 1950-2099 each, resulting in 7500 years of simulated climate. This allows for a better probabilistic analysis of rare and extreme events than any preceding dataset. Besides seasonal sums, several indicators concerning heatwave frequency, duration and mean temperature a well as number and maximum length of dry periods (cons. days <1mm) are calculated for the ClimEx ensemble and several EURO-CORDEX runs. This enables us to investigate the interaction between natural variability (as it appears in the CanESM2-CRCM5 members) and a climate change signal of those members for past, present and future conditions. Adding the EURO-CORDEX results to this, we can also assess the role of internal model variability (or natural variability) in climate change simulations. A first comparison shows similar magnitudes of variability of climate change signals between the ClimEx large ensemble and the CORDEX runs for some indicators, while for most indicators the spread of the SMLE is smaller than the spread of different CORDEX models.

  11. Ensemble Canonical Correlation Prediction of Seasonal Precipitation Over the United States: Raising the Bar for Dynamical Model Forecasts

    NASA Technical Reports Server (NTRS)

    Lau, William K. M.; Kim, Kyu-Myong; Shen, S. P.

    2001-01-01

    This paper presents preliminary results of an ensemble canonical correlation (ECC) prediction scheme developed at the Climate and Radiation Branch, NASA/Goddard Space Flight Center for determining the potential predictability of regional precipitation, and for climate downscaling studies. The scheme is tested on seasonal hindcasts of anomalous precipitation over the continental United States using global sea surface temperature (SST) for 1951-2000. To maximize the forecast skill derived from SST, the world ocean is divided into non-overlapping sectors. The canonical SST modes for each sector are used as the predictor for the ensemble hindcasts. Results show that the ECC yields a substantial (10-25%) increase in prediction skills for all the regions of the US in every season compared to traditional CCA prediction schemes. For the boreal winter, the tropical Pacific contributes the largest potential predictability to precipitation in the southwestern and southeastern regions, while the North Pacific and the North Atlantic are responsible to the enhanced forecast skills in the Pacific Northwest, the northern Great Plains and Ohio Valley. Most importantly, the ECC increases skill for summertime precipitation prediction and substantially reduces the spring predictability barrier over all the regions of the US continent. Besides SST, the ECC is designed with the flexibility to include any number of predictor fields, such as soil moisture, snow cover and additional local observations. The enhanced ECC forecast skill provides a new benchmark for evaluating dynamical model forecasts.

  12. Catalina Eddy as revealed by the historical downscaling of reanalysis

    NASA Astrophysics Data System (ADS)

    Kanamitsu, Masao; Yulaeva, Elena; Li, Haiqin; Hong, Song-You

    2013-08-01

    Climatological properties, dynamical and thermodynamical characteristics of the Catalina Eddy are examined from the 61 years NCEP/NCAR Reanalysis downscaled to hourly 10 km resolution. The eddy is identified as a mesoscale cyclonic circulation confined to the Southern California Bight. Pattern correlation of wind direction against the canonical Catalina Eddy is used to extract cases from the downscaled analysis. Validation against published cases and various observations confirmed that the downscaled analysis accurately reproduces Catalina Eddy events. A composite analysis of the initiation phase of the eddy indicates that no apparent large-scale cyclonic/anti-cyclonic large-scale forcing is associated with the eddy formation or decay. The source of the vorticity is located at the coast of the Santa Barbara Channel. It is generated by the convergence of the wind system crossing over the San Rafael Mountains and the large-scale northwesterly flow associated with the subtropical high. This vorticity is advected towards the southeast by the northwesterly flow, which contributes to the formation of the streak of positive vorticity. At 6 hours prior to the mature stage, there is an explosive generation of positive vorticity along the coast, coincident with the phase change of the sea breeze circulation (wind turning from onshore to offshore), resulting in the convergence all along the California coast. The generation of vorticity due to convergence along the coast together with the advection of vorticity from the north resulted in the formation of southerly flow along the coast, forming the Catalina Eddy. The importance of diurnal variation and the lack of large-scale forcing are new findings, which are in sharp contrast to prior studies. These differences are due to the inclusion of many short-lived eddy events detected in our study which have not been included in other studies.

  13. Uncertainties of statistical downscaling from predictor selection: Equifinality and transferability

    NASA Astrophysics Data System (ADS)

    Fu, Guobin; Charles, Stephen P.; Chiew, Francis H. S.; Ekström, Marie; Potter, Nick J.

    2018-05-01

    The nonhomogeneous hidden Markov model (NHMM) statistical downscaling model, 38 catchments in southeast Australia and 19 general circulation models (GCMs) were used in this study to demonstrate statistical downscaling uncertainties caused by equifinality to and transferability. That is to say, there could be multiple sets of predictors that give similar daily rainfall simulation results for both calibration and validation periods, but project different amounts (or even directions of change) of rainfall changing in the future. Results indicated that two sets of predictors (Set 1 with predictors of sea level pressure north-south gradient, u-wind at 700 hPa, v-wind at 700 hPa, and specific humidity at 700 hPa and Set 2 with predictors of sea level pressure north-south gradient, u-wind at 700 hPa, v-wind at 700 hPa, and dewpoint temperature depression at 850 hPa) as inputs to the NHMM produced satisfactory results of seasonal rainfall in comparison with observations. For example, during the model calibration period, the relative errors across the 38 catchments ranged from 0.48 to 1.76% with a mean value of 1.09% for the predictor Set 1, and from 0.22 to 2.24% with a mean value of 1.16% for the predictor Set 2. However, the changes of future rainfall from NHMM projections based on 19 GCMs produced projections with a different sign for these two different sets of predictors: Set 1 predictors project an increase of future rainfall with magnitudes depending on future time periods and emission scenarios, but Set 2 predictors project a decline of future rainfall. Such divergent projections may present a significant challenge for applications of statistical downscaling as well as climate change impact studies, and could potentially imply caveats in many existing studies in the literature.

  14. Ensemble Pulsar Time Scale

    NASA Astrophysics Data System (ADS)

    Yin, Dong-shan; Gao, Yu-ping; Zhao, Shu-hong

    2017-07-01

    Millisecond pulsars can generate another type of time scale that is totally independent of the atomic time scale, because the physical mechanisms of the pulsar time scale and the atomic time scale are quite different from each other. Usually the pulsar timing observations are not evenly sampled, and the internals between two data points range from several hours to more than half a month. Further more, these data sets are sparse. All this makes it difficult to generate an ensemble pulsar time scale. Hence, a new algorithm to calculate the ensemble pulsar time scale is proposed. Firstly, a cubic spline interpolation is used to densify the data set, and make the intervals between data points uniform. Then, the Vondrak filter is employed to smooth the data set, and get rid of the high-frequency noises, and finally the weighted average method is adopted to generate the ensemble pulsar time scale. The newly released NANOGRAV (North American Nanohertz Observatory for Gravitational Waves) 9-year data set is used to generate the ensemble pulsar time scale. This data set includes the 9-year observational data of 37 millisecond pulsars observed by the 100-meter Green Bank telescope and the 305-meter Arecibo telescope. It is found that the algorithm used in this paper can reduce effectively the influence caused by the noises in pulsar timing residuals, and improve the long-term stability of the ensemble pulsar time scale. Results indicate that the long-term (> 1 yr) stability of the ensemble pulsar time scale is better than 3.4 × 10-15.

  15. Downscaling 20th century flooding events in complex terrain (Switzerland) using the WRF regional climate model

    NASA Astrophysics Data System (ADS)

    Heikkilä, Ulla; Gómez Navarro, Juan Jose; Franke, Jörg; Brönnimann, Stefan; Cattin, Réne

    2016-04-01

    Switzerland has experienced a number of severe precipitation events during the last few decades, such as during the 14-16 November of 2002 or during the 21-22 August of 2005. Both events, and subsequent extreme floods, caused fatalities and severe financial losses, and have been well studied both in terms of atmospheric conditions leading to extreme precipitation, and their consequences [e.g. Hohenegger et al., 2008, Stucki et al., 2012]. These examples highlight the need to better characterise the frequency and severity of flooding in the Alpine area. In a larger framework we will ultimately produce a high-resolution data set covering the entire 20th century to be used for detailed hydrological studies including all atmospheric parameters relevant for flooding events. In a first step, we downscale the aforementioned two events of 2002 and 2005 to assess the model performance regarding precipitation extremes. The complexity of the topography in the Alpine area demands high resolution datasets. To achieve a sufficient detail in resolution we employ the Weather Research and Forecasting regional climate model (WRF). A set of 4 nested domains is used with a 2-km resolution horizontal resolution over Switzerland. The NCAR 20th century reanalysis (20CR) with a horizontal resolution of 2.5° serves as boundary condition [Compo et al., 2011]. First results of the downscaling the 2002 and 2005 extreme precipitation events show that, compared to station observations provided by the Swiss Meteorological Office MeteoSwiss, the model strongly underestimates the strength of these events. This is mainly due to the coarse resolution of the 20CR data, which underestimates the moisture fluxes during these events. We tested driving WRF with the higher-resolved NCEP reanalysis and found a significant improvement in the amount of precipitation of the 2005 event. In a next step we will downscale the precipitation and wind fields during a 6-year period 2002-2007 to investigate and

  16. Non-Gaussian spatiotemporal simulation of multisite daily precipitation: downscaling framework

    NASA Astrophysics Data System (ADS)

    Ben Alaya, M. A.; Ouarda, T. B. M. J.; Chebana, F.

    2018-01-01

    Probabilistic regression approaches for downscaling daily precipitation are very useful. They provide the whole conditional distribution at each forecast step to better represent the temporal variability. The question addressed in this paper is: how to simulate spatiotemporal characteristics of multisite daily precipitation from probabilistic regression models? Recent publications point out the complexity of multisite properties of daily precipitation and highlight the need for using a non-Gaussian flexible tool. This work proposes a reasonable compromise between simplicity and flexibility avoiding model misspecification. A suitable nonparametric bootstrapping (NB) technique is adopted. A downscaling model which merges a vector generalized linear model (VGLM as a probabilistic regression tool) and the proposed bootstrapping technique is introduced to simulate realistic multisite precipitation series. The model is applied to data sets from the southern part of the province of Quebec, Canada. It is shown that the model is capable of reproducing both at-site properties and the spatial structure of daily precipitations. Results indicate the superiority of the proposed NB technique, over a multivariate autoregressive Gaussian framework (i.e. Gaussian copula).

  17. Quantum ensembles of quantum classifiers.

    PubMed

    Schuld, Maria; Petruccione, Francesco

    2018-02-09

    Quantum machine learning witnesses an increasing amount of quantum algorithms for data-driven decision making, a problem with potential applications ranging from automated image recognition to medical diagnosis. Many of those algorithms are implementations of quantum classifiers, or models for the classification of data inputs with a quantum computer. Following the success of collective decision making with ensembles in classical machine learning, this paper introduces the concept of quantum ensembles of quantum classifiers. Creating the ensemble corresponds to a state preparation routine, after which the quantum classifiers are evaluated in parallel and their combined decision is accessed by a single-qubit measurement. This framework naturally allows for exponentially large ensembles in which - similar to Bayesian learning - the individual classifiers do not have to be trained. As an example, we analyse an exponentially large quantum ensemble in which each classifier is weighed according to its performance in classifying the training data, leading to new results for quantum as well as classical machine learning.

  18. Ensemble reconstruction of spatio-temporal extreme low-flow events in France since 1871

    NASA Astrophysics Data System (ADS)

    Caillouet, Laurie; Vidal, Jean-Philippe; Sauquet, Eric; Devers, Alexandre; Graff, Benjamin

    2017-06-01

    The length of streamflow observations is generally limited to the last 50 years even in data-rich countries like France. It therefore offers too small a sample of extreme low-flow events to properly explore the long-term evolution of their characteristics and associated impacts. To overcome this limit, this work first presents a daily 140-year ensemble reconstructed streamflow dataset for a reference network of near-natural catchments in France. This dataset, called SCOPE Hydro (Spatially COherent Probabilistic Extended Hydrological dataset), is based on (1) a probabilistic precipitation, temperature, and reference evapotranspiration downscaling of the Twentieth Century Reanalysis over France, called SCOPE Climate, and (2) continuous hydrological modelling using SCOPE Climate as forcings over the whole period. This work then introduces tools for defining spatio-temporal extreme low-flow events. Extreme low-flow events are first locally defined through the sequent peak algorithm using a novel combination of a fixed threshold and a daily variable threshold. A dedicated spatial matching procedure is then established to identify spatio-temporal events across France. This procedure is furthermore adapted to the SCOPE Hydro 25-member ensemble to characterize in a probabilistic way unrecorded historical events at the national scale. Extreme low-flow events are described and compared in a spatially and temporally homogeneous way over 140 years on a large set of catchments. Results highlight well-known recent events like 1976 or 1989-1990, but also older and relatively forgotten ones like the 1878 and 1893 events. These results contribute to improving our knowledge of historical events and provide a selection of benchmark events for climate change adaptation purposes. Moreover, this study allows for further detailed analyses of the effect of climate variability and anthropogenic climate change on low-flow hydrology at the scale of France.

  19. Multi-Model Ensemble Wake Vortex Prediction

    NASA Technical Reports Server (NTRS)

    Koerner, Stephan; Holzaepfel, Frank; Ahmad, Nash'at N.

    2015-01-01

    Several multi-model ensemble methods are investigated for predicting wake vortex transport and decay. This study is a joint effort between National Aeronautics and Space Administration and Deutsches Zentrum fuer Luft- und Raumfahrt to develop a multi-model ensemble capability using their wake models. An overview of different multi-model ensemble methods and their feasibility for wake applications is presented. The methods include Reliability Ensemble Averaging, Bayesian Model Averaging, and Monte Carlo Simulations. The methodologies are evaluated using data from wake vortex field experiments.

  20. Can bias correction and statistical downscaling methods improve the skill of seasonal precipitation forecasts?

    NASA Astrophysics Data System (ADS)

    Manzanas, R.; Lucero, A.; Weisheimer, A.; Gutiérrez, J. M.

    2018-02-01

    Statistical downscaling methods are popular post-processing tools which are widely used in many sectors to adapt the coarse-resolution biased outputs from global climate simulations to the regional-to-local scale typically required by users. They range from simple and pragmatic Bias Correction (BC) methods, which directly adjust the model outputs of interest (e.g. precipitation) according to the available local observations, to more complex Perfect Prognosis (PP) ones, which indirectly derive local predictions (e.g. precipitation) from appropriate upper-air large-scale model variables (predictors). Statistical downscaling methods have been extensively used and critically assessed in climate change applications; however, their advantages and limitations in seasonal forecasting are not well understood yet. In particular, a key problem in this context is whether they serve to improve the forecast quality/skill of raw model outputs beyond the adjustment of their systematic biases. In this paper we analyze this issue by applying two state-of-the-art BC and two PP methods to downscale precipitation from a multimodel seasonal hindcast in a challenging tropical region, the Philippines. To properly assess the potential added value beyond the reduction of model biases, we consider two validation scores which are not sensitive to changes in the mean (correlation and reliability categories). Our results show that, whereas BC methods maintain or worsen the skill of the raw model forecasts, PP methods can yield significant skill improvement (worsening) in cases for which the large-scale predictor variables considered are better (worse) predicted by the model than precipitation. For instance, PP methods are found to increase (decrease) model reliability in nearly 40% of the stations considered in boreal summer (autumn). Therefore, the choice of a convenient downscaling approach (either BC or PP) depends on the region and the season.

  1. Run-up parameterization and beach vulnerability assessment on a barrier island: a downscaling approach

    NASA Astrophysics Data System (ADS)

    Medellín, G.; Brinkkemper, J. A.; Torres-Freyermuth, A.; Appendini, C. M.; Mendoza, E. T.; Salles, P.

    2016-01-01

    We present a downscaling approach for the study of wave-induced extreme water levels at a location on a barrier island in Yucatán (Mexico). Wave information from a 30-year wave hindcast is validated with in situ measurements at 8 m water depth. The maximum dissimilarity algorithm is employed for the selection of 600 representative cases, encompassing different combinations of wave characteristics and tidal level. The selected cases are propagated from 8 m water depth to the shore using the coupling of a third-generation wave model and a phase-resolving non-hydrostatic nonlinear shallow-water equation model. Extreme wave run-up, R2%, is estimated for the simulated cases and can be further employed to reconstruct the 30-year time series using an interpolation algorithm. Downscaling results show run-up saturation during more energetic wave conditions and modulation owing to tides. The latter suggests that the R2% can be parameterized using a hyperbolic-like formulation with dependency on both wave height and tidal level. The new parametric formulation is in agreement with the downscaling results (r2 = 0.78), allowing a fast calculation of wave-induced extreme water levels at this location. Finally, an assessment of beach vulnerability to wave-induced extreme water levels is conducted at the study area by employing the two approaches (reconstruction/parameterization) and a storm impact scale. The 30-year extreme water level hindcast allows the calculation of beach vulnerability as a function of return periods. It is shown that the downscaling-derived parameterization provides reasonable results as compared with the numerical approach. This methodology can be extended to other locations and can be further improved by incorporating the storm surge contributions to the extreme water level.

  2. A downscaling scheme for atmospheric variables to drive soil-vegetation-atmosphere transfer models

    NASA Astrophysics Data System (ADS)

    Schomburg, A.; Venema, V.; Lindau, R.; Ament, F.; Simmer, C.

    2010-09-01

    For driving soil-vegetation-transfer models or hydrological models, high-resolution atmospheric forcing data is needed. For most applications the resolution of atmospheric model output is too coarse. To avoid biases due to the non-linear processes, a downscaling system should predict the unresolved variability of the atmospheric forcing. For this purpose we derived a disaggregation system consisting of three steps: (1) a bi-quadratic spline-interpolation of the low-resolution data, (2) a so-called `deterministic' part, based on statistical rules between high-resolution surface variables and the desired atmospheric near-surface variables and (3) an autoregressive noise-generation step. The disaggregation system has been developed and tested based on high-resolution model output (400m horizontal grid spacing). A novel automatic search-algorithm has been developed for deriving the deterministic downscaling rules of step 2. When applied to the atmospheric variables of the lowest layer of the atmospheric COSMO-model, the disaggregation is able to adequately reconstruct the reference fields. Applying downscaling step 1 and 2, root mean square errors are decreased. Step 3 finally leads to a close match of the subgrid variability and temporal autocorrelation with the reference fields. The scheme can be applied to the output of atmospheric models, both for stand-alone offline simulations, and a fully coupled model system.

  3. A Statistical Description of Neural Ensemble Dynamics

    PubMed Central

    Long, John D.; Carmena, Jose M.

    2011-01-01

    The growing use of multi-channel neural recording techniques in behaving animals has produced rich datasets that hold immense potential for advancing our understanding of how the brain mediates behavior. One limitation of these techniques is they do not provide important information about the underlying anatomical connections among the recorded neurons within an ensemble. Inferring these connections is often intractable because the set of possible interactions grows exponentially with ensemble size. This is a fundamental challenge one confronts when interpreting these data. Unfortunately, the combination of expert knowledge and ensemble data is often insufficient for selecting a unique model of these interactions. Our approach shifts away from modeling the network diagram of the ensemble toward analyzing changes in the dynamics of the ensemble as they relate to behavior. Our contribution consists of adapting techniques from signal processing and Bayesian statistics to track the dynamics of ensemble data on time-scales comparable with behavior. We employ a Bayesian estimator to weigh prior information against the available ensemble data, and use an adaptive quantization technique to aggregate poorly estimated regions of the ensemble data space. Importantly, our method is capable of detecting changes in both the magnitude and structure of correlations among neurons missed by firing rate metrics. We show that this method is scalable across a wide range of time-scales and ensemble sizes. Lastly, the performance of this method on both simulated and real ensemble data is used to demonstrate its utility. PMID:22319486

  4. Monthly ENSO Forecast Skill and Lagged Ensemble Size

    PubMed Central

    DelSole, T.; Tippett, M.K.; Pegion, K.

    2018-01-01

    Abstract The mean square error (MSE) of a lagged ensemble of monthly forecasts of the Niño 3.4 index from the Climate Forecast System (CFSv2) is examined with respect to ensemble size and configuration. Although the real‐time forecast is initialized 4 times per day, it is possible to infer the MSE for arbitrary initialization frequency and for burst ensembles by fitting error covariances to a parametric model and then extrapolating to arbitrary ensemble size and initialization frequency. Applying this method to real‐time forecasts, we find that the MSE consistently reaches a minimum for a lagged ensemble size between one and eight days, when four initializations per day are included. This ensemble size is consistent with the 8–10 day lagged ensemble configuration used operationally. Interestingly, the skill of both ensemble configurations is close to the estimated skill of the infinite ensemble. The skill of the weighted, lagged, and burst ensembles are found to be comparable. Certain unphysical features of the estimated error growth were tracked down to problems with the climatology and data discontinuities. PMID:29937973

  5. Monthly ENSO Forecast Skill and Lagged Ensemble Size

    NASA Astrophysics Data System (ADS)

    Trenary, L.; DelSole, T.; Tippett, M. K.; Pegion, K.

    2018-04-01

    The mean square error (MSE) of a lagged ensemble of monthly forecasts of the Niño 3.4 index from the Climate Forecast System (CFSv2) is examined with respect to ensemble size and configuration. Although the real-time forecast is initialized 4 times per day, it is possible to infer the MSE for arbitrary initialization frequency and for burst ensembles by fitting error covariances to a parametric model and then extrapolating to arbitrary ensemble size and initialization frequency. Applying this method to real-time forecasts, we find that the MSE consistently reaches a minimum for a lagged ensemble size between one and eight days, when four initializations per day are included. This ensemble size is consistent with the 8-10 day lagged ensemble configuration used operationally. Interestingly, the skill of both ensemble configurations is close to the estimated skill of the infinite ensemble. The skill of the weighted, lagged, and burst ensembles are found to be comparable. Certain unphysical features of the estimated error growth were tracked down to problems with the climatology and data discontinuities.

  6. Similarity Measures for Protein Ensembles

    PubMed Central

    Lindorff-Larsen, Kresten; Ferkinghoff-Borg, Jesper

    2009-01-01

    Analyses of similarities and changes in protein conformation can provide important information regarding protein function and evolution. Many scores, including the commonly used root mean square deviation, have therefore been developed to quantify the similarities of different protein conformations. However, instead of examining individual conformations it is in many cases more relevant to analyse ensembles of conformations that have been obtained either through experiments or from methods such as molecular dynamics simulations. We here present three approaches that can be used to compare conformational ensembles in the same way as the root mean square deviation is used to compare individual pairs of structures. The methods are based on the estimation of the probability distributions underlying the ensembles and subsequent comparison of these distributions. We first validate the methods using a synthetic example from molecular dynamics simulations. We then apply the algorithms to revisit the problem of ensemble averaging during structure determination of proteins, and find that an ensemble refinement method is able to recover the correct distribution of conformations better than standard single-molecule refinement. PMID:19145244

  7. Implications of the methodological choices for hydrologic portrayals of climate change over the contiguous United States: Statistically downscaled forcing data and hydrologic models

    USGS Publications Warehouse

    Mizukami, Naoki; Clark, Martyn P.; Gutmann, Ethan D.; Mendoza, Pablo A.; Newman, Andrew J.; Nijssen, Bart; Livneh, Ben; Hay, Lauren E.; Arnold, Jeffrey R.; Brekke, Levi D.

    2016-01-01

    Continental-domain assessments of climate change impacts on water resources typically rely on statistically downscaled climate model outputs to force hydrologic models at a finer spatial resolution. This study examines the effects of four statistical downscaling methods [bias-corrected constructed analog (BCCA), bias-corrected spatial disaggregation applied at daily (BCSDd) and monthly scales (BCSDm), and asynchronous regression (AR)] on retrospective hydrologic simulations using three hydrologic models with their default parameters (the Community Land Model, version 4.0; the Variable Infiltration Capacity model, version 4.1.2; and the Precipitation–Runoff Modeling System, version 3.0.4) over the contiguous United States (CONUS). Biases of hydrologic simulations forced by statistically downscaled climate data relative to the simulation with observation-based gridded data are presented. Each statistical downscaling method produces different meteorological portrayals including precipitation amount, wet-day frequency, and the energy input (i.e., shortwave radiation), and their interplay affects estimations of precipitation partitioning between evapotranspiration and runoff, extreme runoff, and hydrologic states (i.e., snow and soil moisture). The analyses show that BCCA underestimates annual precipitation by as much as −250 mm, leading to unreasonable hydrologic portrayals over the CONUS for all models. Although the other three statistical downscaling methods produce a comparable precipitation bias ranging from −10 to 8 mm across the CONUS, BCSDd severely overestimates the wet-day fraction by up to 0.25, leading to different precipitation partitioning compared to the simulations with other downscaled data. Overall, the choice of downscaling method contributes to less spread in runoff estimates (by a factor of 1.5–3) than the choice of hydrologic model with use of the default parameters if BCCA is excluded.

  8. Improving database enrichment through ensemble docking

    NASA Astrophysics Data System (ADS)

    Rao, Shashidhar; Sanschagrin, Paul C.; Greenwood, Jeremy R.; Repasky, Matthew P.; Sherman, Woody; Farid, Ramy

    2008-09-01

    While it may seem intuitive that using an ensemble of multiple conformations of a receptor in structure-based virtual screening experiments would necessarily yield improved enrichment of actives relative to using just a single receptor, it turns out that at least in the p38 MAP kinase model system studied here, a very large majority of all possible ensembles do not yield improved enrichment of actives. However, there are combinations of receptor structures that do lead to improved enrichment results. We present here a method to select the ensembles that produce the best enrichments that does not rely on knowledge of active compounds or sophisticated analyses of the 3D receptor structures. In the system studied here, the small fraction of ensembles of up to 3 receptors that do yield good enrichments of actives were identified by selecting ensembles that have the best mean GlideScore for the top 1% of the docked ligands in a database screen of actives and drug-like "decoy" ligands. Ensembles of two receptors identified using this mean GlideScore metric generally outperform single receptors, while ensembles of three receptors identified using this metric consistently give optimal enrichment factors in which, for example, 40% of the known actives outrank all the other ligands in the database.

  9. Improve projections of changes in southern African summer rainfall through comprehensive multi-timescale empirical statistical downscaling

    NASA Astrophysics Data System (ADS)

    Dieppois, B.; Pohl, B.; Eden, J.; Crétat, J.; Rouault, M.; Keenlyside, N.; New, M. G.

    2017-12-01

    The water management community has hitherto neglected or underestimated many of the uncertainties in climate impact scenarios, in particular, uncertainties associated with decadal climate variability. Uncertainty in the state-of-the-art global climate models (GCMs) is time-scale-dependant, e.g. stronger at decadal than at interannual timescales, in response to the different parameterizations and to internal climate variability. In addition, non-stationarity in statistical downscaling is widely recognized as a key problem, in which time-scale dependency of predictors plays an important role. As with global climate modelling, therefore, the selection of downscaling methods must proceed with caution to avoid unintended consequences of over-correcting the noise in GCMs (e.g. interpreting internal climate variability as a model bias). GCM outputs from the Coupled Model Intercomparison Project 5 (CMIP5) have therefore first been selected based on their ability to reproduce southern African summer rainfall variability and their teleconnections with Pacific sea-surface temperature across the dominant timescales. In observations, southern African summer rainfall has recently been shown to exhibit significant periodicities at the interannual timescale (2-8 years), quasi-decadal (8-13 years) and inter-decadal (15-28 years) timescales, which can be interpret as the signature of ENSO, the IPO, and the PDO over the region. Most of CMIP5 GCMs underestimate southern African summer rainfall variability and their teleconnections with Pacific SSTs at these three timescales. In addition, according to a more in-depth analysis of historical and pi-control runs, this bias is might result from internal climate variability in some of the CMIP5 GCMs, suggesting potential for bias-corrected prediction based empirical statistical downscaling. A multi-timescale regression based downscaling procedure, which determines the predictors across the different timescales, has thus been used to

  10. Influence of Scale Effect and Model Performance in Downscaling ASTER Land Surface Temperatures to a Very High Spatial Resolution in an Agricultural Area

    NASA Astrophysics Data System (ADS)

    Zhou, J.; Li, G.; Liu, S.; Zhan, W.; Zhang, X.

    2015-12-01

    At present land surface temperatures (LSTs) can be generated from thermal infrared remote sensing with spatial resolutions from ~100 m to tens of kilometers. However, LSTs with high spatial resolution, e.g. tens of meters, are still lack. The purpose of LST downscaling is to generate LSTs with finer spatial resolutions than their native spatial resolutions. The statistical linear or nonlinear regression models are most frequently used for LST downscaling. The basic assumption of these models is the scale-invariant relationships between LST and its descriptors, which is questioned but rare researches have been reported. In addition, few researches can be found for downscaling satellite LST or TIR data to a high spatial resolution, i.e. better than 100 m or even finer. The lack of LST with high spatial resolution cannot satisfy the requirements of applications such as evapotranspiration mapping at the field scale. By selecting a dynamically developing agricultural oasis as the study area, the aim of this study is to downscale the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) LSTs to 15 m, to satisfy the requirement of evapotranspiration mapping at the field scale. Twelve ASTER images from May to September in 2012, covering the entire growth stage of maize, were selected. Four statistical models were evaluated, including one global model, one piecewise model, and two local models. The influence from scale effect in downscaling LST was quantified. The downscaled LSTs are evaluated from accuracy and image quality. Results demonstrate that the influence from scale effect varies according to models and the maize growth stage. Significant influence about -4 K to 6 K existed at the early stage and weaker influence existed in the middle stage. When compared with the ground measured LSTs, the downscaled LSTs resulted from the global and local models yielded higher accuracies and better image qualities than the local models. In addition to the

  11. Applying downscaled global climate model data to a hydrodynamic surface-water and groundwater model

    USGS Publications Warehouse

    Swain, Eric; Stefanova, Lydia; Smith, Thomas

    2014-01-01

    Precipitation data from Global Climate Models have been downscaled to smaller regions. Adapting this downscaled precipitation data to a coupled hydrodynamic surface-water/groundwater model of southern Florida allows an examination of future conditions and their effect on groundwater levels, inundation patterns, surface-water stage and flows, and salinity. The downscaled rainfall data include the 1996-2001 time series from the European Center for Medium-Range Weather Forecasting ERA-40 simulation and both the 1996-1999 and 2038-2057 time series from two global climate models: the Community Climate System Model (CCSM) and the Geophysical Fluid Dynamic Laboratory (GFDL). Synthesized surface-water inflow datasets were developed for the 2038-2057 simulations. The resulting hydrologic simulations, with and without a 30-cm sea-level rise, were compared with each other and field data to analyze a range of projected conditions. Simulations predicted generally higher future stage and groundwater levels and surface-water flows, with sea-level rise inducing higher coastal salinities. A coincident rise in sea level, precipitation and surface-water flows resulted in a narrower inland saline/fresh transition zone. The inland areas were affected more by the rainfall difference than the sea-level rise, and the rainfall differences make little difference in coastal inundation, but a larger difference in coastal salinities.

  12. Hydrologic downscaling of soil moisture using global data without site-specific calibration

    USDA-ARS?s Scientific Manuscript database

    Numerous applications require fine-resolution (10-30 m) soil moisture patterns, but most satellite remote sensing and land-surface models provide coarse-resolution (9-60 km) soil moisture estimates. The Equilibrium Moisture from Topography, Vegetation, and Soil (EMT+VS) model downscales soil moistu...

  13. New technique for ensemble dressing combining Multimodel SuperEnsemble and precipitation PDF

    NASA Astrophysics Data System (ADS)

    Cane, D.; Milelli, M.

    2009-09-01

    The Multimodel SuperEnsemble technique (Krishnamurti et al., Science 285, 1548-1550, 1999) is a postprocessing method for the estimation of weather forecast parameters reducing direct model output errors. It differs from other ensemble analysis techniques by the use of an adequate weighting of the input forecast models to obtain a combined estimation of meteorological parameters. Weights are calculated by least-square minimization of the difference between the model and the observed field during a so-called training period. Although it can be applied successfully on the continuous parameters like temperature, humidity, wind speed and mean sea level pressure (Cane and Milelli, Meteorologische Zeitschrift, 15, 2, 2006), the Multimodel SuperEnsemble gives good results also when applied on the precipitation, a parameter quite difficult to handle with standard post-processing methods. Here we present our methodology for the Multimodel precipitation forecasts applied on a wide spectrum of results over Piemonte very dense non-GTS weather station network. We will focus particularly on an accurate statistical method for bias correction and on the ensemble dressing in agreement with the observed precipitation forecast-conditioned PDF. Acknowledgement: this work is supported by the Italian Civil Defence Department.

  14. Downscaling of Aircraft-, Landsat-, and MODIS-based Land Surface Temperature Images with Support Vector Machines

    NASA Astrophysics Data System (ADS)

    Ha, W.; Gowda, P. H.; Oommen, T.; Howell, T. A.; Hernandez, J. E.

    2010-12-01

    High spatial resolution Land Surface Temperature (LST) images are required to estimate evapotranspiration (ET) at a field scale for irrigation scheduling purposes. Satellite sensors such as Landsat 5 Thematic Mapper (TM) and Moderate Resolution Imaging Spectroradiometer (MODIS) can offer images at several spectral bandwidths including visible, near-infrared (NIR), shortwave-infrared, and thermal-infrared (TIR). The TIR images usually have coarser spatial resolutions than those from non-thermal infrared bands. Due to this technical constraint of the satellite sensors on these platforms, image downscaling has been proposed in the field of ET remote sensing. This paper explores the potential of the Support Vector Machines (SVM) to perform downscaling of LST images derived from aircraft (4 m spatial resolution), TM (120 m), and MODIS (1000 m) using normalized difference vegetation index images derived from simultaneously acquired high resolution visible and NIR data (1 m for aircraft, 30 m for TM, and 250 m for MODIS). The SVM is a new generation machine learning algorithm that has found a wide application in the field of pattern recognition and time series analysis. The SVM would be ideally suited for downscaling problems due to its generalization ability in capturing non-linear regression relationship between the predictand and the multiple predictors. Remote sensing data acquired over the Texas High Plains during the 2008 summer growing season will be used in this study. Accuracy assessment of the downscaled 1, 30, and 250 m LST images will be made by comparing them with LST data measured with infrared thermometers at a small spatial scale, upscaled 30 m aircraft-based LST images, and upscaled 250 m TM-based LST images, respectively.

  15. A Spatial Allocation Procedure to Downscale Regional Crop Production Estimates from an Integrated Assessment Model

    NASA Astrophysics Data System (ADS)

    Moulds, S.; Djordjevic, S.; Savic, D.

    2017-12-01

    The Global Change Assessment Model (GCAM), an integrated assessment model, provides insight into the interactions and feedbacks between physical and human systems. The land system component of GCAM, which simulates land use activities and the production of major crops, produces output at the subregional level which must be spatially downscaled in order to use with gridded impact assessment models. However, existing downscaling routines typically consider cropland as a homogeneous class and do not provide information about land use intensity or specific management practices such as irrigation and multiple cropping. This paper presents a spatial allocation procedure to downscale crop production data from GCAM to a spatial grid, producing a time series of maps which show the spatial distribution of specific crops (e.g. rice, wheat, maize) at four input levels (subsistence, low input rainfed, high input rainfed and high input irrigated). The model algorithm is constrained by available cropland at each time point and therefore implicitly balances extensification and intensification processes in order to meet global food demand. It utilises a stochastic approach such that an increase in production of a particular crop is more likely to occur in grid cells with a high biophysical suitability and neighbourhood influence, while a fall in production will occur more often in cells with lower suitability. User-supplied rules define the order in which specific crops are downscaled as well as allowable transitions. A regional case study demonstrates the ability of the model to reproduce historical trends in India by comparing the model output with district-level agricultural inventory data. Lastly, the model is used to predict the spatial distribution of crops globally under various GCAM scenarios.

  16. Regional Climate Change across the Continental U.S. Projected from Downscaling IPCC AR5 Simulations

    NASA Astrophysics Data System (ADS)

    Otte, T. L.; Nolte, C. G.; Otte, M. J.; Pinder, R. W.; Faluvegi, G.; Shindell, D. T.

    2011-12-01

    Projecting climate change scenarios to local scales is important for understanding and mitigating the effects of climate change on society and the environment. Many of the general circulation models (GCMs) that are participating in the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5) do not fully resolve regional-scale processes and therefore cannot capture local changes in temperature and precipitation extremes. We seek to project the GCM's large-scale climate change signal to the local scale using a regional climate model (RCM) by applying dynamical downscaling techniques. The RCM will be used to better understand the local changes of temperature and precipitation extremes that may result from a changing climate. Preliminary results from downscaling NASA/GISS ModelE simulations of the IPCC AR5 Representative Concentration Pathway (RCP) scenario 6.0 will be shown. The Weather Research and Forecasting (WRF) model will be used as the RCM to downscale decadal time slices for ca. 2000 and ca. 2030 and illustrate potential changes in regional climate for the continental U.S. that are projected by ModelE and WRF under RCP6.0.

  17. On the improvement of wave and storm surge hindcasts by downscaled atmospheric forcing: application to historical storms

    NASA Astrophysics Data System (ADS)

    Bresson, Émilie; Arbogast, Philippe; Aouf, Lotfi; Paradis, Denis; Kortcheva, Anna; Bogatchev, Andrey; Galabov, Vasko; Dimitrova, Marieta; Morvan, Guillaume; Ohl, Patrick; Tsenova, Boryana; Rabier, Florence

    2018-04-01

    Winds, waves and storm surges can inflict severe damage in coastal areas. In order to improve preparedness for such events, a better understanding of storm-induced coastal flooding episodes is necessary. To this end, this paper highlights the use of atmospheric downscaling techniques in order to improve wave and storm surge hindcasts. The downscaling techniques used here are based on existing European Centre for Medium-Range Weather Forecasts reanalyses (ERA-20C, ERA-40 and ERA-Interim). The results show that the 10 km resolution data forcing provided by a downscaled atmospheric model gives a better wave and surge hindcast compared to using data directly from the reanalysis. Furthermore, the analysis of the most extreme mid-latitude cyclones indicates that a four-dimensional blending approach improves the whole process, as it assimilates more small-scale processes in the initial conditions. Our approach has been successfully applied to ERA-20C (the 20th century reanalysis).

  18. Verifying and Postprocesing the Ensemble Spread-Error Relationship

    NASA Astrophysics Data System (ADS)

    Hopson, Tom; Knievel, Jason; Liu, Yubao; Roux, Gregory; Wu, Wanli

    2013-04-01

    With the increased utilization of ensemble forecasts in weather and hydrologic applications, there is a need to verify their benefit over less expensive deterministic forecasts. One such potential benefit of ensemble systems is their capacity to forecast their own forecast error through the ensemble spread-error relationship. The paper begins by revisiting the limitations of the Pearson correlation alone in assessing this relationship. Next, we introduce two new metrics to consider in assessing the utility an ensemble's varying dispersion. We argue there are two aspects of an ensemble's dispersion that should be assessed. First, and perhaps more fundamentally: is there enough variability in the ensembles dispersion to justify the maintenance of an expensive ensemble prediction system (EPS), irrespective of whether the EPS is well-calibrated or not? To diagnose this, the factor that controls the theoretical upper limit of the spread-error correlation can be useful. Secondly, does the variable dispersion of an ensemble relate to variable expectation of forecast error? Representing the spread-error correlation in relation to its theoretical limit can provide a simple diagnostic of this attribute. A context for these concepts is provided by assessing two operational ensembles: 30-member Western US temperature forecasts for the U.S. Army Test and Evaluation Command and 51-member Brahmaputra River flow forecasts of the Climate Forecast and Applications Project for Bangladesh. Both of these systems utilize a postprocessing technique based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. In addition, the methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. We will describe both ensemble systems briefly, review the steps used to calibrate the ensemble forecast, and present

  19. Mixture models for protein structure ensembles.

    PubMed

    Hirsch, Michael; Habeck, Michael

    2008-10-01

    Protein structure ensembles provide important insight into the dynamics and function of a protein and contain information that is not captured with a single static structure. However, it is not clear a priori to what extent the variability within an ensemble is caused by internal structural changes. Additional variability results from overall translations and rotations of the molecule. And most experimental data do not provide information to relate the structures to a common reference frame. To report meaningful values of intrinsic dynamics, structural precision, conformational entropy, etc., it is therefore important to disentangle local from global conformational heterogeneity. We consider the task of disentangling local from global heterogeneity as an inference problem. We use probabilistic methods to infer from the protein ensemble missing information on reference frames and stable conformational sub-states. To this end, we model a protein ensemble as a mixture of Gaussian probability distributions of either entire conformations or structural segments. We learn these models from a protein ensemble using the expectation-maximization algorithm. Our first model can be used to find multiple conformers in a structure ensemble. The second model partitions the protein chain into locally stable structural segments or core elements and less structured regions typically found in loops. Both models are simple to implement and contain only a single free parameter: the number of conformers or structural segments. Our models can be used to analyse experimental ensembles, molecular dynamics trajectories and conformational change in proteins. The Python source code for protein ensemble analysis is available from the authors upon request.

  20. Image Change Detection via Ensemble Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Benjamin W; Vatsavai, Raju

    2013-01-01

    The concept of geographic change detection is relevant in many areas. Changes in geography can reveal much information about a particular location. For example, analysis of changes in geography can identify regions of population growth, change in land use, and potential environmental disturbance. A common way to perform change detection is to use a simple method such as differencing to detect regions of change. Though these techniques are simple, often the application of these techniques is very limited. Recently, use of machine learning methods such as neural networks for change detection has been explored with great success. In this work,more » we explore the use of ensemble learning methodologies for detecting changes in bitemporal synthetic aperture radar (SAR) images. Ensemble learning uses a collection of weak machine learning classifiers to create a stronger classifier which has higher accuracy than the individual classifiers in the ensemble. The strength of the ensemble lies in the fact that the individual classifiers in the ensemble create a mixture of experts in which the final classification made by the ensemble classifier is calculated from the outputs of the individual classifiers. Our methodology leverages this aspect of ensemble learning by training collections of weak decision tree based classifiers to identify regions of change in SAR images collected of a region in the Staten Island, New York area during Hurricane Sandy. Preliminary studies show that the ensemble method has approximately 11.5% higher change detection accuracy than an individual classifier.« less

  1. Activity-Dependent Downscaling of Subthreshold Synaptic Inputs during Slow-Wave-Sleep-like Activity In Vivo.

    PubMed

    González-Rueda, Ana; Pedrosa, Victor; Feord, Rachael C; Clopath, Claudia; Paulsen, Ole

    2018-03-21

    Activity-dependent synaptic plasticity is critical for cortical circuit refinement. The synaptic homeostasis hypothesis suggests that synaptic connections are strengthened during wake and downscaled during sleep; however, it is not obvious how the same plasticity rules could explain both outcomes. Using whole-cell recordings and optogenetic stimulation of presynaptic input in urethane-anesthetized mice, which exhibit slow-wave-sleep (SWS)-like activity, we show that synaptic plasticity rules are gated by cortical dynamics in vivo. While Down states support conventional spike timing-dependent plasticity, Up states are biased toward depression such that presynaptic stimulation alone leads to synaptic depression, while connections contributing to postsynaptic spiking are protected against this synaptic weakening. We find that this novel activity-dependent and input-specific downscaling mechanism has two important computational advantages: (1) improved signal-to-noise ratio, and (2) preservation of previously stored information. Thus, these synaptic plasticity rules provide an attractive mechanism for SWS-related synaptic downscaling and circuit refinement. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  2. A Theoretical Analysis of Why Hybrid Ensembles Work.

    PubMed

    Hsu, Kuo-Wei

    2017-01-01

    Inspired by the group decision making process, ensembles or combinations of classifiers have been found favorable in a wide variety of application domains. Some researchers propose to use the mixture of two different types of classification algorithms to create a hybrid ensemble. Why does such an ensemble work? The question remains. Following the concept of diversity, which is one of the fundamental elements of the success of ensembles, we conduct a theoretical analysis of why hybrid ensembles work, connecting using different algorithms to accuracy gain. We also conduct experiments on classification performance of hybrid ensembles of classifiers created by decision tree and naïve Bayes classification algorithms, each of which is a top data mining algorithm and often used to create non-hybrid ensembles. Therefore, through this paper, we provide a complement to the theoretical foundation of creating and using hybrid ensembles.

  3. A new dynamical downscaling approach with GCM bias corrections and spectral nudging

    NASA Astrophysics Data System (ADS)

    Xu, Zhongfeng; Yang, Zong-Liang

    2015-04-01

    To improve confidence in regional projections of future climate, a new dynamical downscaling (NDD) approach with both general circulation model (GCM) bias corrections and spectral nudging is developed and assessed over North America. GCM biases are corrected by adjusting GCM climatological means and variances based on reanalysis data before the GCM output is used to drive a regional climate model (RCM). Spectral nudging is also applied to constrain RCM-based biases. Three sets of RCM experiments are integrated over a 31 year period. In the first set of experiments, the model configurations are identical except that the initial and lateral boundary conditions are derived from either the original GCM output, the bias-corrected GCM output, or the reanalysis data. The second set of experiments is the same as the first set except spectral nudging is applied. The third set of experiments includes two sensitivity runs with both GCM bias corrections and nudging where the nudging strength is progressively reduced. All RCM simulations are assessed against North American Regional Reanalysis. The results show that NDD significantly improves the downscaled mean climate and climate variability relative to other GCM-driven RCM downscaling approach in terms of climatological mean air temperature, geopotential height, wind vectors, and surface air temperature variability. In the NDD approach, spectral nudging introduces the effects of GCM bias corrections throughout the RCM domain rather than just limiting them to the initial and lateral boundary conditions, thereby minimizing climate drifts resulting from both the GCM and RCM biases.

  4. Advances in snow cover distributed modelling via ensemble simulations and assimilation of satellite data

    NASA Astrophysics Data System (ADS)

    Revuelto, J.; Dumont, M.; Tuzet, F.; Vionnet, V.; Lafaysse, M.; Lecourt, G.; Vernay, M.; Morin, S.; Cosme, E.; Six, D.; Rabatel, A.

    2017-12-01

    Nowadays snowpack models show a good capability in simulating the evolution of snow in mountain areas. However singular deviations of meteorological forcing and shortcomings in the modelling of snow physical processes, when accumulated on time along a snow season, could produce large deviations from real snowpack state. The evaluation of these deviations is usually assessed with on-site observations from automatic weather stations. Nevertheless the location of these stations could strongly influence the results of these evaluations since local topography may have a marked influence on snowpack evolution. Despite the evaluation of snowpack models with automatic weather stations usually reveal good results, there exist a lack of large scale evaluations of simulations results on heterogeneous alpine terrain subjected to local topographic effects.This work firstly presents a complete evaluation of the detailed snowpack model Crocus over an extended mountain area, the Arve upper catchment (western European Alps). This catchment has a wide elevation range with a large area above 2000m a.s.l. and/or glaciated. The evaluation compares results obtained with distributed and semi-distributed simulations (the latter nowadays used on the operational forecasting). Daily observations of the snow covered area from MODIS satellite sensor, seasonal glacier surface mass balance evolution measured in more than 65 locations and the galciers annual equilibrium line altitude from Landsat/Spot/Aster satellites, have been used for model evaluation. Additionally the latest advances in producing ensemble snowpack simulations for assimilating satellite reflectance data over extended areas will be presented. These advances comprises the generation of an ensemble of downscaled high-resolution meteorological forcing from meso-scale meteorological models and the application of a particle filter scheme for assimilating satellite observations. Despite the results are prefatory, they show a good

  5. Downscaling the Local Weather Above Glaciers in Complex Topography

    NASA Astrophysics Data System (ADS)

    Horak, Johannes; Hofer, Marlis; Gutmann, Ethan; Gohm, Alexander; Rotach, Mathias

    2017-04-01

    Glaciers have experienced a substantial ice-volume loss during the 20th century. To study their response to climate change, process-based glacier mass-balance models (PBGMs) are employed, which require a faithful representation of the state of the atmosphere above the glacier at high spatial and temporal resolution. Glaciers are usually located in complex topography where weather stations are scarce or not existent at all due to the remoteness of such sites and the associated high cost of maintenance. Furthermore. the effective resolution of global circulation models is too large to adequately capture the local topography and represent local weather, which is prerequisite for atmospheric input used by PBGMs. Dynamical downscaling is a physically consistent but computationally expensive approach to bridge the scale gap between GCM output and input needed by PBGMs, while statistical downscaling is faster but requires measurements for training. Both methods have their merits, however, a computationally frugal approach that does not rely on measurements is desirable, especially for long term studies of glacier response to future climate. In this study the intermediate complexity atmospheric research model (ICAR) is employed (Gutmann et al., 2016). It simplifies the wind field physics by relying on analytical solutions derived with linear theory. ICAR then advects atmospheric quantities within this wind field. This allows for computationally fast downscaling and yields a physically consistent set of atmospheric variables. First results obtained from downscaling air temperature, precipitation amount, relative humidity and wind speed to 4 × 4 km2 are presented. Preliminary ICAR is applied for a six month simulation period during five years and evaluated for three domains located in very distinct climates, namely the Southern Alps of New Zealand, the Cordillera Blanca in Peru and the European Alps using ERA Interim reanalysis data (ERAI) as forcing data set. The

  6. Key Findings from the U.S.-India Partnership for Climate Resilience Workshop on Development and Application of Downscaling Climate Projections

    NASA Astrophysics Data System (ADS)

    Kunkel, K.; Dissen, J.; Easterling, D. R.; Kulkarni, A.; Akhtar, F. H.; Hayhoe, K.; Stoner, A. M. K.; Swaminathan, R.; Thrasher, B. L.

    2017-12-01

    s part of the Department of State U.S.-India Partnership for Climate Resilience (PCR), scientists from NOAA NCEI, CICS-NC, Texas Tech University (TTU), Stanford University (SU), and the Indian Institute of Tropical Meteorology (IITM) held a workshop at IITM in Pune, India during 7-9 March 2017 on the development, techniques and applications of downscaled climate projections. Workshop participants from TTU, SU, and IITM presented state-of-the-art climate downscaling techniques using the ARRM method, NASA NEX climate products, CORDEX-South Asia and analysis tools for resilience planning and sustainable development. PCR collaborators in attendance included Indian practitioners, researchers and other NGO including the WRI Partnership for Resilience and Preparedness (PREP), The Energy and Resources Institute (TERI), and NIH. The scientific techniques were provided to workshop participants in a software package written in R by TTU scientists and several sessions were devoted to hands-on experience with the software package. The workshop further examined case studies on the use of downscaled climate data for decision making in a range of sectors, including human health, agriculture, and water resources management as well as to inform the development of the India State Action Plans. This talk will discuss key outcomes including information needs for downscaling climate projections, importance of QA/QC of the data, key findings from select case studies, and the importance of collaborations and partnerships to apply downscaling projections to help inform the development of the India State Action Plans.

  7. Downscaling Coarse Scale Microwave Soil Moisture Product using Machine Learning

    NASA Astrophysics Data System (ADS)

    Abbaszadeh, P.; Moradkhani, H.; Yan, H.

    2016-12-01

    Soil moisture (SM) is a key variable in partitioning and examining the global water-energy cycle, agricultural planning, and water resource management. It is also strongly coupled with climate change, playing an important role in weather forecasting and drought monitoring and prediction, flood modeling and irrigation management. Although satellite retrievals can provide an unprecedented information of soil moisture at a global-scale, the products might be inadequate for basin scale study or regional assessment. To improve the spatial resolution of SM, this work presents a novel approach based on Machine Learning (ML) technique that allows for downscaling of the satellite soil moisture to fine resolution. For this purpose, the SMAP L-band radiometer SM products were used and conditioned on the Variable Infiltration Capacity (VIC) model prediction to describe the relationship between the coarse and fine scale soil moisture data. The proposed downscaling approach was applied to a western US basin and the products were compared against the available SM data from in-situ gauge stations. The obtained results indicated a great potential of the machine learning technique to derive the fine resolution soil moisture information that is currently used for land data assimilation applications.

  8. Input Decimated Ensembles

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Oza, Nikunj C.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    Using an ensemble of classifiers instead of a single classifier has been shown to improve generalization performance in many pattern recognition problems. However, the extent of such improvement depends greatly on the amount of correlation among the errors of the base classifiers. Therefore, reducing those correlations while keeping the classifiers' performance levels high is an important area of research. In this article, we explore input decimation (ID), a method which selects feature subsets for their ability to discriminate among the classes and uses them to decouple the base classifiers. We provide a summary of the theoretical benefits of correlation reduction, along with results of our method on two underwater sonar data sets, three benchmarks from the Probenl/UCI repositories, and two synthetic data sets. The results indicate that input decimated ensembles (IDEs) outperform ensembles whose base classifiers use all the input features; randomly selected subsets of features; and features created using principal components analysis, on a wide range of domains.

  9. Passive microwave soil moisture downscaling using vegetation index and skin surface temperature

    USDA-ARS?s Scientific Manuscript database

    Soil moisture satellite estimates are available from a variety of passive microwave satellite sensors, but their spatial resolution is frequently too coarse for use by land managers and other decision makers. In this paper, a soil moisture downscaling algorithm based on a regression relationship bet...

  10. Evaluating the ClimEx Single Model Large Ensemble in Comparison with EURO-CORDEX Results of Seasonal Means and Extreme Precipitation Indicators

    NASA Astrophysics Data System (ADS)

    von Trentini, F.; Schmid, F. J.; Braun, M.; Brisette, F.; Frigon, A.; Leduc, M.; Martel, J. L.; Willkofer, F.; Wood, R. R.; Ludwig, R.

    2017-12-01

    Meteorological extreme events seem to become more frequent in the present and future, and a seperation of natural climate variability and a clear climate change effect on these extreme events gains more and more interest. Since there is only one realisation of historical events, natural variability in terms of very long timeseries for a robust statistical analysis is not possible with observation data. A new single model large ensemble (SMLE), developed for the ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) is supposed to overcome this lack of data by downscaling 50 members of the CanESM2 (RCP 8.5) with the Canadian CRCM5 regional model (using the EURO-CORDEX grid specifications) for timeseries of 1950-2099 each, resulting in 7500 years of simulated climate. This allows for a better probabilistic analysis of rare and extreme events than any preceding dataset. Besides seasonal sums, several extreme indicators like R95pTOT, RX5day and others are calculated for the ClimEx ensemble and several EURO-CORDEX runs. This enables us to investigate the interaction between natural variability (as it appears in the CanESM2-CRCM5 members) and a climate change signal of those members for past, present and future conditions. Adding the EURO-CORDEX results to this, we can also assess the role of internal model variability (or natural variability) in climate change simulations. A first comparison shows similar magnitudes of variability of climate change signals between the ClimEx large ensemble and the CORDEX runs for some indicators, while for most indicators the spread of the SMLE is smaller than the spread of different CORDEX models.

  11. Downscaling climate information for local disease mapping.

    PubMed

    Bernardi, M; Gommes, R; Grieser, J

    2006-06-01

    The study of the impacts of climate on human health requires the interdisciplinary efforts of health professionals, climatologists, biologists, and social scientists to analyze the relationships among physical, biological, ecological, and social systems. As the disease dynamics respond to variations in regional and local climate, climate variability affects every region of the world and the diseases are not necessarily limited to specific regions, so that vectors may become endemic in other regions. Climate data at local level are thus essential to evaluate the dynamics of vector-borne disease through health-climate models and most of the times the climatological databases are not adequate. Climate data at high spatial resolution can be derived by statistical downscaling using historical observations but the method is limited by the availability of historical data at local level. Since the 90s', the statistical interpolation of climate data has been an important priority of the Agrometeorology Group of the Food and Agriculture Organization of the United Nations (FAO), as they are required for agricultural planning and operational activities at the local level. Since 1995, date of the first FAO spatial interpolation software for climate data, more advanced applications have been developed such as SEDI (Satellite Enhanced Data Interpolation) for the downscaling of climate data, LOCCLIM (Local Climate Estimator) and the NEW_LOCCLIM in collaboration with the Deutscher Wetterdienst (German Weather Service) to estimate climatic conditions at locations for which no observations are available. In parallel, an important effort has been made to improve the FAO climate database including at present more than 30,000 stations worldwide and expanding the database from developing countries coverage to global coverage.

  12. A Theoretical Analysis of Why Hybrid Ensembles Work

    PubMed Central

    2017-01-01

    Inspired by the group decision making process, ensembles or combinations of classifiers have been found favorable in a wide variety of application domains. Some researchers propose to use the mixture of two different types of classification algorithms to create a hybrid ensemble. Why does such an ensemble work? The question remains. Following the concept of diversity, which is one of the fundamental elements of the success of ensembles, we conduct a theoretical analysis of why hybrid ensembles work, connecting using different algorithms to accuracy gain. We also conduct experiments on classification performance of hybrid ensembles of classifiers created by decision tree and naïve Bayes classification algorithms, each of which is a top data mining algorithm and often used to create non-hybrid ensembles. Therefore, through this paper, we provide a complement to the theoretical foundation of creating and using hybrid ensembles. PMID:28255296

  13. Statistical downscaling of daily precipitation over Llobregat river basin in Catalonia (Spain) using three downscaling methods.

    NASA Astrophysics Data System (ADS)

    Ballinas, R.; Versini, P.-A.; Sempere, D.; Escaler, I.

    2009-09-01

    environmental impact studies. Downscaling methods to assess the effect of large-scale circulations on local parameters have. Statistical downscaling methods are based on the view that regional climate can be conditioned by two factors: large-scale climatic state and regional/local features. Local climate information is derived by first developing a statistical model which relates large-scale variables or "predictors" for which GCMs are trustable to regional or local surface "predictands" for which models are less skilful. The main advantage of these methods is that they are computationally inexpensive, and can be applied to outputs from different GCM experiments. Three statistical downscaling methods are applied: Analogue method, Delta Change and Direct Forcing. These methods have been used to determine daily precipitation projections at rain gauge location to study the intensity, frequency and variability of storms in a context of climate change in the Llobregat River Basin in Catalonia, Spain. This work is part of the European project "Water Change" (included in the LIFE + Environment Policy and Governance program). It deals with Medium and long term water resources modelling as a tool for planning and global change adaptation. Two stakeholders involved in the project provided the historical time series: Catalan Water Agency (ACA) and the State Meteorological Agency (AEMET).

  14. Effect of land model ensemble versus coupled model ensemble on the simulation of precipitation climatology and variability

    NASA Astrophysics Data System (ADS)

    Wei, Jiangfeng; Dirmeyer, Paul A.; Yang, Zong-Liang; Chen, Haishan

    2017-10-01

    Through a series of model simulations with an atmospheric general circulation model coupled to three different land surface models, this study investigates the impacts of land model ensembles and coupled model ensemble on precipitation simulation. It is found that coupling an ensemble of land models to an atmospheric model has a very minor impact on the improvement of precipitation climatology and variability, but a simple ensemble average of the precipitation from three individually coupled land-atmosphere models produces better results, especially for precipitation variability. The generally weak impact of land processes on precipitation should be the main reason that the land model ensembles do not improve precipitation simulation. However, if there are big biases in the land surface model or land surface data set, correcting them could improve the simulated climate, especially for well-constrained regional climate simulations.

  15. Climate Change Impact Assessment in Pacific North West Using Copula based Coupling of Temperature and Precipitation variables

    NASA Astrophysics Data System (ADS)

    Qin, Y.; Rana, A.; Moradkhani, H.

    2014-12-01

    The multi downscaled-scenario products allow us to better assess the uncertainty of the changes/variations of precipitation and temperature in the current and future periods. Joint Probability distribution functions (PDFs), of both the climatic variables, might help better understand the interdependence of the two, and thus in-turn help in accessing the future with confidence. Using the joint distribution of temperature and precipitation is also of significant importance in hydrological applications and climate change studies. In the present study, we have used multi-modelled statistically downscaled-scenario ensemble of precipitation and temperature variables using 2 different statistically downscaled climate dataset. The datasets used are, 10 Global Climate Models (GCMs) downscaled products from CMIP5 daily dataset, namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, leading to 2 ensemble time series from 20 GCM products. Thereafter the ensemble PDFs of both precipitation and temperature is evaluated for summer, winter, and yearly periods for all the 10 sub-basins across Columbia River Basin (CRB). Eventually, Copula is applied to establish the joint distribution of two variables enabling users to model the joint behavior of the variables with any level of correlation and dependency. Moreover, the probabilistic distribution helps remove the limitations on marginal distributions of variables in question. The joint distribution is then used to estimate the change trends of the joint precipitation and temperature in the current and future, along with estimation of the probabilities of the given change. Results have indicated towards varied change trends of the joint distribution of, summer, winter, and yearly time scale, respectively in all 10 sub-basins. Probabilities of changes, as estimated

  16. Statistical downscaling of summer precipitation over northwestern South America

    NASA Astrophysics Data System (ADS)

    Palomino Lemus, Reiner; Córdoba Machado, Samir; Raquel Gámiz Fortis, Sonia; Castro Díez, Yolanda; Jesús Esteban Parra, María

    2015-04-01

    In this study a statistical downscaling (SD) model using Principal Component Regression (PCR) for simulating summer precipitation in Colombia during the period 1950-2005, has been developed, and climate projections during the 2071-2100 period by applying the obtained SD model have been obtained. For these ends the Principal Components (PCs) of the SLP reanalysis data from NCEP were used as predictor variables, while the observed gridded summer precipitation was the predictand variable. Period 1950-1993 was utilized for calibration and 1994-2010 for validation. The Bootstrap with replacement was applied to provide estimations of the statistical errors. All models perform reasonably well at regional scales, and the spatial distribution of the correlation coefficients between predicted and observed gridded precipitation values show high values (between 0.5 and 0.93) along Andes range, north and north Pacific of Colombia. Additionally, the ability of the MIROC5 GCM to simulate the summer precipitation in Colombia, for present climate (1971-2005), has been analyzed by calculating the differences between the simulated and observed precipitation values. The simulation obtained by this GCM strongly overestimates the precipitation along a horizontal sector through the center of Colombia, especially important at the east and west of this country. However, the SD model applied to the SLP of the GCM shows its ability to faithfully reproduce the rainfall field. Finally, in order to get summer precipitation projections in Colombia for the period 1971-2100, the downscaled model, recalibrated for the total period 1950-2010, has been applied to the SLP output from MIROC5 model under the RCP2.6, RCP4.5 and RCP8.5 scenarios. The changes estimated by the SD models are not significant under the RCP2.6 scenario, while for the RCP4.5 and RCP8.5 scenarios a significant increase of precipitation appears regard to the present values in all the regions, reaching around the 27% in northern

  17. Impacts of high resolution model downscaling in coastal regions

    NASA Astrophysics Data System (ADS)

    Bricheno, Lucy; Wolf, Judith

    2013-04-01

    With model development and cheaper computational resources ocean forecasts are becoming readily available, high resolution coastal forecasting is now a reality. This can only be achieved, however, by downscaling global or basin-scale products such as the MyOcean reanalyses and forecasts. These model products have resolution ranging from 1/16th - 1/4 degree, which are often insufficient for coastal scales, but can provide initialisation and boundary data. We present applications of downscaling the MyOcean products for use in shelf-seas and the nearshore. We will address the question 'Do coastal predictions improve with higher resolution modelling?' with a few focused examples, while also discussing what is meant by an improved result. Increasing resolution appears to be an obvious route for getting more accurate forecasts in operational coastal models. However, when models resolve finer scales, this may lead to the introduction of high-frequency variability which is not necessarily deterministic. Thus a flow may appear more realistic by generating eddies but the simple statistics like rms error and correlation may become less good because the model variability is not exactly in phase with the observations (Hoffman et al., 1995). By deciding on a specific process to simulate (rather than concentrating on reducing rms error) we can better assess the improvements gained by downscaling. In this work we will select two processes which are dominant in our case-study site: Liverpool Bay. Firstly we consider the magnitude and timing of a peak in tide-surge elevations, by separating out the event into timing (or displacement) and intensity (or amplitude) errors. The model can thus be evaluated on how well it predicts the timing and magnitude of the surge. The second important characteristic of Liverpool Bay is the position of the freshwater front. To evaluate model performance in this case, the location, sharpness, and temperature difference across the front will be

  18. Mid-Century Warming in the Los Angeles Region and its Uncertainty using Dynamical and Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Sun, F.; Hall, A. D.; Walton, D.; Capps, S. B.; Qu, X.; Huang, H. J.; Berg, N.; Jousse, A.; Schwartz, M.; Nakamura, M.; Cerezo-Mota, R.

    2012-12-01

    Using a combination of dynamical and statistical downscaling techniques, we projected mid-21st century warming in the Los Angeles region at 2-km resolution. To account for uncertainty associated with the trajectory of future greenhouse gas emissions, we examined projections for both "business-as-usual" (RCP8.5) and "mitigation" (RCP2.6) emissions scenarios from the Fifth Coupled Model Intercomparison Project (CMIP5). To account for the considerable uncertainty associated with choice of global climate model, we downscaled results for all available global climate models in CMIP5. For the business-as-usual scenario, we find that by the mid-21st century, the most likely warming is roughly 2.6°C averaged over the region's land areas, with a 95% confidence that the warming lies between 0.9 and 4.2°C. The high resolution of the projections reveals a pronounced spatial pattern in the warming: High elevations and inland areas separated from the coast by at least one mountain complex warm 20 to 50% more than the areas near the coast or within the Los Angeles basin. This warming pattern is especially apparent in summertime. The summertime warming contrast between the inland and coastal zones has a large effect on the most likely expected number of extremely hot days per year. Coastal locations and areas within the Los Angeles basin see roughly two to three times the number of extremely hot days, while high elevations and inland areas typically experience approximately three to five times the number of extremely hot days. Under the mitigation emissions scenario, the most likely warming and increase in heat extremes are somewhat smaller. However, the majority of the warming seen in the business-as-usual scenario still occurs at all locations in the most likely case under the mitigation scenario, and heat extremes still increase significantly. This warming study is the first part of a series studies of our project. More climate change impacts on the Santa Ana wind, rainfall

  19. CWRF performance at downscaling China climate characteristics

    NASA Astrophysics Data System (ADS)

    Liang, Xin-Zhong; Sun, Chao; Zheng, Xiaohui; Dai, Yongjiu; Xu, Min; Choi, Hyun I.; Ling, Tiejun; Qiao, Fengxue; Kong, Xianghui; Bi, Xunqiang; Song, Lianchun; Wang, Fang

    2018-05-01

    The performance of the regional Climate-Weather Research and Forecasting model (CWRF) for downscaling China climate characteristics is evaluated using a 1980-2015 simulation at 30 km grid spacing driven by the ECMWF Interim reanalysis (ERI). It is shown that CWRF outperforms the popular Regional Climate Modeling system (RegCM4.6) in key features including monsoon rain bands, diurnal temperature ranges, surface winds, interannual precipitation and temperature anomalies, humidity couplings, and 95th percentile daily precipitation. Even compared with ERI, which assimilates surface observations, CWRF better represents the geographic distributions of seasonal mean climate and extreme precipitation. These results indicate that CWRF may significantly enhance China climate modeling capabilities.

  20. Evaluation of an Ensemble Dispersion Calculation.

    NASA Astrophysics Data System (ADS)

    Draxler, Roland R.

    2003-02-01

    A Lagrangian transport and dispersion model was modified to generate multiple simulations from a single meteorological dataset. Each member of the simulation was computed by assuming a ±1-gridpoint shift in the horizontal direction and a ±250-m shift in the vertical direction of the particle position, with respect to the meteorological data. The configuration resulted in 27 ensemble members. Each member was assumed to have an equal probability. The model was tested by creating an ensemble of daily average air concentrations for 3 months at 75 measurement locations over the eastern half of the United States during the Across North America Tracer Experiment (ANATEX). Two generic graphical displays were developed to summarize the ensemble prediction and the resulting concentration probabilities for a specific event: a probability-exceed plot and a concentration-probability plot. Although a cumulative distribution of the ensemble probabilities compared favorably with the measurement data, the resulting distribution was not uniform. This result was attributed to release height sensitivity. The trajectory ensemble approach accounts for about 41%-47% of the variance in the measurement data. This residual uncertainty is caused by other model and data errors that are not included in the ensemble design.

  1. Assessing the Added Value of Dynamical Downscaling in the Context of Hydrologic Implication

    NASA Astrophysics Data System (ADS)

    Lu, M.; IM, E. S.; Lee, M. H.

    2017-12-01

    There is a scientific consensus that high-resolution climate simulations downscaled by Regional Climate Models (RCMs) can provide valuable refined information over the target region. However, a significant body of hydrologic impact assessment has been performing using the climate information provided by Global Climate Models (GCMs) in spite of a fundamental spatial scale gap. It is probably based on the assumption that the substantial biases and spatial scale gap from GCMs raw data can be simply removed by applying the statistical bias correction and spatial disaggregation. Indeed, many previous studies argue that the benefit of dynamical downscaling using RCMs is minimal when linking climate data with the hydrological model, from the comparison of the impact between bias-corrected GCMs and bias-corrected RCMs on hydrologic simulations. It may be true for long-term averaged climatological pattern, but it is not necessarily the case when looking into variability across various temporal spectrum. In this study, we investigate the added value of dynamical downscaling focusing on the performance in capturing climate variability. For doing this, we evaluate the performance of the distributed hydrological model over the Korean river basin using the raw output from GCM and RCM, and bias-corrected output from GCM and RCM. The impacts of climate input data on streamflow simulation are comprehensively analyzed. [Acknowledgements]This research is supported by the Korea Agency for Infrastructure Technology Advancement (KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant 17AWMP-B083066-04).

  2. Spatial downscaling algorithm of TRMM precipitation based on multiple high-resolution satellite data for Inner Mongolia, China

    NASA Astrophysics Data System (ADS)

    Duan, Limin; Fan, Keke; Li, Wei; Liu, Tingxi

    2017-12-01

    Daily precipitation data from 42 stations in Inner Mongolia, China for the 10 years period from 1 January 2001 to 31 December 2010 was utilized along with downscaled data from the Tropical Rainfall Measuring Mission (TRMM) with a spatial resolution of 0.25° × 0.25° for the same period based on the statistical relationships between the normalized difference vegetation index (NDVI), meteorological variables, and digital elevation models (https://en.wikipedia.org/wiki/Digital_elevation_model) (DEM) using the leave-one-out (LOO) cross validation method and multivariate step regression. The results indicate that (1) TRMM data can indeed be used to estimate annual precipitation in Inner Mongolia and there is a linear relationship between annual TRMM and observed precipitation; (2) there is a significant relationship between TRMM-based precipitation and predicted precipitation, with a spatial resolution of 0.50° × 0.50°; (3) NDVI and temperature are important factors influencing the downscaling of TRMM precipitation data for DEM and the slope is not the most significant factor affecting the downscaled TRMM data; and (4) the downscaled TRMM data reflects spatial patterns in annual precipitation reasonably well, showing less precipitation falling in west Inner Mongolia and more in the south and southeast. The new approach proposed here provides a useful alternative for evaluating spatial patterns in precipitation and can thus be applied to generate a more accurate precipitation dataset to support both irrigation management and the conservation of this fragile grassland ecosystem.

  3. On the predictability of outliers in ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Siegert, S.; Bröcker, J.; Kantz, H.

    2012-03-01

    In numerical weather prediction, ensembles are used to retrieve probabilistic forecasts of future weather conditions. We consider events where the verification is smaller than the smallest, or larger than the largest ensemble member of a scalar ensemble forecast. These events are called outliers. In a statistically consistent K-member ensemble, outliers should occur with a base rate of 2/(K+1). In operational ensembles this base rate tends to be higher. We study the predictability of outlier events in terms of the Brier Skill Score and find that forecast probabilities can be calculated which are more skillful than the unconditional base rate. This is shown analytically for statistically consistent ensembles. Using logistic regression, forecast probabilities for outlier events in an operational ensemble are calculated. These probabilities exhibit positive skill which is quantitatively similar to the analytical results. Possible causes of these results as well as their consequences for ensemble interpretation are discussed.

  4. Rethinking the Default Construction of Multimodel Climate Ensembles

    DOE PAGES

    Rauser, Florian; Gleckler, Peter; Marotzke, Jochem

    2015-07-21

    Here, we discuss the current code of practice in the climate sciences to routinely create climate model ensembles as ensembles of opportunity from the newest phase of the Coupled Model Intercomparison Project (CMIP). We give a two-step argument to rethink this process. First, the differences between generations of ensembles corresponding to different CMIP phases in key climate quantities are not large enough to warrant an automatic separation into generational ensembles for CMIP3 and CMIP5. Second, we suggest that climate model ensembles cannot continue to be mere ensembles of opportunity but should always be based on a transparent scientific decision process.more » If ensembles can be constrained by observation, then they should be constructed as target ensembles that are specifically tailored to a physical question. If model ensembles cannot be constrained by observation, then they should be constructed as cross-generational ensembles, including all available model data to enhance structural model diversity and to better sample the underlying uncertainties. To facilitate this, CMIP should guide the necessarily ongoing process of updating experimental protocols for the evaluation and documentation of coupled models. Finally, with an emphasis on easy access to model data and facilitating the filtering of climate model data across all CMIP generations and experiments, our community could return to the underlying idea of using model data ensembles to improve uncertainty quantification, evaluation, and cross-institutional exchange.« less

  5. Long-range Prediction of climatic Change in the Eastern Seaboard of Thailand over the 21st Century using various Downscaling Approaches

    NASA Astrophysics Data System (ADS)

    Bejranonda, Werapol; Koch, Manfred; Koontanakulvong, Sucharit

    2010-05-01

    the different scales of the hydrological (local to regional) and of the GCM (global), one is faced with the problem of 'downscaling' the coarse grid resolution output of the GCM to the fine grid of the hydrological model. Although there have been numerous downscaling approaches proposed to that regard over the last decade, the jury is still out about the best method to use in a particular application. The focus here is on the downscaling part of the investigation, i.e. the proper preparation of the GCM's output to serve as input, i.e. the driving force, to the hydrological model (which is not further discussed here). Daily ensembles of climate variables computed by means of the CGCM3 model of the Canadian Climate Center which has a horizontal grid resolution of approximately the size of the whole study basin are used here, indicating clearly the need for downscaling. Daily observations of local climate variables available since 1971 are used as additional input to the various downscaling tools proposed which are, namely, the stochastic weather generator (LARS-WG), the statistical downscaling model (SDSM), and a multiple linear regression model between the observed variables and the CGCM3 predictors. Both the 2D and the 3D versions of the CGCM3 model are employed to predict, 100 years ahead up to year 2100, the monthly rainfall and temperatures, based on the past calibration period (training period) 1971-2000. To investigate the prediction performance, multiple linear regression, autoregressive (AR) and autoregressive integrated moving average (ARIMA) models are applied to the time series of the observation data which are aggregated into monthly time steps to be able compare them with the downscaling results above. Likewise, multiple linear regression and ARIMA models also executed on the CGCM3 predictors and the Pacific / Indian oceans indices as external regressors to predict short-term local climate variations. The results of the various downscaling method are

  6. Ensemble Data Assimilation Without Ensembles: Methodology and Application to Ocean Data Assimilation

    NASA Technical Reports Server (NTRS)

    Keppenne, Christian L.; Rienecker, Michele M.; Kovach, Robin M.; Vernieres, Guillaume

    2013-01-01

    Two methods to estimate background error covariances for data assimilation are introduced. While both share properties with the ensemble Kalman filter (EnKF), they differ from it in that they do not require the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The first method is referred-to as SAFE (Space Adaptive Forecast error Estimation) because it estimates error covariances from the spatial distribution of model variables within a single state vector. It can thus be thought of as sampling an ensemble in space. The second method, named FAST (Flow Adaptive error Statistics from a Time series), constructs an ensemble sampled from a moving window along a model trajectory. The underlying assumption in these methods is that forecast errors in data assimilation are primarily phase errors in space and/or time.

  7. Ensembl genomes 2016: more genomes, more complexity

    USDA-ARS?s Scientific Manuscript database

    Ensembl Genomes (http://www.ensemblgenomes.org) is an integrating resource for genome-scale data from non-vertebrate species, complementing the resources for vertebrate genomics developed in the context of the Ensembl project (http://www.ensembl.org). Together, the two resources provide a consistent...

  8. Hybrid Data Assimilation without Ensemble Filtering

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Akkraoui, Amal El

    2014-01-01

    The Global Modeling and Assimilation Office is preparing to upgrade its three-dimensional variational system to a hybrid approach in which the ensemble is generated using a square-root ensemble Kalman filter (EnKF) and the variational problem is solved using the Grid-point Statistical Interpolation system. As in most EnKF applications, we found it necessary to employ a combination of multiplicative and additive inflations, to compensate for sampling and modeling errors, respectively and, to maintain the small-member ensemble solution close to the variational solution; we also found it necessary to re-center the members of the ensemble about the variational analysis. During tuning of the filter we have found re-centering and additive inflation to play a considerably larger role than expected, particularly in a dual-resolution context when the variational analysis is ran at larger resolution than the ensemble. This led us to consider a hybrid strategy in which the members of the ensemble are generated by simply converting the variational analysis to the resolution of the ensemble and applying additive inflation, thus bypassing the EnKF. Comparisons of this, so-called, filter-free hybrid procedure with an EnKF-based hybrid procedure and a control non-hybrid, traditional, scheme show both hybrid strategies to provide equally significant improvement over the control; more interestingly, the filter-free procedure was found to give qualitatively similar results to the EnKF-based procedure.

  9. Motor-motor interactions in ensembles of muscle myosin: using theory to connect single molecule to ensemble measurements

    NASA Astrophysics Data System (ADS)

    Walcott, Sam

    2013-03-01

    Interactions between the proteins actin and myosin drive muscle contraction. Properties of a single myosin interacting with an actin filament are largely known, but a trillion myosins work together in muscle. We are interested in how single-molecule properties relate to ensemble function. Myosin's reaction rates depend on force, so ensemble models keep track of both molecular state and force on each molecule. These models make subtle predictions, e.g. that myosin, when part of an ensemble, moves actin faster than when isolated. This acceleration arises because forces between molecules speed reaction kinetics. Experiments support this prediction and allow parameter estimates. A model based on this analysis describes experiments from single molecule to ensemble. In vivo, actin is regulated by proteins that, when present, cause the binding of one myosin to speed the binding of its neighbors; binding becomes cooperative. Although such interactions preclude the mean field approximation, a set of linear ODEs describes these ensembles under simplified experimental conditions. In these experiments cooperativity is strong, with the binding of one molecule affecting ten neighbors on either side. We progress toward a description of myosin ensembles under physiological conditions.

  10. Comparison Of Downscaled CMIP5 Precipitation Datasets For Projecting Changes In Extreme Precipitation In The San Francisco Bay Area.

    NASA Technical Reports Server (NTRS)

    Milesi, Cristina; Costa-Cabral, Mariza; Rath, John; Mills, William; Roy, Sujoy; Thrasher, Bridget; Wang, Weile; Chiang, Felicia; Loewenstein, Max; Podolske, James

    2014-01-01

    Water resource managers planning for the adaptation to future events of extreme precipitation now have access to high resolution downscaled daily projections derived from statistical bias correction and constructed analogs. We also show that along the Pacific Coast the Northern Oscillation Index (NOI) is a reliable predictor of storm likelihood, and therefore a predictor of seasonal precipitation totals and likelihood of extremely intense precipitation. Such time series can be used to project intensity duration curves into the future or input into stormwater models. However, few climate projection studies have explored the impact of the type of downscaling method used on the range and uncertainty of predictions for local flood protection studies. Here we present a study of the future climate flood risk at NASA Ames Research Center, located in South Bay Area, by comparing the range of predictions in extreme precipitation events calculated from three sets of time series downscaled from CMIP5 data: 1) the Bias Correction Constructed Analogs method dataset downscaled to a 1/8 degree grid (12km); 2) the Bias Correction Spatial Disaggregation method downscaled to a 1km grid; 3) a statistical model of extreme daily precipitation events and projected NOI from CMIP5 models. In addition, predicted years of extreme precipitation are used to estimate the risk of overtopping of the retention pond located on the site through simulations of the EPA SWMM hydrologic model. Preliminary results indicate that the intensity of extreme precipitation events is expected to increase and flood the NASA Ames retention pond. The results from these estimations will assist flood protection managers in planning for infrastructure adaptations.

  11. The Ensembl genome database project.

    PubMed

    Hubbard, T; Barker, D; Birney, E; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Huminiecki, L; Kasprzyk, A; Lehvaslaiho, H; Lijnzaad, P; Melsopp, C; Mongin, E; Pettett, R; Pocock, M; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Clamp, M

    2002-01-01

    The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of the human genome sequence, with confirmed gene predictions that have been integrated with external data sources, and is available as either an interactive web site or as flat files. It is also an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements from sequence analysis to data storage and visualisation. The Ensembl site is one of the leading sources of human genome sequence annotation and provided much of the analysis for publication by the international human genome project of the draft genome. The Ensembl system is being installed around the world in both companies and academic sites on machines ranging from supercomputers to laptops.

  12. Decadal climate prediction in the large ensemble limit

    NASA Astrophysics Data System (ADS)

    Yeager, S. G.; Rosenbloom, N. A.; Strand, G.; Lindsay, K. T.; Danabasoglu, G.; Karspeck, A. R.; Bates, S. C.; Meehl, G. A.

    2017-12-01

    In order to quantify the benefits of initialization for climate prediction on decadal timescales, two parallel sets of historical simulations are required: one "initialized" ensemble that incorporates observations of past climate states and one "uninitialized" ensemble whose internal climate variations evolve freely and without synchronicity. In the large ensemble limit, ensemble averaging isolates potentially predictable forced and internal variance components in the "initialized" set, but only the forced variance remains after averaging the "uninitialized" set. The ensemble size needed to achieve this variance decomposition, and to robustly distinguish initialized from uninitialized decadal predictions, remains poorly constrained. We examine a large ensemble (LE) of initialized decadal prediction (DP) experiments carried out using the Community Earth System Model (CESM). This 40-member CESM-DP-LE set of experiments represents the "initialized" complement to the CESM large ensemble of 20th century runs (CESM-LE) documented in Kay et al. (2015). Both simulation sets share the same model configuration, historical radiative forcings, and large ensemble sizes. The twin experiments afford an unprecedented opportunity to explore the sensitivity of DP skill assessment, and in particular the skill enhancement associated with initialization, to ensemble size. This talk will highlight the benefits of a large ensemble size for initialized predictions of seasonal climate over land in the Atlantic sector as well as predictions of shifts in the likelihood of climate extremes that have large societal impact.

  13. Downscaling SSPs in Bangladesh - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    NASA Astrophysics Data System (ADS)

    Allan, A.; Barbour, E.; Salehin, M.; Hutton, C.; Lázár, A. N.; Nicholls, R. J.; Rahman, M. M.

    2015-12-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  14. Layered Ensemble Architecture for Time Series Forecasting.

    PubMed

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  15. Perception of ensemble statistics requires attention.

    PubMed

    Jackson-Nielsen, Molly; Cohen, Michael A; Pitts, Michael A

    2017-02-01

    To overcome inherent limitations in perceptual bandwidth, many aspects of the visual world are represented as summary statistics (e.g., average size, orientation, or density of objects). Here, we investigated the relationship between summary (ensemble) statistics and visual attention. Recently, it was claimed that one ensemble statistic in particular, color diversity, can be perceived without focal attention. However, a broader debate exists over the attentional requirements of conscious perception, and it is possible that some form of attention is necessary for ensemble perception. To test this idea, we employed a modified inattentional blindness paradigm and found that multiple types of summary statistics (color and size) often go unnoticed without attention. In addition, we found attentional costs in dual-task situations, further implicating a role for attention in statistical perception. Overall, we conclude that while visual ensembles may be processed efficiently, some amount of attention is necessary for conscious perception of ensemble statistics. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Cluster Analysis of Downscaled and Explicitly Simulated North Atlantic Tropical Cyclone Tracks

    DOE PAGES

    Daloz, Anne S.; Camargo, S. J.; Kossin, J. P.; ...

    2015-02-11

    A realistic representation of the North Atlantic tropical cyclone tracks is crucial as it allows, for example, explaining potential changes in U.S. landfalling systems. Here, the authors present a tentative study that examines the ability of recent climate models to represent North Atlantic tropical cyclone tracks. Tracks from two types of climate models are evaluated: explicit tracks are obtained from tropical cyclones simulated in regional or global climate models with moderate to high horizontal resolution (1°–0.25°), and downscaled tracks are obtained using a downscaling technique with large-scale environmental fields from a subset of these models. Here, for both configurations, tracksmore » are objectively separated into four groups using a cluster technique, leading to a zonal and a meridional separation of the tracks. The meridional separation largely captures the separation between deep tropical and subtropical, hybrid or baroclinic cyclones, while the zonal separation segregates Gulf of Mexico and Cape Verde storms. The properties of the tracks’ seasonality, intensity, and power dissipation index in each cluster are documented for both configurations. The authors’ results show that, except for the seasonality, the downscaled tracks better capture the observed characteristics of the clusters. The authors also use three different idealized scenarios to examine the possible future changes of tropical cyclone tracks under 1) warming sea surface temperature, 2) increasing carbon dioxide, and 3) a combination of the two. The response to each scenario is highly variable depending on the simulation considered. Lastly, the authors examine the role of each cluster in these future changes and find no preponderant contribution of any single cluster over the others.« less

  17. A parametric approach for simultaneous bias correction and high-resolution downscaling of climate model rainfall

    NASA Astrophysics Data System (ADS)

    Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto; Marrocu, Marino

    2017-03-01

    Distribution mapping has been identified as the most efficient approach to bias-correct climate model rainfall, while reproducing its statistics at spatial and temporal resolutions suitable to run hydrologic models. Yet its implementation based on empirical distributions derived from control samples (referred to as nonparametric distribution mapping) makes the method's performance sensitive to sample length variations, the presence of outliers, the spatial resolution of climate model results, and may lead to biases, especially in extreme rainfall estimation. To address these shortcomings, we propose a methodology for simultaneous bias correction and high-resolution downscaling of climate model rainfall products that uses: (a) a two-component theoretical distribution model (i.e., a generalized Pareto (GP) model for rainfall intensities above a specified threshold u*, and an exponential model for lower rainrates), and (b) proper interpolation of the corresponding distribution parameters on a user-defined high-resolution grid, using kriging for uncertain data. We assess the performance of the suggested parametric approach relative to the nonparametric one, using daily raingauge measurements from a dense network in the island of Sardinia (Italy), and rainfall data from four GCM/RCM model chains of the ENSEMBLES project. The obtained results shed light on the competitive advantages of the parametric approach, which is proved more accurate and considerably less sensitive to the characteristics of the calibration period, independent of the GCM/RCM combination used. This is especially the case for extreme rainfall estimation, where the GP assumption allows for more accurate and robust estimates, also beyond the range of the available data.

  18. AUC-Maximizing Ensembles through Metalearning.

    PubMed

    LeDell, Erin; van der Laan, Mark J; Petersen, Maya

    2016-05-01

    Area Under the ROC Curve (AUC) is often used to measure the performance of an estimator in binary classification problems. An AUC-maximizing classifier can have significant advantages in cases where ranking correctness is valued or if the outcome is rare. In a Super Learner ensemble, maximization of the AUC can be achieved by the use of an AUC-maximining metalearning algorithm. We discuss an implementation of an AUC-maximization technique that is formulated as a nonlinear optimization problem. We also evaluate the effectiveness of a large number of different nonlinear optimization algorithms to maximize the cross-validated AUC of the ensemble fit. The results provide evidence that AUC-maximizing metalearners can, and often do, out-perform non-AUC-maximizing metalearning methods, with respect to ensemble AUC. The results also demonstrate that as the level of imbalance in the training data increases, the Super Learner ensemble outperforms the top base algorithm by a larger degree.

  19. AUC-Maximizing Ensembles through Metalearning

    PubMed Central

    LeDell, Erin; van der Laan, Mark J.; Peterson, Maya

    2016-01-01

    Area Under the ROC Curve (AUC) is often used to measure the performance of an estimator in binary classification problems. An AUC-maximizing classifier can have significant advantages in cases where ranking correctness is valued or if the outcome is rare. In a Super Learner ensemble, maximization of the AUC can be achieved by the use of an AUC-maximining metalearning algorithm. We discuss an implementation of an AUC-maximization technique that is formulated as a nonlinear optimization problem. We also evaluate the effectiveness of a large number of different nonlinear optimization algorithms to maximize the cross-validated AUC of the ensemble fit. The results provide evidence that AUC-maximizing metalearners can, and often do, out-perform non-AUC-maximizing metalearning methods, with respect to ensemble AUC. The results also demonstrate that as the level of imbalance in the training data increases, the Super Learner ensemble outperforms the top base algorithm by a larger degree. PMID:27227721

  20. Anticipating Future Extreme Climate Events for Alaska Using Dynamical Downscaling and Quantile Mapping

    NASA Astrophysics Data System (ADS)

    Lader, R.; Walsh, J. E.

    2016-12-01

    Alaska is projected to experience major changes in extreme climate during the 21st century, due to greenhouse warming and exacerbated by polar amplification, wherein the Arctic is warming at twice the rate compared to the Northern Hemisphere. Given its complex topography, Alaska displays extreme gradients of temperature and precipitation. However, global climate models (GCMs), which typically have a spatial resolution on the order of 100km, struggle to replicate these extremes. To help resolve this issue, this study employs dynamically downscaled regional climate simulations and quantile-mapping methodologies to provide a full suite of daily model variables at 20 km spatial resolution for Alaska, from 1970 to 2100. These data include downscaled products of the: ERA-Interim reanalysis from 1979 to 2015, GFDL-CM3 historical from 1970 to 2005, and GFDL-CM3 RCP 8.5 from 2006 to 2100. Due to the limited nature of long-term observations and high-resolution modeling in Alaska, these data enable a broad expansion of extremes analysis. This study uses these data to highlight a subset of the 27 climate extremes indices, previously defined by the Expert Team on Climate Change Detection and Indices, as they pertain to climate change in Alaska. These indices are based on the statistical distributions of daily surface temperature and precipitation and focus on threshold exceedance, and percentiles. For example, the annual number of days with a daily maximum temperature greater than 25°C is anticipated to triple in many locations in Alaska by the end of the century. Climate extremes can also refer to long duration events, such as the record-setting warmth that defined the 2015-16 cold season in Alaska. The downscaled climate model simulations indicate that this past winter will be considered normal by as early as the mid-2040s, if we continue to warm according to the business-as-usual RCP 8.5 emissions scenario. This represents an accelerated warming as compared to projections

  1. CMIP5 downscaling and its uncertainty in China

    NASA Astrophysics Data System (ADS)

    Yue, TianXiang; Zhao, Na; Fan, ZeMeng; Li, Jing; Chen, ChuanFa; Lu, YiMin; Wang, ChenLiang; Xu, Bing; Wilson, John

    2016-11-01

    A comparison between the Coupled Model Intercomparison Project Phase 5 (CMIP5) data and observations at 735 meteorological stations indicated that mean annual temperature (MAT) was underestimated about 1.8 °C while mean annual precipitation (MAP) was overestimated about 263 mm in general across the whole of China. A statistical analysis of China-CMIP5 data demonstrated that MAT exhibits spatial stationarity, while MAP exhibits spatial non-stationarity. MAT and MAP data from the China-CMIP5 dataset were downscaled by combining statistical approaches with a method for high accuracy surface modeling (HASM). A statistical transfer function (STF) of MAT was formulated using minimized residuals output by HASM with an ordinary least squares (OLS) linear equation that used latitude and elevation as independent variables, abbreviated as HASM-OLS. The STF of MAP under a BOX-COX transformation was derived as a combination of minimized residuals output by HASM with a geographically weight regression (GWR) using latitude, longitude, elevation and impact coefficient of aspect as independent variables, abbreviated as HASM-GB. Cross validation, using observational data from the 735 meteorological stations across China for the period 1976 to 2005, indicates that the largest uncertainty occurred on the Tibet plateau with mean absolute errors (MAEs) of MAT and MAP as high as 4.64 °C and 770.51 mm, respectively. The downscaling processes of HASM-OLS and HASM-GB generated MAEs of MAT and MAP that were 67.16% and 77.43% lower, respectively across the whole of China on average, and 88.48% and 97.09% lower for the Tibet plateau.

  2. Technical Challenges and Solutions in Representing Lakes when using WRF in Downscaling Applications

    EPA Science Inventory

    The Weather Research and Forecasting (WRF) model is commonly used to make high resolution future projections of regional climate by downscaling global climate model (GCM) outputs. Because the GCM fields are typically at a much coarser spatial resolution than the target regional ...

  3. Advanced Atmospheric Ensemble Modeling Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, R.; Chiswell, S.; Kurzeja, R.

    Ensemble modeling (EM), the creation of multiple atmospheric simulations for a given time period, has become an essential tool for characterizing uncertainties in model predictions. We explore two novel ensemble modeling techniques: (1) perturbation of model parameters (Adaptive Programming, AP), and (2) data assimilation (Ensemble Kalman Filter, EnKF). The current research is an extension to work from last year and examines transport on a small spatial scale (<100 km) in complex terrain, for more rigorous testing of the ensemble technique. Two different release cases were studied, a coastal release (SF6) and an inland release (Freon) which consisted of two releasemore » times. Observations of tracer concentration and meteorology are used to judge the ensemble results. In addition, adaptive grid techniques have been developed to reduce required computing resources for transport calculations. Using a 20- member ensemble, the standard approach generated downwind transport that was quantitatively good for both releases; however, the EnKF method produced additional improvement for the coastal release where the spatial and temporal differences due to interior valley heating lead to the inland movement of the plume. The AP technique showed improvements for both release cases, with more improvement shown in the inland release. This research demonstrated that transport accuracy can be improved when models are adapted to a particular location/time or when important local data is assimilated into the simulation and enhances SRNL’s capability in atmospheric transport modeling in support of its current customer base and local site missions, as well as our ability to attract new customers within the intelligence community.« less

  4. Accounting for downscaling and model uncertainty in fine-resolution seasonal climate projections over the Columbia River Basin

    NASA Astrophysics Data System (ADS)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2018-01-01

    Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.

  5. A multimodal wave spectrum-based approach for statistical downscaling of local wave climate

    USGS Publications Warehouse

    Hegermiller, Christie; Antolinez, Jose A A; Rueda, Ana C.; Camus, Paula; Perez, Jorge; Erikson, Li; Barnard, Patrick; Mendez, Fernando J.

    2017-01-01

    Characterization of wave climate by bulk wave parameters is insufficient for many coastal studies, including those focused on assessing coastal hazards and long-term wave climate influences on coastal evolution. This issue is particularly relevant for studies using statistical downscaling of atmospheric fields to local wave conditions, which are often multimodal in large ocean basins (e.g. the Pacific). Swell may be generated in vastly different wave generation regions, yielding complex wave spectra that are inadequately represented by a single set of bulk wave parameters. Furthermore, the relationship between atmospheric systems and local wave conditions is complicated by variations in arrival time of wave groups from different parts of the basin. Here, we address these two challenges by improving upon the spatiotemporal definition of the atmospheric predictor used in statistical downscaling of local wave climate. The improved methodology separates the local wave spectrum into “wave families,” defined by spectral peaks and discrete generation regions, and relates atmospheric conditions in distant regions of the ocean basin to local wave conditions by incorporating travel times computed from effective energy flux across the ocean basin. When applied to locations with multimodal wave spectra, including Southern California and Trujillo, Peru, the new methodology improves the ability of the statistical model to project significant wave height, peak period, and direction for each wave family, retaining more information from the full wave spectrum. This work is the base of statistical downscaling by weather types, which has recently been applied to coastal flooding and morphodynamic applications.

  6. Bayesian ensemble refinement by replica simulations and reweighting.

    PubMed

    Hummer, Gerhard; Köfinger, Jürgen

    2015-12-28

    We describe different Bayesian ensemble refinement methods, examine their interrelation, and discuss their practical application. With ensemble refinement, the properties of dynamic and partially disordered (bio)molecular structures can be characterized by integrating a wide range of experimental data, including measurements of ensemble-averaged observables. We start from a Bayesian formulation in which the posterior is a functional that ranks different configuration space distributions. By maximizing this posterior, we derive an optimal Bayesian ensemble distribution. For discrete configurations, this optimal distribution is identical to that obtained by the maximum entropy "ensemble refinement of SAXS" (EROS) formulation. Bayesian replica ensemble refinement enhances the sampling of relevant configurations by imposing restraints on averages of observables in coupled replica molecular dynamics simulations. We show that the strength of the restraints should scale linearly with the number of replicas to ensure convergence to the optimal Bayesian result in the limit of infinitely many replicas. In the "Bayesian inference of ensembles" method, we combine the replica and EROS approaches to accelerate the convergence. An adaptive algorithm can be used to sample directly from the optimal ensemble, without replicas. We discuss the incorporation of single-molecule measurements and dynamic observables such as relaxation parameters. The theoretical analysis of different Bayesian ensemble refinement approaches provides a basis for practical applications and a starting point for further investigations.

  7. Bayesian ensemble refinement by replica simulations and reweighting

    NASA Astrophysics Data System (ADS)

    Hummer, Gerhard; Köfinger, Jürgen

    2015-12-01

    We describe different Bayesian ensemble refinement methods, examine their interrelation, and discuss their practical application. With ensemble refinement, the properties of dynamic and partially disordered (bio)molecular structures can be characterized by integrating a wide range of experimental data, including measurements of ensemble-averaged observables. We start from a Bayesian formulation in which the posterior is a functional that ranks different configuration space distributions. By maximizing this posterior, we derive an optimal Bayesian ensemble distribution. For discrete configurations, this optimal distribution is identical to that obtained by the maximum entropy "ensemble refinement of SAXS" (EROS) formulation. Bayesian replica ensemble refinement enhances the sampling of relevant configurations by imposing restraints on averages of observables in coupled replica molecular dynamics simulations. We show that the strength of the restraints should scale linearly with the number of replicas to ensure convergence to the optimal Bayesian result in the limit of infinitely many replicas. In the "Bayesian inference of ensembles" method, we combine the replica and EROS approaches to accelerate the convergence. An adaptive algorithm can be used to sample directly from the optimal ensemble, without replicas. We discuss the incorporation of single-molecule measurements and dynamic observables such as relaxation parameters. The theoretical analysis of different Bayesian ensemble refinement approaches provides a basis for practical applications and a starting point for further investigations.

  8. Inundation downscaling for the development of a long-term and global inundation database compatible to SWOT mission

    NASA Astrophysics Data System (ADS)

    Aires, Filipe; Prigent, Catherine; Papa, Fabrice

    2014-05-01

    The Global Inundation Extent from Multi-Satellite (GIEMS) provides multi-year monthly variations of the global surface water extent at about 25 kmx25 km resolution, from 1993 to 2007. It is derived from multiple satellite observations. Its spatial resolution is usually compatible with climate model outputs and with global land surface model grids but is clearly not adequate for local applications that require the characterization of small individual water bodies. There is today a strong demand for high-resolution inundation extent datasets, for a large variety of applications such as water management, regional hydrological modeling, or for the analysis of mosquitos-related diseases. Even for climate applications, the GIEMS resolution might be limited given recent results on the key importance of the smallest ponds in the emission of CH4, as compared to the largest ones. If the inundation extent is combined to altimetry measurements to obtain water volume changes, and finally river discharge to the ocean (Frappart et al. 2011), then a better resolved inundation extent will also improve the accuracy of these estimates. In the context of the SWOT mission, the downscaling of GIEMS has multiple applications uses but a major one will be to use the SWOT retrievals to develop a downscaling of GIEMS. This SWOT-compatible downscaling could then be used to built a SWOT-compatible high-resolution database back in time from 1993 to the SWOT launch date. This extension of SWOT record is necessary to perform climate studies related to climate change. This paper present three approaches to do downscale GIEMS. Two basins will be considered for illustrative purpose, Amazon, Niger and Mekhong. - Aires, F., F. Papa, C. Prigent, J.-F. Cretaux and M. Berge-Nguyen, Characterization and downscaling of the inundation extent over the Inner Niger delta using a multi-wavelength retrievals and Modis data, J. of Hydrometeorology, in press, 2014. - Aires, F., F. Papa and C. Prigent, A long

  9. Examining Projected Changes in Weather & Air Quality Extremes Between 2000 & 2030 using Dynamical Downscaling

    EPA Science Inventory

    Climate change may alter regional weather extremes resulting in a range of environmental impacts including changes in air quality, water quality and availability, energy demands, agriculture, and ecology. Dynamical downscaling simulations were conducted with the Weather Research...

  10. Trend analysis of watershed-scale precipitation over Northern California by means of dynamically-downscaled CMIP5 future climate projections.

    PubMed

    Ishida, K; Gorguner, M; Ercan, A; Trinh, T; Kavvas, M L

    2017-08-15

    The impacts of climate change on watershed-scale precipitation through the 21st century were investigated over eight study watersheds in Northern California based on dynamically downscaled CMIP5 future climate projections from three GCMs (CCSM4, HadGEM2-ES, and MIROC5) under the RCP4.5 and RCP8.5 future climate scenarios. After evaluating the modeling capability of the WRF model, the six future climate projections were dynamically downscaled by means of the WRF model over Northern California at 9km grid resolution and hourly temporal resolution during a 94-year period (2006-2100). The biases in the model simulations were corrected, and basin-average precipitation over the eight study watersheds was calculated from the dynamically downscaled precipitation data. Based on the dynamically downscaled basin-average precipitation, trends in annual depth and annual peaks of basin-average precipitation during the 21st century were analyzed over the eight study watersheds. The analyses in this study indicate that there may be differences between trends of annual depths and annual peaks of watershed-scale precipitation during the 21st century. Furthermore, trends in watershed-scale precipitation under future climate conditions may be different for different watersheds depending on their location and topography even if they are in the same region. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A proxy for high-resolution regional reanalysis for the Southeast United States: assessment of precipitation variability in dynamically downscaled reanalyses

    USGS Publications Warehouse

    Stefanova, Lydia; Misra, Vasubandhu; Chan, Steven; Griffin, Melissa; O'Brien, James J.; Smith, Thomas J.

    2012-01-01

    We present an analysis of the seasonal, subseasonal, and diurnal variability of rainfall from COAPS Land- Atmosphere Regional Reanalysis for the Southeast at 10-km resolution (CLARReS10). Most of our assessment focuses on the representation of summertime subseasonal and diurnal variability.Summer precipitation in the Southeast United States is a particularly challenging modeling problem because of the variety of regional-scale phenomena, such as sea breeze, thunderstorms and squall lines, which are not adequately resolved in coarse atmospheric reanalyses but contribute significantly to the hydrological budget over the region. We find that the dynamically downscaled reanalyses are in good agreement with station and gridded observations in terms of both the relative seasonal distribution and the diurnal structure of precipitation, although total precipitation amounts tend to be systematically overestimated. The diurnal cycle of summer precipitation in the downscaled reanalyses is in very good agreement with station observations and a clear improvement both over their "parent" reanalyses and over newer-generation reanalyses. The seasonal cycle of precipitation is particularly well simulated in the Florida; this we attribute to the ability of the regional model to provide a more accurate representation of the spatial and temporal structure of finer-scale phenomena such as fronts and sea breezes. Over the northern portion of the domain summer precipitation in the downscaled reanalyses remains, as in the "parent" reanalyses, overestimated. Given the degree of success that dynamical downscaling of reanalyses demonstrates in the simulation of the characteristics of regional precipitation, its favorable comparison to conventional newer-generation reanalyses and its cost-effectiveness, we conclude that for the Southeast United states such downscaling is a viable proxy for high-resolution conventional reanalysis.

  12. Ensemble perception of color in autistic adults.

    PubMed

    Maule, John; Stanworth, Kirstie; Pellicano, Elizabeth; Franklin, Anna

    2017-05-01

    Dominant accounts of visual processing in autism posit that autistic individuals have an enhanced access to details of scenes [e.g., weak central coherence] which is reflected in a general bias toward local processing. Furthermore, the attenuated priors account of autism predicts that the updating and use of summary representations is reduced in autism. Ensemble perception describes the extraction of global summary statistics of a visual feature from a heterogeneous set (e.g., of faces, sizes, colors), often in the absence of local item representation. The present study investigated ensemble perception in autistic adults using a rapidly presented (500 msec) ensemble of four, eight, or sixteen elements representing four different colors. We predicted that autistic individuals would be less accurate when averaging the ensembles, but more accurate in recognizing individual ensemble colors. The results were consistent with the predictions. Averaging was impaired in autism, but only when ensembles contained four elements. Ensembles of eight or sixteen elements were averaged equally accurately across groups. The autistic group also showed a corresponding advantage in rejecting colors that were not originally seen in the ensemble. The results demonstrate the local processing bias in autism, but also suggest that the global perceptual averaging mechanism may be compromised under some conditions. The theoretical implications of the findings and future avenues for research on summary statistics in autism are discussed. Autism Res 2017, 10: 839-851. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  13. Ensemble perception of color in autistic adults

    PubMed Central

    Stanworth, Kirstie; Pellicano, Elizabeth; Franklin, Anna

    2016-01-01

    Dominant accounts of visual processing in autism posit that autistic individuals have an enhanced access to details of scenes [e.g., weak central coherence] which is reflected in a general bias toward local processing. Furthermore, the attenuated priors account of autism predicts that the updating and use of summary representations is reduced in autism. Ensemble perception describes the extraction of global summary statistics of a visual feature from a heterogeneous set (e.g., of faces, sizes, colors), often in the absence of local item representation. The present study investigated ensemble perception in autistic adults using a rapidly presented (500 msec) ensemble of four, eight, or sixteen elements representing four different colors. We predicted that autistic individuals would be less accurate when averaging the ensembles, but more accurate in recognizing individual ensemble colors. The results were consistent with the predictions. Averaging was impaired in autism, but only when ensembles contained four elements. Ensembles of eight or sixteen elements were averaged equally accurately across groups. The autistic group also showed a corresponding advantage in rejecting colors that were not originally seen in the ensemble. The results demonstrate the local processing bias in autism, but also suggest that the global perceptual averaging mechanism may be compromised under some conditions. The theoretical implications of the findings and future avenues for research on summary statistics in autism are discussed. Autism Res 2017, 10: 839–851. © 2016 The Authors Autism Research published by Wiley Periodicals, Inc. on behalf of International Society for Autism Research PMID:27874263

  14. Evaluation of medium-range ensemble flood forecasting based on calibration strategies and ensemble methods in Lanjiang Basin, Southeast China

    NASA Astrophysics Data System (ADS)

    Liu, Li; Gao, Chao; Xuan, Weidong; Xu, Yue-Ping

    2017-11-01

    Ensemble flood forecasts by hydrological models using numerical weather prediction products as forcing data are becoming more commonly used in operational flood forecasting applications. In this study, a hydrological ensemble flood forecasting system comprised of an automatically calibrated Variable Infiltration Capacity model and quantitative precipitation forecasts from TIGGE dataset is constructed for Lanjiang Basin, Southeast China. The impacts of calibration strategies and ensemble methods on the performance of the system are then evaluated. The hydrological model is optimized by the parallel programmed ε-NSGA II multi-objective algorithm. According to the solutions by ε-NSGA II, two differently parameterized models are determined to simulate daily flows and peak flows at each of the three hydrological stations. Then a simple yet effective modular approach is proposed to combine these daily and peak flows at the same station into one composite series. Five ensemble methods and various evaluation metrics are adopted. The results show that ε-NSGA II can provide an objective determination on parameter estimation, and the parallel program permits a more efficient simulation. It is also demonstrated that the forecasts from ECMWF have more favorable skill scores than other Ensemble Prediction Systems. The multimodel ensembles have advantages over all the single model ensembles and the multimodel methods weighted on members and skill scores outperform other methods. Furthermore, the overall performance at three stations can be satisfactory up to ten days, however the hydrological errors can degrade the skill score by approximately 2 days, and the influence persists until a lead time of 10 days with a weakening trend. With respect to peak flows selected by the Peaks Over Threshold approach, the ensemble means from single models or multimodels are generally underestimated, indicating that the ensemble mean can bring overall improvement in forecasting of flows. For

  15. Future intensification of hydro-meteorological extremes: downscaling using the weather research and forecasting model

    NASA Astrophysics Data System (ADS)

    El-Samra, R.; Bou-Zeid, E.; Bangalath, H. K.; Stenchikov, G.; El-Fadel, M.

    2017-12-01

    A set of ten downscaling simulations at high spatial resolution (3 km horizontally) were performed using the Weather Research and Forecasting (WRF) model to generate future climate projections of annual and seasonal temperature and precipitation changes over the Eastern Mediterranean (with a focus on Lebanon). The model was driven with the High Resolution Atmospheric Model (HiRAM), running over the whole globe at a resolution of 25 km, under the conditions of two Representative Concentration Pathways (RCP) (4.5 and 8.5). Each downscaling simulation spanned one year. Two past years (2003 and 2008), also forced by HiRAM without data assimilation, were simulated to evaluate the model's ability to capture the cold and wet (2003) and hot and dry (2008) extremes. The downscaled data were in the range of recent observed climatic variability, and therefore corrected for the cold bias of HiRAM. Eight future years were then selected based on an anomaly score that relies on the mean annual temperature and accumulated precipitation to identify the worst year per decade from a water resources perspective. One hot and dry year per decade, from 2011 to 2050, and per scenario was simulated and compared to the historic 2008 reference. The results indicate that hot and dry future extreme years will be exacerbated and the study area might be exposed to a significant decrease in annual precipitation (rain and snow), reaching up to 30% relative to the current extreme conditions.

  16. Ensemble training to improve recognition using 2D ear

    NASA Astrophysics Data System (ADS)

    Middendorff, Christopher; Bowyer, Kevin W.

    2009-05-01

    The ear has gained popularity as a biometric feature due to the robustness of the shape over time and across emotional expression. Popular methods of ear biometrics analyze the ear as a whole, leaving these methods vulnerable to error due to occlusion. Many researchers explore ear recognition using an ensemble, but none present a method for designing the individual parts that comprise the ensemble. In this work, we introduce a method of modifying the ensemble shapes to improve performance. We determine how different properties of an ensemble training system can affect overall performance. We show that ensembles built from small parts will outperform ensembles built with larger parts, and that incorporating a large number of parts improves the performance of the ensemble.

  17. Assessing drought risk under climate change in the US Great Plains via evaporative demand from downscaled GCM projections

    NASA Astrophysics Data System (ADS)

    Dewes, C.; Rangwala, I.; Hobbins, M.; Barsugli, J. J.

    2016-12-01

    Drought conditions in the US Great Plains occur primarily in response to periods of low precipitation, but they can be exacerbated by enhanced evaporative demand (E0) during periods of elevated temperatures, radiation, advection, and/or decreased humidity. A number of studies project severe to unprecedented drought conditions for this region later in the 21st century. Yet, we have found that methodological choices in the estimation of E0 and the selection of global climate model (GCM) output account for large uncertainties in projections of drought risk. Furthermore, the coarse resolution of GCMs offers little usability for drought risk assessments applied to socio-ecological systems, and users of climate data for that purpose tend to prefer existing downscaled products. Here we derive a physically based estimation of E0 - the FAO56 Penman-Monteith reference evapotranspiration - using driving variables from the Multivariate Adaptive Constructed Analogs (MACA) dataset, which have a spatial resolution of approximately 4 km. We select downscaled outputs from five CMIP5 GCMs, whereby we aim to represent different scenarios for the future of the Great Plains region (e.g. warm/wet, hot/dry, etc.). While this downscaling methodology removes GCM bias relative to a gridded product for historical data (METDATA), we first examine the remaining bias relative to ground (point) estimates of E0. Next we assess whether the downscaled products preserve the variability of their parent GCMs, in both historical and future (RCP8.5) projections. We then use the E0 estimates to compute multi-scale time series of drought indices such as the Evaporative Demand Drought Index (EDDI) and the Standardized Precipitation-Evaporation Index (SPEI) over the Great Plains region. We also attribute variability and drought anomalies to each of the driving parameters, to tease out the influence of specific model biases and evaluate geographical nuances of E0 drivers. Aside from improved understanding of

  18. Ensemble method for dengue prediction

    PubMed Central

    Baugher, Benjamin; Moniz, Linda J.; Bagley, Thomas; Babin, Steven M.; Guven, Erhan

    2018-01-01

    Background In the 2015 NOAA Dengue Challenge, participants made three dengue target predictions for two locations (Iquitos, Peru, and San Juan, Puerto Rico) during four dengue seasons: 1) peak height (i.e., maximum weekly number of cases during a transmission season; 2) peak week (i.e., week in which the maximum weekly number of cases occurred); and 3) total number of cases reported during a transmission season. A dengue transmission season is the 12-month period commencing with the location-specific, historical week with the lowest number of cases. At the beginning of the Dengue Challenge, participants were provided with the same input data for developing the models, with the prediction testing data provided at a later date. Methods Our approach used ensemble models created by combining three disparate types of component models: 1) two-dimensional Method of Analogues models incorporating both dengue and climate data; 2) additive seasonal Holt-Winters models with and without wavelet smoothing; and 3) simple historical models. Of the individual component models created, those with the best performance on the prior four years of data were incorporated into the ensemble models. There were separate ensembles for predicting each of the three targets at each of the two locations. Principal findings Our ensemble models scored higher for peak height and total dengue case counts reported in a transmission season for Iquitos than all other models submitted to the Dengue Challenge. However, the ensemble models did not do nearly as well when predicting the peak week. Conclusions The Dengue Challenge organizers scored the dengue predictions of the Challenge participant groups. Our ensemble approach was the best in predicting the total number of dengue cases reported for transmission season and peak height for Iquitos, Peru. PMID:29298320

  19. Ensemble method for dengue prediction.

    PubMed

    Buczak, Anna L; Baugher, Benjamin; Moniz, Linda J; Bagley, Thomas; Babin, Steven M; Guven, Erhan

    2018-01-01

    In the 2015 NOAA Dengue Challenge, participants made three dengue target predictions for two locations (Iquitos, Peru, and San Juan, Puerto Rico) during four dengue seasons: 1) peak height (i.e., maximum weekly number of cases during a transmission season; 2) peak week (i.e., week in which the maximum weekly number of cases occurred); and 3) total number of cases reported during a transmission season. A dengue transmission season is the 12-month period commencing with the location-specific, historical week with the lowest number of cases. At the beginning of the Dengue Challenge, participants were provided with the same input data for developing the models, with the prediction testing data provided at a later date. Our approach used ensemble models created by combining three disparate types of component models: 1) two-dimensional Method of Analogues models incorporating both dengue and climate data; 2) additive seasonal Holt-Winters models with and without wavelet smoothing; and 3) simple historical models. Of the individual component models created, those with the best performance on the prior four years of data were incorporated into the ensemble models. There were separate ensembles for predicting each of the three targets at each of the two locations. Our ensemble models scored higher for peak height and total dengue case counts reported in a transmission season for Iquitos than all other models submitted to the Dengue Challenge. However, the ensemble models did not do nearly as well when predicting the peak week. The Dengue Challenge organizers scored the dengue predictions of the Challenge participant groups. Our ensemble approach was the best in predicting the total number of dengue cases reported for transmission season and peak height for Iquitos, Peru.

  20. Modeling diurnal land temperature cycles over Los Angeles using downscaled GOES imagery

    NASA Astrophysics Data System (ADS)

    Weng, Qihao; Fu, Peng

    2014-11-01

    Land surface temperature is a key parameter for monitoring urban heat islands, assessing heat related risks, and estimating building energy consumption. These environmental issues are characterized by high temporal variability. A possible solution from the remote sensing perspective is to utilize geostationary satellites images, for instance, images from Geostationary Operational Environmental System (GOES) and Meteosat Second Generation (MSG). These satellite systems, however, with coarse spatial but high temporal resolution (sub-hourly imagery at 3-10 km resolution), often limit their usage to meteorological forecasting and global climate modeling. Therefore, how to develop efficient and effective methods to disaggregate these coarse resolution images to a proper scale suitable for regional and local studies need be explored. In this study, we propose a least square support vector machine (LSSVM) method to achieve the goal of downscaling of GOES image data to half-hourly 1-km LSTs by fusing it with MODIS data products and Shuttle Radar Topography Mission (SRTM) digital elevation data. The result of downscaling suggests that the proposed method successfully disaggregated GOES images to half-hourly 1-km LSTs with accuracy of approximately 2.5 K when validated against with MODIS LSTs at the same over-passing time. The synthetic LST datasets were further explored for monitoring of surface urban heat island (UHI) in the Los Angeles region by extracting key diurnal temperature cycle (DTC) parameters. It is found that the datasets and DTC derived parameters were more suitable for monitoring of daytime- other than nighttime-UHI. With the downscaled GOES 1-km LSTs, the diurnal temperature variations can well be characterized. An accuracy of about 2.5 K was achieved in terms of the fitted results at both 1 km and 5 km resolutions.

  1. Mass Balance Modelling of Saskatchewan Glacier, Canada Using Empirically Downscaled Reanalysis Data

    NASA Astrophysics Data System (ADS)

    Larouche, O.; Kinnard, C.; Demuth, M. N.

    2017-12-01

    Observations show that glaciers around the world are retreating. As sites with long-term mass balance observations are scarce, models are needed to reconstruct glacier mass balance and assess its sensitivity to climate. In regions with discontinuous and/or sparse meteorological data, high-resolution climate reanalysis data provide a convenient alternative to in situ weather observations, but can also suffer from strong bias due to the spatial and temporal scale mismatch. In this study we used data from the North American Regional Reanalysis (NARR) project with a 30 x 30 km spatial resolution and 3-hour temporal resolution to produce the meteorological forcings needed to drive a physically-based, distributed glacier mass balance model (DEBAM, Hock and Holmgren 2005) for the historical period 1979-2016. A two-year record from an automatic weather station (AWS) operated on Saskatchewan Glacier (2014-2016) was used to downscale air temperature, relative humidity, wind speed and incoming solar radiation from the nearest NARR gridpoint to the glacier AWS site. An homogenized historical precipitation record was produced using data from two nearby, low-elevation weather stations and used to downscale the NARR precipitation data. Three bias correction methods were applied (scaling, delta and empirical quantile mapping - EQM) and evaluated using split sample cross-validation. The EQM method gave better results for precipitation and for air temperature. Only a slight improvement in the relative humidity was obtained using the scaling method, while none of the methods improved the wind speed. The later correlates poorly with AWS observations, probably because the local glacier wind is decoupled from the larger scale NARR wind field. The downscaled data was used to drive the DEBAM model in order to reconstruct the mass balance of Saskatchewan Glacier over the past 30 years. The model was validated using recent snow thickness measurements and previously published geodetic mass

  2. Developing a regional retrospective ensemble precipitation dataset for watershed hydrology modeling, Idaho, USA

    NASA Astrophysics Data System (ADS)

    Flores, A. N.; Smith, K.; LaPorte, P.

    2011-12-01

    Applications like flood forecasting, military trafficability assessment, and slope stability analysis necessitate the use of models capable of resolving hydrologic states and fluxes at spatial scales of hillslopes (e.g., 10s to 100s m). These models typically require precipitation forcings at spatial scales of kilometers or better and time intervals of hours. Yet in especially rugged terrain that typifies much of the Western US and throughout much of the developing world, precipitation data at these spatiotemporal resolutions is difficult to come by. Ground-based weather radars have significant problems in high-relief settings and are sparsely located, leaving significant gaps in coverage and high uncertainties. Precipitation gages provide accurate data at points but are very sparsely located and their placement is often not representative, yielding significant coverage gaps in a spatial and physiographic sense. Numerical weather prediction efforts have made precipitation data, including critically important information on precipitation phase, available globally and in near real-time. However, these datasets present watershed modelers with two problems: (1) spatial scales of many of these datasets are tens of kilometers or coarser, (2) numerical weather models used to generate these datasets include a land surface parameterization that in some circumstances can significantly affect precipitation predictions. We report on the development of a regional precipitation dataset for Idaho that leverages: (1) a dataset derived from a numerical weather prediction model, (2) gages within Idaho that report hourly precipitation data, and (3) a long-term precipitation climatology dataset. Hourly precipitation estimates from the Modern Era Retrospective-analysis for Research and Applications (MERRA) are stochastically downscaled using a hybrid orographic and statistical model from their native resolution (1/2 x 2/3 degrees) to a resolution of approximately 1 km. Downscaled

  3. Toward Robust and Efficient Climate Downscaling for Wind Energy

    NASA Astrophysics Data System (ADS)

    Vanvyve, E.; Rife, D.; Pinto, J. O.; Monaghan, A. J.; Davis, C. A.

    2011-12-01

    This presentation describes a more accurate and economical (less time, money and effort) wind resource assessment technique for the renewable energy industry, that incorporates innovative statistical techniques and new global mesoscale reanalyzes. The technique judiciously selects a collection of "case days" that accurately represent the full range of wind conditions observed at a given site over a 10-year period, in order to estimate the long-term energy yield. We will demonstrate that this new technique provides a very accurate and statistically reliable estimate of the 10-year record of the wind resource by intelligently choosing a sample of ±120 case days. This means that the expense of downscaling to quantify the wind resource at a prospective wind farm can be cut by two thirds from the current industry practice of downscaling a randomly chosen 365-day sample to represent winds over a "typical" year. This new estimate of the long-term energy yield at a prospective wind farm also has far less statistical uncertainty than the current industry standard approach. This key finding has the potential to reduce significantly market barriers to both onshore and offshore wind farm development, since insurers and financiers charge prohibitive premiums on investments that are deemed to be high risk. Lower uncertainty directly translates to lower perceived risk, and therefore far more attractive financing terms could be offered to wind farm developers who employ this new technique.

  4. An information-theoretical perspective on weighted ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Weijs, Steven V.; van de Giesen, Nick

    2013-08-01

    This paper presents an information-theoretical method for weighting ensemble forecasts with new information. Weighted ensemble forecasts can be used to adjust the distribution that an existing ensemble of time series represents, without modifying the values in the ensemble itself. The weighting can, for example, add new seasonal forecast information in an existing ensemble of historically measured time series that represents climatic uncertainty. A recent article in this journal compared several methods to determine the weights for the ensemble members and introduced the pdf-ratio method. In this article, a new method, the minimum relative entropy update (MRE-update), is presented. Based on the principle of minimum discrimination information, an extension of the principle of maximum entropy (POME), the method ensures that no more information is added to the ensemble than is present in the forecast. This is achieved by minimizing relative entropy, with the forecast information imposed as constraints. From this same perspective, an information-theoretical view on the various weighting methods is presented. The MRE-update is compared with the existing methods and the parallels with the pdf-ratio method are analysed. The paper provides a new, information-theoretical justification for one version of the pdf-ratio method that turns out to be equivalent to the MRE-update. All other methods result in sets of ensemble weights that, seen from the information-theoretical perspective, add either too little or too much (i.e. fictitious) information to the ensemble.

  5. Contact planarization of ensemble nanowires

    NASA Astrophysics Data System (ADS)

    Chia, A. C. E.; LaPierre, R. R.

    2011-06-01

    The viability of four organic polymers (S1808, SC200, SU8 and Cyclotene) as filling materials to achieve planarization of ensemble nanowire arrays is reported. Analysis of the porosity, surface roughness and thermal stability of each filling material was performed. Sonication was used as an effective method to remove the tops of the nanowires (NWs) to achieve complete planarization. Ensemble nanowire devices were fully fabricated and I-V measurements confirmed that Cyclotene effectively planarizes the NWs while still serving the role as an insulating layer between the top and bottom contacts. These processes and analysis can be easily implemented into future characterization and fabrication of ensemble NWs for optoelectronic device applications.

  6. Contact planarization of ensemble nanowires.

    PubMed

    Chia, A C E; LaPierre, R R

    2011-06-17

    The viability of four organic polymers (S1808, SC200, SU8 and Cyclotene) as filling materials to achieve planarization of ensemble nanowire arrays is reported. Analysis of the porosity, surface roughness and thermal stability of each filling material was performed. Sonication was used as an effective method to remove the tops of the nanowires (NWs) to achieve complete planarization. Ensemble nanowire devices were fully fabricated and I-V measurements confirmed that Cyclotene effectively planarizes the NWs while still serving the role as an insulating layer between the top and bottom contacts. These processes and analysis can be easily implemented into future characterization and fabrication of ensemble NWs for optoelectronic device applications.

  7. Challenges in Visual Analysis of Ensembles

    DOE PAGES

    Crossno, Patricia

    2018-04-12

    Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.

  8. Challenges in Visual Analysis of Ensembles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crossno, Patricia

    Modeling physical phenomena through computational simulation increasingly relies on generating a collection of related runs, known as an ensemble. In this paper, we explore the challenges we face in developing analysis and visualization systems for large and complex ensemble data sets, which we seek to understand without having to view the results of every simulation run. Implementing approaches and ideas developed in response to this goal, we demonstrate the analysis of a 15K run material fracturing study using Slycat, our ensemble analysis system.

  9. Random matrix ensembles for many-body quantum systems

    NASA Astrophysics Data System (ADS)

    Vyas, Manan; Seligman, Thomas H.

    2018-04-01

    Classical random matrix ensembles were originally introduced in physics to approximate quantum many-particle nuclear interactions. However, there exists a plethora of quantum systems whose dynamics is explained in terms of few-particle (predom-inantly two-particle) interactions. The random matrix models incorporating the few-particle nature of interactions are known as embedded random matrix ensembles. In the present paper, we provide a brief overview of these two ensembles and illustrate how the embedded ensembles can be successfully used to study decoherence of a qubit interacting with an environment, both for fermionic and bosonic embedded ensembles. Numerical calculations show the dependence of decoherence on the nature of the environment.

  10. Reproducing multi-model ensemble average with Ensemble-averaged Reconstructed Forcings (ERF) in regional climate modeling

    NASA Astrophysics Data System (ADS)

    Erfanian, A.; Fomenko, L.; Wang, G.

    2016-12-01

    Multi-model ensemble (MME) average is considered the most reliable for simulating both present-day and future climates. It has been a primary reference for making conclusions in major coordinated studies i.e. IPCC Assessment Reports and CORDEX. The biases of individual models cancel out each other in MME average, enabling the ensemble mean to outperform individual members in simulating the mean climate. This enhancement however comes with tremendous computational cost, which is especially inhibiting for regional climate modeling as model uncertainties can originate from both RCMs and the driving GCMs. Here we propose the Ensemble-based Reconstructed Forcings (ERF) approach to regional climate modeling that achieves a similar level of bias reduction at a fraction of cost compared with the conventional MME approach. The new method constructs a single set of initial and boundary conditions (IBCs) by averaging the IBCs of multiple GCMs, and drives the RCM with this ensemble average of IBCs to conduct a single run. Using a regional climate model (RegCM4.3.4-CLM4.5), we tested the method over West Africa for multiple combination of (up to six) GCMs. Our results indicate that the performance of the ERF method is comparable to that of the MME average in simulating the mean climate. The bias reduction seen in ERF simulations is achieved by using more realistic IBCs in solving the system of equations underlying the RCM physics and dynamics. This endows the new method with a theoretical advantage in addition to reducing computational cost. The ERF output is an unaltered solution of the RCM as opposed to a climate state that might not be physically plausible due to the averaging of multiple solutions with the conventional MME approach. The ERF approach should be considered for use in major international efforts such as CORDEX. Key words: Multi-model ensemble, ensemble analysis, ERF, regional climate modeling

  11. Minimalist ensemble algorithms for genome-wide protein localization prediction.

    PubMed

    Lin, Jhih-Rong; Mondal, Ananda Mohan; Liu, Rong; Hu, Jianjun

    2012-07-03

    Computational prediction of protein subcellular localization can greatly help to elucidate its functions. Despite the existence of dozens of protein localization prediction algorithms, the prediction accuracy and coverage are still low. Several ensemble algorithms have been proposed to improve the prediction performance, which usually include as many as 10 or more individual localization algorithms. However, their performance is still limited by the running complexity and redundancy among individual prediction algorithms. This paper proposed a novel method for rational design of minimalist ensemble algorithms for practical genome-wide protein subcellular localization prediction. The algorithm is based on combining a feature selection based filter and a logistic regression classifier. Using a novel concept of contribution scores, we analyzed issues of algorithm redundancy, consensus mistakes, and algorithm complementarity in designing ensemble algorithms. We applied the proposed minimalist logistic regression (LR) ensemble algorithm to two genome-wide datasets of Yeast and Human and compared its performance with current ensemble algorithms. Experimental results showed that the minimalist ensemble algorithm can achieve high prediction accuracy with only 1/3 to 1/2 of individual predictors of current ensemble algorithms, which greatly reduces computational complexity and running time. It was found that the high performance ensemble algorithms are usually composed of the predictors that together cover most of available features. Compared to the best individual predictor, our ensemble algorithm improved the prediction accuracy from AUC score of 0.558 to 0.707 for the Yeast dataset and from 0.628 to 0.646 for the Human dataset. Compared with popular weighted voting based ensemble algorithms, our classifier-based ensemble algorithms achieved much better performance without suffering from inclusion of too many individual predictors. We proposed a method for rational design

  12. Minimalist ensemble algorithms for genome-wide protein localization prediction

    PubMed Central

    2012-01-01

    Background Computational prediction of protein subcellular localization can greatly help to elucidate its functions. Despite the existence of dozens of protein localization prediction algorithms, the prediction accuracy and coverage are still low. Several ensemble algorithms have been proposed to improve the prediction performance, which usually include as many as 10 or more individual localization algorithms. However, their performance is still limited by the running complexity and redundancy among individual prediction algorithms. Results This paper proposed a novel method for rational design of minimalist ensemble algorithms for practical genome-wide protein subcellular localization prediction. The algorithm is based on combining a feature selection based filter and a logistic regression classifier. Using a novel concept of contribution scores, we analyzed issues of algorithm redundancy, consensus mistakes, and algorithm complementarity in designing ensemble algorithms. We applied the proposed minimalist logistic regression (LR) ensemble algorithm to two genome-wide datasets of Yeast and Human and compared its performance with current ensemble algorithms. Experimental results showed that the minimalist ensemble algorithm can achieve high prediction accuracy with only 1/3 to 1/2 of individual predictors of current ensemble algorithms, which greatly reduces computational complexity and running time. It was found that the high performance ensemble algorithms are usually composed of the predictors that together cover most of available features. Compared to the best individual predictor, our ensemble algorithm improved the prediction accuracy from AUC score of 0.558 to 0.707 for the Yeast dataset and from 0.628 to 0.646 for the Human dataset. Compared with popular weighted voting based ensemble algorithms, our classifier-based ensemble algorithms achieved much better performance without suffering from inclusion of too many individual predictors. Conclusions We

  13. Assessing the cumulative environmental effects of marine renewable energy developments: Establishing common ground.

    PubMed

    Willsteed, Edward; Gill, Andrew B; Birchenough, Silvana N R; Jude, Simon

    2017-01-15

    Assessing and managing the cumulative impacts of human activities on the environment remains a major challenge to sustainable development. This challenge is highlighted by the worldwide expansion of marine renewable energy developments (MREDs) in areas already subject to multiple activities and climate change. Cumulative effects assessments in theory provide decision makers with adequate information about how the environment will respond to the incremental effects of licensed activities and are a legal requirement in many nations. In practise, however, such assessments are beset by uncertainties resulting in substantial delays during the licensing process that reduce MRED investor confidence and limit progress towards meeting climate change targets. In light of these targets and ambitions to manage the marine environment sustainably, reducing the uncertainty surrounding MRED effects and cumulative effects assessment are timely and vital. This review investigates the origins and evolution of cumulative effects assessment to identify why the multitude of approaches and pertinent research have emerged, and discusses key considerations and challenges relevant to assessing the cumulative effects of MREDs and other activities on ecosystems. The review recommends a shift away from the current reliance on disparate environmental impact assessments and limited strategic environmental assessments, and a move towards establishing a common system of coordinated data and research relative to ecologically meaningful areas, focussed on the needs of decision makers tasked with protecting and conserving marine ecosystems and services. Copyright © 2016. Published by Elsevier B.V.

  14. Statistical Downscaling of General Circulation Model Outputs to Precipitation Accounting for Non-Stationarities in Predictor-Predictand Relationships

    PubMed Central

    Sachindra, D. A.; Perera, B. J. C.

    2016-01-01

    This paper presents a novel approach to incorporate the non-stationarities characterised in the GCM outputs, into the Predictor-Predictand Relationships (PPRs) in statistical downscaling models. In this approach, a series of 42 PPRs based on multi-linear regression (MLR) technique were determined for each calendar month using a 20-year moving window moved at a 1-year time step on the predictor data obtained from the NCEP/NCAR reanalysis data archive and observations of precipitation at 3 stations located in Victoria, Australia, for the period 1950–2010. Then the relationships between the constants and coefficients in the PPRs and the statistics of reanalysis data of predictors were determined for the period 1950–2010, for each calendar month. Thereafter, using these relationships with the statistics of the past data of HadCM3 GCM pertaining to the predictors, new PPRs were derived for the periods 1950–69, 1970–89 and 1990–99 for each station. This process yielded a non-stationary downscaling model consisting of a PPR per calendar month for each of the above three periods for each station. The non-stationarities in the climate are characterised by the long-term changes in the statistics of the climate variables and above process enabled relating the non-stationarities in the climate to the PPRs. These new PPRs were then used with the past data of HadCM3, to reproduce the observed precipitation. It was found that the non-stationary MLR based downscaling model was able to produce more accurate simulations of observed precipitation more often than conventional stationary downscaling models developed with MLR and Genetic Programming (GP). PMID:27997609

  15. Statistical Downscaling of General Circulation Model Outputs to Precipitation Accounting for Non-Stationarities in Predictor-Predictand Relationships.

    PubMed

    Sachindra, D A; Perera, B J C

    2016-01-01

    This paper presents a novel approach to incorporate the non-stationarities characterised in the GCM outputs, into the Predictor-Predictand Relationships (PPRs) in statistical downscaling models. In this approach, a series of 42 PPRs based on multi-linear regression (MLR) technique were determined for each calendar month using a 20-year moving window moved at a 1-year time step on the predictor data obtained from the NCEP/NCAR reanalysis data archive and observations of precipitation at 3 stations located in Victoria, Australia, for the period 1950-2010. Then the relationships between the constants and coefficients in the PPRs and the statistics of reanalysis data of predictors were determined for the period 1950-2010, for each calendar month. Thereafter, using these relationships with the statistics of the past data of HadCM3 GCM pertaining to the predictors, new PPRs were derived for the periods 1950-69, 1970-89 and 1990-99 for each station. This process yielded a non-stationary downscaling model consisting of a PPR per calendar month for each of the above three periods for each station. The non-stationarities in the climate are characterised by the long-term changes in the statistics of the climate variables and above process enabled relating the non-stationarities in the climate to the PPRs. These new PPRs were then used with the past data of HadCM3, to reproduce the observed precipitation. It was found that the non-stationary MLR based downscaling model was able to produce more accurate simulations of observed precipitation more often than conventional stationary downscaling models developed with MLR and Genetic Programming (GP).

  16. HEPS4Power - Extended-range Hydrometeorological Ensemble Predictions for Improved Hydropower Operations and Revenues

    NASA Astrophysics Data System (ADS)

    Bogner, Konrad; Monhart, Samuel; Liniger, Mark; Spririg, Christoph; Jordan, Fred; Zappa, Massimiliano

    2015-04-01

    In recent years large progresses have been achieved in the operational prediction of floods and hydrological drought with up to ten days lead time. Both the public and the private sectors are currently using probabilistic runoff forecast in order to monitoring water resources and take actions when critical conditions are to be expected. The use of extended-range predictions with lead times exceeding 10 days is not yet established. The hydropower sector in particular might have large benefits from using hydro meteorological forecasts for the next 15 to 60 days in order to optimize the operations and the revenues from their watersheds, dams, captions, turbines and pumps. The new Swiss Competence Centers in Energy Research (SCCER) targets at boosting research related to energy issues in Switzerland. The objective of HEPS4POWER is to demonstrate that operational extended-range hydro meteorological forecasts have the potential to become very valuable tools for fine tuning the production of energy from hydropower systems. The project team covers a specific system-oriented value chain starting from the collection and forecast of meteorological data (MeteoSwiss), leading to the operational application of state-of-the-art hydrological models (WSL) and terminating with the experience in data presentation and power production forecasts for end-users (e-dric.ch). The first task of the HEPS4POWER will be the downscaling and post-processing of ensemble extended-range meteorological forecasts (EPS). The goal is to provide well-tailored forecasts of probabilistic nature that should be reliable in statistical and localized at catchment or even station level. The hydrology related task will consist in feeding the post-processed meteorological forecasts into a HEPS using a multi-model approach by implementing models with different complexity. Also in the case of the hydrological ensemble predictions, post-processing techniques need to be tested in order to improve the quality of the

  17. Examining South Atlantic Subtropical Cyclone Anita using the Satellite-Enhanced Regional Downscaling for Applied Studies Hourly Outputs

    NASA Astrophysics Data System (ADS)

    Vaicberg, H.; Palmeira, A. C. P. A.; Nunes, A.

    2017-12-01

    Studies on South Atlantic cyclones are mainly compromised by scarcity of observations. Therefore, remote sensing and global (re) analysis products are usually employed in investigations of their evolution. However, the frequent use of global reanalysis might difficult the assessment of the characteristics of the cyclones found in South Atlantic. In that regard, studies on "subtropical" cyclones have been performed using the 25-km resolution, Satellite-enhanced Regional Downscaling for Applied Studies (SRDAS), a product developed at the Federal University of Rio de Janeiro in Brazil. In SRDAS, the Regional Spectral Model assimilates precipitation estimates from environmental satellites, while dynamically downscaling a global reanalysis using the spectral nudging technique to maintain the large-scale features in agreement with the regional model solution. The use of regional models in the downscaling of general circulation models provides more detailed information on weather and climate. As a way of illustrating the usefulness of SRDAS in the study of the subtropical South Atlantic cyclones, the subtropical cyclone Anita was selected because of its intensity. Anita developed near Brazilian south/southeast coast, with damages to local communities. Comparisons with available observations demonstrated the skill of SRDAS in simulating such an extreme event.

  18. Post-processing Seasonal Precipitation Forecasts via Integrating Climate Indices and the Analog Approach

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zhang, Y.; Wood, A.; Lee, H. S.; Wu, L.; Schaake, J. C.

    2016-12-01

    Seasonal precipitation forecasts are a primary driver for seasonal streamflow prediction that is critical for a range of water resources applications, such as reservoir operations and drought management. However, it is well known that seasonal precipitation forecasts from climate models are often biased and also too coarse in spatial resolution for hydrologic applications. Therefore, post-processing procedures such as downscaling and bias correction are often needed. In this presentation, we discuss results from a recent study that applies a two-step methodology to downscale and correct the ensemble mean precipitation forecasts from the Climate Forecast System (CFS). First, CFS forecasts are downscaled and bias corrected using monthly reforecast analogs: we identify past precipitation forecasts that are similar to the current forecast, and then use the finer-scale observational analysis fields from the corresponding dates to represent the post-processed ensemble forecasts. Second, we construct the posterior distribution of forecast precipitation from the post-processed ensemble by integrating climate indices: a correlation analysis is performed to identify dominant climate indices for the study region, which are then used to weight the analysis analogs selected in the first step using a Bayesian approach. The methodology is applied to the California Nevada River Forecast Center (CNRFC) and the Middle Atlantic River Forecast Center (MARFC) regions for 1982-2015, using the North American Land Data Assimilation System (NLDAS-2) precipitation as the analysis. The results from cross validation show that the post-processed CFS precipitation forecast are considerably more skillful than the raw CFS with the analog approach only. Integrating climate indices can further improve the skill if the number of ensemble members considered is large enough; however, the improvement is generally limited to the first couple of months when compared against climatology. Impacts of

  19. Total probabilities of ensemble runoff forecasts

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2017-04-01

    Ensemble forecasting has a long history from meteorological modelling, as an indication of the uncertainty of the forecasts. However, it is necessary to calibrate and post-process the ensembles as the they often exhibit both bias and dispersion errors. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters varying in space and time, while giving a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, which makes it unsuitable for our purpose. Our post-processing method of the ensembles is developed in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu), where we are making forecasts for whole Europe, and based on observations from around 700 catchments. As the target is flood forecasting, we are also more interested in improving the forecast skill for high-flows rather than in a good prediction of the entire flow regime. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different meteorological forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to estimate the total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but we are adding a spatial penalty in the calibration process to force a spatial correlation of the parameters. The penalty takes

  20. Modality-Driven Classification and Visualization of Ensemble Variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bensema, Kevin; Gosink, Luke; Obermaier, Harald

    Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space. While this approach helps address conceptual and parametric uncertainties, the ensemble datasets produced by this technique present a special challenge to visualization researchers as the ensemble dataset records a distribution of possible values for each location in the domain. Contemporary visualization approaches that rely solely on summary statistics (e.g., mean and variance) cannot convey the detailed information encoded in ensemble distributions that are paramount to ensemble analysis; summary statistics provide no informationmore » about modality classification and modality persistence. To address this problem, we propose a novel technique that classifies high-variance locations based on the modality of the distribution of ensemble predictions. Additionally, we develop a set of confidence metrics to inform the end-user of the quality of fit between the distribution at a given location and its assigned class. We apply a similar method to time-varying ensembles to illustrate the relationship between peak variance and bimodal or multimodal behavior. These classification schemes enable a deeper understanding of the behavior of the ensemble members by distinguishing between distributions that can be described by a single tendency and distributions which reflect divergent trends in the ensemble.« less

  1. Simulation studies of the fidelity of biomolecular structure ensemble recreation

    NASA Astrophysics Data System (ADS)

    Lätzer, Joachim; Eastwood, Michael P.; Wolynes, Peter G.

    2006-12-01

    We examine the ability of Bayesian methods to recreate structural ensembles for partially folded molecules from averaged data. Specifically we test the ability of various algorithms to recreate different transition state ensembles for folding proteins using a multiple replica simulation algorithm using input from "gold standard" reference ensembles that were first generated with a Gō-like Hamiltonian having nonpairwise additive terms. A set of low resolution data, which function as the "experimental" ϕ values, were first constructed from this reference ensemble. The resulting ϕ values were then treated as one would treat laboratory experimental data and were used as input in the replica reconstruction algorithm. The resulting ensembles of structures obtained by the replica algorithm were compared to the gold standard reference ensemble, from which those "data" were, in fact, obtained. It is found that for a unimodal transition state ensemble with a low barrier, the multiple replica algorithm does recreate the reference ensemble fairly successfully when no experimental error is assumed. The Kolmogorov-Smirnov test as well as principal component analysis show that the overlap of the recovered and reference ensembles is significantly enhanced when multiple replicas are used. Reduction of the multiple replica ensembles by clustering successfully yields subensembles with close similarity to the reference ensembles. On the other hand, for a high barrier transition state with two distinct transition state ensembles, the single replica algorithm only samples a few structures of one of the reference ensemble basins. This is due to the fact that the ϕ values are intrinsically ensemble averaged quantities. The replica algorithm with multiple copies does sample both reference ensemble basins. In contrast to the single replica case, the multiple replicas are constrained to reproduce the average ϕ values, but allow fluctuations in ϕ for each individual copy. These

  2. Evaluating Alignment of Shapes by Ensemble Visualization

    PubMed Central

    Raj, Mukund; Mirzargar, Mahsa; Preston, J. Samuel; Kirby, Robert M.; Whitaker, Ross T.

    2016-01-01

    The visualization of variability in surfaces embedded in 3D, which is a type of ensemble uncertainty visualization, provides a means of understanding the underlying distribution of a collection or ensemble of surfaces. Although ensemble visualization for isosurfaces has been described in the literature, we conduct an expert-based evaluation of various ensemble visualization techniques in a particular medical imaging application: the construction of atlases or templates from a population of images. In this work, we extend contour boxplot to 3D, allowing us to evaluate it against an enumeration-style visualization of the ensemble members and other conventional visualizations used by atlas builders, namely examining the atlas image and the corresponding images/data provided as part of the construction process. We present feedback from domain experts on the efficacy of contour boxplot compared to other modalities when used as part of the atlas construction and analysis stages of their work. PMID:26186768

  3. Bioactive focus in conformational ensembles: a pluralistic approach

    NASA Astrophysics Data System (ADS)

    Habgood, Matthew

    2017-12-01

    Computational generation of conformational ensembles is key to contemporary drug design. Selecting the members of the ensemble that will approximate the conformation most likely to bind to a desired target (the bioactive conformation) is difficult, given that the potential energy usually used to generate and rank the ensemble is a notoriously poor discriminator between bioactive and non-bioactive conformations. In this study an approach to generating a focused ensemble is proposed in which each conformation is assigned multiple rankings based not just on potential energy but also on solvation energy, hydrophobic or hydrophilic interaction energy, radius of gyration, and on a statistical potential derived from Cambridge Structural Database data. The best ranked structures derived from each system are then assembled into a new ensemble that is shown to be better focused on bioactive conformations. This pluralistic approach is tested on ensembles generated by the Molecular Operating Environment's Low Mode Molecular Dynamics module, and by the Cambridge Crystallographic Data Centre's conformation generator software.

  4. Fusing MODIS with Landsat 8 data to downscale weekly normalized difference vegetation index estimates for central Great Basin rangelands, USA

    USGS Publications Warehouse

    Boyte, Stephen; Wylie, Bruce K.; Rigge, Matthew B.; Dahal, Devendra

    2018-01-01

    Data fused from distinct but complementary satellite sensors mitigate tradeoffs that researchers make when selecting between spatial and temporal resolutions of remotely sensed data. We integrated data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor aboard the Terra satellite and the Operational Land Imager sensor aboard the Landsat 8 satellite into four regression-tree models and applied those data to a mapping application. This application produced downscaled maps that utilize the 30-m spatial resolution of Landsat in conjunction with daily acquisitions of MODIS normalized difference vegetation index (NDVI) that are composited and temporally smoothed. We produced four weekly, atmospherically corrected, and nearly cloud-free, downscaled 30-m synthetic MODIS NDVI predictions (maps) built from these models. Model results were strong with R2 values ranging from 0.74 to 0.85. The correlation coefficients (r ≥ 0.89) were strong for all predictions when compared to corresponding original MODIS NDVI data. Downscaled products incorporated into independently developed sagebrush ecosystem models yielded mixed results. The visual quality of the downscaled 30-m synthetic MODIS NDVI predictions were remarkable when compared to the original 250-m MODIS NDVI. These 30-m maps improve knowledge of dynamic rangeland seasonal processes in the central Great Basin, United States, and provide land managers improved resource maps.

  5. Evaluating characteristics of dry spell changes in Lake Urmia Basin using an ensemble CMIP5 GCM models

    NASA Astrophysics Data System (ADS)

    Fazel, Nasim; Berndtsson, Ronny; Bertacchi Uvo, Cintia; Klove, Bjorn; Madani, Kaveh

    2015-04-01

    Drought is a natural phenomenon that can cause significant environmental, ecological, and socio-economic losses in water scarce regions. Studies of drought under climate change are essential for water resources planning and management. Dry spells and number of consecutive days with precipitation below a certain threshold can be used to identify the severity of hydrological drought. In this study, we analyzed the projected changes of number of dry days in two future periods, 2011-2040 and 2071-2100, for both seasonal and annual time scales in the Lake Urmia Basin. The lake and its wetlands, located in northwestern Iran, have invaluable environmental, social, and economic importance for the region. The lake level has been shrinking dramatically since 1995 and now the water volume is less than 30% of its original. Moreover, frequent dry spells have struck the region and effected the region's water resources and lake ecosystem as in other parts of Iran too. Analyzing future drought and dry spells characteristics in the region is crucial for sustainable water management and lake restoration plans. We used daily projected precipitation from 20 climate models used in the CMIP5 (Coupled Model Inter-comparison Project Phase 5) driven by three representative paths, RCP2.6, RCP4.5, and, RCP8.5. The model outputs were statistically downscaled and validated based on the historical observation period 1980-2010. We defined days with precipitation less than 1 mm as dry days for both observation periods and model projections. The model validation showed that all models underestimated the number of dry days. An ensemble based on the validation results consisting of five models which were in best agreement with observations was used to assess the changes in number of future dry days in Lake Urmia Basin. The entire ensemble showed increase in number of dry days for all seasons. The projected changes in winter and spring were larger than for summer and autumn. All models projected

  6. Entropy of spatial network ensembles

    NASA Astrophysics Data System (ADS)

    Coon, Justin P.; Dettmann, Carl P.; Georgiou, Orestis

    2018-04-01

    We analyze complexity in spatial network ensembles through the lens of graph entropy. Mathematically, we model a spatial network as a soft random geometric graph, i.e., a graph with two sources of randomness, namely nodes located randomly in space and links formed independently between pairs of nodes with probability given by a specified function (the "pair connection function") of their mutual distance. We consider the general case where randomness arises in node positions as well as pairwise connections (i.e., for a given pair distance, the corresponding edge state is a random variable). Classical random geometric graph and exponential graph models can be recovered in certain limits. We derive a simple bound for the entropy of a spatial network ensemble and calculate the conditional entropy of an ensemble given the node location distribution for hard and soft (probabilistic) pair connection functions. Under this formalism, we derive the connection function that yields maximum entropy under general constraints. Finally, we apply our analytical framework to study two practical examples: ad hoc wireless networks and the US flight network. Through the study of these examples, we illustrate that both exhibit properties that are indicative of nearly maximally entropic ensembles.

  7. Ensemble Weight Enumerators for Protograph LDPC Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush

    2006-01-01

    Recently LDPC codes with projected graph, or protograph structures have been proposed. In this paper, finite length ensemble weight enumerators for LDPC codes with protograph structures are obtained. Asymptotic results are derived as the block size goes to infinity. In particular we are interested in obtaining ensemble average weight enumerators for protograph LDPC codes which have minimum distance that grows linearly with block size. As with irregular ensembles, linear minimum distance property is sensitive to the proportion of degree-2 variable nodes. In this paper the derived results on ensemble weight enumerators show that linear minimum distance condition on degree distribution of unstructured irregular LDPC codes is a sufficient but not a necessary condition for protograph LDPC codes.

  8. Total probabilities of ensemble runoff forecasts

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2016-04-01

    Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative

  9. Application of SDSM and LARS-WG for simulating and downscaling of rainfall and temperature

    NASA Astrophysics Data System (ADS)

    Hassan, Zulkarnain; Shamsudin, Supiah; Harun, Sobri

    2014-04-01

    Climate change is believed to have significant impacts on the water basin and region, such as in a runoff and hydrological system. However, impact studies on the water basin and region are difficult, since general circulation models (GCMs), which are widely used to simulate future climate scenarios, do not provide reliable hours of daily series rainfall and temperature for hydrological modeling. There is a technique named as "downscaling techniques", which can derive reliable hour of daily series rainfall and temperature due to climate scenarios from the GCMs output. In this study, statistical downscaling models are used to generate the possible future values of local meteorological variables such as rainfall and temperature in the selected stations in Peninsular of Malaysia. The models are: (1) statistical downscaling model (SDSM) that utilized the regression models and stochastic weather generators and (2) Long Ashton research station weather generator (LARS-WG) that only utilized the stochastic weather generators. The LARS-WG and SDSM models obviously are feasible methods to be used as tools in quantifying effects of climate change condition in a local scale. SDSM yields a better performance compared to LARS-WG, except SDSM is slightly underestimated for the wet and dry spell lengths. Although both models do not provide identical results, the time series generated by both methods indicate a general increasing trend in the mean daily temperature values. Meanwhile, the trend of the daily rainfall is not similar to each other, with SDSM giving a relatively higher change of annual rainfall compared to LARS-WG.

  10. Ensembl BioMarts: a hub for data retrieval across taxonomic space.

    PubMed

    Kinsella, Rhoda J; Kähäri, Andreas; Haider, Syed; Zamora, Jorge; Proctor, Glenn; Spudich, Giulietta; Almeida-King, Jeff; Staines, Daniel; Derwent, Paul; Kerhornou, Arnaud; Kersey, Paul; Flicek, Paul

    2011-01-01

    For a number of years the BioMart data warehousing system has proven to be a valuable resource for scientists seeking a fast and versatile means of accessing the growing volume of genomic data provided by the Ensembl project. The launch of the Ensembl Genomes project in 2009 complemented the Ensembl project by utilizing the same visualization, interactive and programming tools to provide users with a means for accessing genome data from a further five domains: protists, bacteria, metazoa, plants and fungi. The Ensembl and Ensembl Genomes BioMarts provide a point of access to the high-quality gene annotation, variation data, functional and regulatory annotation and evolutionary relationships from genomes spanning the taxonomic space. This article aims to give a comprehensive overview of the Ensembl and Ensembl Genomes BioMarts as well as some useful examples and a description of current data content and future objectives. Database URLs: http://www.ensembl.org/biomart/martview/; http://metazoa.ensembl.org/biomart/martview/; http://plants.ensembl.org/biomart/martview/; http://protists.ensembl.org/biomart/martview/; http://fungi.ensembl.org/biomart/martview/; http://bacteria.ensembl.org/biomart/martview/.

  11. Application of an Ensemble Smoother to Precipitation Assimilation

    NASA Technical Reports Server (NTRS)

    Zhang, Sara; Zupanski, Dusanka; Hou, Arthur; Zupanski, Milija

    2008-01-01

    Assimilation of precipitation in a global modeling system poses a special challenge in that the observation operators for precipitation processes are highly nonlinear. In the variational approach, substantial development work and model simplifications are required to include precipitation-related physical processes in the tangent linear model and its adjoint. An ensemble based data assimilation algorithm "Maximum Likelihood Ensemble Smoother (MLES)" has been developed to explore the ensemble representation of the precipitation observation operator with nonlinear convection and large-scale moist physics. An ensemble assimilation system based on the NASA GEOS-5 GCM has been constructed to assimilate satellite precipitation data within the MLES framework. The configuration of the smoother takes the time dimension into account for the relationship between state variables and observable rainfall. The full nonlinear forward model ensembles are used to represent components involving the observation operator and its transpose. Several assimilation experiments using satellite precipitation observations have been carried out to investigate the effectiveness of the ensemble representation of the nonlinear observation operator and the data impact of assimilating rain retrievals from the TMI and SSM/I sensors. Preliminary results show that this ensemble assimilation approach is capable of extracting information from nonlinear observations to improve the analysis and forecast if ensemble size is adequate, and a suitable localization scheme is applied. In addition to a dynamically consistent precipitation analysis, the assimilation system produces a statistical estimate of the analysis uncertainty.

  12. Contribution of crop model structure, parameters and climate projections to uncertainty in climate change impact assessments.

    PubMed

    Tao, Fulu; Rötter, Reimund P; Palosuo, Taru; Gregorio Hernández Díaz-Ambrona, Carlos; Mínguez, M Inés; Semenov, Mikhail A; Kersebaum, Kurt Christian; Nendel, Claas; Specka, Xenia; Hoffmann, Holger; Ewert, Frank; Dambreville, Anaelle; Martre, Pierre; Rodríguez, Lucía; Ruiz-Ramos, Margarita; Gaiser, Thomas; Höhn, Jukka G; Salo, Tapio; Ferrise, Roberto; Bindi, Marco; Cammarano, Davide; Schulman, Alan H

    2018-03-01

    Climate change impact assessments are plagued with uncertainties from many sources, such as climate projections or the inadequacies in structure and parameters of the impact model. Previous studies tried to account for the uncertainty from one or two of these. Here, we developed a triple-ensemble probabilistic assessment using seven crop models, multiple sets of model parameters and eight contrasting climate projections together to comprehensively account for uncertainties from these three important sources. We demonstrated the approach in assessing climate change impact on barley growth and yield at Jokioinen, Finland in the Boreal climatic zone and Lleida, Spain in the Mediterranean climatic zone, for the 2050s. We further quantified and compared the contribution of crop model structure, crop model parameters and climate projections to the total variance of ensemble output using Analysis of Variance (ANOVA). Based on the triple-ensemble probabilistic assessment, the median of simulated yield change was -4% and +16%, and the probability of decreasing yield was 63% and 31% in the 2050s, at Jokioinen and Lleida, respectively, relative to 1981-2010. The contribution of crop model structure to the total variance of ensemble output was larger than that from downscaled climate projections and model parameters. The relative contribution of crop model parameters and downscaled climate projections to the total variance of ensemble output varied greatly among the seven crop models and between the two sites. The contribution of downscaled climate projections was on average larger than that of crop model parameters. This information on the uncertainty from different sources can be quite useful for model users to decide where to put the most effort when preparing or choosing models or parameters for impact analyses. We concluded that the triple-ensemble probabilistic approach that accounts for the uncertainties from multiple important sources provide more comprehensive

  13. Dynamically-downscaled projections of changes in temperature extremes over China

    NASA Astrophysics Data System (ADS)

    Guo, Junhong; Huang, Guohe; Wang, Xiuquan; Li, Yongping; Lin, Qianguo

    2018-02-01

    In this study, likely changes in extreme temperatures (including 16 indices) over China in response to global warming throughout the twenty-first century are investigated through the PRECIS regional climate modeling system. The PRECIS experiment is conducted at a spatial resolution of 25 km and is driven by a perturbed-physics ensemble to reflect spatial variations and model uncertainties. Simulations of present climate (1961-1990) are compared with observations to validate the model performance in reproducing historical climate over China. Results indicate that the PRECIS demonstrates reasonable skills in reproducing the spatial patterns of observed extreme temperatures over the most regions of China, especially in the east. Nevertheless, the PRECIS shows a relatively poor performance in simulating the spatial patterns of extreme temperatures in the western mountainous regions, where its driving GCM exhibits more uncertainties due to lack of insufficient observations and results in more errors in climate downscaling. Future spatio-temporal changes of extreme temperature indices are then analyzed for three successive periods (i.e., 2020s, 2050s and 2080s). The projected changes in extreme temperatures by PRECIS are well consistent with the results of the major global climate models in both spatial and temporal patterns. Furthermore, the PRECIS demonstrates a distinct superiority in providing more detailed spatial information of extreme indices. In general, all extreme indices show similar changes in spatial pattern: large changes are projected in the north while small changes are projected in the south. In contrast, the temporal patterns for all indices vary differently over future periods: the warm indices, such as SU, TR, WSDI, TX90p, TN90p and GSL are likely to increase, while the cold indices, such as ID, FD, CSDI, TX10p and TN10p, are likely to decrease with time in response to global warming. Nevertheless, the magnitudes of changes in all indices tend to

  14. Improving land resource evaluation using fuzzy neural network ensembles

    USGS Publications Warehouse

    Xue, Yue-Ju; HU, Y.-M.; Liu, S.-G.; YANG, J.-F.; CHEN, Q.-C.; BAO, S.-T.

    2007-01-01

    Land evaluation factors often contain continuous-, discrete- and nominal-valued attributes. In traditional land evaluation, these different attributes are usually graded into categorical indexes by land resource experts, and the evaluation results rely heavily on experts' experiences. In order to overcome the shortcoming, we presented a fuzzy neural network ensemble method that did not require grading the evaluation factors into categorical indexes and could evaluate land resources by using the three kinds of attribute values directly. A fuzzy back propagation neural network (BPNN), a fuzzy radial basis function neural network (RBFNN), a fuzzy BPNN ensemble, and a fuzzy RBFNN ensemble were used to evaluate the land resources in Guangdong Province. The evaluation results by using the fuzzy BPNN ensemble and the fuzzy RBFNN ensemble were much better than those by using the single fuzzy BPNN and the single fuzzy RBFNN, and the error rate of the single fuzzy RBFNN or fuzzy RBFNN ensemble was lower than that of the single fuzzy BPNN or fuzzy BPNN ensemble, respectively. By using the fuzzy neural network ensembles, the validity of land resource evaluation was improved and reliance on land evaluators' experiences was considerably reduced. ?? 2007 Soil Science Society of China.

  15. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Man, Jun; Zhang, Jiangjiang; Li, Weixuan

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees ofmore » freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.« less

  16. Parametric analysis of a down-scaled turbo jet engine suitable for drone and UAV propulsion

    NASA Astrophysics Data System (ADS)

    Wessley, G. Jims John; Chauhan, Swati

    2018-04-01

    This paper presents a detailed study on the need for downscaling gas turbine engines for UAV and drone propulsion. Also, the procedure for downscaling and the parametric analysis of a downscaled engine using Gas Turbine Simulation Program software GSP 11 is presented. The need for identifying a micro gas turbine engine in the thrust range of 0.13 to 4.45 kN to power UAVs and drones weighing in the range of 4.5 to 25 kg is considered and in order to meet the requirement a parametric analysis on the scaled down Allison J33-A-35 Turbojet engine is performed. It is evident from the analysis that the thrust developed by the scaled engine and the Thrust Specific Fuel Consumption TSFC depends on pressure ratio, mass flow rate of air and Mach number. A scaling factor of 0.195 corresponding to air mass flow rate of 7.69 kg/s produces a thrust in the range of 4.57 to 5.6 kN while operating at a Mach number of 0.3 within the altitude of 5000 to 9000 m. The thermal and overall efficiency of the scaled engine is found to be 67% and 75% respectively for a pressure ratio of 2. The outcomes of this analysis form a strong base for further analysis, design and fabrication of micro gas turbine engines to propel future UAVs and drones.

  17. Temporal Downscaling of Crop Coefficient and Crop Water Requirement from Growing Stage to Substage Scales

    PubMed Central

    Shang, Songhao

    2012-01-01

    Crop water requirement is essential for agricultural water management, which is usually available for crop growing stages. However, crop water requirement values of monthly or weekly scales are more useful for water management. A method was proposed to downscale crop coefficient and water requirement from growing stage to substage scales, which is based on the interpolation of accumulated crop and reference evapotranspiration calculated from their values in growing stages. The proposed method was compared with two straightforward methods, that is, direct interpolation of crop evapotranspiration and crop coefficient by assuming that stage average values occurred in the middle of the stage. These methods were tested with a simulated daily crop evapotranspiration series. Results indicate that the proposed method is more reliable, showing that the downscaled crop evapotranspiration series is very close to the simulated ones. PMID:22619572

  18. Fine-Tuning Your Ensemble's Jazz Style.

    ERIC Educational Resources Information Center

    Garcia, Antonio J.

    1991-01-01

    Proposes instructional strategies for directors of jazz groups, including guidelines for developing of skills necessary for good performance. Includes effective methods for positive changes in ensemble style. Addresses jazz group problems such as beat, tempo, staying in tune, wind power, and solo/ensemble lines. Discusses percussionists, bassists,…

  19. Thermodynamic-ensemble independence of solvation free energy.

    PubMed

    Chong, Song-Ho; Ham, Sihyun

    2015-02-10

    Solvation free energy is the fundamental thermodynamic quantity in solution chemistry. Recently, it has been suggested that the partial molar volume correction is necessary to convert the solvation free energy determined in different thermodynamic ensembles. Here, we demonstrate ensemble-independence of the solvation free energy on general thermodynamic grounds. Theoretical estimates of the solvation free energy based on the canonical or grand-canonical ensemble are pertinent to experiments carried out under constant pressure without any conversion.

  20. Very high resolution surface mass balance over Greenland modeled by the regional climate model MAR with a downscaling technique

    NASA Astrophysics Data System (ADS)

    Kittel, Christoph; Lang, Charlotte; Agosta, Cécile; Prignon, Maxime; Fettweis, Xavier; Erpicum, Michel

    2016-04-01

    This study presents surface mass balance (SMB) results at 5 km resolution with the regional climate MAR model over the Greenland ice sheet. Here, we use the last MAR version (v3.6) where the land-ice module (SISVAT) using a high resolution grid (5km) for surface variables is fully coupled while the MAR atmospheric module running at a lower resolution of 10km. This online downscaling technique enables to correct near-surface temperature and humidity from MAR by a gradient based on elevation before forcing SISVAT. The 10 km precipitation is not corrected. Corrections are stronger over the ablation zone where topography presents more variations. The model has been force by ERA-Interim between 1979 and 2014. We will show the advantages of using an online SMB downscaling technique in respect to an offline downscaling extrapolation based on local SMB vertical gradients. Results at 5 km show a better agreement with the PROMICE surface mass balance data base than the extrapolated 10 km MAR SMB results.

  1. Evaluation of bottom-up and downscaled emission inventories for Paris and consequences for estimating urban air pollution increments

    NASA Astrophysics Data System (ADS)

    Timmermans, R.; Denier van der Gon, H.; Segers, A.; Honore, C.; Perrussel, O.; Builtjes, P.; Schaap, M.

    2012-04-01

    Since a major part of the Earth's population lives in cities, it is of great importance to correctly characterise the air pollution levels over these urban areas. Many studies in the past have already been dedicated to this subject and have determined so-called urban increments: the impact of large cities on the air pollution levels. The impact of large cities on air pollution levels usually is determined with models driven by so-called downscaled emission inventories. In these inventories official country total emissions are gridded using information on for example population density and location of industries and roads. The question is how accurate are the downscaled inventories over cities or large urban areas. Within the EU FP 7 project MEGAPOLI project a new emission inventory has been produced including refined local emission data for two European megacities (Paris, London) and two urban conglomerations (the Po valley, Italy and the Rhine-Ruhr region, Germany) based on a bottom-up approach. The inventory has comparable national totals but remarkable difference at the city scale. Such a bottom up inventory is thought to be more accurate as it contains local knowledge. Within this study we compared modelled nitrogen dioxide (NO2) and particulate matter (PM) concentrations from the LOTOS-EUROS chemistry transport model driven by a conventional downscaled emission inventory (TNO-MACC inventory) with the concentrations from the same model driven by the new MEGAPOLI 'bottom-up' emission inventory focusing on the Paris region. Model predictions for Paris significantly improve using the new Megapoli inventory. Both the emissions as well as the simulated average concentrations of PM over urban sites in Paris are much lower due to the different spatial distribution of the anthropogenic emissions. The difference for the nearby rural stations is small implicating that also the urban increment for PM simulated using the bottom-up emission inventory is much smaller than

  2. Long-range interacting systems in the unconstrained ensemble.

    PubMed

    Latella, Ivan; Pérez-Madrid, Agustín; Campa, Alessandro; Casetti, Lapo; Ruffo, Stefano

    2017-01-01

    Completely open systems can exchange heat, work, and matter with the environment. While energy, volume, and number of particles fluctuate under completely open conditions, the equilibrium states of the system, if they exist, can be specified using the temperature, pressure, and chemical potential as control parameters. The unconstrained ensemble is the statistical ensemble describing completely open systems and the replica energy is the appropriate free energy for these control parameters from which the thermodynamics must be derived. It turns out that macroscopic systems with short-range interactions cannot attain equilibrium configurations in the unconstrained ensemble, since temperature, pressure, and chemical potential cannot be taken as a set of independent variables in this case. In contrast, we show that systems with long-range interactions can reach states of thermodynamic equilibrium in the unconstrained ensemble. To illustrate this fact, we consider a modification of the Thirring model and compare the unconstrained ensemble with the canonical and grand-canonical ones: The more the ensemble is constrained by fixing the volume or number of particles, the larger the space of parameters defining the equilibrium configurations.

  3. Downscaling SSPs in the GBM Delta - Integrating Science, Modelling and Stakeholders Through Qualitative and Quantitative Scenarios

    NASA Astrophysics Data System (ADS)

    Allan, Andrew; Barbour, Emily; Salehin, Mashfiqus; Munsur Rahman, Md.; Hutton, Craig; Lazar, Attila

    2016-04-01

    A downscaled scenario development process was adopted in the context of a project seeking to understand relationships between ecosystem services and human well-being in the Ganges-Brahmaputra delta. The aim was to link the concerns and priorities of relevant stakeholders with the integrated biophysical and poverty models used in the project. A 2-stage process was used to facilitate the connection between stakeholders concerns and available modelling capacity: the first to qualitatively describe what the future might look like in 2050; the second to translate these qualitative descriptions into the quantitative form required by the numerical models. An extended, modified SSP approach was adopted, with stakeholders downscaling issues identified through interviews as being priorities for the southwest of Bangladesh. Detailed qualitative futures were produced, before modellable elements were quantified in conjunction with an expert stakeholder cadre. Stakeholder input, using the methods adopted here, allows the top-down focus of the RCPs to be aligned with the bottom-up approach needed to make the SSPs appropriate at the more local scale, and also facilitates the translation of qualitative narrative scenarios into a quantitative form that lends itself to incorporation of biophysical and socio-economic indicators. The presentation will describe the downscaling process in detail, and conclude with findings regarding the importance of stakeholder involvement (and logistical considerations), balancing model capacity with expectations and recommendations on SSP refinement at local levels.

  4. Study of Regional Downscaled Climate and Air Quality in the United States

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Fu, J. S.; Drake, J.; Lamarque, J.; Lam, Y.; Huang, K.

    2011-12-01

    Due to the increasing anthropogenic greenhouse gas emissions, the global and regional climate patterns have significantly changed. Climate change has exerted strong impact on ecosystem, air quality and human life. The global model Community Earth System Model (CESM v1.0) was used to predict future climate and chemistry under projected emission scenarios. Two new emission scenarios, Representative Community Pathways (RCP) 4.5 and RCP 8.5, were used in this study for climate and chemistry simulations. The projected global mean temperature will increase 1.2 and 1.7 degree Celcius for the RCP 4.5 and RCP 8.5 scenarios in 2050s, respectively. In order to take advantage of local detailed topography, land use data and conduct local climate impact on air quality, we downscaled CESM outputs to 4 km by 4 km Eastern US domain using Weather Research and Forecasting (WRF) Model and Community Multi-scale Air Quality modeling system (CMAQ). The evaluations between regional model outputs and global model outputs, regional model outputs and observational data were conducted to verify the downscaled methodology. Future climate change and air quality impact were also examined on a 4 km by 4 km high resolution scale.

  5. Ensemble habitat mapping of invasive plant species

    USGS Publications Warehouse

    Stohlgren, T.J.; Ma, P.; Kumar, S.; Rocca, M.; Morisette, J.T.; Jarnevich, C.S.; Benson, N.

    2010-01-01

    Ensemble species distribution models combine the strengths of several species environmental matching models, while minimizing the weakness of any one model. Ensemble models may be particularly useful in risk analysis of recently arrived, harmful invasive species because species may not yet have spread to all suitable habitats, leaving species-environment relationships difficult to determine. We tested five individual models (logistic regression, boosted regression trees, random forest, multivariate adaptive regression splines (MARS), and maximum entropy model or Maxent) and ensemble modeling for selected nonnative plant species in Yellowstone and Grand Teton National Parks, Wyoming; Sequoia and Kings Canyon National Parks, California, and areas of interior Alaska. The models are based on field data provided by the park staffs, combined with topographic, climatic, and vegetation predictors derived from satellite data. For the four invasive plant species tested, ensemble models were the only models that ranked in the top three models for both field validation and test data. Ensemble models may be more robust than individual species-environment matching models for risk analysis. ?? 2010 Society for Risk Analysis.

  6. An Archive of Downscaled WCRP CMIP3 Climate Projections for Planning Applications in the Contiguous United States

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Pruitt, T.; Maurer, E. P.; Duffy, P. B.

    2007-12-01

    Incorporating climate change information into long-term evaluations of water and energy resources requires analysts to have access to climate projection data that have been spatially downscaled to "basin-relevant" resolution. This is necessary in order to develop system-specific hydrology and demand scenarios consistent with projected climate scenarios. Analysts currently have access to "climate model" resolution data (e.g., at LLNL PCMDI), but not spatially downscaled translations of these datasets. Motivated by a common interest in supporting regional and local assessments, the U.S. Bureau of Reclamation and LLNL (through support from the DOE National Energy Technology Laboratory) have teamed to develop an archive of downscaled climate projections (temperature and precipitation) with geographic coverage consistent with the North American Land Data Assimilation System domain, encompassing the contiguous United States. A web-based information service, hosted at LLNL Green Data Oasis, has been developed to provide Reclamation, LLNL, and other interested analysts free access to archive content. A contemporary statistical method was used to bias-correct and spatially disaggregate projection datasets, and was applied to 112 projections included in the WCRP CMIP3 multi-model dataset hosted by LLNL PCMDI (i.e. 16 GCMs and their multiple simulations of SRES A2, A1b, and B1 emissions pathways).

  7. Comparison of initial perturbation methods for the mesoscale ensemble prediction system of the Meteorological Research Institute for the WWRP Beijing 2008 Olympics Research and Development Project (B08RDP)

    NASA Astrophysics Data System (ADS)

    Saito, Kazuo; Hara, Masahiro; Kunii, Masaru; Seko, Hiromu; Yamaguchi, Munehiko

    2011-05-01

    Different initial perturbation methods for the mesoscale ensemble prediction were compared by the Meteorological Research Institute (MRI) as a part of the intercomparison of mesoscale ensemble prediction systems (EPSs) of the World Weather Research Programme (WWRP) Beijing 2008 Olympics Research and Development Project (B08RDP). Five initial perturbation methods for mesoscale ensemble prediction were developed for B08RDP and compared at MRI: (1) a downscaling method of the Japan Meteorological Agency (JMA)'s operational one-week EPS (WEP), (2) a targeted global model singular vector (GSV) method, (3) a mesoscale model singular vector (MSV) method based on the adjoint model of the JMA non-hydrostatic model (NHM), (4) a mesoscale breeding growing mode (MBD) method based on the NHM forecast and (5) a local ensemble transform (LET) method based on the local ensemble transform Kalman filter (LETKF) using NHM. These perturbation methods were applied to the preliminary experiments of the B08RDP Tier-1 mesoscale ensemble prediction with a horizontal resolution of 15 km. To make the comparison easier, the same horizontal resolution (40 km) was employed for the three mesoscale model-based initial perturbation methods (MSV, MBD and LET). The GSV method completely outperformed the WEP method, confirming the advantage of targeting in mesoscale EPS. The GSV method generally performed well with regard to root mean square errors of the ensemble mean, large growth rates of ensemble spreads throughout the 36-h forecast period, and high detection rates and high Brier skill scores (BSSs) for weak rains. On the other hand, the mesoscale model-based initial perturbation methods showed good detection rates and BSSs for intense rains. The MSV method showed a rapid growth in the ensemble spread of precipitation up to a forecast time of 6 h, which suggests suitability of the mesoscale SV for short-range EPSs, but the initial large growth of the perturbation did not last long. The

  8. Skill of Ensemble Seasonal Probability Forecasts

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk

    2010-05-01

    In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.

  9. Dynamical Downscaling of Meteorology from a Global Model by WRF towards Resolving US PM2.5 Distributions for the Mid 21st Century

    NASA Astrophysics Data System (ADS)

    Kunwar, S.; Bowden, J.; Milly, G.; Previdi, M. J.; Fiore, A. M.; West, J. J.

    2017-12-01

    In the coming decades, anthropogenically induced climate change will likely impact PM2.5 through both changing meteorology and feedback in natural emissions. A major goal of our project is to assess changes in PM2.5 levels over the continental US due to climate variability and change for the period 2005-2065. We will achieve this by using regional models to dynamically downscale coarse resolution (20 × 20) meteorology and air chemistry from a global model to finer spatial resolution (12 km), improving air quality projections for regions and subregions of the US (NE, SE, SW, NW, Midwest, Intermountain West). We downscale from GFDL CM3 simulations of the RCP8.5 scenario for the years 2006-2100 with aerosol and ozone precursor emissions fixed at 2005 levels. We carefully select model years from the global simulations that sample the range of PM2.5 distributions for different US regions at mid 21st century (2050-2065). Here we will show results for the meteorological downscaling (using WRF version 3.8.1) for this project, including a performance evaluation for meteorological variables with respect to the global model. In the future, the downscaled meteorology presented here will be used to drive air quality downscaling in CMAQ (version 5.2). Analysis of the resulting PM2.5 statistics for US regions, as well as the drivers for PM2.5 changes, will be important in supporting informed policies for air quality (also health and visibility) planning for different US regions for the next five decades.

  10. Ensemble-based assimilation of fractional snow-covered area satellite retrievals to estimate the snow distribution at Arctic sites

    NASA Astrophysics Data System (ADS)

    Aalstad, Kristoffer; Westermann, Sebastian; Vikhamar Schuler, Thomas; Boike, Julia; Bertino, Laurent

    2018-01-01

    With its high albedo, low thermal conductivity and large water storing capacity, snow strongly modulates the surface energy and water balance, which makes it a critical factor in mid- to high-latitude and mountain environments. However, estimating the snow water equivalent (SWE) is challenging in remote-sensing applications already at medium spatial resolutions of 1 km. We present an ensemble-based data assimilation framework that estimates the peak subgrid SWE distribution (SSD) at the 1 km scale by assimilating fractional snow-covered area (fSCA) satellite retrievals in a simple snow model forced by downscaled reanalysis data. The basic idea is to relate the timing of the snow cover depletion (accessible from satellite products) to the peak SSD. Peak subgrid SWE is assumed to be lognormally distributed, which can be translated to a modeled time series of fSCA through the snow model. Assimilation of satellite-derived fSCA facilitates the estimation of the peak SSD, while taking into account uncertainties in both the model and the assimilated data sets. As an extension to previous studies, our method makes use of the novel (to snow data assimilation) ensemble smoother with multiple data assimilation (ES-MDA) scheme combined with analytical Gaussian anamorphosis to assimilate time series of Moderate Resolution Imaging Spectroradiometer (MODIS) and Sentinel-2 fSCA retrievals. The scheme is applied to Arctic sites near Ny-Ålesund (79° N, Svalbard, Norway) where field measurements of fSCA and SWE distributions are available. The method is able to successfully recover accurate estimates of peak SSD on most of the occasions considered. Through the ES-MDA assimilation, the root-mean-square error (RMSE) for the fSCA, peak mean SWE and peak subgrid coefficient of variation is improved by around 75, 60 and 20 %, respectively, when compared to the prior, yielding RMSEs of 0.01, 0.09 m water equivalent (w.e.) and 0.13, respectively. The ES-MDA either outperforms or at least

  11. A multiphysical ensemble system of numerical snow modelling

    NASA Astrophysics Data System (ADS)

    Lafaysse, Matthieu; Cluzet, Bertrand; Dumont, Marie; Lejeune, Yves; Vionnet, Vincent; Morin, Samuel

    2017-05-01

    Physically based multilayer snowpack models suffer from various modelling errors. To represent these errors, we built the new multiphysical ensemble system ESCROC (Ensemble System Crocus) by implementing new representations of different physical processes in the deterministic coupled multilayer ground/snowpack model SURFEX/ISBA/Crocus. This ensemble was driven and evaluated at Col de Porte (1325 m a.s.l., French alps) over 18 years with a high-quality meteorological and snow data set. A total number of 7776 simulations were evaluated separately, accounting for the uncertainties of evaluation data. The ability of the ensemble to capture the uncertainty associated to modelling errors is assessed for snow depth, snow water equivalent, bulk density, albedo and surface temperature. Different sub-ensembles of the ESCROC system were studied with probabilistic tools to compare their performance. Results show that optimal members of the ESCROC system are able to explain more than half of the total simulation errors. Integrating members with biases exceeding the range corresponding to observational uncertainty is necessary to obtain an optimal dispersion, but this issue can also be a consequence of the fact that meteorological forcing uncertainties were not accounted for. The ESCROC system promises the integration of numerical snow-modelling errors in ensemble forecasting and ensemble assimilation systems in support of avalanche hazard forecasting and other snowpack-modelling applications.

  12. Bidirectional Modulation of Intrinsic Excitability in Rat Prelimbic Cortex Neuronal Ensembles and Non-Ensembles after Operant Learning.

    PubMed

    Whitaker, Leslie R; Warren, Brandon L; Venniro, Marco; Harte, Tyler C; McPherson, Kylie B; Beidel, Jennifer; Bossert, Jennifer M; Shaham, Yavin; Bonci, Antonello; Hope, Bruce T

    2017-09-06

    Learned associations between environmental stimuli and rewards drive goal-directed learning and motivated behavior. These memories are thought to be encoded by alterations within specific patterns of sparsely distributed neurons called neuronal ensembles that are activated selectively by reward-predictive stimuli. Here, we use the Fos promoter to identify strongly activated neuronal ensembles in rat prelimbic cortex (PLC) and assess altered intrinsic excitability after 10 d of operant food self-administration training (1 h/d). First, we used the Daun02 inactivation procedure in male FosLacZ-transgenic rats to ablate selectively Fos-expressing PLC neurons that were active during operant food self-administration. Selective ablation of these neurons decreased food seeking. We then used male FosGFP-transgenic rats to assess selective alterations of intrinsic excitability in Fos-expressing neuronal ensembles (FosGFP + ) that were activated during food self-administration and compared these with alterations in less activated non-ensemble neurons (FosGFP - ). Using whole-cell recordings of layer V pyramidal neurons in an ex vivo brain slice preparation, we found that operant self-administration increased excitability of FosGFP + neurons and decreased excitability of FosGFP - neurons. Increased excitability of FosGFP + neurons was driven by increased steady-state input resistance. Decreased excitability of FosGFP - neurons was driven by increased contribution of small-conductance calcium-activated potassium (SK) channels. Injections of the specific SK channel antagonist apamin into PLC increased Fos expression but had no effect on food seeking. Overall, operant learning increased intrinsic excitability of PLC Fos-expressing neuronal ensembles that play a role in food seeking but decreased intrinsic excitability of Fos - non-ensembles. SIGNIFICANCE STATEMENT Prefrontal cortex activity plays a critical role in operant learning, but the underlying cellular mechanisms are

  13. Downscaling soil moisture over East Asia through multi-sensor data fusion and optimization of regression trees

    NASA Astrophysics Data System (ADS)

    Park, Seonyoung; Im, Jungho; Park, Sumin; Rhee, Jinyoung

    2017-04-01

    Soil moisture is one of the most important keys for understanding regional and global climate systems. Soil moisture is directly related to agricultural processes as well as hydrological processes because soil moisture highly influences vegetation growth and determines water supply in the agroecosystem. Accurate monitoring of the spatiotemporal pattern of soil moisture is important. Soil moisture has been generally provided through in situ measurements at stations. Although field survey from in situ measurements provides accurate soil moisture with high temporal resolution, it requires high cost and does not provide the spatial distribution of soil moisture over large areas. Microwave satellite (e.g., advanced Microwave Scanning Radiometer on the Earth Observing System (AMSR2), the Advanced Scatterometer (ASCAT), and Soil Moisture Active Passive (SMAP)) -based approaches and numerical models such as Global Land Data Assimilation System (GLDAS) and Modern- Era Retrospective Analysis for Research and Applications (MERRA) provide spatial-temporalspatiotemporally continuous soil moisture products at global scale. However, since those global soil moisture products have coarse spatial resolution ( 25-40 km), their applications for agriculture and water resources at local and regional scales are very limited. Thus, soil moisture downscaling is needed to overcome the limitation of the spatial resolution of soil moisture products. In this study, GLDAS soil moisture data were downscaled up to 1 km spatial resolution through the integration of AMSR2 and ASCAT soil moisture data, Shuttle Radar Topography Mission (SRTM) Digital Elevation Model (DEM), and Moderate Resolution Imaging Spectroradiometer (MODIS) data—Land Surface Temperature, Normalized Difference Vegetation Index, and Land cover—using modified regression trees over East Asia from 2013 to 2015. Modified regression trees were implemented using Cubist, a commercial software tool based on machine learning. An

  14. Exploring the calibration of a wind forecast ensemble for energy applications

    NASA Astrophysics Data System (ADS)

    Heppelmann, Tobias; Ben Bouallegue, Zied; Theis, Susanne

    2015-04-01

    In the German research project EWeLiNE, Deutscher Wetterdienst (DWD) and Fraunhofer Institute for Wind Energy and Energy System Technology (IWES) are collaborating with three German Transmission System Operators (TSO) in order to provide the TSOs with improved probabilistic power forecasts. Probabilistic power forecasts are derived from probabilistic weather forecasts, themselves derived from ensemble prediction systems (EPS). Since the considered raw ensemble wind forecasts suffer from underdispersiveness and bias, calibration methods are developed for the correction of the model bias and the ensemble spread bias. The overall aim is to improve the ensemble forecasts such that the uncertainty of the possible weather deployment is depicted by the ensemble spread from the first forecast hours. Additionally, the ensemble members after calibration should remain physically consistent scenarios. We focus on probabilistic hourly wind forecasts with horizon of 21 h delivered by the convection permitting high-resolution ensemble system COSMO-DE-EPS which has become operational in 2012 at DWD. The ensemble consists of 20 ensemble members driven by four different global models. The model area includes whole Germany and parts of Central Europe with a horizontal resolution of 2.8 km and a vertical resolution of 50 model levels. For verification we use wind mast measurements around 100 m height that corresponds to the hub height of wind energy plants that belong to wind farms within the model area. Calibration of the ensemble forecasts can be performed by different statistical methods applied to the raw ensemble output. Here, we explore local bivariate Ensemble Model Output Statistics at individual sites and quantile regression with different predictors. Applying different methods, we already show an improvement of ensemble wind forecasts from COSMO-DE-EPS for energy applications. In addition, an ensemble copula coupling approach transfers the time-dependencies of the raw

  15. Evaluating the utility of dynamical downscaling in agricultural impacts projections

    PubMed Central

    Glotter, Michael; Elliott, Joshua; McInerney, David; Best, Neil; Foster, Ian; Moyer, Elisabeth J.

    2014-01-01

    Interest in estimating the potential socioeconomic costs of climate change has led to the increasing use of dynamical downscaling—nested modeling in which regional climate models (RCMs) are driven with general circulation model (GCM) output—to produce fine-spatial-scale climate projections for impacts assessments. We evaluate here whether this computationally intensive approach significantly alters projections of agricultural yield, one of the greatest concerns under climate change. Our results suggest that it does not. We simulate US maize yields under current and future CO2 concentrations with the widely used Decision Support System for Agrotechnology Transfer crop model, driven by a variety of climate inputs including two GCMs, each in turn downscaled by two RCMs. We find that no climate model output can reproduce yields driven by observed climate unless a bias correction is first applied. Once a bias correction is applied, GCM- and RCM-driven US maize yields are essentially indistinguishable in all scenarios (<10% discrepancy, equivalent to error from observations). Although RCMs correct some GCM biases related to fine-scale geographic features, errors in yield are dominated by broad-scale (100s of kilometers) GCM systematic errors that RCMs cannot compensate for. These results support previous suggestions that the benefits for impacts assessments of dynamically downscaling raw GCM output may not be sufficient to justify its computational demands. Progress on fidelity of yield projections may benefit more from continuing efforts to understand and minimize systematic error in underlying climate projections. PMID:24872455

  16. Century long observation constrained global dynamic downscaling and hydrologic implication

    NASA Astrophysics Data System (ADS)

    Kim, H.; Yoshimura, K.; Chang, E.; Famiglietti, J. S.; Oki, T.

    2012-12-01

    It has been suggested that greenhouse gas induced warming climate causes the acceleration of large scale hydrologic cycles, and, indeed, many regions on the Earth have been suffered by hydrologic extremes getting more frequent. However, historical observations are not able to provide enough information in comprehensive manner to understand their long-term variability and/or global distributions. In this study, a century long high resolution global climate data is developed in order to break through existing limitations. 20th Century Reanalysis (20CR) which has relatively low spatial resolution (~2.0°) and longer term availability (140 years) is dynamically downscaled into global T248 (~0.5°) resolution using Experimental Climate Prediction Center (ECPC) Global Spectral Model (GSM) by spectral nudging data assimilation technique. Also, Global Precipitation Climatology Centre (GPCC) and Climate Research Unit (CRU) observational data are adopted to reduce model dependent uncertainty. Downscaled product successfully represents realistic geographical detail keeping low frequency signal in mean state and spatiotemporal variability, while previous bias correction method fails to reproduce high frequency variability. Newly developed data is used to investigate how long-term large scale terrestrial hydrologic cycles have been changed globally and how they have been interacted with various climate modes, such as El-Niño Southern Oscillation (ENSO) and Atlantic Multidecadal Oscillation (AMO). As a further application, it will be used to provide atmospheric boundary condition of multiple land surface models in the Global Soil Wetness Project Phase 3 (GSWP3).

  17. Statistical Downscaling of Gusts During Extreme European Winter Storms Using Radial-Basis-Function Networks

    NASA Astrophysics Data System (ADS)

    Voigt, M.; Lorenz, P.; Kruschke, T.; Osinski, R.; Ulbrich, U.; Leckebusch, G. C.

    2012-04-01

    Winterstorms and related gusts can cause extensive socio-economic damages. Knowledge about the occurrence and the small scale structure of such events may help to make regional estimations of storm losses. For a high spatial and temporal representation, the use of dynamical downscaling methods (RCM) is a cost-intensive and time-consuming option and therefore only applicable for a limited number of events. The current study explores a methodology to provide a statistical downscaling, which offers small scale structured gust fields from an extended large scale structured eventset. Radial-basis-function (RBF) networks in combination with bidirectional Kohonen (BDK) maps are used to generate the gustfields on a spatial resolution of 7 km from the 6-hourly mean sea level pressure field from ECMWF reanalysis data. BDK maps are a kind of neural network which handles supervised classification problems. In this study they are used to provide prototypes for the RBF network and give a first order approximation for the output data. A further interpolation is done by the RBF network. For the training process the 50 most extreme storm events over the North Atlantic area from 1957 to 2011 are used, which have been selected from ECMWF reanalysis datasets ERA40 and ERA-Interim by an objective wind based tracking algorithm. These events were downscaled dynamically by application of the DWD model chain GME → COSMO-EU. Different model parameters and their influence on the quality of the generated high-resolution gustfields are studied. It is shown that the statistical RBF network approach delivers reasonable results in modeling the regional gust fields for untrained events.

  18. Application of multi-scale wavelet entropy and multi-resolution Volterra models for climatic downscaling

    NASA Astrophysics Data System (ADS)

    Sehgal, V.; Lakhanpal, A.; Maheswaran, R.; Khosa, R.; Sridhar, Venkataramana

    2018-01-01

    This study proposes a wavelet-based multi-resolution modeling approach for statistical downscaling of GCM variables to mean monthly precipitation for five locations at Krishna Basin, India. Climatic dataset from NCEP is used for training the proposed models (Jan.'69 to Dec.'94) and are applied to corresponding CanCM4 GCM variables to simulate precipitation for the validation (Jan.'95-Dec.'05) and forecast (Jan.'06-Dec.'35) periods. The observed precipitation data is obtained from the India Meteorological Department (IMD) gridded precipitation product at 0.25 degree spatial resolution. This paper proposes a novel Multi-Scale Wavelet Entropy (MWE) based approach for clustering climatic variables into suitable clusters using k-means methodology. Principal Component Analysis (PCA) is used to obtain the representative Principal Components (PC) explaining 90-95% variance for each cluster. A multi-resolution non-linear approach combining Discrete Wavelet Transform (DWT) and Second Order Volterra (SoV) is used to model the representative PCs to obtain the downscaled precipitation for each downscaling location (W-P-SoV model). The results establish that wavelet-based multi-resolution SoV models perform significantly better compared to the traditional Multiple Linear Regression (MLR) and Artificial Neural Networks (ANN) based frameworks. It is observed that the proposed MWE-based clustering and subsequent PCA, helps reduce the dimensionality of the input climatic variables, while capturing more variability compared to stand-alone k-means (no MWE). The proposed models perform better in estimating the number of precipitation events during the non-monsoon periods whereas the models with clustering without MWE over-estimate the rainfall during the dry season.

  19. Development and evaluation of climatologically-downscaled AFWA AGRMET precipitation products over the continental U.S.

    NASA Astrophysics Data System (ADS)

    Garcia, M.; Peters-Lidard, C. D.; Eylander, J. B.; Daly, C.; Gibson, W.; Tian, Y.; Zeng, J.; Kato, H.

    2008-05-01

    Collaborations between the Air Force Weather Agency (AFWA), the Hydrological Sciences Branch at NASA-GSFC, and the PRISM Group at Oregon State University have led to improvements in the processing of meteorological forcing inputs for the NASA-GSFC Land Information System (LIS; Kumar et al. 2006), a sophisticated framework for LSM operation and model coupling experiments. Efforts at AFWA toward the production of surface hydrometeorological products are currently in transition from the legacy Agricultural Meteorology modeling system (AGRMET) to use of the LIS framework and procedures. Recent enhancements to meteorological input processing for application to land surface models in LIS include the assimilation of climate-based information for the spatial interpolation and downscaling of precipitation fields. Climatological information included in the LIS- based downscaling procedure for North America is provided by a monthly high-resolution PRISM (Daly et al. 1994, 2002; Daly 2006) dataset based on a 30-year analysis period. The combination of these sources and methods attempts to address the strengths and weaknesses of available legacy products, objective interpolation methods, and the PRISM knowledge-based methodology. All of these efforts are oriented on an operational need for timely estimation of spatial precipitation fields at adequate spatial resolution for customer dissemination and near-real-time simulations in regions of interest. This work focuses on value added to the AGRMET precipitation product by the inclusion of high-quality climatological information on a monthly time scale. The AGRMET method uses microwave-based satellite precipitation estimates from various polar-orbiting platforms (NOAA POES and DMSP), infrared-based estimates from geostationary platforms (GOES, METEOSAT, etc.), related cloud analysis products, and surface gauge observations in a complex and hierarchical blending process. Results from processing of the legacy AGRMET precipitation

  20. Development of probabilistic regional climate scenario in East Asia

    NASA Astrophysics Data System (ADS)

    Dairaku, K.; Ueno, G.; Ishizaki, N. N.

    2015-12-01

    Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in East Asia (CORDEX-EA and Japan), the probability distribution of 2m air temperature was estimated by using developed regression model. The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. Probabilistic climate information in present (1969-1998) and future (2069-2098) climate was developed using CMIP3 SRES A1b scenarios 21 models and the observation data (CRU_TS3.22 & University of Delaware in CORDEX-EA, NIAES AMeDAS mesh data in Japan). The prototype of probabilistic information in CORDEX-EA and Japan represent the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Appropriate combination of statistical methods and optimization of climate ensemble experiments using multi-General Circulation Models (GCMs) and multi-regional climate models (RCMs) ensemble downscaling experiments are investigated.

  1. Nationwide validation of ensemble streamflow forecasts from the Hydrologic Ensemble Forecast Service (HEFS) of the U.S. National Weather Service

    NASA Astrophysics Data System (ADS)

    Lee, H. S.; Liu, Y.; Ward, J.; Brown, J.; Maestre, A.; Herr, H.; Fresch, M. A.; Wells, E.; Reed, S. M.; Jones, E.

    2017-12-01

    The National Weather Service's (NWS) Office of Water Prediction (OWP) recently launched a nationwide effort to verify streamflow forecasts from the Hydrologic Ensemble Forecast Service (HEFS) for a majority of forecast locations across the 13 River Forecast Centers (RFCs). Known as the HEFS Baseline Validation (BV), the project involves a joint effort between the OWP and the RFCs. It aims to provide a geographically consistent, statistically robust validation, and a benchmark to guide the operational implementation of the HEFS, inform practical applications, such as impact-based decision support services, and to provide an objective framework for evaluating strategic investments in the HEFS. For the BV, HEFS hindcasts are issued once per day on a 12Z cycle for the period of 1985-2015 with a forecast horizon of 30 days. For the first two weeks, the hindcasts are forced with precipitation and temperature ensemble forecasts from the Global Ensemble Forecast System of the National Centers for Environmental Prediction, and by resampled climatology for the remaining period. The HEFS-generated ensemble streamflow hindcasts are verified using the Ensemble Verification System. Skill is assessed relative to streamflow hindcasts generated from NWS' current operational system, namely climatology-based Ensemble Streamflow Prediction. In this presentation, we summarize the results and findings to date.

  2. Argumentation Based Joint Learning: A Novel Ensemble Learning Approach

    PubMed Central

    Xu, Junyi; Yao, Li; Li, Le

    2015-01-01

    Recently, ensemble learning methods have been widely used to improve classification performance in machine learning. In this paper, we present a novel ensemble learning method: argumentation based multi-agent joint learning (AMAJL), which integrates ideas from multi-agent argumentation, ensemble learning, and association rule mining. In AMAJL, argumentation technology is introduced as an ensemble strategy to integrate multiple base classifiers and generate a high performance ensemble classifier. We design an argumentation framework named Arena as a communication platform for knowledge integration. Through argumentation based joint learning, high quality individual knowledge can be extracted, and thus a refined global knowledge base can be generated and used independently for classification. We perform numerous experiments on multiple public datasets using AMAJL and other benchmark methods. The results demonstrate that our method can effectively extract high quality knowledge for ensemble classifier and improve the performance of classification. PMID:25966359

  3. Regionalization of post-processed ensemble runoff forecasts

    NASA Astrophysics Data System (ADS)

    Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian

    2016-05-01

    For many years, meteorological models have been run with perturbated initial conditions or parameters to produce ensemble forecasts that are used as a proxy of the uncertainty of the forecasts. However, the ensembles are usually both biased (the mean is systematically too high or too low, compared with the observed weather), and has dispersion errors (the ensemble variance indicates a too low or too high confidence in the forecast, compared with the observed weather). The ensembles are therefore commonly post-processed to correct for these shortcomings. Here we look at one of these techniques, referred to as Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). Originally, the post-processing parameters were identified as a fixed set of parameters for a region. The application of our work is the European Flood Awareness System (http://www.efas.eu), where a distributed model is run with meteorological ensembles as input. We are therefore dealing with a considerably larger data set than previous analyses. We also want to regionalize the parameters themselves for other locations than the calibration gauges. The post-processing parameters are therefore estimated for each calibration station, but with a spatial penalty for deviations from neighbouring stations, depending on the expected semivariance between the calibration catchment and these stations. The estimated post-processed parameters can then be used for regionalization of the postprocessing parameters also for uncalibrated locations using top-kriging in the rtop-package (Skøien et al., 2006, 2014). We will show results from cross-validation of the methodology and although our interest is mainly in identifying exceedance probabilities for certain return levels, we will also show how the rtop package can be used for creating a set of post-processed ensembles through simulations.

  4. Downscaling Land Surface Temperature in an Urban Area: A Case Study for Hamburg, Germany

    NASA Astrophysics Data System (ADS)

    Bechtel, Benjamin; Zakšek, Klemen

    2013-04-01

    Land surface temperature (LST) is an important parameter for the urban radiation and heat balance and a boundary condition for the atmospheric urban heat island (UHI). The increase in urban surface temperatures compared to the surrounding area (surface urban heat island, SUHI) has been described and analysed with satellite-based measurements for several decades. Besides continuous progress in the development of new sensors, an operational monitoring is still severely limited by physical constraints regarding the spatial and temporal resolution of the satellite data. Essentially, two measurement concepts must be distinguished: Sensors on geostationary platforms have high temporal (several times per hour) and poor spatial resolution (~ 5 km) while those on low earth orbiters have high spatial (~ 100-1000 m) resolution and a long return period (one day to several weeks). To enable an observation with high temporal and spatial resolution, a downscaling scheme for LST from the Spinning Enhanced Visible Infra-Red Imager (SEVIRI) sensor onboard the geostationary meteorological Meteosat 9 to spatial resolutions between 100 and 1000 m was developed and tested for Hamburg in this case study. Therefore, various predictor sets (including parameters derived from multi-temporal thermal data, NDVI, and morphological parameters) were tested. The relationship between predictors and LST was empirically calibrated in the low resolution domain and then transferred to the high resolution domain. The downscaling was validated with LST data from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) for the same time. Aggregated parameters from multi-temporal thermal data (in particular annual cycle parameters and principal components) proved particularly suitable. The results for the highest resolution of 100 m showed a high explained variance (R² = 0.71) and relatively low root mean square errors (RMSE = 2.2 K). Larger predictor sets resulted in higher errors

  5. Climatological Downscaling and Evaluation of AGRMET Precipitation Analyses Over the Continental U.S.

    NASA Astrophysics Data System (ADS)

    Garcia, M.; Peters-Lidard, C. D.; Eylander, J. B.; Daly, C.; Tian, Y.; Zeng, J.

    2007-05-01

    The spatially distributed application of a land surface model (LSM) over a region of interest requires the application of similarly distributed precipitation fields that can be derived from various sources, including surface gauge networks, surface-based radar, and orbital platforms. The spatial variability of precipitation influences the spatial organization of soil temperature and moisture states and, consequently, the spatial variability of land- atmosphere fluxes. The accuracy of spatially-distributed precipitation fields can contribute significantly to the uncertainty of model-based hydrological states and fluxes at the land surface. Collaborations between the Air Force Weather Agency (AFWA), NASA, and Oregon State University have led to improvements in the processing of meteorological forcing inputs for the NASA-GSFC Land Information System (LIS; Kumar et al. 2006), a sophisticated framework for LSM operation and model coupling experiments. Efforts at AFWA toward the production of surface hydrometeorological products are currently in transition from the legacy Agricultural Meteorology modeling system (AGRMET) to use of the LIS framework and procedures. Recent enhancements to meteorological input processing for application to land surface models in LIS include the assimilation of climate-based information for the spatial interpolation and downscaling of precipitation fields. Climatological information included in the LIS-based downscaling procedure for North America is provided by a monthly high-resolution PRISM (Daly et al. 1994, 2002; Daly 2006) dataset based on a 30-year analysis period. The combination of these sources and methods attempts to address the strengths and weaknesses of available legacy products, objective interpolation methods, and the PRISM knowledge-based methodology. All of these efforts are oriented on an operational need for timely estimation of spatial precipitation fields at adequate spatial resolution for customer dissemination and

  6. Pauci ex tanto numero: reducing redundancy in multi-model ensembles

    NASA Astrophysics Data System (ADS)

    Solazzo, E.; Riccio, A.; Kioutsioukis, I.; Galmarini, S.

    2013-02-01

    We explicitly address the fundamental issue of member diversity in multi-model ensembles. To date no attempts in this direction are documented within the air quality (AQ) community, although the extensive use of ensembles in this field. Common biases and redundancy are the two issues directly deriving from lack of independence, undermining the significance of a multi-model ensemble, and are the subject of this study. Shared biases among models will determine a biased ensemble, making therefore essential the errors of the ensemble members to be independent so that bias can cancel out. Redundancy derives from having too large a portion of common variance among the members of the ensemble, producing overconfidence in the predictions and underestimation of the uncertainty. The two issues of common biases and redundancy are analysed in detail using the AQMEII ensemble of AQ model results for four air pollutants in two European regions. We show that models share large portions of bias and variance, extending well beyond those induced by common inputs. We make use of several techniques to further show that subsets of models can explain the same amount of variance as the full ensemble with the advantage of being poorly correlated. Selecting the members for generating skilful, non-redundant ensembles from such subsets proved, however, non-trivial. We propose and discuss various methods of member selection and rate the ensemble performance they produce. In most cases, the full ensemble is outscored by the reduced ones. We conclude that, although independence of outputs may not always guarantee enhancement of scores (but this depends upon the skill being investigated) we discourage selecting the members of the ensemble simply on the basis of scores, that is, independence and skills need to be considered disjointly.

  7. EnsembleGASVR: a novel ensemble method for classifying missense single nucleotide polymorphisms.

    PubMed

    Rapakoulia, Trisevgeni; Theofilatos, Konstantinos; Kleftogiannis, Dimitrios; Likothanasis, Spiros; Tsakalidis, Athanasios; Mavroudi, Seferina

    2014-08-15

    Single nucleotide polymorphisms (SNPs) are considered the most frequently occurring DNA sequence variations. Several computational methods have been proposed for the classification of missense SNPs to neutral and disease associated. However, existing computational approaches fail to select relevant features by choosing them arbitrarily without sufficient documentation. Moreover, they are limited to the problem of missing values, imbalance between the learning datasets and most of them do not support their predictions with confidence scores. To overcome these limitations, a novel ensemble computational methodology is proposed. EnsembleGASVR facilitates a two-step algorithm, which in its first step applies a novel evolutionary embedded algorithm to locate close to optimal Support Vector Regression models. In its second step, these models are combined to extract a universal predictor, which is less prone to overfitting issues, systematizes the rebalancing of the learning sets and uses an internal approach for solving the missing values problem without loss of information. Confidence scores support all the predictions and the model becomes tunable by modifying the classification thresholds. An extensive study was performed for collecting the most relevant features for the problem of classifying SNPs, and a superset of 88 features was constructed. Experimental results show that the proposed framework outperforms well-known algorithms in terms of classification performance in the examined datasets. Finally, the proposed algorithmic framework was able to uncover the significant role of certain features such as the solvent accessibility feature, and the top-scored predictions were further validated by linking them with disease phenotypes. Datasets and codes are freely available on the Web at http://prlab.ceid.upatras.gr/EnsembleGASVR/dataset-codes.zip. All the required information about the article is available through http://prlab.ceid.upatras.gr/Ensemble

  8. 20060530 - Global Ensemble Upgrade - NWS ftp

    Science.gov Websites

    the NWS ftp server and to describe some changes to data locations described in the earlier message on .{YYYYMMDD}/ MAJOR PRODUCT CHANGES TO NCEP GLOBAL ENSEMBLE OUTPUT -- PLEASE READ CAREFULLY Starting with the 12 UTC cycle on 30 May 2006 NCEP Central Operations will implement changes to the global ensemble

  9. Ensemble pharmacophore meets ensemble docking: a novel screening strategy for the identification of RIPK1 inhibitors

    NASA Astrophysics Data System (ADS)

    Fayaz, S. M.; Rajanikant, G. K.

    2014-07-01

    Programmed cell death has been a fascinating area of research since it throws new challenges and questions in spite of the tremendous ongoing research in this field. Recently, necroptosis, a programmed form of necrotic cell death, has been implicated in many diseases including neurological disorders. Receptor interacting serine/threonine protein kinase 1 (RIPK1) is an important regulatory protein involved in the necroptosis and inhibition of this protein is essential to stop necroptotic process and eventually cell death. Current structure-based virtual screening methods involve a wide range of strategies and recently, considering the multiple protein structures for pharmacophore extraction has been emphasized as a way to improve the outcome. However, using the pharmacophoric information completely during docking is very important. Further, in such methods, using the appropriate protein structures for docking is desirable. If not, potential compound hits, obtained through pharmacophore-based screening, may not have correct ranks and scores after docking. Therefore, a comprehensive integration of different ensemble methods is essential, which may provide better virtual screening results. In this study, dual ensemble screening, a novel computational strategy was used to identify diverse and potent inhibitors against RIPK1. All the pharmacophore features present in the binding site were captured using both the apo and holo protein structures and an ensemble pharmacophore was built by combining these features. This ensemble pharmacophore was employed in pharmacophore-based screening of ZINC database. The compound hits, thus obtained, were subjected to ensemble docking. The leads acquired through docking were further validated through feature evaluation and molecular dynamics simulation.

  10. The Hydrologic Ensemble Prediction Experiment (HEPEX)

    NASA Astrophysics Data System (ADS)

    Wood, A. W.; Thielen, J.; Pappenberger, F.; Schaake, J. C.; Hartman, R. K.

    2012-12-01

    The Hydrologic Ensemble Prediction Experiment was established in March, 2004, at a workshop hosted by the European Center for Medium Range Weather Forecasting (ECMWF). With support from the US National Weather Service (NWS) and the European Commission (EC), the HEPEX goal was to bring the international hydrological and meteorological communities together to advance the understanding and adoption of hydrological ensemble forecasts for decision support in emergency management and water resources sectors. The strategy to meet this goal includes meetings that connect the user, forecast producer and research communities to exchange ideas, data and methods; the coordination of experiments to address specific challenges; and the formation of testbeds to facilitate shared experimentation. HEPEX has organized about a dozen international workshops, as well as sessions at scientific meetings (including AMS, AGU and EGU) and special issues of scientific journals where workshop results have been published. Today, the HEPEX mission is to demonstrate the added value of hydrological ensemble prediction systems (HEPS) for emergency management and water resources sectors to make decisions that have important consequences for economy, public health, safety, and the environment. HEPEX is now organised around six major themes that represent core elements of a hydrologic ensemble prediction enterprise: input and pre-processing, ensemble techniques, data assimilation, post-processing, verification, and communication and use in decision making. This poster presents an overview of recent and planned HEPEX activities, highlighting case studies that exemplify the focus and objectives of HEPEX.

  11. Pauci ex tanto numero: reduce redundancy in multi-model ensembles

    NASA Astrophysics Data System (ADS)

    Solazzo, E.; Riccio, A.; Kioutsioukis, I.; Galmarini, S.

    2013-08-01

    We explicitly address the fundamental issue of member diversity in multi-model ensembles. To date, no attempts in this direction have been documented within the air quality (AQ) community despite the extensive use of ensembles in this field. Common biases and redundancy are the two issues directly deriving from lack of independence, undermining the significance of a multi-model ensemble, and are the subject of this study. Shared, dependant biases among models do not cancel out but will instead determine a biased ensemble. Redundancy derives from having too large a portion of common variance among the members of the ensemble, producing overconfidence in the predictions and underestimation of the uncertainty. The two issues of common biases and redundancy are analysed in detail using the AQMEII ensemble of AQ model results for four air pollutants in two European regions. We show that models share large portions of bias and variance, extending well beyond those induced by common inputs. We make use of several techniques to further show that subsets of models can explain the same amount of variance as the full ensemble with the advantage of being poorly correlated. Selecting the members for generating skilful, non-redundant ensembles from such subsets proved, however, non-trivial. We propose and discuss various methods of member selection and rate the ensemble performance they produce. In most cases, the full ensemble is outscored by the reduced ones. We conclude that, although independence of outputs may not always guarantee enhancement of scores (but this depends upon the skill being investigated), we discourage selecting the members of the ensemble simply on the basis of scores; that is, independence and skills need to be considered disjointly.

  12. The Road to DLCZ Protocol in Rubidium Ensemble

    NASA Astrophysics Data System (ADS)

    Li, Chang; Pu, Yunfei; Jiang, Nan; Chang, Wei; Zhang, Sheng; CenterQuantum Information, InstituteInterdisciplinary Information Sciences, Tsinghua Univ Team

    2017-04-01

    Quantum communication is the powerful approach achieving a fully secure information transferal. The DLCZ protocol ensures that photon linearly decays with transferring distance increasing, which improves the success potential and shortens the time to build up an entangled channel. Apart from that, it provides an advanced idea that building up a quantum internet based on different nodes connected to different sites and themselves. In our laboratory, three sets of laser-cooled Rubidium 87 ensemble have been built. Two of them serve as the single photon emitter, which generate the entanglement between ensemble and photon. What's more, crossed AODs are equipped to multiplex and demultiplex optical circuit so that ensemble is divided into 2 hundred of 2D sub-memory cells. And the third ensemble is used as quantum telecommunication, which converts 780nm photon into telecom-wavelength one. And we have been building double-MOT system, which provides more atoms in ensemble and larger optical density.

  13. Quantum Gibbs ensemble Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fantoni, Riccardo, E-mail: rfantoni@ts.infn.it; Moroni, Saverio, E-mail: moroni@democritos.it

    We present a path integral Monte Carlo method which is the full quantum analogue of the Gibbs ensemble Monte Carlo method of Panagiotopoulos to study the gas-liquid coexistence line of a classical fluid. Unlike previous extensions of Gibbs ensemble Monte Carlo to include quantum effects, our scheme is viable even for systems with strong quantum delocalization in the degenerate regime of temperature. This is demonstrated by an illustrative application to the gas-superfluid transition of {sup 4}He in two dimensions.

  14. Decadal climate predictions improved by ocean ensemble dispersion filtering

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Kröner, I.; Ulbrich, U.; Cubasch, U.

    2017-06-01

    Decadal predictions by Earth system models aim to capture the state and phase of the climate several years in advance. Atmosphere-ocean interaction plays an important role for such climate forecasts. While short-term weather forecasts represent an initial value problem and long-term climate projections represent a boundary condition problem, the decadal climate prediction falls in-between these two time scales. In recent years, more precise initialization techniques of coupled Earth system models and increased ensemble sizes have improved decadal predictions. However, climate models in general start losing the initialized signal and its predictive skill from one forecast year to the next. Here we show that the climate prediction skill of an Earth system model can be improved by a shift of the ocean state toward the ensemble mean of its individual members at seasonal intervals. We found that this procedure, called ensemble dispersion filter, results in more accurate results than the standard decadal prediction. Global mean and regional temperature, precipitation, and winter cyclone predictions show an increased skill up to 5 years ahead. Furthermore, the novel technique outperforms predictions with larger ensembles and higher resolution. Our results demonstrate how decadal climate predictions benefit from ocean ensemble dispersion filtering toward the ensemble mean.Plain Language SummaryDecadal predictions aim to predict the climate several years in advance. Atmosphere-ocean interaction plays an important role for such climate forecasts. The ocean memory due to its heat capacity holds big potential skill. In recent years, more precise initialization techniques of coupled Earth system models (incl. atmosphere and ocean) have improved decadal predictions. <span class="hlt">Ensembles</span> are another important aspect. Applying slightly perturbed predictions to trigger the famous butterfly effect results in an <span class="hlt">ensemble</span>. Instead of evaluating one</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://pubs.usgs.gov/of/2016/1102/ofr20161102.pdf','USGSPUBS'); return false;" href="https://pubs.usgs.gov/of/2016/1102/ofr20161102.pdf"><span>Report from the workshop on climate <span class="hlt">downscaling</span> and its application in high Hawaiian Islands, September 16–17, 2015</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Helweg, David A.; Keener, Victoria; Burgett, Jeff M.</p> <p>2016-07-14</p> <p>In the subtropical and tropical Pacific islands, changing climate is predicted to influence precipitation and freshwater availability, and thus is predicted to impact ecosystems goods and services available to ecosystems and human communities. The small size of high Hawaiian Islands, plus their complex microlandscapes, require <span class="hlt">downscaling</span> of global climate models to provide future projections of greater skill and spatial resolution. Two different climate modeling approaches (physics-based dynamical <span class="hlt">downscaling</span> and statistics-based <span class="hlt">downscaling</span>) have produced dissimilar projections. Because of these disparities, natural resource managers and decision makers have low confidence in using the modeling results and are therefore are unwilling to include climate-related projections in their decisions. In September 2015, the Pacific Islands Climate Science Center (PICSC), the Pacific Islands Climate Change Cooperative (PICCC), and the Pacific Regional Integrated Sciences and Assessments (Pacific RISA) program convened a 2-day facilitated workshop in which the two modeling teams, plus key model users and resource managers, were brought together for a comparison of the two approaches, culminating with a discussion of how to provide predictions that are useable by resource managers. The proceedings, discussions, and outcomes of this Workshop are summarized in this Open-File Report.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1510233G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1510233G"><span>On the impact of using <span class="hlt">downscaled</span> reanalysis data instead of direct measurements for modeling the mass balance of a tropical glacier (Cordillera Blanca, Peru)</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Galos, Stephan; Hofer, Marlis; Marzeion, Ben; Mölg, Thomas; Großhauser, Martin</p> <p>2013-04-01</p> <p>Due to their setting, tropical glaciers are sensitive indicators of mid-tropospheric meteorological variability and climate change. Furthermore these glaciers are of particular interest because they respond faster to climatic changes than glaciers located in mid- or high-latitudes. As long-term direct meteorological measurements in such remote environments are scarce, reanalysis data (e.g. ERA-Interim) provide a highly valuable source of information. Reanalysis datasets (i) enable a temporal extension of data records gained by direct measurements and (ii) provide information from regions where direct measurements are not available. In order to properly derive the physical exchange processes between glaciers and atmosphere from reanalysis data, <span class="hlt">downscaling</span> procedures are required. In the present study we investigate if <span class="hlt">downscaled</span> atmospheric variables (air temperature and relative humidity) from a reanalysis dataset can be used as input for a physically based, high resolution energy and mass balance model. We apply a well validated empirical-statistical <span class="hlt">downscaling</span> model, fed with ERA-Interim data, to an automated weather station (AWS) on the surface of Glaciar Artesonraju (8.96° S | 77.63° W). The <span class="hlt">downscaled</span> data is then used to replace measured air temperature and relative humidity in the input for the energy and mass balance model, which was calibrated using ablation data from stakes and a sonic ranger. In order to test the sensitivity of the modeled mass balance to the <span class="hlt">downscaled</span> data, the results are compared to a reference model run driven solely with AWS data as model input. We finally discuss the results and present future perspectives for further developing this method.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20160003588&hterms=risk+climate&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Drisk%2Bclimate','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20160003588&hterms=risk+climate&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Drisk%2Bclimate"><span>Development and Evaluation of High-Resolution Climate Simulations Over the Mountainous Northeastern United States</span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Winter, Jonathan M.; Beckage, Brian; Bucini, Gabriela; Horton, Radley M.; Clemins, Patrick J.</p> <p>2016-01-01</p> <p>The mountain regions of the northeastern United States are a critical socioeconomic resource for Vermont, New York State, New Hampshire, Maine, and southern Quebec. While global climate models (GCMs) are important tools for climate change risk assessment at regional scales, even the increased spatial resolution of statistically <span class="hlt">downscaled</span> GCMs (commonly approximately 1/ 8 deg) is not sufficient for hydrologic, ecologic, and land-use modeling of small watersheds within the mountainous Northeast. To address this limitation, an <span class="hlt">ensemble</span> of topographically <span class="hlt">downscaled</span>, high-resolution (30"), daily 2-m maximum air temperature; 2-m minimum air temperature; and precipitation simulations are developed for the mountainous Northeast by applying an additional level of <span class="hlt">downscaling</span> to intermediately <span class="hlt">downscaled</span> (1/ 8 deg) data using high-resolution topography and station observations. First, observed relationships between 2-m air temperature and elevation and between precipitation and elevation are derived. Then, these relationships are combined with spatial interpolation to enhance the resolution of intermediately <span class="hlt">downscaled</span> GCM simulations. The resulting topographically <span class="hlt">downscaled</span> dataset is analyzed for its ability to reproduce station observations. Topographic <span class="hlt">downscaling</span> adds value to intermediately <span class="hlt">downscaled</span> maximum and minimum 2-m air temperature at high-elevation stations, as well as moderately improves domain-averaged maximum and minimum 2-m air temperature. Topographic <span class="hlt">downscaling</span> also improves mean precipitation but not daily probability distributions of precipitation. Overall, the utility of topographic <span class="hlt">downscaling</span> is dependent on the initial bias of the intermediately <span class="hlt">downscaled</span> product and the magnitude of the elevation adjustment. As the initial bias or elevation adjustment increases, more value is added to the topographically <span class="hlt">downscaled</span> product.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016NatCo...710788Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016NatCo...710788Z"><span>Phase-selective entrainment of nonlinear oscillator <span class="hlt">ensembles</span></span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zlotnik, Anatoly; Nagao, Raphael; Kiss, István Z.; Li-Shin, Jr.</p> <p>2016-03-01</p> <p>The ability to organize and finely manipulate the hierarchy and timing of dynamic processes is important for understanding and influencing brain functions, sleep and metabolic cycles, and many other natural phenomena. However, establishing spatiotemporal structures in biological oscillator <span class="hlt">ensembles</span> is a challenging task that requires controlling large collections of complex nonlinear dynamical units. In this report, we present a method to design entrainment signals that create stable phase patterns in <span class="hlt">ensembles</span> of heterogeneous nonlinear oscillators without using state feedback information. We demonstrate the approach using experiments with electrochemical reactions on multielectrode arrays, in which we selectively assign <span class="hlt">ensemble</span> subgroups into spatiotemporal patterns with multiple phase clusters. The experimentally confirmed mechanism elucidates the connection between the phases and natural frequencies of a collection of dynamical elements, the spatial and temporal information that is encoded within this <span class="hlt">ensemble</span>, and how external signals can be used to retrieve this information.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27222203','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27222203"><span>Effects of <span class="hlt">ensembles</span> on methane hydrate nucleation kinetics.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhang, Zhengcai; Liu, Chan-Juan; Walsh, Matthew R; Guo, Guang-Jun</p> <p>2016-06-21</p> <p>By performing molecular dynamics simulations to form a hydrate with a methane nano-bubble in liquid water at 250 K and 50 MPa, we report how different <span class="hlt">ensembles</span>, such as the NPT, NVT, and NVE <span class="hlt">ensembles</span>, affect the nucleation kinetics of the methane hydrate. The nucleation trajectories are monitored using the face-saturated incomplete cage analysis (FSICA) and the mutually coordinated guest (MCG) order parameter (OP). The nucleation rate and the critical nucleus are obtained using the mean first-passage time (MFPT) method based on the FS cages and the MCG-1 OPs, respectively. The fitting results of MFPT show that hydrate nucleation and growth are coupled together, consistent with the cage adsorption hypothesis which emphasizes that the cage adsorption of methane is a mechanism for both hydrate nucleation and growth. For the three different <span class="hlt">ensembles</span>, the hydrate nucleation rate is quantitatively ordered as follows: NPT > NVT > NVE, while the sequence of hydrate crystallinity is exactly reversed. However, the largest size of the critical nucleus appears in the NVT <span class="hlt">ensemble</span>, rather than in the NVE <span class="hlt">ensemble</span>. These results are helpful for choosing a suitable <span class="hlt">ensemble</span> when to study hydrate formation via computer simulations, and emphasize the importance of the order degree of the critical nucleus.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018InvPr..34e5009C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018InvPr..34e5009C"><span>Parameterizations for <span class="hlt">ensemble</span> Kalman inversion</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chada, Neil K.; Iglesias, Marco A.; Roininen, Lassi; Stuart, Andrew M.</p> <p>2018-05-01</p> <p>The use of <span class="hlt">ensemble</span> methods to solve inverse problems is attractive because it is a derivative-free methodology which is also well-adapted to parallelization. In its basic iterative form the method produces an <span class="hlt">ensemble</span> of solutions which lie in the linear span of the initial <span class="hlt">ensemble</span>. Choice of the parameterization of the unknown field is thus a key component of the success of the method. We demonstrate how both geometric ideas and hierarchical ideas can be used to design effective parameterizations for a number of applied inverse problems arising in electrical impedance tomography, groundwater flow and source inversion. In particular we show how geometric ideas, including the level set method, can be used to reconstruct piecewise continuous fields, and we show how hierarchical methods can be used to learn key parameters in continuous fields, such as length-scales, resulting in improved reconstructions. Geometric and hierarchical ideas are combined in the level set method to find piecewise constant reconstructions with interfaces of unknown topology.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/biblio/21583318-crossover-ensembles-random-matrices-skew-orthogonal-polynomials','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/21583318-crossover-ensembles-random-matrices-skew-orthogonal-polynomials"><span>Crossover <span class="hlt">ensembles</span> of random matrices and skew-orthogonal polynomials</span></a></p> <p><a target="_blank" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Kumar, Santosh, E-mail: skumar.physics@gmail.com; Pandey, Akhilesh, E-mail: ap0700@mail.jnu.ac.in</p> <p>2011-08-15</p> <p>Highlights: > We study crossover <span class="hlt">ensembles</span> of Jacobi family of random matrices. > We consider correlations for orthogonal-unitary and symplectic-unitary crossovers. > We use the method of skew-orthogonal polynomials and quaternion determinants. > We prove universality of spectral correlations in crossover <span class="hlt">ensembles</span>. > We discuss applications to quantum conductance and communication theory problems. - Abstract: In a recent paper (S. Kumar, A. Pandey, Phys. Rev. E, 79, 2009, p. 026211) we considered Jacobi family (including Laguerre and Gaussian cases) of random matrix <span class="hlt">ensembles</span> and reported exact solutions of crossover problems involving time-reversal symmetry breaking. In the present paper we givemore » details of the work. We start with Dyson's Brownian motion description of random matrix <span class="hlt">ensembles</span> and obtain universal hierarchic relations among the unfolded correlation functions. For arbitrary dimensions we derive the joint probability density (jpd) of eigenvalues for all transitions leading to unitary <span class="hlt">ensembles</span> as equilibrium <span class="hlt">ensembles</span>. We focus on the orthogonal-unitary and symplectic-unitary crossovers and give generic expressions for jpd of eigenvalues, two-point kernels and n-level correlation functions. This involves generalization of the theory of skew-orthogonal polynomials to crossover <span class="hlt">ensembles</span>. We also consider crossovers in the circular <span class="hlt">ensembles</span> to show the generality of our method. In the large dimensionality limit, correlations in spectra with arbitrary initial density are shown to be universal when expressed in terms of a rescaled symmetry breaking parameter. Applications of our crossover results to communication theory and quantum conductance problems are also briefly discussed.« less</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28862923','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28862923"><span>Representing Color <span class="hlt">Ensembles</span>.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chetverikov, Andrey; Campana, Gianluca; Kristjánsson, Árni</p> <p>2017-10-01</p> <p>Colors are rarely uniform, yet little is known about how people represent color distributions. We introduce a new method for studying color <span class="hlt">ensembles</span> based on intertrial learning in visual search. Participants looked for an oddly colored diamond among diamonds with colors taken from either uniform or Gaussian color distributions. On test trials, the targets had various distances in feature space from the mean of the preceding distractor color distribution. Targets on test trials therefore served as probes into probabilistic representations of distractor colors. Test-trial response times revealed a striking similarity between the physical distribution of colors and their internal representations. The results demonstrate that the visual system represents color <span class="hlt">ensembles</span> in a more detailed way than previously thought, coding not only mean and variance but, most surprisingly, the actual shape (uniform or Gaussian) of the distribution of colors in the environment.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=339384&Lab=NERL&keyword=One+AND+case+AND+study+AND+approach&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50','EPA-EIMS'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?dirEntryId=339384&Lab=NERL&keyword=One+AND+case+AND+study+AND+approach&actType=&TIMSType=+&TIMSSubTypeID=&DEID=&epaNumber=&ntisID=&archiveStatus=Both&ombCat=Any&dateBeginCreated=&dateEndCreated=&dateBeginPublishedPresented=&dateEndPublishedPresented=&dateBeginUpdated=&dateEndUpdated=&dateBeginCompleted=&dateEndCompleted=&personID=&role=Any&journalID=&publisherID=&sortBy=revisionDate&count=50"><span>Examining the Effects of Mosaic Land Cover on Extreme Events in Historical <span class="hlt">Downscaled</span> WRF Simulations</span></a></p> <p><a target="_blank" href="http://oaspub.epa.gov/eims/query.page">EPA Science Inventory</a></p> <p></p> <p></p> <p>The representation of land use and land cover (hereby referred to as “LU”) is a challenging aspect of dynamically <span class="hlt">downscaled</span> simulations, as a mesoscale model that is utilized as a regional climate model (RCM) may be limited in its ability to represent LU over multi-d...</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1175481','DOE-PATENT-XML'); return false;" href="https://www.osti.gov/servlets/purl/1175481"><span>Creating <span class="hlt">ensembles</span> of decision trees through sampling</span></a></p> <p><a target="_blank" href="http://www.osti.gov/doepatents">DOEpatents</a></p> <p>Kamath, Chandrika; Cantu-Paz, Erick</p> <p>2005-08-30</p> <p>A system for decision tree <span class="hlt">ensembles</span> that includes a module to read the data, a module to sort the data, a module to evaluate a potential split of the data according to some criterion using a random sample of the data, a module to split the data, and a module to combine multiple decision trees in <span class="hlt">ensembles</span>. The decision tree method is based on statistical sampling techniques and includes the steps of reading the data; sorting the data; evaluating a potential split according to some criterion using a random sample of the data, splitting the data, and combining multiple decision trees in <span class="hlt">ensembles</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24024194','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24024194"><span>An efficient <span class="hlt">ensemble</span> learning method for gene microarray classification.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Osareh, Alireza; Shadgar, Bita</p> <p>2013-01-01</p> <p>The gene microarray analysis and classification have demonstrated an effective way for the effective diagnosis of diseases and cancers. However, it has been also revealed that the basic classification techniques have intrinsic drawbacks in achieving accurate gene classification and cancer diagnosis. On the other hand, classifier <span class="hlt">ensembles</span> have received increasing attention in various applications. Here, we address the gene classification issue using RotBoost <span class="hlt">ensemble</span> methodology. This method is a combination of Rotation Forest and AdaBoost techniques which in turn preserve both desirable features of an <span class="hlt">ensemble</span> architecture, that is, accuracy and diversity. To select a concise subset of informative genes, 5 different feature selection algorithms are considered. To assess the efficiency of the RotBoost, other nonensemble/<span class="hlt">ensemble</span> techniques including Decision Trees, Support Vector Machines, Rotation Forest, AdaBoost, and Bagging are also deployed. Experimental results have revealed that the combination of the fast correlation-based feature selection method with ICA-based RotBoost <span class="hlt">ensemble</span> is highly effective for gene classification. In fact, the proposed method can create <span class="hlt">ensemble</span> classifiers which outperform not only the classifiers produced by the conventional machine learning but also the classifiers generated by two widely used conventional <span class="hlt">ensemble</span> learning methods, that is, Bagging and AdaBoost.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27557880','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27557880"><span>An <span class="hlt">ensemble</span> framework for identifying essential proteins.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhang, Xue; Xiao, Wangxin; Acencio, Marcio Luis; Lemke, Ney; Wang, Xujing</p> <p>2016-08-25</p> <p>Many centrality measures have been proposed to mine and characterize the correlations between network topological properties and protein essentiality. However, most of them show limited prediction accuracy, and the number of common predicted essential proteins by different methods is very small. In this paper, an <span class="hlt">ensemble</span> framework is proposed which integrates gene expression data and protein-protein interaction networks (PINs). It aims to improve the prediction accuracy of basic centrality measures. The idea behind this <span class="hlt">ensemble</span> framework is that different protein-protein interactions (PPIs) may show different contributions to protein essentiality. Five standard centrality measures (degree centrality, betweenness centrality, closeness centrality, eigenvector centrality, and subgraph centrality) are integrated into the <span class="hlt">ensemble</span> framework respectively. We evaluated the performance of the proposed <span class="hlt">ensemble</span> framework using yeast PINs and gene expression data. The results show that it can considerably improve the prediction accuracy of the five centrality measures individually. It can also remarkably increase the number of common predicted essential proteins among those predicted by each centrality measure individually and enable each centrality measure to find more low-degree essential proteins. This paper demonstrates that it is valuable to differentiate the contributions of different PPIs for identifying essential proteins based on network topological characteristics. The proposed <span class="hlt">ensemble</span> framework is a successful paradigm to this end.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4415763','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4415763"><span>A Bayesian <span class="hlt">Ensemble</span> Approach for Epidemiological Projections</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Lindström, Tom; Tildesley, Michael; Webb, Colleen</p> <p>2015-01-01</p> <p>Mathematical models are powerful tools for epidemiology and can be used to compare control actions. However, different models and model parameterizations may provide different prediction of outcomes. In other fields of research, <span class="hlt">ensemble</span> modeling has been used to combine multiple projections. We explore the possibility of applying such methods to epidemiology by adapting Bayesian techniques developed for climate forecasting. We exemplify the implementation with single model <span class="hlt">ensembles</span> based on different parameterizations of the Warwick model run for the 2001 United Kingdom foot and mouth disease outbreak and compare the efficacy of different control actions. This allows us to investigate the effect that discrepancy among projections based on different modeling assumptions has on the <span class="hlt">ensemble</span> prediction. A sensitivity analysis showed that the choice of prior can have a pronounced effect on the posterior estimates of quantities of interest, in particular for <span class="hlt">ensembles</span> with large discrepancy among projections. However, by using a hierarchical extension of the method we show that prior sensitivity can be circumvented. We further extend the method to include a priori beliefs about different modeling assumptions and demonstrate that the effect of this can have different consequences depending on the discrepancy among projections. We propose that the method is a promising analytical tool for <span class="hlt">ensemble</span> modeling of disease outbreaks. PMID:25927892</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JHyd..524..789H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JHyd..524..789H"><span><span class="hlt">Ensemble</span> Bayesian forecasting system Part I: Theory and algorithms</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Herr, Henry D.; Krzysztofowicz, Roman</p> <p>2015-05-01</p> <p>The <span class="hlt">ensemble</span> Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input <span class="hlt">ensemble</span> forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an <span class="hlt">ensemble</span> of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an <span class="hlt">ensemble</span> of time series of outputs, which is next transformed stochastically by the HUP into an <span class="hlt">ensemble</span> of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the <span class="hlt">ensemble</span> size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the <span class="hlt">ensemble</span> Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple <span class="hlt">ensemble</span> members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input <span class="hlt">ensemble</span> and makes it operationally feasible to generate a Bayesian <span class="hlt">ensemble</span> forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018GMD....11..453Q','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018GMD....11..453Q"><span>Online dynamical <span class="hlt">downscaling</span> of temperature and precipitation within the iLOVECLIM model (version 1.1)</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Quiquet, Aurélien; Roche, Didier M.; Dumas, Christophe; Paillard, Didier</p> <p>2018-02-01</p> <p>This paper presents the inclusion of an online dynamical <span class="hlt">downscaling</span> of temperature and precipitation within the model of intermediate complexity iLOVECLIM v1.1. We describe the following methodology to generate temperature and precipitation fields on a 40 km × 40 km Cartesian grid of the Northern Hemisphere from the T21 native atmospheric model grid. Our scheme is not grid specific and conserves energy and moisture in the same way as the original climate model. We show that we are able to generate a high-resolution field which presents a spatial variability in better agreement with the observations compared to the standard model. Although the large-scale model biases are not corrected, for selected model parameters, the <span class="hlt">downscaling</span> can induce a better overall performance compared to the standard version on both the high-resolution grid and on the native grid. Foreseen applications of this new model feature include the improvement of ice sheet model coupling and high-resolution land surface models.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..11..967H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..11..967H"><span>Empirical <span class="hlt">downscaling</span> of atmospheric key variables above a tropical glacier surface (Cordillera Blanca, Peru)</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hofer, M.; Kaser, G.; Mölg, T.; Juen, I.; Wagnon, P.</p> <p>2009-04-01</p> <p>Glaciers in the outer tropical Cordillera Blanca (Peru, South America) are of major socio-economic importance, since glacier runoff represents the primary water source during the dry season, when little or no rainfall occurs. Due to their location at high elevations, the glaciers moreover provide important information about climate change in the tropical troposphere, where measurements are sparse. This study targets the local reconstruction of air temperature, specific humidity and wind speed above the surface of an outer tropical glacier from NCEP/NCAR reanalysis data as large scale predictors. Since a farther scope is to provide input data for process based glacier mass balance modelling, the reconstruction pursues a high temporal resolution. Hence an empirical <span class="hlt">downscaling</span> scheme is developed, based on a few years' time series of hourly observations from automatic weather stations, located at the glacier Artesonraju and nearby moraines (Northern Cordillera Blanca). Principal component and multiple regression analyses are applied to define the appropriate spatial <span class="hlt">downscaling</span> domain, suitable predictor variables, and the statistical transfer functions. The model performance is verified using an independent data set. The best predictors are lower tropospheric air temperature and specific humidity, at reanalysis model grid points that represent the Bolivian Altiplano, located in the South of the Cordillera Blanca. The developed <span class="hlt">downscaling</span> model explaines a considerable portion (more than 60%) of the diurnal variance of air temperature and specific humidity at the moraine stations, and air temperature above the glacier surface. Specific humidity above the glacier surface, however, can be reconstructed well in the seasonal, but not in the required diurnal time resolution. Wind speed can only be poorly determined by the large scale predictors (r² lower than 0.3) at both sites. We assume a complex local interaction between valley and glacier wind system to be the main</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014GMDD....7.7525M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014GMDD....7.7525M"><span>Reduction of predictive uncertainty in estimating irrigation water requirement through multi-model <span class="hlt">ensembles</span> and <span class="hlt">ensemble</span> averaging</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Multsch, S.; Exbrayat, J.-F.; Kirby, M.; Viney, N. R.; Frede, H.-G.; Breuer, L.</p> <p>2014-11-01</p> <p>Irrigation agriculture plays an increasingly important role in food supply. Many evapotranspiration models are used today to estimate the water demand for irrigation. They consider different stages of crop growth by empirical crop coefficients to adapt evapotranspiration throughout the vegetation period. We investigate the importance of the model structural vs. model parametric uncertainty for irrigation simulations by considering six evapotranspiration models and five crop coefficient sets to estimate irrigation water requirements for growing wheat in the Murray-Darling Basin, Australia. The study is carried out using the spatial decision support system SPARE:WATER. We find that structural model uncertainty is far more important than model parametric uncertainty to estimate irrigation water requirement. Using the Reliability <span class="hlt">Ensemble</span> Averaging (REA) technique, we are able to reduce the overall predictive model uncertainty by more than 10%. The exceedance probability curve of irrigation water requirements shows that a certain threshold, e.g. an irrigation water limit due to water right of 400 mm, would be less frequently exceeded in case of the REA <span class="hlt">ensemble</span> average (45%) in comparison to the equally weighted <span class="hlt">ensemble</span> average (66%). We conclude that multi-model <span class="hlt">ensemble</span> predictions and sophisticated model averaging techniques are helpful in predicting irrigation demand and provide relevant information for decision making.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1980STIN...8032586A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1980STIN...8032586A"><span>Project fires. Volume 2: Protective <span class="hlt">ensemble</span> performance standards, phase 1B</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Abeles, F. J.</p> <p>1980-05-01</p> <p>The design of the prototype protective <span class="hlt">ensemble</span> was finalized. Prototype <span class="hlt">ensembles</span> were fabricated and then subjected to a series of qualification tests which were based upon the protective <span class="hlt">ensemble</span> performance standards PEPS requirements. Engineering drawings and purchase specifications were prepared for the new protective <span class="hlt">ensemble</span>.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120016072','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120016072"><span>Two Topics in Seasonal Streamflow Forecasting: Soil Moisture Initialization Error and Precipitation <span class="hlt">Downscaling</span></span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Koster, Randal; Walker, Greg; Mahanama, Sarith; Reichle, Rolf</p> <p>2012-01-01</p> <p>Continental-scale offline simulations with a land surface model are used to address two important issues in the forecasting of large-scale seasonal streamflow: (i) the extent to which errors in soil moisture initialization degrade streamflow forecasts, and (ii) the extent to which the <span class="hlt">downscaling</span> of seasonal precipitation forecasts, if it could be done accurately, would improve streamflow forecasts. The reduction in streamflow forecast skill (with forecasted streamflow measured against observations) associated with adding noise to a soil moisture field is found to be, to first order, proportional to the average reduction in the accuracy of the soil moisture field itself. This result has implications for streamflow forecast improvement under satellite-based soil moisture measurement programs. In the second and more idealized ("perfect model") analysis, precipitation <span class="hlt">downscaling</span> is found to have an impact on large-scale streamflow forecasts only if two conditions are met: (i) evaporation variance is significant relative to the precipitation variance, and (ii) the subgrid spatial variance of precipitation is adequately large. In the large-scale continental region studied (the conterminous United States), these two conditions are met in only a somewhat limited area.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://www.emc.ncep.noaa.gov/gmb/ens','SCIGOVWS'); return false;" href="http://www.emc.ncep.noaa.gov/gmb/ens"><span>NCEP <span class="hlt">ENSEMBLE</span> PRODUCTS</span></a></p> <p><a target="_blank" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>://www.emc.ncep.noaa.gov/GEFS/ The links to the various Forecast Plots are listed under <em>Experimental</em> Data on the new GEFS page. This NCEP <span class="hlt">Ensemble</span> Home Page is a collection of <em>experimental</em> analysis and forecast products</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JNEng..15c6006S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JNEng..15c6006S"><span>Modeling task-specific neuronal <span class="hlt">ensembles</span> improves decoding of grasp</span></a></p> <p><a target="_blank" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Smith, Ryan J.; Soares, Alcimar B.; Rouse, Adam G.; Schieber, Marc H.; Thakor, Nitish V.</p> <p>2018-06-01</p> <p>Objective. Dexterous movement involves the activation and coordination of networks of neuronal populations across multiple cortical regions. Attempts to model firing of individual neurons commonly treat the firing rate as directly modulating with motor behavior. However, motor behavior may additionally be associated with modulations in the activity and functional connectivity of neurons in a broader <span class="hlt">ensemble</span>. Accounting for variations in neural <span class="hlt">ensemble</span> connectivity may provide additional information about the behavior being performed. Approach. In this study, we examined neural <span class="hlt">ensemble</span> activity in primary motor cortex (M1) and premotor cortex (PM) of two male rhesus monkeys during performance of a center-out reach, grasp and manipulate task. We constructed point process encoding models of neuronal firing that incorporated task-specific variations in the baseline firing rate as well as variations in functional connectivity with the neural <span class="hlt">ensemble</span>. Models were evaluated both in terms of their encoding capabilities and their ability to properly classify the grasp being performed. Main results. Task-specific <span class="hlt">ensemble</span> models correctly predicted the performed grasp with over 95% accuracy and were shown to outperform models of neuronal activity that assume only a variable baseline firing rate. Task-specific <span class="hlt">ensemble</span> models exhibited superior decoding performance in 82% of units in both monkeys (p  <  0.01). Inclusion of <span class="hlt">ensemble</span> activity also broadly improved the ability of models to describe observed spiking. Encoding performance of task-specific <span class="hlt">ensemble</span> models, measured by spike timing predictability, improved upon baseline models in 62% of units. Significance. These results suggest that additional discriminative information about motor behavior found in the variations in functional connectivity of neuronal <span class="hlt">ensembles</span> located in motor-related cortical regions is relevant to decode complex tasks such as grasping objects, and may serve the basis for more</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70197124','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70197124"><span>Do <span class="hlt">downscaled</span> general circulation models reliably simulate historical climatic conditions?</span></a></p> <p><a target="_blank" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Bock, Andrew R.; Hay, Lauren E.; McCabe, Gregory J.; Markstrom, Steven L.; Atkinson, R. Dwight</p> <p>2018-01-01</p> <p>The accuracy of statistically <span class="hlt">downscaled</span> (SD) general circulation model (GCM) simulations of monthly surface climate for historical conditions (1950–2005) was assessed for the conterminous United States (CONUS). The SD monthly precipitation (PPT) and temperature (TAVE) from 95 GCMs from phases 3 and 5 of the Coupled Model Intercomparison Project (CMIP3 and CMIP5) were used as inputs to a monthly water balance model (MWBM). Distributions of MWBM input (PPT and TAVE) and output [runoff (RUN)] variables derived from gridded station data (GSD) and historical SD climate were compared using the Kolmogorov–Smirnov (KS) test For all three variables considered, the KS test results showed that variables simulated using CMIP5 generally are more reliable than those derived from CMIP3, likely due to improvements in PPT simulations. At most locations across the CONUS, the largest differences between GSD and SD PPT and RUN occurred in the lowest part of the distributions (i.e., low-flow RUN and low-magnitude PPT). Results indicate that for the majority of the CONUS, there are <span class="hlt">downscaled</span> GCMs that can reliably simulate historical climatic conditions. But, in some geographic locations, none of the SD GCMs replicated historical conditions for two of the three variables (PPT and RUN) based on the KS test, with a significance level of 0.05. In these locations, improved GCM simulations of PPT are needed to more reliably estimate components of the hydrologic cycle. Simple metrics and statistical tests, such as those described here, can provide an initial set of criteria to help simplify GCM selection.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25810748','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25810748"><span>Genetic programming based <span class="hlt">ensemble</span> system for microarray data classification.</span></a></p> <p><a target="_blank" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To</p> <p>2015-01-01</p> <p>Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new <span class="hlt">ensemble</span> system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this <span class="hlt">ensemble</span> framework with three operators: Min, Max, and Average. Each individual of the GP is an <span class="hlt">ensemble</span> system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each <span class="hlt">ensemble</span> system. The final <span class="hlt">ensemble</span> committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other <span class="hlt">ensemble</span> systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4355811','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4355811"><span>Genetic Programming Based <span class="hlt">Ensemble</span> System for Microarray Data Classification</span></a></p> <p><a target="_blank" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To</p> <p>2015-01-01</p> <p>Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new <span class="hlt">ensemble</span> system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this <span class="hlt">ensemble</span> framework with three operators: Min, Max, and Average. Each individual of the GP is an <span class="hlt">ensemble</span> system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each <span class="hlt">ensemble</span> system. The final <span class="hlt">ensemble</span> committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other <span class="hlt">ensemble</span> systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('http://hdl.handle.net/2060/20160007388','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20160007388"><span>Improving Climate Projections Using "Intelligent" <span class="hlt">Ensembles</span></span></a></p> <p><a target="_blank" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Baker, Noel C.; Taylor, Patrick C.</p> <p>2015-01-01</p> <p>Recent changes in the climate system have led to growing concern, especially in communities which are highly vulnerable to resource shortages and weather extremes. There is an urgent need for better climate information to develop solutions and strategies for adapting to a changing climate. Climate models provide excellent tools for studying the current state of climate and making future projections. However, these models are subject to biases created by structural uncertainties. Performance metrics-or the systematic determination of model biases-succinctly quantify aspects of climate model behavior. Efforts to standardize climate model experiments and collect simulation data-such as the Coupled Model Intercomparison Project (CMIP)-provide the means to directly compare and assess model performance. Performance metrics have been used to show that some models reproduce present-day climate better than others. Simulation data from multiple models are often used to add value to projections by creating a consensus projection from the model <span class="hlt">ensemble</span>, in which each model is given an equal weight. It has been shown that the <span class="hlt">ensemble</span> mean generally outperforms any single model. It is possible to use unequal weights to produce <span class="hlt">ensemble</span> means, in which models are weighted based on performance (called "intelligent" <span class="hlt">ensembles</span>). Can performance metrics be used to improve climate projections? Previous work introduced a framework for comparing the utility of model performance metrics, showing that the best metrics are related to the variance of top-of-atmosphere outgoing longwave radiation. These metrics improve present-day climate simulations of Earth's energy budget using the "intelligent" <span class="hlt">ensemble</span> method. The current project identifies several approaches for testing whether performance metrics can be applied to future simulations to create "intelligent" <span class="hlt">ensemble</span>-mean climate projections. It is shown that certain performance metrics test key climate processes in the models, and</p> </li> <li> <p><a target="_blank" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1291228-phase-selective-entrainment-nonlinear-oscillator-ensembles','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1291228-phase-selective-entrainment-nonlinear-oscillator-ensembles"><span>Phase-selective entrainment of nonlinear oscillator <span class="hlt">ensembles</span></span></a></p> <p><a target="_blank" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Zlotnik, Anatoly V.; Nagao, Raphael; Kiss, Istvan Z.; ...</p> <p>2016-03-18</p> <p>The ability to organize and finely manipulate the hierarchy and timing of dynamic processes is important for understanding and influencing brain functions, sleep and metabolic cycles, and many other natural phenomena. However, establishing spatiotemporal structures in biological oscillator <span class="hlt">ensembles</span> is a challenging task that requires controlling large collections of complex nonlinear dynamical units. In this report, we present a method to design entrainment signals that create stable phase patterns in <span class="hlt">ensembles</span> of heterogeneous nonlinear oscillators without using state feedback information. We demonstrate the approach using experiments with electrochemical reactions on multielectrode arrays, in which we selectively assign <span class="hlt">ensemble</span> subgroups intomore » spatiotemporal patterns with multiple phase clusters. As a result, the experimentally confirmed mechanism elucidates the connection between the phases and natural frequencies of a collection of dynamical elements, the spatial and temporal information that is encoded within this <span class="hlt">ensemble</span>, and how external signals can be used to retrieve this information.« less</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <footer><a id="backToTop" href="#top"> </a><nav><a id="backToTop" href="#top"> </a><ul class="links"><a id="backToTop" href="#top"> </a><li><a id="backToTop" href="#top"></a><a href="/sitemap.html">Site Map</a></li> <li><a href="/members/index.html">Members Only</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://doe.responsibledisclosure.com/hc/en-us" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> <div class="small">Science.gov is maintained by the U.S. Department of Energy's <a href="https://www.osti.gov/" target="_blank">Office of Scientific and Technical Information</a>, in partnership with <a href="https://www.cendi.gov/" target="_blank">CENDI</a>.</div> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>