Sample records for high resolution ensemble

  1. Machine Learning Predictions of a Multiresolution Climate Model Ensemble

    NASA Astrophysics Data System (ADS)

    Anderson, Gemma J.; Lucas, Donald D.

    2018-05-01

    Statistical models of high-resolution climate models are useful for many purposes, including sensitivity and uncertainty analyses, but building them can be computationally prohibitive. We generated a unique multiresolution perturbed parameter ensemble of a global climate model. We use a novel application of a machine learning technique known as random forests to train a statistical model on the ensemble to make high-resolution model predictions of two important quantities: global mean top-of-atmosphere energy flux and precipitation. The random forests leverage cheaper low-resolution simulations, greatly reducing the number of high-resolution simulations required to train the statistical model. We demonstrate that high-resolution predictions of these quantities can be obtained by training on an ensemble that includes only a small number of high-resolution simulations. We also find that global annually averaged precipitation is more sensitive to resolution changes than to any of the model parameters considered.

  2. Multi-Resolution Climate Ensemble Parameter Analysis with Nested Parallel Coordinates Plots.

    PubMed

    Wang, Junpeng; Liu, Xiaotong; Shen, Han-Wei; Lin, Guang

    2017-01-01

    Due to the uncertain nature of weather prediction, climate simulations are usually performed multiple times with different spatial resolutions. The outputs of simulations are multi-resolution spatial temporal ensembles. Each simulation run uses a unique set of values for multiple convective parameters. Distinct parameter settings from different simulation runs in different resolutions constitute a multi-resolution high-dimensional parameter space. Understanding the correlation between the different convective parameters, and establishing a connection between the parameter settings and the ensemble outputs are crucial to domain scientists. The multi-resolution high-dimensional parameter space, however, presents a unique challenge to the existing correlation visualization techniques. We present Nested Parallel Coordinates Plot (NPCP), a new type of parallel coordinates plots that enables visualization of intra-resolution and inter-resolution parameter correlations. With flexible user control, NPCP integrates superimposition, juxtaposition and explicit encodings in a single view for comparative data visualization and analysis. We develop an integrated visual analytics system to help domain scientists understand the connection between multi-resolution convective parameters and the large spatial temporal ensembles. Our system presents intricate climate ensembles with a comprehensive overview and on-demand geographic details. We demonstrate NPCP, along with the climate ensemble visualization system, based on real-world use-cases from our collaborators in computational and predictive science.

  3. Ensemble flood simulation for a small dam catchment in Japan using 10 and 2 km resolution nonhydrostatic model rainfalls

    NASA Astrophysics Data System (ADS)

    Kobayashi, Kenichiro; Otsuka, Shigenori; Apip; Saito, Kazuo

    2016-08-01

    This paper presents a study on short-term ensemble flood forecasting specifically for small dam catchments in Japan. Numerical ensemble simulations of rainfall from the Japan Meteorological Agency nonhydrostatic model (JMA-NHM) are used as the input data to a rainfall-runoff model for predicting river discharge into a dam. The ensemble weather simulations use a conventional 10 km and a high-resolution 2 km spatial resolutions. A distributed rainfall-runoff model is constructed for the Kasahori dam catchment (approx. 70 km2) and applied with the ensemble rainfalls. The results show that the hourly maximum and cumulative catchment-average rainfalls of the 2 km resolution JMA-NHM ensemble simulation are more appropriate than the 10 km resolution rainfalls. All the simulated inflows based on the 2 and 10 km rainfalls become larger than the flood discharge of 140 m3 s-1, a threshold value for flood control. The inflows with the 10 km resolution ensemble rainfall are all considerably smaller than the observations, while at least one simulated discharge out of 11 ensemble members with the 2 km resolution rainfalls reproduces the first peak of the inflow at the Kasahori dam with similar amplitude to observations, although there are spatiotemporal lags between simulation and observation. To take positional lags into account of the ensemble discharge simulation, the rainfall distribution in each ensemble member is shifted so that the catchment-averaged cumulative rainfall of the Kasahori dam maximizes. The runoff simulation with the position-shifted rainfalls shows much better results than the original ensemble discharge simulations.

  4. Assessment of Multiple Daily Precipitation Statistics in ERA-Interim Driven Med-CORDEX and EURO-CORDEX Experiments Against High Resolution Observations

    NASA Astrophysics Data System (ADS)

    Coppola, E.; Fantini, A.; Raffaele, F.; Torma, C. Z.; Bacer, S.; Giorgi, F.; Ahrens, B.; Dubois, C.; Sanchez, E.; Verdecchia, M.

    2017-12-01

    We assess the statistics of different daily precipitation indices in ensembles of Med-CORDEX and EUROCORDEX experiments at high resolution (grid spacing of ˜0.11° , or RCM11) and medium resolution (grid spacing of ˜0.44° , or RCM44) with regional climate models (RCMs) driven by the ERA-Interim reanalysis of observations for the period 1989-2008. The assessment is carried out by comparison with a set of high resolution observation datasets for 9 European subregions. The statistics analyzed include quantitative metrics for mean precipitation, daily precipitation Probability Density Functions (PDFs), daily precipitation intensity, frequency, 95th percentile and 95th percentile of dry spell length. We assess both an ensemble including all Med-CORDEX and EURO-CORDEX models and one including the Med-CORDEX models alone. For the All Models ensembles, the RCM11 one shows a remarkable performance in reproducing the spatial patterns and seasonal cycle of mean precipitation over all regions, with a consistent and marked improvement compared to the RCM44 ensemble and the ERA-Interim reanalysis. A good consistency with observations by the RCM11 ensemble (and a substantial improvement compared to RCM44 and ERA-Interim) is found also for the daily precipitation PDFs, mean intensity and, to a lesser extent, the 95th percentile. In fact, for some regions the RCM11 ensemble overestimates the occurrence of very high intensity events while for one region the models underestimate the occurrence of the largest extremes. The RCM11 ensemble still shows a general tendency to underestimate the dry day frequency and 95th percentile of dry spell length over wetter regions, with only a marginal improvement compared to the lower resolution models. This indicates that the problem of the excessive production of low precipitation events found in many climate models persists also at relatively high resolutions, at least in wet climate regimes. Concerning the Med-CORDEX model ensembles we find that their performance is of similar quality as that of the all-models over the Mediterranean regions analyzed. Finally, we stress the need of consistent and quality checked fine scale observation datasets for the assessment of RCMs run at increasingly high horizontal resolutions.

  5. Impact assessment of climate change on tourism in the Pacific small islands based on the database of long-term high-resolution climate ensemble experiments

    NASA Astrophysics Data System (ADS)

    Watanabe, S.; Utsumi, N.; Take, M.; Iida, A.

    2016-12-01

    This study aims to develop a new approach to assess the impact of climate change on the small oceanic islands in the Pacific. In the new approach, the change of the probabilities of various situations was projected with considering the spread of projection derived from ensemble simulations, instead of projecting the most probable situation. The database for Policy Decision making for Future climate change (d4PDF) is a database of long-term high-resolution climate ensemble experiments, which has the results of 100 ensemble simulations. We utilized the database for Policy Decision making for Future climate change (d4PDF), which was (a long-term and high-resolution database) composed of results of 100 ensemble experiments. A new methodology, Multi Threshold Ensemble Assessment (MTEA), was developed using the d4PDF in order to assess the impact of climate change. We focused on the impact of climate change on tourism because it has played an important role in the economy of the Pacific Islands. The Yaeyama Region, one of the tourist destinations in Okinawa, Japan, was selected as the case study site. Two kinds of impact were assessed: change in probability of extreme climate phenomena and tourist satisfaction associated with weather. The database of long-term high-resolution climate ensemble experiments and the questionnaire survey conducted by a local government were used for the assessment. The result indicated that the strength of extreme events would be increased, whereas the probability of occurrence would be decreased. This change should result in increase of the number of clear days and it could contribute to improve the tourist satisfaction.

  6. Regional sea level variability in a high-resolution global coupled climate model

    NASA Astrophysics Data System (ADS)

    Palko, D.; Kirtman, B. P.

    2016-12-01

    The prediction of trends at regional scales is essential in order to adapt to and prepare for the effects of climate change. However, GCMs are unable to make reliable predictions at regional scales. The prediction of local sea level trends is particularly critical. The main goal of this research is to utilize high-resolution (HR) (0.1° resolution in the ocean) coupled model runs of CCSM4 to analyze regional sea surface height (SSH) trends. Unlike typical, lower resolution (1.0°) GCM runs these HR runs resolve features in the ocean, like the Gulf Stream, which may have a large effect on regional sea level. We characterize the variability of regional SSH along the Atlantic coast of the US using tide gauge observations along with fixed radiative forcing runs of CCSM4 and HR interactive ensemble runs. The interactive ensemble couples an ensemble mean atmosphere with a single ocean realization. This coupling results in a 30% decrease in the strength of the Atlantic meridional overturning circulation; therefore, the HR interactive ensemble is analogous to a HR hosing experiment. By characterizing the variability in these high-resolution GCM runs and observations we seek to understand what processes influence coastal SSH along the Eastern Coast of the United States and better predict future SLR.

  7. Uncertainty of global summer precipitation in the CMIP5 models: a comparison between high-resolution and low-resolution models

    NASA Astrophysics Data System (ADS)

    Huang, Danqing; Yan, Peiwen; Zhu, Jian; Zhang, Yaocun; Kuang, Xueyuan; Cheng, Jing

    2018-04-01

    The uncertainty of global summer precipitation simulated by the 23 CMIP5 CGCMs and the possible impacts of model resolutions are investigated in this study. Large uncertainties exist over the tropical and subtropical regions, which can be mainly attributed to convective precipitation simulation. High-resolution models (HRMs) and low-resolution models (LRMs) are further investigated to demonstrate their different contributions to the uncertainties of the ensemble mean. It shows that the high-resolution model ensemble means (HMME) and low-resolution model ensemble mean (LMME) mitigate the biases between the MME and observation over most continents and oceans, respectively. The HMME simulates more precipitation than the LMME over most oceans, but less precipitation over some continents. The dominant precipitation category in the HRMs (LRMs) is the heavy precipitation (moderate precipitation) over the tropic regions. The combinations of convective and stratiform precipitation are also quite different: the HMME has much higher ratio of stratiform precipitation while the LMME has more convective precipitation. Finally, differences in precipitation between the HMME and LMME can be traced to their differences in the SST simulations via the local and remote air-sea interaction.

  8. Spatial Ensemble Postprocessing of Precipitation Forecasts Using High Resolution Analyses

    NASA Astrophysics Data System (ADS)

    Lang, Moritz N.; Schicker, Irene; Kann, Alexander; Wang, Yong

    2017-04-01

    Ensemble prediction systems are designed to account for errors or uncertainties in the initial and boundary conditions, imperfect parameterizations, etc. However, due to sampling errors and underestimation of the model errors, these ensemble forecasts tend to be underdispersive, and to lack both reliability and sharpness. To overcome such limitations, statistical postprocessing methods are commonly applied to these forecasts. In this study, a full-distributional spatial post-processing method is applied to short-range precipitation forecasts over Austria using Standardized Anomaly Model Output Statistics (SAMOS). Following Stauffer et al. (2016), observation and forecast fields are transformed into standardized anomalies by subtracting a site-specific climatological mean and dividing by the climatological standard deviation. Due to the need of fitting only a single regression model for the whole domain, the SAMOS framework provides a computationally inexpensive method to create operationally calibrated probabilistic forecasts for any arbitrary location or for all grid points in the domain simultaneously. Taking advantage of the INCA system (Integrated Nowcasting through Comprehensive Analysis), high resolution analyses are used for the computation of the observed climatology and for model training. The INCA system operationally combines station measurements and remote sensing data into real-time objective analysis fields at 1 km-horizontal resolution and 1 h-temporal resolution. The precipitation forecast used in this study is obtained from a limited area model ensemble prediction system also operated by ZAMG. The so called ALADIN-LAEF provides, by applying a multi-physics approach, a 17-member forecast at a horizontal resolution of 10.9 km and a temporal resolution of 1 hour. The performed SAMOS approach statistically combines the in-house developed high resolution analysis and ensemble prediction system. The station-based validation of 6 hour precipitation sums shows a mean improvement of more than 40% in CRPS when compared to bilinearly interpolated uncalibrated ensemble forecasts. The validation on randomly selected grid points, representing the true height distribution over Austria, still indicates a mean improvement of 35%. The applied statistical model is currently set up for 6-hourly and daily accumulation periods, but will be extended to a temporal resolution of 1-3 hours within a new probabilistic nowcasting system operated by ZAMG.

  9. Multi-RCM ensemble downscaling of global seasonal forecasts (MRED)

    NASA Astrophysics Data System (ADS)

    Arritt, R. W.

    2008-12-01

    The Multi-RCM Ensemble Downscaling (MRED) project was recently initiated to address the question, Can regional climate models provide additional useful information from global seasonal forecasts? MRED will use a suite of regional climate models to downscale seasonal forecasts produced by the new National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) seasonal forecast system and the NASA GEOS5 system. The initial focus will be on wintertime forecasts in order to evaluate topographic forcing, snowmelt, and the potential usefulness of higher resolution, especially for near-surface fields influenced by high resolution orography. Each regional model will cover the conterminous US (CONUS) at approximately 32 km resolution, and will perform an ensemble of 15 runs for each year 1982-2003 for the forecast period 1 December - 30 April. MRED will compare individual regional and global forecasts as well as ensemble mean precipitation and temperature forecasts, which are currently being used to drive macroscale land surface models (LSMs), as well as wind, humidity, radiation, turbulent heat fluxes, which are important for more advanced coupled macro-scale hydrologic models. Metrics of ensemble spread will also be evaluated. Extensive analysis will be performed to link improvements in downscaled forecast skill to regional forcings and physical mechanisms. Our overarching goal is to determine what additional skill can be provided by a community ensemble of high resolution regional models, which we believe will eventually define a strategy for more skillful and useful regional seasonal climate forecasts.

  10. Addressing model uncertainty through stochastic parameter perturbations within the High Resolution Rapid Refresh (HRRR) ensemble

    NASA Astrophysics Data System (ADS)

    Wolff, J.; Jankov, I.; Beck, J.; Carson, L.; Frimel, J.; Harrold, M.; Jiang, H.

    2016-12-01

    It is well known that global and regional numerical weather prediction ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system for addressing the deficiencies in ensemble modeling is the use of stochastic physics to represent model-related uncertainty. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), Stochastic Perturbation of Physics Tendencies (SPPT), or some combination of all three. The focus of this study is to assess the model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) when using stochastic approaches. For this purpose, the test utilized a single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model, with ensemble members produced by employing stochastic methods. Parameter perturbations were employed in the Rapid Update Cycle (RUC) land surface model and Mellor-Yamada-Nakanishi-Niino (MYNN) planetary boundary layer scheme. Results will be presented in terms of bias, error, spread, skill, accuracy, reliability, and sharpness using the Model Evaluation Tools (MET) verification package. Due to the high level of complexity of running a frequently updating (hourly), high spatial resolution (3 km), large domain (CONUS) ensemble system, extensive high performance computing (HPC) resources were needed to meet this objective. Supercomputing resources were provided through the National Center for Atmospheric Research (NCAR) Strategic Capability (NSC) project support, allowing for a more extensive set of tests over multiple seasons, consequently leading to more robust results. Through the use of these stochastic innovations and powerful supercomputing at NCAR, further insights and advancements in ensemble forecasting at convection-permitting scales will be possible.

  11. The effect of Ocean resolution, and external forcing in the correlation between SLP and Sea Ice Concentration in the Pre-PRIMAVERA GCMs

    NASA Astrophysics Data System (ADS)

    Fuentes-Franco, Ramon; Koenigk, Torben

    2017-04-01

    Recently, an observational study has shown that sea ice variations in Barents Sea seem to be important for the sign of the following winter NAO (Koenigk et al. 2016). It has also been found that amplitude and extension of the Sea Level Pressure (SLP) patterns are modulated by Greenland and Labrador Seas ice areas. Therefore, Earth System Models participating in the PRIMAVERA Project are used to study the impact of resolution in ocean models in reproducing the previously mentioned observed correlation patterns between Sea Ice Concentration (SIC) and the SLP. When using ensembles of high ocean resolution (0.25 degrees) and low ocean resolution (1 degree) simulations, we found that the correlation sign between sea ice concentration over the Central Arctic, the Barents/Kara Seas and the Northern Hemisphere is similar to observations in the higher ocean resolution ensemble, although the amplitude is underestimated. In contrast, the low resolution ensemble shows opposite correlation patterns compared to observations. In general, high ocean resolution simulations show more similar results to observations than the low resolution simulations. Similarly, in order to study the mentioned observed SIC-SLP relationship reported by Koenigk et al (2016), we analyzed the impact of the use of pre-industrial and historical external forcing in the simulations. When using same forcing ensembles, we found that the correlation sign between SIC and SLP does not show a systematic behavior dependent on the use of different external forcing (pre-industrial or present day) as it does when using different ocean resolutions.

  12. Stochastic Forcing for High-Resolution Regional and Global Ocean and Atmosphere-Ocean Coupled Ensemble Forecast System

    NASA Astrophysics Data System (ADS)

    Rowley, C. D.; Hogan, P. J.; Martin, P.; Thoppil, P.; Wei, M.

    2017-12-01

    An extended range ensemble forecast system is being developed in the US Navy Earth System Prediction Capability (ESPC), and a global ocean ensemble generation capability to represent uncertainty in the ocean initial conditions has been developed. At extended forecast times, the uncertainty due to the model error overtakes the initial condition as the primary source of forecast uncertainty. Recently, stochastic parameterization or stochastic forcing techniques have been applied to represent the model error in research and operational atmospheric, ocean, and coupled ensemble forecasts. A simple stochastic forcing technique has been developed for application to US Navy high resolution regional and global ocean models, for use in ocean-only and coupled atmosphere-ocean-ice-wave ensemble forecast systems. Perturbation forcing is added to the tendency equations for state variables, with the forcing defined by random 3- or 4-dimensional fields with horizontal, vertical, and temporal correlations specified to characterize different possible kinds of error. Here, we demonstrate the stochastic forcing in regional and global ensemble forecasts with varying perturbation amplitudes and length and time scales, and assess the change in ensemble skill measured by a range of deterministic and probabilistic metrics.

  13. Exploring the calibration of a wind forecast ensemble for energy applications

    NASA Astrophysics Data System (ADS)

    Heppelmann, Tobias; Ben Bouallegue, Zied; Theis, Susanne

    2015-04-01

    In the German research project EWeLiNE, Deutscher Wetterdienst (DWD) and Fraunhofer Institute for Wind Energy and Energy System Technology (IWES) are collaborating with three German Transmission System Operators (TSO) in order to provide the TSOs with improved probabilistic power forecasts. Probabilistic power forecasts are derived from probabilistic weather forecasts, themselves derived from ensemble prediction systems (EPS). Since the considered raw ensemble wind forecasts suffer from underdispersiveness and bias, calibration methods are developed for the correction of the model bias and the ensemble spread bias. The overall aim is to improve the ensemble forecasts such that the uncertainty of the possible weather deployment is depicted by the ensemble spread from the first forecast hours. Additionally, the ensemble members after calibration should remain physically consistent scenarios. We focus on probabilistic hourly wind forecasts with horizon of 21 h delivered by the convection permitting high-resolution ensemble system COSMO-DE-EPS which has become operational in 2012 at DWD. The ensemble consists of 20 ensemble members driven by four different global models. The model area includes whole Germany and parts of Central Europe with a horizontal resolution of 2.8 km and a vertical resolution of 50 model levels. For verification we use wind mast measurements around 100 m height that corresponds to the hub height of wind energy plants that belong to wind farms within the model area. Calibration of the ensemble forecasts can be performed by different statistical methods applied to the raw ensemble output. Here, we explore local bivariate Ensemble Model Output Statistics at individual sites and quantile regression with different predictors. Applying different methods, we already show an improvement of ensemble wind forecasts from COSMO-DE-EPS for energy applications. In addition, an ensemble copula coupling approach transfers the time-dependencies of the raw ensemble to the calibrated ensemble. The calibrated wind forecasts are evaluated first with univariate probabilistic scores and additionally with diagnostics of wind ramps in order to assess the time-consistency of the calibrated ensemble members.

  14. High Resolution Global Topography of Eros from NEAR Imaging and LIDAR Data

    NASA Technical Reports Server (NTRS)

    Gaskell, Robert W.; Konopliv, A.; Barnouin-Jha, O.; Scheeres, D.

    2006-01-01

    Principal Data Products: Ensemble of L-maps from SPC, Spacecraft state, Asteroid pole and rotation. Secondary Products: Global topography model, inertia tensor, gravity. Composite high resolution topography. Three dimensional image maps.

  15. Multi-RCM ensemble downscaling of global seasonal forecasts (MRED)

    NASA Astrophysics Data System (ADS)

    Arritt, R.

    2009-04-01

    Regional climate models (RCMs) have long been used to downscale global climate simulations. In contrast the ability of RCMs to downscale seasonal climate forecasts has received little attention. The Multi-RCM Ensemble Downscaling (MRED) project was recently initiated to address the question, Does dynamical downscaling using RCMs provide additional useful information for seasonal forecasts made by global models? MRED is using a suite of RCMs to downscale seasonal forecasts produced by the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) seasonal forecast system and the NASA GEOS5 system. The initial focus is on wintertime forecasts in order to evaluate topographic forcing, snowmelt, and the usefulness of higher resolution for near-surface fields influenced by high resolution orography. Each RCM covers the conterminous U.S. at approximately 32 km resolution, comparable to the scale of the North American Regional Reanalysis (NARR) which will be used to evaluate the models. The forecast ensemble for each RCM is comprised of 15 members over a period of 22+ years (from 1982 to 2003+) for the forecast period 1 December - 30 April. Each RCM will create a 15-member lagged ensemble by starting on different dates in the preceding November. This results in a 120-member ensemble for each projection (8 RCMs by 15 members per RCM). The RCMs will be continually updated at their lateral boundaries using 6-hourly output from CFS or GEOS5. Hydrometeorological output will be produced in a standard netCDF-based format for a common analysis grid, which simplifies both model intercomparison and the generation of ensembles. MRED will compare individual RCM and global forecasts as well as ensemble mean precipitation and temperature forecasts, which are currently being used to drive macroscale land surface models (LSMs). Metrics of ensemble spread will also be evaluated. Extensive process-oriented analysis will be performed to link improvements in downscaled forecast skill to regional forcings and physical mechanisms. Our overarching goal is to determine what additional skill can be provided by a community ensemble of high resolution regional models, which we believe will define a strategy for more skillful and useful regional seasonal climate forecasts.

  16. Local-scale changes in mean and heavy precipitation in Western Europe, climate change or internal variability?

    NASA Astrophysics Data System (ADS)

    Aalbers, Emma E.; Lenderink, Geert; van Meijgaard, Erik; van den Hurk, Bart J. J. M.

    2018-06-01

    High-resolution climate information provided by e.g. regional climate models (RCMs) is valuable for exploring the changing weather under global warming, and assessing the local impact of climate change. While there is generally more confidence in the representativeness of simulated processes at higher resolutions, internal variability of the climate system—`noise', intrinsic to the chaotic nature of atmospheric and oceanic processes—is larger at smaller spatial scales as well, limiting the predictability of the climate signal. To quantify the internal variability and robustly estimate the climate signal, large initial-condition ensembles of climate simulations conducted with a single model provide essential information. We analyze a regional downscaling of a 16-member initial-condition ensemble over western Europe and the Alps at 0.11° resolution, similar to the highest resolution EURO-CORDEX simulations. We examine the strength of the forced climate response (signal) in mean and extreme daily precipitation with respect to noise due to internal variability, and find robust small-scale geographical features in the forced response, indicating regional differences in changes in the probability of events. However, individual ensemble members provide only limited information on the forced climate response, even for high levels of global warming. Although the results are based on a single RCM-GCM chain, we believe that they have general value in providing insight in the fraction of the uncertainty in high-resolution climate information that is irreducible, and can assist in the correct interpretation of fine-scale information in multi-model ensembles in terms of a forced response and noise due to internal variability.

  17. Local-scale changes in mean and heavy precipitation in Western Europe, climate change or internal variability?

    NASA Astrophysics Data System (ADS)

    Aalbers, Emma E.; Lenderink, Geert; van Meijgaard, Erik; van den Hurk, Bart J. J. M.

    2017-09-01

    High-resolution climate information provided by e.g. regional climate models (RCMs) is valuable for exploring the changing weather under global warming, and assessing the local impact of climate change. While there is generally more confidence in the representativeness of simulated processes at higher resolutions, internal variability of the climate system—`noise', intrinsic to the chaotic nature of atmospheric and oceanic processes—is larger at smaller spatial scales as well, limiting the predictability of the climate signal. To quantify the internal variability and robustly estimate the climate signal, large initial-condition ensembles of climate simulations conducted with a single model provide essential information. We analyze a regional downscaling of a 16-member initial-condition ensemble over western Europe and the Alps at 0.11° resolution, similar to the highest resolution EURO-CORDEX simulations. We examine the strength of the forced climate response (signal) in mean and extreme daily precipitation with respect to noise due to internal variability, and find robust small-scale geographical features in the forced response, indicating regional differences in changes in the probability of events. However, individual ensemble members provide only limited information on the forced climate response, even for high levels of global warming. Although the results are based on a single RCM-GCM chain, we believe that they have general value in providing insight in the fraction of the uncertainty in high-resolution climate information that is irreducible, and can assist in the correct interpretation of fine-scale information in multi-model ensembles in terms of a forced response and noise due to internal variability.

  18. The ARPAL operational high resolution Poor Man's Ensemble, description and validation

    NASA Astrophysics Data System (ADS)

    Corazza, Matteo; Sacchetti, Davide; Antonelli, Marta; Drofa, Oxana

    2018-05-01

    The Meteo Hydrological Functional Center for Civil Protection of the Environmental Protection Agency of the Liguria Region is responsible for issuing forecasts primarily aimed at the Civil Protection needs. Several deterministic high resolution models, run every 6 or 12 h, are regularly used in the Center to elaborate weather forecasts at short to medium range. The Region is frequently affected by severe flash floods over its very small basins, characterized by a steep orography close to the sea. These conditions led the Center in the past years to pay particular attention to the use and development of high resolution model chains for explicit simulation of convective phenomena. For years, the availability of several models has been used by the forecasters for subjective analyses of the potential evolution of the atmosphere and of its uncertainty. More recently, an Interactive Poor Man's Ensemble has been developed, aimed at providing statistical ensemble variables to help forecaster's evaluations. In this paper the structure of this system is described and results are validated using the regional dense ground observational network.

  19. On the skill of various ensemble spread estimators for probabilistic short range wind forecasting

    NASA Astrophysics Data System (ADS)

    Kann, A.

    2012-05-01

    A variety of applications ranging from civil protection associated with severe weather to economical interests are heavily dependent on meteorological information. For example, a precise planning of the energy supply with a high share of renewables requires detailed meteorological information on high temporal and spatial resolution. With respect to wind power, detailed analyses and forecasts of wind speed are of crucial interest for the energy management. Although the applicability and the current skill of state-of-the-art probabilistic short range forecasts has increased during the last years, ensemble systems still show systematic deficiencies which limit its practical use. This paper presents methods to improve the ensemble skill of 10-m wind speed forecasts by combining deterministic information from a nowcasting system on very high horizontal resolution with uncertainty estimates from a limited area ensemble system. It is shown for a one month validation period that a statistical post-processing procedure (a modified non-homogeneous Gaussian regression) adds further skill to the probabilistic forecasts, especially beyond the nowcasting range after +6 h.

  20. Development of Gridded Ensemble Precipitation and Temperature Datasets for the Contiguous United States Plus Hawai'i and Alaska

    NASA Astrophysics Data System (ADS)

    Newman, A. J.; Clark, M. P.; Nijssen, B.; Wood, A.; Gutmann, E. D.; Mizukami, N.; Longman, R. J.; Giambelluca, T. W.; Cherry, J.; Nowak, K.; Arnold, J.; Prein, A. F.

    2016-12-01

    Gridded precipitation and temperature products are inherently uncertain due to myriad factors. These include interpolation from a sparse observation network, measurement representativeness, and measurement errors. Despite this inherent uncertainty, uncertainty is typically not included, or is a specific addition to each dataset without much general applicability across different datasets. A lack of quantitative uncertainty estimates for hydrometeorological forcing fields limits their utility to support land surface and hydrologic modeling techniques such as data assimilation, probabilistic forecasting and verification. To address this gap, we have developed a first of its kind gridded, observation-based ensemble of precipitation and temperature at a daily increment for the period 1980-2012 over the United States (including Alaska and Hawaii). A longer, higher resolution version (1970-present, 1/16th degree) has also been implemented to support real-time hydrologic- monitoring and prediction in several regional US domains. We will present the development and evaluation of the dataset, along with initial applications of the dataset for ensemble data assimilation and probabilistic evaluation of high resolution regional climate model simulations. We will also present results on the new high resolution products for Alaska and Hawaii (2 km and 250 m respectively), to complete the first ensemble observation based product suite for the entire 50 states. Finally, we will present plans to improve the ensemble dataset, focusing on efforts to improve the methods used for station interpolation and ensemble generation, as well as methods to fuse station data with numerical weather prediction model output.

  1. High resolution wind measurements for offshore wind energy development

    NASA Technical Reports Server (NTRS)

    Nghiem, Son Van (Inventor); Neumann, Gregory (Inventor)

    2013-01-01

    A method, apparatus, system, article of manufacture, and computer readable storage medium provide the ability to measure wind. Data at a first resolution (i.e., low resolution data) is collected by a satellite scatterometer. Thin slices of the data are determined. A collocation of the data slices are determined at each grid cell center to obtain ensembles of collocated data slices. Each ensemble of collocated data slices is decomposed into a mean part and a fluctuating part. The data is reconstructed at a second resolution from the mean part and a residue of the fluctuating part. A wind measurement is determined from the data at the second resolution using a wind model function. A description of the wind measurement is output.

  2. Insights in time dependent cross compartment sensitivities from ensemble simulations with the fully coupled subsurface-land surface-atmosphere model TerrSysMP

    NASA Astrophysics Data System (ADS)

    Schalge, Bernd; Rihani, Jehan; Haese, Barbara; Baroni, Gabriele; Erdal, Daniel; Haefliger, Vincent; Lange, Natascha; Neuweiler, Insa; Hendricks-Franssen, Harrie-Jan; Geppert, Gernot; Ament, Felix; Kollet, Stefan; Cirpka, Olaf; Saavedra, Pablo; Han, Xujun; Attinger, Sabine; Kunstmann, Harald; Vereecken, Harry; Simmer, Clemens

    2017-04-01

    Currently, an integrated approach to simulating the earth system is evolving where several compartment models are coupled to achieve the best possible physically consistent representation. We used the model TerrSysMP, which fully couples subsurface, land surface and atmosphere, in a synthetic study that mimicked the Neckar catchment in Southern Germany. A virtual reality run at a high resolution of 400m for the land surface and subsurface and 1.1km for the atmosphere was made. Ensemble runs at a lower resolution (800m for the land surface and subsurface) were also made. The ensemble was generated by varying soil and vegetation parameters and lateral atmospheric forcing among the different ensemble members in a systematic way. It was found that the ensemble runs deviated for some variables and some time periods largely from the virtual reality reference run (the reference run was not covered by the ensemble), which could be related to the different model resolutions. This was for example the case for river discharge in the summer. We also analyzed the spread of model states as function of time and found clear relations between the spread and the time of the year and weather conditions. For example, the ensemble spread of latent heat flux related to uncertain soil parameters was larger under dry soil conditions than under wet soil conditions. Another example is that the ensemble spread of atmospheric states was more influenced by uncertain soil and vegetation parameters under conditions of low air pressure gradients (in summer) than under conditions with larger air pressure gradients in winter. The analysis of the ensemble of fully coupled model simulations provided valuable insights in the dynamics of land-atmosphere feedbacks which we will further highlight in the presentation.

  3. Is high-resolution inverse characterization of heterogeneous river bed hydraulic conductivities needed and possible?

    NASA Astrophysics Data System (ADS)

    Kurtz, W.; Hendricks Franssen, H.-J.; Brunner, P.; Vereecken, H.

    2013-10-01

    River-aquifer exchange fluxes influence local and regional water balances and affect groundwater and river water quality and quantity. Unfortunately, river-aquifer exchange fluxes tend to be strongly spatially variable, and it is an open research question to which degree river bed heterogeneity has to be represented in a model in order to achieve reliable estimates of river-aquifer exchange fluxes. This research question is addressed in this paper with the help of synthetic simulation experiments, which mimic the Limmat aquifer in Zurich (Switzerland), where river-aquifer exchange fluxes and groundwater management activities play an important role. The solution of the unsaturated-saturated subsurface hydrological flow problem including river-aquifer interaction is calculated for ten different synthetic realities where the strongly heterogeneous river bed hydraulic conductivities (L) are perfectly known. Hydraulic head data (100 in the default scenario) are sampled from the synthetic realities. In subsequent data assimilation experiments, where L is unknown now, the hydraulic head data are used as conditioning information, with the help of the ensemble Kalman filter (EnKF). For each of the ten synthetic realities, four different ensembles of L are tested in the experiments with EnKF; one ensemble estimates high-resolution L fields with different L values for each element, and the other three ensembles estimate effective L values for 5, 3 or 2 zones. The calibration of higher-resolution L fields (i.e. fully heterogeneous or 5 zones) gives better results than the calibration of L for only 3 or 2 zones in terms of reproduction of states, stream-aquifer exchange fluxes and parameters. Effective L for a limited number of zones cannot always reproduce the true states and fluxes well and results in biased estimates of net exchange fluxes between aquifer and stream. Also in case only 10 head data are used for conditioning, the high-resolution characterization of L fields with EnKF is still feasible. For less heterogeneous river bed hydraulic conductivities, a high-resolution characterization of L is less important. When uncertainties in the hydraulic parameters of the aquifer are also regarded in the assimilation, the errors in state and flux predictions increase, but the ensemble with a high spatial resolution for L still outperforms the ensembles with effective L values. We conclude that for strongly heterogeneous river beds the commonly applied simplified representation of the streambed, with spatially homogeneous parameters or constant parameters for a few zones, might yield significant biases in the characterization of the water balance. For strongly heterogeneous river beds, we suggest adopting a stochastic field approach to model the spatially heterogeneous river beds geostatistically. The paper illustrates that EnKF is able to calibrate such heterogeneous streambeds on the basis of hydraulic head measurements, outperforming zonation approaches.

  4. Semantic labeling of high-resolution aerial images using an ensemble of fully convolutional networks

    NASA Astrophysics Data System (ADS)

    Sun, Xiaofeng; Shen, Shuhan; Lin, Xiangguo; Hu, Zhanyi

    2017-10-01

    High-resolution remote sensing data classification has been a challenging and promising research topic in the community of remote sensing. In recent years, with the rapid advances of deep learning, remarkable progress has been made in this field, which facilitates a transition from hand-crafted features designing to an automatic end-to-end learning. A deep fully convolutional networks (FCNs) based ensemble learning method is proposed to label the high-resolution aerial images. To fully tap the potentials of FCNs, both the Visual Geometry Group network and a deeper residual network, ResNet, are employed. Furthermore, to enlarge training samples with diversity and gain better generalization, in addition to the commonly used data augmentation methods (e.g., rotation, multiscale, and aspect ratio) in the literature, aerial images from other datasets are also collected for cross-scene learning. Finally, we combine these learned models to form an effective FCN ensemble and refine the results using a fully connected conditional random field graph model. Experiments on the ISPRS 2-D Semantic Labeling Contest dataset show that our proposed end-to-end classification method achieves an overall accuracy of 90.7%, a state-of-the-art in the field.

  5. Changing precipitation in western Europe, climate change or natural variability?

    NASA Astrophysics Data System (ADS)

    Aalbers, Emma; Lenderink, Geert; van Meijgaard, Erik; van den Hurk, Bart

    2017-04-01

    Multi-model RCM-GCM ensembles provide high resolution climate projections, valuable for among others climate impact assessment studies. While the application of multiple models (both GCMs and RCMs) provides a certain robustness with respect to model uncertainty, the interpretation of differences between ensemble members - the combined result of model uncertainty and natural variability of the climate system - is not straightforward. Natural variability is intrinsic to the climate system, and a potentially large source of uncertainty in climate change projections, especially for projections on the local to regional scale. To quantify the natural variability and get a robust estimate of the forced climate change response (given a certain model and forcing scenario), large ensembles of climate model simulations of the same model provide essential information. While for global climate models (GCMs) a number of such large single model ensembles exists and have been analyzed, for regional climate models (RCMs) the number and size of single model ensembles is limited, and the predictability of the forced climate response at the local to regional scale is still rather uncertain. We present a regional downscaling of a 16-member single model ensemble over western Europe and the Alps at a resolution of 0.11 degrees (˜12km), similar to the highest resolution EURO-CORDEX simulations. This 16-member ensemble was generated by the GCM EC-EARTH, which was downscaled with the RCM RACMO for the period 1951-2100. This single model ensemble has been investigated in terms of the ensemble mean response (our estimate of the forced climate response), as well as the difference between the ensemble members, which measures natural variability. We focus on the response in seasonal mean and extreme precipitation (seasonal maxima and extremes with a return period up to 20 years) for the near to far future. For most precipitation indices we can reliably determine the climate change signal, given the applied model chain and forcing scenario. However, the analysis also shows how limited the information in single ensemble members is on the local scale forced climate response, even for high levels of global warming when the forced response has emerged from natural variability. Analysis and application of multi-model ensembles like EURO-CORDEX should go hand-in-hand with single model ensembles, like the one presented here, to be able to correctly interpret the fine-scale information in terms of a forced signal and random noise due to natural variability.

  6. Quasi-most unstable modes: a window to 'À la carte' ensemble diversity?

    NASA Astrophysics Data System (ADS)

    Homar Santaner, Victor; Stensrud, David J.

    2010-05-01

    The atmospheric scientific community is nowadays facing the ambitious challenge of providing useful forecasts of atmospheric events that produce high societal impact. The low level of social resilience to false alarms creates tremendous pressure on forecasting offices to issue accurate, timely and reliable warnings.Currently, no operational numerical forecasting system is able to respond to the societal demand for high-resolution (in time and space) predictions in the 12-72h time span. The main reasons for such deficiencies are the lack of adequate observations and the high non-linearity of the numerical models that are currently used. The whole weather forecasting problem is intrinsically probabilistic and current methods aim at coping with the various sources of uncertainties and the error propagation throughout the forecasting system. This probabilistic perspective is often created by generating ensembles of deterministic predictions that are aimed at sampling the most important sources of uncertainty in the forecasting system. The ensemble generation/sampling strategy is a crucial aspect of their performance and various methods have been proposed. Although global forecasting offices have been using ensembles of perturbed initial conditions for medium-range operational forecasts since 1994, no consensus exists regarding the optimum sampling strategy for high resolution short-range ensemble forecasts. Bred vectors, however, have been hypothesized to better capture the growing modes in the highly nonlinear mesoscale dynamics of severe episodes than singular vectors or observation perturbations. Yet even this technique is not able to produce enough diversity in the ensembles to accurately and routinely predict extreme phenomena such as severe weather. Thus, we propose a new method to generate ensembles of initial conditions perturbations that is based on the breeding technique. Given a standard bred mode, a set of customized perturbations is derived with specified amplitudes and horizontal scales. This allows the ensemble to excite growing modes across a wider range of scales. Results show that this approach produces significantly more spread in the ensemble prediction than standard bred modes alone. Several examples that illustrate the benefits from this approach for severe weather forecasts will be provided.

  7. A Prototype Regional GSI-based EnKF-Variational Hybrid Data Assimilation System for the Rapid Refresh Forecasting System: Dual-Resolution Implementation and Testing Results

    NASA Astrophysics Data System (ADS)

    Pan, Yujie; Xue, Ming; Zhu, Kefeng; Wang, Mingjun

    2018-05-01

    A dual-resolution (DR) version of a regional ensemble Kalman filter (EnKF)-3D ensemble variational (3DEnVar) coupled hybrid data assimilation system is implemented as a prototype for the operational Rapid Refresh forecasting system. The DR 3DEnVar system combines a high-resolution (HR) deterministic background forecast with lower-resolution (LR) EnKF ensemble perturbations used for flow-dependent background error covariance to produce a HR analysis. The computational cost is substantially reduced by running the ensemble forecasts and EnKF analyses at LR. The DR 3DEnVar system is tested with 3-h cycles over a 9-day period using a 40/˜13-km grid spacing combination. The HR forecasts from the DR hybrid analyses are compared with forecasts launched from HR Gridpoint Statistical Interpolation (GSI) 3D variational (3DVar) analyses, and single LR hybrid analyses interpolated to the HR grid. With the DR 3DEnVar system, a 90% weight for the ensemble covariance yields the lowest forecast errors and the DR hybrid system clearly outperforms the HR GSI 3DVar. Humidity and wind forecasts are also better than those launched from interpolated LR hybrid analyses, but the temperature forecasts are slightly worse. The humidity forecasts are improved most. For precipitation forecasts, the DR 3DEnVar always outperforms HR GSI 3DVar. It also outperforms the LR 3DEnVar, except for the initial forecast period and lower thresholds.

  8. A Simple Ensemble Simulation Technique for Assessment of Future Variations in Specific High-Impact Weather Events

    NASA Astrophysics Data System (ADS)

    Taniguchi, Kenji

    2018-04-01

    To investigate future variations in high-impact weather events, numerous samples are required. For the detailed assessment in a specific region, a high spatial resolution is also required. A simple ensemble simulation technique is proposed in this paper. In the proposed technique, new ensemble members were generated from one basic state vector and two perturbation vectors, which were obtained by lagged average forecasting simulations. Sensitivity experiments with different numbers of ensemble members, different simulation lengths, and different perturbation magnitudes were performed. Experimental application to a global warming study was also implemented for a typhoon event. Ensemble-mean results and ensemble spreads of total precipitation, atmospheric conditions showed similar characteristics across the sensitivity experiments. The frequencies of the maximum total and hourly precipitation also showed similar distributions. These results indicate the robustness of the proposed technique. On the other hand, considerable ensemble spread was found in each ensemble experiment. In addition, the results of the application to a global warming study showed possible variations in the future. These results indicate that the proposed technique is useful for investigating various meteorological phenomena and the impacts of global warming. The results of the ensemble simulations also enable the stochastic evaluation of differences in high-impact weather events. In addition, the impacts of a spectral nudging technique were also examined. The tracks of a typhoon were quite different between cases with and without spectral nudging; however, the ranges of the tracks among ensemble members were comparable. It indicates that spectral nudging does not necessarily suppress ensemble spread.

  9. In the eye of the beholder: Inhomogeneous distribution of high-resolution shapes within the random-walk ensemble

    NASA Astrophysics Data System (ADS)

    Müller, Christian L.; Sbalzarini, Ivo F.; van Gunsteren, Wilfred F.; Žagrović, Bojan; Hünenberger, Philippe H.

    2009-06-01

    The concept of high-resolution shapes (also referred to as folds or states, depending on the context) of a polymer chain plays a central role in polymer science, structural biology, bioinformatics, and biopolymer dynamics. However, although the idea of shape is intuitively very useful, there is no unambiguous mathematical definition for this concept. In the present work, the distributions of high-resolution shapes within the ideal random-walk ensembles with N =3,…,6 beads (or up to N =10 for some properties) are investigated using a systematic (grid-based) approach based on a simple working definition of shapes relying on the root-mean-square atomic positional deviation as a metric (i.e., to define the distance between pairs of structures) and a single cutoff criterion for the shape assignment. Although the random-walk ensemble appears to represent the paramount of homogeneity and randomness, this analysis reveals that the distribution of shapes within this ensemble, i.e., in the total absence of interatomic interactions characteristic of a specific polymer (beyond the generic connectivity constraint), is significantly inhomogeneous. In particular, a specific (densest) shape occurs with a local probability that is 1.28, 1.79, 2.94, and 10.05 times (N =3,…,6) higher than the corresponding average over all possible shapes (these results can tentatively be extrapolated to a factor as large as about 1028 for N =100). The qualitative results of this analysis lead to a few rather counterintuitive suggestions, namely, that, e.g., (i) a fold classification analysis applied to the random-walk ensemble would lead to the identification of random-walk "folds;" (ii) a clustering analysis applied to the random-walk ensemble would also lead to the identification random-walk "states" and associated relative free energies; and (iii) a random-walk ensemble of polymer chains could lead to well-defined diffraction patterns in hypothetical fiber or crystal diffraction experiments. The inhomogeneous nature of the shape probability distribution identified here for random walks may represent a significant underlying baseline effect in the analysis of real polymer chain ensembles (i.e., in the presence of specific interatomic interactions). As a consequence, a part of what is called a polymer shape may actually reside just "in the eye of the beholder" rather than in the nature of the interactions between the constituting atoms, and the corresponding observation-related bias should be taken into account when drawing conclusions from shape analyses as applied to real structural ensembles.

  10. In the eye of the beholder: Inhomogeneous distribution of high-resolution shapes within the random-walk ensemble.

    PubMed

    Müller, Christian L; Sbalzarini, Ivo F; van Gunsteren, Wilfred F; Zagrović, Bojan; Hünenberger, Philippe H

    2009-06-07

    The concept of high-resolution shapes (also referred to as folds or states, depending on the context) of a polymer chain plays a central role in polymer science, structural biology, bioinformatics, and biopolymer dynamics. However, although the idea of shape is intuitively very useful, there is no unambiguous mathematical definition for this concept. In the present work, the distributions of high-resolution shapes within the ideal random-walk ensembles with N=3,...,6 beads (or up to N=10 for some properties) are investigated using a systematic (grid-based) approach based on a simple working definition of shapes relying on the root-mean-square atomic positional deviation as a metric (i.e., to define the distance between pairs of structures) and a single cutoff criterion for the shape assignment. Although the random-walk ensemble appears to represent the paramount of homogeneity and randomness, this analysis reveals that the distribution of shapes within this ensemble, i.e., in the total absence of interatomic interactions characteristic of a specific polymer (beyond the generic connectivity constraint), is significantly inhomogeneous. In particular, a specific (densest) shape occurs with a local probability that is 1.28, 1.79, 2.94, and 10.05 times (N=3,...,6) higher than the corresponding average over all possible shapes (these results can tentatively be extrapolated to a factor as large as about 10(28) for N=100). The qualitative results of this analysis lead to a few rather counterintuitive suggestions, namely, that, e.g., (i) a fold classification analysis applied to the random-walk ensemble would lead to the identification of random-walk "folds;" (ii) a clustering analysis applied to the random-walk ensemble would also lead to the identification random-walk "states" and associated relative free energies; and (iii) a random-walk ensemble of polymer chains could lead to well-defined diffraction patterns in hypothetical fiber or crystal diffraction experiments. The inhomogeneous nature of the shape probability distribution identified here for random walks may represent a significant underlying baseline effect in the analysis of real polymer chain ensembles (i.e., in the presence of specific interatomic interactions). As a consequence, a part of what is called a polymer shape may actually reside just "in the eye of the beholder" rather than in the nature of the interactions between the constituting atoms, and the corresponding observation-related bias should be taken into account when drawing conclusions from shape analyses as applied to real structural ensembles.

  11. The added value of stochastic spatial disaggregation for short-term rainfall forecasts currently available in Canada

    NASA Astrophysics Data System (ADS)

    Gagnon, Patrick; Rousseau, Alain N.; Charron, Dominique; Fortin, Vincent; Audet, René

    2017-11-01

    Several businesses and industries rely on rainfall forecasts to support their day-to-day operations. To deal with the uncertainty associated with rainfall forecast, some meteorological organisations have developed products, such as ensemble forecasts. However, due to the intensive computational requirements of ensemble forecasts, the spatial resolution remains coarse. For example, Environment and Climate Change Canada's (ECCC) Global Ensemble Prediction System (GEPS) data is freely available on a 1-degree grid (about 100 km), while those of the so-called High Resolution Deterministic Prediction System (HRDPS) are available on a 2.5-km grid (about 40 times finer). Potential users are then left with the option of using either a high-resolution rainfall forecast without uncertainty estimation and/or an ensemble with a spectrum of plausible rainfall values, but at a coarser spatial scale. The objective of this study was to evaluate the added value of coupling the Gibbs Sampling Disaggregation Model (GSDM) with ECCC products to provide accurate, precise and consistent rainfall estimates at a fine spatial resolution (10-km) within a forecast framework (6-h). For 30, 6-h, rainfall events occurring within a 40,000-km2 area (Québec, Canada), results show that, using 100-km aggregated reference rainfall depths as input, statistics of the rainfall fields generated by GSDM were close to those of the 10-km reference field. However, in forecast mode, GSDM outcomes inherit of the ECCC forecast biases, resulting in a poor performance when GEPS data were used as input, mainly due to the inherent rainfall depth distribution of the latter product. Better performance was achieved when the Regional Deterministic Prediction System (RDPS), available on a 10-km grid and aggregated at 100-km, was used as input to GSDM. Nevertheless, most of the analyzed ensemble forecasts were weakly consistent. Some areas of improvement are identified herein.

  12. Assimilation of glider and mooring data into a coastal ocean model

    NASA Astrophysics Data System (ADS)

    Jones, Emlyn M.; Oke, Peter R.; Rizwi, Farhan; Murray, Lawrence M.

    We have applied an ensemble optimal interpolation (EnOI) data assimilation system to a high resolution coastal ocean model of south-east Tasmania, Australia. The region is characterised by a complex coastline with water masses influenced by riverine input and the interaction between two offshore current systems. Using a large static ensemble to estimate the systems background error covariance, data from a coastal observing network of fixed moorings and a Slocum glider are assimilated into the model at daily intervals. We demonstrate that the EnOI algorithm can successfully correct a biased high resolution coastal model. In areas with dense observations, the assimilation scheme reduces the RMS difference between the model and independent GHRSST observations by 90%, while the domain-wide RMS difference is reduced by a more modest 40%. Our findings show that errors introduced by surface forcing and boundary conditions can be identified and reduced by a relatively sparse observing array using an inexpensive ensemble-based data assimilation system.

  13. A new aircraft hurricane wind climatology and applications in assessing the predictive skill of tropical cyclone intensity using high-resolution ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Judt, Falko; Chen, Shuyi S.

    2015-07-01

    Hurricane surface wind is a key measure of storm intensity. However, a climatology of hurricane winds is lacking to date, largely because hurricanes are relatively rare events and difficult to observe over the open ocean. Here we present a new hurricane wind climatology based on objective surface wind analyses, which are derived from Stepped Frequency Microwave Radiometer measurements acquired by NOAA WP-3D and U.S. Air Force WC-130J hurricane hunter aircraft. The wind data were collected during 72 aircraft reconnaissance missions into 21 western Atlantic hurricanes from 1998 to 2012. This climatology provides an opportunity to validate hurricane intensity forecasts beyond the simplistic maximum wind speed metric and allows evaluating the predictive skill of probabilistic hurricane intensity forecasts using high-resolution model ensembles. An example of application is presented here using a 1.3 km grid spacing Weather Research and Forecasting model ensemble forecast of Hurricane Earl (2010).

  14. A 12-year (1987-1998) Ensemble Simulation of the US Climate with a Variable Resolution Stretched Grid GCM

    NASA Technical Reports Server (NTRS)

    Fox-Rabinovitz, Michael S.; Takacs, Lawrence L.; Govindaraju, Ravi C.

    2002-01-01

    The variable-resolution stretched-grid (SG) GEOS (Goddard Earth Observing System) GCM has been used for limited ensemble integrations with a relatively coarse, 60 to 100 km, regional resolution over the U.S. The experiments have been run for the 12-year period, 1987-1998, that includes the recent ENSO cycles. Initial conditions 1-2 days apart are used for ensemble members. The goal of the experiments is analyzing the long-term SG-GCM ensemble integrations in terms of their potential in reducing the uncertainties of regional climate simulation while producing realistic mesoscales. The ensemble integration results are analyzed for both prognostic and diagnostic fields. A special attention is devoted to analyzing the variability of precipitation over the U.S. The internal variability of the SG-GCM has been assessed. The ensemble means appear to be closer to the verifying analyses than the individual ensemble members. The ensemble means capture realistic mesoscale patterns, especially those of induced by orography. Two ENSO cycles have been analyzed in terms their impact on the U.S. climate, especially on precipitation. The ability of the SG-GCM simulations to produce regional climate anomalies has been confirmed. However, the optimal size of the ensembles depending on fine regional resolution used, is still to be determined. The SG-GCM ensemble simulations are performed as a preparation or a preliminary stage for the international SGMIP (Stretched-Grid Model Intercomparison Project) that is under way with participation of the major centers and groups employing the SG-approach for regional climate modeling.

  15. On the incidence of meteorological and hydrological processors: Effect of resolution, sharpness and reliability of hydrological ensemble forecasts

    NASA Astrophysics Data System (ADS)

    Abaza, Mabrouk; Anctil, François; Fortin, Vincent; Perreault, Luc

    2017-12-01

    Meteorological and hydrological ensemble prediction systems are imperfect. Their outputs could often be improved through the use of a statistical processor, opening up the question of the necessity of using both processors (meteorological and hydrological), only one of them, or none. This experiment compares the predictive distributions from four hydrological ensemble prediction systems (H-EPS) utilising the Ensemble Kalman filter (EnKF) probabilistic sequential data assimilation scheme. They differ in the inclusion or not of the Distribution Based Scaling (DBS) method for post-processing meteorological forecasts and the ensemble Bayesian Model Averaging (ensemble BMA) method for hydrological forecast post-processing. The experiment is implemented on three large watersheds and relies on the combination of two meteorological reforecast products: the 4-member Canadian reforecasts from the Canadian Centre for Meteorological and Environmental Prediction (CCMEP) and the 10-member American reforecasts from the National Oceanic and Atmospheric Administration (NOAA), leading to 14 members at each time step. Results show that all four tested H-EPS lead to resolution and sharpness values that are quite similar, with an advantage to DBS + EnKF. The ensemble BMA is unable to compensate for any bias left in the precipitation ensemble forecasts. On the other hand, it succeeds in calibrating ensemble members that are otherwise under-dispersed. If reliability is preferred over resolution and sharpness, DBS + EnKF + ensemble BMA performs best, making use of both processors in the H-EPS system. Conversely, for enhanced resolution and sharpness, DBS is the preferred method.

  16. Estimation of wind regime from combination of RCM and NWP data in the Gulf of Riga (Baltic Sea)

    NASA Astrophysics Data System (ADS)

    Sile, T.; Sennikovs, J.; Bethers, U.

    2012-04-01

    Gulf of Riga is a semi-enclosed gulf located in the Eastern part of the Baltic Sea. Reliable wind climate data is crucial for the development of wind energy. The objective of this study is to create high resolution wind parameter datasets for the Gulf of Riga using climate and numerical weather prediction (NWP) models as an alternative to methods that rely on observations with the expectation of benefit from comparing different approaches. The models used for the estimation of the wind regime are an ensemble of Regional Climate Models (RCM, ENSEMBLES, 23 runs are considered) and high resolution NWP data. Future projections provided by RCM are of interest however their spatial resolution is unsatisfactory. We describe a method of spatial refinement of RCM data using NWP data to resolve small scale features. We apply the method of RCM bias correction (Sennikovs and Bethers, 2009) previously used for temperature and precipitation to wind data and use NWP data instead of observations. The refinement function is calculated using contemporary climate (1981- 2010) and later applied to RCM near future (2021 - 2050) projections to produce a dataset with the same resolution as NWP data. This method corrects for RCM biases that were shown to be present in the initial analysis and inter-model statistical analysis was carried out to estimate uncertainty. Using the datasets produced by this method the current and future projections of wind speed and wind energy density are calculated. Acknowledgments: This research is part of the GORWIND (The Gulf of Riga as a Resource for Wind Energy) project (EU34711). The ENSEMBLES data used in this work was funded by the EU FP6 Integrated Project ENSEMBLES (Contract number 505539) whose support is gratefully acknowledged.

  17. National Centers for Environmental Prediction

    Science.gov Websites

    NAM Specifications/References [<--click here] Rapid Refresh (RAP) [<--click here] High -Resolution Rapid Refresh (HRRR) [<--click here] Short-range Ensemble Forecast (SREF) system [<--click

  18. Robust isotropic super-resolution by maximizing a Laplace posterior for MRI volumes

    NASA Astrophysics Data System (ADS)

    Han, Xian-Hua; Iwamoto, Yutaro; Shiino, Akihiko; Chen, Yen-Wei

    2014-03-01

    Magnetic resonance imaging can only acquire volume data with finite resolution due to various factors. In particular, the resolution in one direction (such as the slice direction) is much lower than others (such as the in-plane direction), yielding un-realistic visualizations. This study explores to reconstruct MRI isotropic resolution volumes from three orthogonal scans. This proposed super- resolution reconstruction is formulated as a maximum a posterior (MAP) problem, which relies on the generation model of the acquired scans from the unknown high-resolution volumes. Generally, the deviation ensemble of the reconstructed high-resolution (HR) volume from the available LR ones in the MAP is represented as a Gaussian distribution, which usually results in some noise and artifacts in the reconstructed HR volume. Therefore, this paper investigates a robust super-resolution by formulating the deviation set as a Laplace distribution, which assumes sparsity in the deviation ensemble based on the possible insight of the appeared large values only around some unexpected regions. In addition, in order to achieve reliable HR MRI volume, we integrates the priors such as bilateral total variation (BTV) and non-local mean (NLM) into the proposed MAP framework for suppressing artifacts and enriching visual detail. We validate the proposed robust SR strategy using MRI mouse data with high-definition resolution in two direction and low-resolution in one direction, which are imaged in three orthogonal scans: axial, coronal and sagittal planes. Experiments verifies that the proposed strategy can achieve much better HR MRI volumes than the conventional MAP method even with very high-magnification factor: 10.

  19. Application of the LEPS technique for Quantitative Precipitation Forecasting (QPF) in Southern Italy: a preliminary study

    NASA Astrophysics Data System (ADS)

    Federico, S.; Avolio, E.; Bellecci, C.; Colacino, M.; Walko, R. L.

    2006-03-01

    This paper reports preliminary results for a Limited area model Ensemble Prediction System (LEPS), based on RAMS (Regional Atmospheric Modelling System), for eight case studies of moderate-intense precipitation over Calabria, the southernmost tip of the Italian peninsula. LEPS aims to transfer the benefits of a probabilistic forecast from global to regional scales in countries where local orographic forcing is a key factor to force convection. To accomplish this task and to limit computational time in an operational implementation of LEPS, we perform a cluster analysis of ECMWF-EPS runs. Starting from the 51 members that form the ECMWF-EPS we generate five clusters. For each cluster a representative member is selected and used to provide initial and dynamic boundary conditions to RAMS, whose integrations generate LEPS. RAMS runs have 12-km horizontal resolution. To analyze the impact of enhanced horizontal resolution on quantitative precipitation forecasts, LEPS forecasts are compared to a full Brute Force (BF) ensemble. This ensemble is based on RAMS, has 36 km horizontal resolution and is generated by 51 members, nested in each ECMWF-EPS member. LEPS and BF results are compared subjectively and by objective scores. Subjective analysis is based on precipitation and probability maps of case studies whereas objective analysis is made by deterministic and probabilistic scores. Scores and maps are calculated by comparing ensemble precipitation forecasts against reports from the Calabria regional raingauge network. Results show that LEPS provided better rainfall predictions than BF for all case studies selected. This strongly suggests the importance of the enhanced horizontal resolution, compared to ensemble population, for Calabria for these cases. To further explore the impact of local physiographic features on QPF (Quantitative Precipitation Forecasting), LEPS results are also compared with a 6-km horizontal resolution deterministic forecast. Due to local and mesoscale forcing, the high resolution forecast (Hi-Res) has better performance compared to the ensemble mean for rainfall thresholds larger than 10mm but it tends to overestimate precipitation for lower amounts. This yields larger false alarms that have a detrimental effect on objective scores for lower thresholds. To exploit the advantages of a probabilistic forecast compared to a deterministic one, the relation between the ECMWF-EPS 700 hPa geopotential height spread and LEPS performance is analyzed. Results are promising even if additional studies are required.

  20. Evolution of Precipitation Extremes in Three Large Ensembles of Climate Simulations - Impact of Spatial and Temporal Resolutions

    NASA Astrophysics Data System (ADS)

    Martel, J. L.; Brissette, F.; Mailhot, A.; Wood, R. R.; Ludwig, R.; Frigon, A.; Leduc, M.; Turcotte, R.

    2017-12-01

    Recent studies indicate that the frequency and intensity of extreme precipitation will increase in future climate due to global warming. In this study, we compare annual maxima precipitation series from three large ensembles of climate simulations at various spatial and temporal resolutions. The first two are at the global scale: the Canadian Earth System Model (CanESM2) 50-member large ensemble (CanESM2-LE) at a 2.8° resolution and the Community Earth System Model (CESM1) 40-member large ensemble (CESM1-LE) at a 1° resolution. The third ensemble is at the regional scale over both Eastern North America and Europe: the Canadian Regional Climate Model (CRCM5) 50-member large ensemble (CRCM5-LE) at a 0.11° resolution, driven at its boundaries by the CanESM-LE. The CRCM5-LE is a new ensemble issued from the ClimEx project (http://www.climex-project.org), a Québec-Bavaria collaboration. Using these three large ensembles, change in extreme precipitations over the historical (1980-2010) and future (2070-2100) periods are investigated. This results in 1 500 (30 years x 50 members for CanESM2-LE and CRCM5-LE) and 1200 (30 years x 40 members for CESM1-LE) simulated years over both the historical and future periods. Using these large datasets, the empirical daily (and sub-daily for CRCM5-LE) extreme precipitation quantiles for large return periods ranging from 2 to 100 years are computed. Results indicate that daily extreme precipitations generally will increase over most land grid points of both domains according to the three large ensembles. Regarding the CRCM5-LE, the increase in sub-daily extreme precipitations will be even more important than the one observed for daily extreme precipitations. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety.

  1. Modulating RNA Alignment Using Directional Dynamic Kinks: Application in Determining an Atomic-Resolution Ensemble for a Hairpin using NMR Residual Dipolar Couplings.

    PubMed

    Salmon, Loïc; Giambaşu, George M; Nikolova, Evgenia N; Petzold, Katja; Bhattacharya, Akash; Case, David A; Al-Hashimi, Hashim M

    2015-10-14

    Approaches that combine experimental data and computational molecular dynamics (MD) to determine atomic resolution ensembles of biomolecules require the measurement of abundant experimental data. NMR residual dipolar couplings (RDCs) carry rich dynamics information, however, difficulties in modulating overall alignment of nucleic acids have limited the ability to fully extract this information. We present a strategy for modulating RNA alignment that is based on introducing variable dynamic kinks in terminal helices. With this strategy, we measured seven sets of RDCs in a cUUCGg apical loop and used this rich data set to test the accuracy of an 0.8 μs MD simulation computed using the Amber ff10 force field as well as to determine an atomic resolution ensemble. The MD-generated ensemble quantitatively reproduces the measured RDCs, but selection of a sub-ensemble was required to satisfy the RDCs within error. The largest discrepancies between the RDC-selected and MD-generated ensembles are observed for the most flexible loop residues and backbone angles connecting the loop to the helix, with the RDC-selected ensemble resulting in more uniform dynamics. Comparison of the RDC-selected ensemble with NMR spin relaxation data suggests that the dynamics occurs on the ps-ns time scales as verified by measurements of R(1ρ) relaxation-dispersion data. The RDC-satisfying ensemble samples many conformations adopted by the hairpin in crystal structures indicating that intrinsic plasticity may play important roles in conformational adaptation. The approach presented here can be applied to test nucleic acid force fields and to characterize dynamics in diverse RNA motifs at atomic resolution.

  2. Using ensembles in water management: forecasting dry and wet episodes

    NASA Astrophysics Data System (ADS)

    van het Schip-Haverkamp, Tessa; van den Berg, Wim; van de Beek, Remco

    2015-04-01

    Extreme weather situations as droughts and extensive precipitation are becoming more frequent, which makes it more important to obtain accurate weather forecasts for the short and long term. Ensembles can provide a solution in terms of scenario forecasts. MeteoGroup uses ensembles in a new forecasting technique which presents a number of weather scenarios for a dynamical water management project, called Water-Rijk, in which water storage and water retention plays a large role. The Water-Rijk is part of Park Lingezegen, which is located between Arnhem and Nijmegen in the Netherlands. In collaboration with the University of Wageningen, Alterra and Eijkelkamp a forecasting system is developed for this area which can provide water boards with a number of weather and hydrology scenarios in order to assist in the decision whether or not water retention or water storage is necessary in the near future. In order to make a forecast for drought and extensive precipitation, the difference 'precipitation- evaporation' is used as a measurement of drought in the weather forecasts. In case of an upcoming drought this difference will take larger negative values. In case of a wet episode, this difference will be positive. The Makkink potential evaporation is used which gives the most accurate potential evaporation values during the summer, when evaporation plays an important role in the availability of surface water. Scenarios are determined by reducing the large number of forecasts in the ensemble to a number of averaged members with each its own likelihood of occurrence. For the Water-Rijk project 5 scenario forecasts are calculated: extreme dry, dry, normal, wet and extreme wet. These scenarios are constructed for two forecasting periods, each using its own ensemble technique: up to 48 hours ahead and up to 15 days ahead. The 48-hour forecast uses an ensemble constructed from forecasts of multiple high-resolution regional models: UKMO's Euro4 model,the ECMWF model, WRF and Hirlam. Using multiple model runs and additional post processing, an ensemble can be created from non-ensemble models. The 15-day forecast uses the ECMWF Ensemble Prediction System forecast from which scenarios can be deduced directly. A combination of the ensembles from the two forecasting periods is used in order to have the highest possible resolution of the forecast for the first 48 hours followed by the lower resolution long term forecast.

  3. Evolution of extreme temperature events in short term climate projection for Iberian Peninsula.

    NASA Astrophysics Data System (ADS)

    Rodriguez, Alfredo; Tarquis, Ana M.; Sanchez, Enrique; Dosio, Alessandro; Ruiz-Ramos, Margarita

    2014-05-01

    Extreme events of maximum and minimum temperatures are a main hazard for agricultural production in Iberian Peninsula. For this purpose, in this study we analyze projections of their evolution that could be valid for the next decade, represented in this study by the 30-year period 2004-2034 (target period). For this purpose two kinds of data were used in this study: 1) observations from the station network of AEMET (Spanish National Meteorological Agency) for five Spanish locations, and 2) simulated data at a resolution of 50 ×50 km horizontal grid derived from the outputs of twelve Regional Climate Models (RCMs) taken from project ENSEMBLES (van der Linden and Mitchell, 2009), with a bias correction (Dosio and Paruolo, 2011; Dosio et al., 2012) regarding the observational dataset Spain02 (Herrera et al., 2012). To validate the simulated climate, the available period of observations was compared to a baseline period (1964-1994) of simulated climate for all locations. Then, to analyze the changes for the present/very next future, probability of extreme temperature events for 2004-2034 were compared to that of the baseline period. Although only minor changes are expected, small variations in variability may have a significant impact in crop performance. The objective of the work is to evaluate the utility of these short term projections for potential users, as for instance insurance companies. References Dosio A. and Paruolo P., 2011. Bias correction of the ENSEMBLES high-resolution climate change projections for use by impact models: Evaluation on the present climate. Journal of Geophysical Research, VOL. 116,D16106, doi:10.1029/2011JD015934 Dosio A., Paruolo P. and Rojas R., 2012. Bias correction of the ENSEMBLES high resolution climate change projections for use by impact models: Analysis of the climate change signal. Journal of Geophysical Research,Volume 117, D17, doi: 0.1029/2012JD017968 Herrera et. al. (2012) Development and Analysis of a 50 year high-resolution daily gridded precipitation dataset over Spain (Spain02). International Journal of Climatology 32:74-85 DOI: 10.1002/joc.2256. van der Linden, P., and J. F. B. Mitchell (Eds.) (2009), ENSEMBLES: Climate Change and Its Impacts: Summary of Research and Results From the ENSEMBLES Project, Met Off. Hadley Cent, Exeter, U. K.

  4. A stochastic ensemble-based model to predict crop water requirements from numerical weather forecasts and VIS-NIR high resolution satellite images in Southern Italy

    NASA Astrophysics Data System (ADS)

    Pelosi, Anna; Falanga Bolognesi, Salvatore; De Michele, Carlo; Medina Gonzalez, Hanoi; Villani, Paolo; D'Urso, Guido; Battista Chirico, Giovanni

    2015-04-01

    Irrigation agriculture is one the biggest consumer of water in Europe, especially in southern regions, where it accounts for up to 70% of the total water consumption. The EU Common Agricultural Policy, combined with the Water Framework Directive, imposes to farmers and irrigation managers a substantial increase of the efficiency in the use of water in agriculture for the next decade. Ensemble numerical weather predictions can be valuable data for developing operational advisory irrigation services. We propose a stochastic ensemble-based model providing spatial and temporal estimates of crop water requirements, implemented within an advisory service offering detailed maps of irrigation water requirements and crop water consumption estimates, to be used by water irrigation managers and farmers. The stochastic model combines estimates of crop potential evapotranspiration retrieved from ensemble numerical weather forecasts (COSMO-LEPS, 16 members, 7 km resolution) and canopy parameters (LAI, albedo, fractional vegetation cover) derived from high resolution satellite images in the visible and near infrared wavelengths. The service provides users with daily estimates of crop water requirements for lead times up to five days. The temporal evolution of the crop potential evapotranspiration is simulated with autoregressive models. An ensemble Kalman filter is employed for updating model states by assimilating both ground based meteorological variables (where available) and numerical weather forecasts. The model has been applied in Campania region (Southern Italy), where a satellite assisted irrigation advisory service has been operating since 2006. This work presents the results of the system performance for one year of experimental service. The results suggest that the proposed model can be an effective support for a sustainable use and management of irrigation water, under conditions of water scarcity and drought. Since the evapotranspiration term represents a staple component in the water balance of a catchment, as outstanding future development, the model could also offer an advanced support for water resources management decisions at catchment scale.

  5. Non-covalent nanodiamond-polymer dispersions and electrostatic immobilization of bovine serum albumin protein

    NASA Astrophysics Data System (ADS)

    Skaltsas, T.; Pispas, S.; Tagmatarchis, N.

    2015-11-01

    Nanodiamonds (NDs) lack efficient dispersion, not only in solvents but also in aqueous media. The latter is of great importance, considering the inherent biocompatibility of NDs and the plethora of suitable strategies for immobilizing functional biomolecules. In this work, a series of polymers was non-covalently interacted with NDs, forming ND-polymer ensembles, and their dispersibility and stability was examined. Dynamic light scattering gave valuable information regarding the size of the ensembles in liquid phase, while their morphology was further examined by high-resolution transmission electron microscopy imaging. In addition, thermal analysis measurements were applied to collect information on the thermal behavior of NDs and their ensembles and to calculate the amount of polymer interacting with the NDs, as well as the dispersibility values of the ND-polymer ensembles. Finally, the bovine serum albumin protein was electrostatically bound to a ND-polymer ensemble in which the polymeric moiety was carrying quaternized pyridine units.

  6. A target recognition method for maritime surveillance radars based on hybrid ensemble selection

    NASA Astrophysics Data System (ADS)

    Fan, Xueman; Hu, Shengliang; He, Jingbo

    2017-11-01

    In order to improve the generalisation ability of the maritime surveillance radar, a novel ensemble selection technique, termed Optimisation and Dynamic Selection (ODS), is proposed. During the optimisation phase, the non-dominated sorting genetic algorithm II for multi-objective optimisation is used to find the Pareto front, i.e. a set of ensembles of classifiers representing different tradeoffs between the classification error and diversity. During the dynamic selection phase, the meta-learning method is used to predict whether a candidate ensemble is competent enough to classify a query instance based on three different aspects, namely, feature space, decision space and the extent of consensus. The classification performance and time complexity of ODS are compared against nine other ensemble methods using a self-built full polarimetric high resolution range profile data-set. The experimental results clearly show the effectiveness of ODS. In addition, the influence of the selection of diversity measures is studied concurrently.

  7. An Efficient Local Correlation Matrix Decomposition Approach for the Localization Implementation of Ensemble-Based Assimilation Methods

    NASA Astrophysics Data System (ADS)

    Zhang, Hongqin; Tian, Xiangjun

    2018-04-01

    Ensemble-based data assimilation methods often use the so-called localization scheme to improve the representation of the ensemble background error covariance (Be). Extensive research has been undertaken to reduce the computational cost of these methods by using the localized ensemble samples to localize Be by means of a direct decomposition of the local correlation matrix C. However, the computational costs of the direct decomposition of the local correlation matrix C are still extremely high due to its high dimension. In this paper, we propose an efficient local correlation matrix decomposition approach based on the concept of alternating directions. This approach is intended to avoid direct decomposition of the correlation matrix. Instead, we first decompose the correlation matrix into 1-D correlation matrices in the three coordinate directions, then construct their empirical orthogonal function decomposition at low resolution. This procedure is followed by the 1-D spline interpolation process to transform the above decompositions to the high-resolution grid. Finally, an efficient correlation matrix decomposition is achieved by computing the very similar Kronecker product. We conducted a series of comparison experiments to illustrate the validity and accuracy of the proposed local correlation matrix decomposition approach. The effectiveness of the proposed correlation matrix decomposition approach and its efficient localization implementation of the nonlinear least-squares four-dimensional variational assimilation are further demonstrated by several groups of numerical experiments based on the Advanced Research Weather Research and Forecasting model.

  8. Hybrid Data Assimilation without Ensemble Filtering

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Akkraoui, Amal El

    2014-01-01

    The Global Modeling and Assimilation Office is preparing to upgrade its three-dimensional variational system to a hybrid approach in which the ensemble is generated using a square-root ensemble Kalman filter (EnKF) and the variational problem is solved using the Grid-point Statistical Interpolation system. As in most EnKF applications, we found it necessary to employ a combination of multiplicative and additive inflations, to compensate for sampling and modeling errors, respectively and, to maintain the small-member ensemble solution close to the variational solution; we also found it necessary to re-center the members of the ensemble about the variational analysis. During tuning of the filter we have found re-centering and additive inflation to play a considerably larger role than expected, particularly in a dual-resolution context when the variational analysis is ran at larger resolution than the ensemble. This led us to consider a hybrid strategy in which the members of the ensemble are generated by simply converting the variational analysis to the resolution of the ensemble and applying additive inflation, thus bypassing the EnKF. Comparisons of this, so-called, filter-free hybrid procedure with an EnKF-based hybrid procedure and a control non-hybrid, traditional, scheme show both hybrid strategies to provide equally significant improvement over the control; more interestingly, the filter-free procedure was found to give qualitatively similar results to the EnKF-based procedure.

  9. Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics

    NASA Astrophysics Data System (ADS)

    Lazarus, S. M.; Holman, B. P.; Splitt, M. E.

    2017-12-01

    A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.

  10. Insight and Evidence Motivating the Simplification of Dual-Analysis Hybrid Systems into Single-Analysis Hybrid Systems

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Diniz, F. L. R.; Takacs, L. L.; Suarez, M. J.

    2018-01-01

    Many hybrid data assimilation systems currently used for NWP employ some form of dual-analysis system approach. Typically a hybrid variational analysis is responsible for creating initial conditions for high-resolution forecasts, and an ensemble analysis system is responsible for creating sample perturbations used to form the flow-dependent part of the background error covariance required in the hybrid analysis component. In many of these, the two analysis components employ different methodologies, e.g., variational and ensemble Kalman filter. In such cases, it is not uncommon to have observations treated rather differently between the two analyses components; recentering of the ensemble analysis around the hybrid analysis is used to compensated for such differences. Furthermore, in many cases, the hybrid variational high-resolution system implements some type of four-dimensional approach, whereas the underlying ensemble system relies on a three-dimensional approach, which again introduces discrepancies in the overall system. Connected to these is the expectation that one can reliably estimate observation impact on forecasts issued from hybrid analyses by using an ensemble approach based on the underlying ensemble strategy of dual-analysis systems. Just the realization that the ensemble analysis makes substantially different use of observations as compared to their hybrid counterpart should serve as enough evidence of the implausibility of such expectation. This presentation assembles numerous anecdotal evidence to illustrate the fact that hybrid dual-analysis systems must, at the very minimum, strive for consistent use of the observations in both analysis sub-components. Simpler than that, this work suggests that hybrid systems can reliably be constructed without the need to employ a dual-analysis approach. In practice, the idea of relying on a single analysis system is appealing from a cost-maintenance perspective. More generally, single-analysis systems avoid contradictions such as having to choose one sub-component to generate performance diagnostics to another, possibly not fully consistent, component.

  11. The WASCAL high-resolution climate projection ensemble for West Africa

    NASA Astrophysics Data System (ADS)

    Kunstmann, Harald; Heinzeller, Dominikus; Dieng, Diarra; Smiatek, Gerhard; Bliefernicht, Jan; Hamann, Ilse; Salack, Seyni

    2017-04-01

    With climate change being one of the most severe challenges to rural Africa in the 21st century, West Africa is facing an urgent need to develop effective adaptation and mitigation measures to protect its constantly growing population. We perform ensemble-based regional climate simulations at a high resolution of 12km for West Africa to allow a scientifically sound derivation of climate change adaptation measures. Based on the RCP4.5 scenario, our ensemble consist of three simulation experiments with the Weather Research & Forecasting Tool (WRF) and one additional experiment with the Consortium for Small-scale Modelling Model COSMO in Climate Mode (COSMO-CLM). We discuss the model performance over the validation period 1980-2010, including a novel, station-based precipitation database for West Africa obtained within the WASCAL (West African Science Service Centre for Climate Change and Adapted Land Use) program. Particular attention is paid to the representation of the dynamics of the West African Summer Monsoon and to the added value of our high-resolution models over existing data sets. We further present results on the climate change signal obtained for the two future periods 2020-2050 and 2070-2100 and compare them to current state-of-the-art projections from the CORDEX-Africa project. While the temperature change signal is similar to that obtained within CORDEX-Africa, our simulations predict a wetter future for the Coast of Guinea and the southern Soudano area and a slight drying in the northernmost part of the Sahel.

  12. An ensemble forecast of the South China Sea monsoon

    NASA Astrophysics Data System (ADS)

    Krishnamurti, T. N.; Tewari, Mukul; Bensman, Ed; Han, Wei; Zhang, Zhan; Lau, William K. M.

    1999-05-01

    This paper presents a generalized ensemble forecast procedure for the tropical latitudes. Here we propose an empirical orthogonal function-based procedure for the definition of a seven-member ensemble. The wind and the temperature fields are perturbed over the global tropics. Although the forecasts are made over the global belt with a high-resolution model, the emphasis of this study is on a South China Sea monsoon. Over this domain of the South China Sea includes the passage of a Tropical Storm, Gary, that moved eastwards north of the Philippines. The ensemble forecast handled the precipitation of this storm reasonably well. A global model at the resolution Triangular Truncation 126 waves is used to carry out these seven forecasts. The evaluation of the ensemble of forecasts is carried out via standard root mean square errors of the precipitation and the wind fields. The ensemble average is shown to have a higher skill compared to a control experiment, which was a first analysis based on operational data sets over both the global tropical and South China Sea domain. All of these experiments were subjected to physical initialization which provides a spin-up of the model rain close to that obtained from satellite and gauge-based estimates. The results furthermore show that inherently much higher skill resides in the forecast precipitation fields if they are averaged over area elements of the order of 4° latitude by 4° longitude squares.

  13. Spread-Spectrum Beamforming and Clutter Filtering for Plane-Wave Color Doppler Imaging.

    PubMed

    Mansour, Omar; Poepping, Tamie L; Lacefield, James C

    2016-07-21

    Plane-wave imaging is desirable for its ability to achieve high frame rates, allowing the capture of fast dynamic events and continuous Doppler data. In most implementations of plane-wave imaging, multiple low-resolution images from different plane wave tilt angles are compounded to form a single high-resolution image, thereby reducing the frame rate. Compounding improves the lateral beam profile in the high-resolution image, but it also acts as a low-pass filter in slow time that causes attenuation and aliasing of signals with high Doppler shifts. This paper introduces a spread-spectrum color Doppler imaging method that produces high-resolution images without the use of compounding, thereby eliminating the tradeoff between beam quality, maximum unaliased Doppler frequency, and frame rate. The method uses a long, random sequence of transmit angles rather than a linear sweep of plane wave directions. The random angle sequence randomizes the phase of off-focus (clutter) signals, thereby spreading the clutter power in the Doppler spectrum, while keeping the spectrum of the in-focus signal intact. The ensemble of randomly tilted low-resolution frames also acts as the Doppler ensemble, so it can be much longer than a conventional linear sweep, thereby improving beam formation while also making the slow-time Doppler sampling frequency equal to the pulse repetition frequency. Experiments performed using a carotid artery phantom with constant flow demonstrate that the spread-spectrum method more accurately measures the parabolic flow profile of the vessel and outperforms conventional plane-wave Doppler in both contrast resolution and estimation of high flow velocities. The spread-spectrum method is expected to be valuable for Doppler applications that require measurement of high velocities at high frame rates.

  14. Tunable allosteric library of caspase-3 identifies coupling between conserved water molecules and conformational selection

    PubMed Central

    Maciag, Joseph J.; Mackenzie, Sarah H.; Tucker, Matthew B.; Schipper, Joshua L.; Swartz, Paul; Clark, A. Clay

    2016-01-01

    The native ensemble of caspases is described globally by a complex energy landscape where the binding of substrate selects for the active conformation, whereas targeting an allosteric site in the dimer interface selects an inactive conformation that contains disordered active-site loops. Mutations and posttranslational modifications stabilize high-energy inactive conformations, with mostly formed, but distorted, active sites. To examine the interconversion of active and inactive states in the ensemble, we used detection of related solvent positions to analyze 4,995 waters in 15 high-resolution (<2.0 Å) structures of wild-type caspase-3, resulting in 450 clusters with the most highly conserved set containing 145 water molecules. The data show that regions of the protein that contact the conserved waters also correspond to sites of posttranslational modifications, suggesting that the conserved waters are an integral part of allosteric mechanisms. To test this hypothesis, we created a library of 19 caspase-3 variants through saturation mutagenesis in a single position of the allosteric site of the dimer interface, and we show that the enzyme activity varies by more than four orders of magnitude. Altogether, our database consists of 37 high-resolution structures of caspase-3 variants, and we demonstrate that the decrease in activity correlates with a loss of conserved water molecules. The data show that the activity of caspase-3 can be fine-tuned through globally desolvating the active conformation within the native ensemble, providing a mechanism for cells to repartition the ensemble and thus fine-tune activity through conformational selection. PMID:27681633

  15. Tunable allosteric library of caspase-3 identifies coupling between conserved water molecules and conformational selection.

    PubMed

    Maciag, Joseph J; Mackenzie, Sarah H; Tucker, Matthew B; Schipper, Joshua L; Swartz, Paul; Clark, A Clay

    2016-10-11

    The native ensemble of caspases is described globally by a complex energy landscape where the binding of substrate selects for the active conformation, whereas targeting an allosteric site in the dimer interface selects an inactive conformation that contains disordered active-site loops. Mutations and posttranslational modifications stabilize high-energy inactive conformations, with mostly formed, but distorted, active sites. To examine the interconversion of active and inactive states in the ensemble, we used detection of related solvent positions to analyze 4,995 waters in 15 high-resolution (<2.0 Å) structures of wild-type caspase-3, resulting in 450 clusters with the most highly conserved set containing 145 water molecules. The data show that regions of the protein that contact the conserved waters also correspond to sites of posttranslational modifications, suggesting that the conserved waters are an integral part of allosteric mechanisms. To test this hypothesis, we created a library of 19 caspase-3 variants through saturation mutagenesis in a single position of the allosteric site of the dimer interface, and we show that the enzyme activity varies by more than four orders of magnitude. Altogether, our database consists of 37 high-resolution structures of caspase-3 variants, and we demonstrate that the decrease in activity correlates with a loss of conserved water molecules. The data show that the activity of caspase-3 can be fine-tuned through globally desolvating the active conformation within the native ensemble, providing a mechanism for cells to repartition the ensemble and thus fine-tune activity through conformational selection.

  16. A hybrid variational ensemble data assimilation for the HIgh Resolution Limited Area Model (HIRLAM)

    NASA Astrophysics Data System (ADS)

    Gustafsson, N.; Bojarova, J.; Vignes, O.

    2014-02-01

    A hybrid variational ensemble data assimilation has been developed on top of the HIRLAM variational data assimilation. It provides the possibility of applying a flow-dependent background error covariance model during the data assimilation at the same time as full rank characteristics of the variational data assimilation are preserved. The hybrid formulation is based on an augmentation of the assimilation control variable with localised weights to be assigned to a set of ensemble member perturbations (deviations from the ensemble mean). The flow-dependency of the hybrid assimilation is demonstrated in single simulated observation impact studies and the improved performance of the hybrid assimilation in comparison with pure 3-dimensional variational as well as pure ensemble assimilation is also proven in real observation assimilation experiments. The performance of the hybrid assimilation is comparable to the performance of the 4-dimensional variational data assimilation. The sensitivity to various parameters of the hybrid assimilation scheme and the sensitivity to the applied ensemble generation techniques are also examined. In particular, the inclusion of ensemble perturbations with a lagged validity time has been examined with encouraging results.

  17. High resolution probabilistic precipitation forecast over Spain combining the statistical downscaling tool PROMETEO and the AEMET short range EPS system (AEMET/SREPS)

    NASA Astrophysics Data System (ADS)

    Cofino, A. S.; Santos, C.; Garcia-Moya, J. A.; Gutierrez, J. M.; Orfila, B.

    2009-04-01

    The Short-Range Ensemble Prediction System (SREPS) is a multi-LAM (UM, HIRLAM, MM5, LM and HRM) multi analysis/boundary conditions (ECMWF, UKMetOffice, DWD and GFS) run twice a day by AEMET (72 hours lead time) over a European domain, with a total of 5 (LAMs) x 4 (GCMs) = 20 members. One of the main goals of this project is analyzing the impact of models and boundary conditions in the short-range high-resolution forecasted precipitation. A previous validation of this method has been done considering a set of climate networks in Spain, France and Germany, by interpolating the prediction to the gauge locations (SREPS, 2008). In this work we compare these results with those obtained by using a statistical downscaling method to post-process the global predictions, obtaining an "advanced interpolation" for the local precipitation using climate network precipitation observations. In particular, we apply the PROMETEO downscaling system based on analogs and compare the SREPS ensemble of 20 members with the PROMETEO statistical ensemble of 5 (analog ensemble) x 4 (GCMs) = 20 members. Moreover, we will also compare the performance of a combined approach post-processing the SREPS outputs using the PROMETEO system. References: SREPS 2008. 2008 EWGLAM-SRNWP Meeting (http://www.aemet.es/documentos/va/divulgacion/conferencias/prediccion/Ewglam/PRED_CSantos.pdf)

  18. Impact of Climate Change on Potential, Attainable, and Actual Wheat Yield in Oklahoma

    NASA Astrophysics Data System (ADS)

    Dhakal, K.; Linde, E.; Kakani, V. G.; Alderman, P. D.; Brunson, D.; Ochsner, T. E.; Carver, B.

    2017-12-01

    Gradually developing climatic and weather anomalies due to increasing atmospheric greenhouse gases concentration can pose threat to farmers and resource managers. This study was aimed at investigating the effects of climate change on winter wheat (Triticum aestivum L.) under the Representative Concentration Pathways 6.0 and 8.5 using downscaled climate projections from different models and their ensembles. Daily data of maximum and minimum air temperature, rainfall, and solar radiation for, four General Circulation Models (MRIOC5, MRI-CGCM3, HadGEM2-ES, CSRIO-Mk3.6.0), ensemble of four models and ensemble of 17 GCMs, at 800 m resolution, were developed for two RCPs using Marksim. We describe a methodology for rapid synthesis of GCM-based, spatially explicit, high resolution future weather data inputs for the DSSAT crop model, for cropland area across wheat growing regions of Oklahoma for the future period 2040-2060. The potential impacts of climate change and variability on potential, attainable, and actual winter wheat yield in Oklahoma is discussed.

  19. Evaluation of ensemble forecast uncertainty using a new proper score: application to medium-range and seasonal forecasts

    NASA Astrophysics Data System (ADS)

    Christensen, Hannah; Moroz, Irene; Palmer, Tim

    2015-04-01

    Forecast verification is important across scientific disciplines as it provides a framework for evaluating the performance of a forecasting system. In the atmospheric sciences, probabilistic skill scores are often used for verification as they provide a way of unambiguously ranking the performance of different probabilistic forecasts. In order to be useful, a skill score must be proper -- it must encourage honesty in the forecaster, and reward forecasts which are reliable and which have good resolution. A new score, the Error-spread Score (ES), is proposed which is particularly suitable for evaluation of ensemble forecasts. It is formulated with respect to the moments of the forecast. The ES is confirmed to be a proper score, and is therefore sensitive to both resolution and reliability. The ES is tested on forecasts made using the Lorenz '96 system, and found to be useful for summarising the skill of the forecasts. The European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system (EPS) is evaluated using the ES. Its performance is compared to a perfect statistical probabilistic forecast -- the ECMWF high resolution deterministic forecast dressed with the observed error distribution. This generates a forecast that is perfectly reliable if considered over all time, but which does not vary from day to day with the predictability of the atmospheric flow. The ES distinguishes between the dynamically reliable EPS forecasts and the statically reliable dressed deterministic forecasts. Other skill scores are tested and found to be comparatively insensitive to this desirable forecast quality. The ES is used to evaluate seasonal range ensemble forecasts made with the ECMWF System 4. The ensemble forecasts are found to be skilful when compared with climatological or persistence forecasts, though this skill is dependent on region and time of year.

  20. Context dependent anti-aliasing image reconstruction

    NASA Technical Reports Server (NTRS)

    Beaudet, Paul R.; Hunt, A.; Arlia, N.

    1989-01-01

    Image Reconstruction has been mostly confined to context free linear processes; the traditional continuum interpretation of digital array data uses a linear interpolator with or without an enhancement filter. Here, anti-aliasing context dependent interpretation techniques are investigated for image reconstruction. Pattern classification is applied to each neighborhood to assign it a context class; a different interpolation/filter is applied to neighborhoods of differing context. It is shown how the context dependent interpolation is computed through ensemble average statistics using high resolution training imagery from which the lower resolution image array data is obtained (simulation). A quadratic least squares (LS) context-free image quality model is described from which the context dependent interpolation coefficients are derived. It is shown how ensembles of high-resolution images can be used to capture the a priori special character of different context classes. As a consequence, a priori information such as the translational invariance of edges along the edge direction, edge discontinuity, and the character of corners is captured and can be used to interpret image array data with greater spatial resolution than would be expected by the Nyquist limit. A Gibb-like artifact associated with this super-resolution is discussed. More realistic context dependent image quality models are needed and a suggestion is made for using a quality model which now is finding application in data compression.

  1. Technical note: 3-hourly temporal downscaling of monthly global terrestrial biosphere model net ecosystem exchange

    DOE PAGES

    Fisher, Joshua B.; Sikka, Munish; Huntzinger, Deborah N.; ...

    2016-07-29

    Here, the land surface provides a boundary condition to atmospheric forward and flux inversion models. These models require prior estimates of CO 2 fluxes at relatively high temporal resolutions (e.g., 3-hourly) because of the high frequency of atmospheric mixing and wind heterogeneity. However, land surface model CO 2 fluxes are often provided at monthly time steps, typically because the land surface modeling community focuses more on time steps associated with plant phenology (e.g., seasonal) than on sub-daily phenomena. Here, we describe a new dataset created from 15 global land surface models and 4 ensemble products in the Multi-scale Synthesis andmore » Terrestrial Model Intercomparison Project (MsTMIP), temporally downscaled from monthly to 3-hourly output. We provide 3-hourly output for each individual model over 7 years (2004–2010), as well as an ensemble mean, a weighted ensemble mean, and the multi-model standard deviation. Output is provided in three different spatial resolutions for user preferences: 0.5° × 0.5°, 2.0° × 2.5°, and 4.0° × 5.0° (latitude × longitude).« less

  2. Evaluation of uncertainties in mean and extreme precipitation under climate change for northwestern Mediterranean watersheds from high-resolution Med and Euro-CORDEX ensembles

    NASA Astrophysics Data System (ADS)

    Colmet-Daage, Antoine; Sanchez-Gomez, Emilia; Ricci, Sophie; Llovel, Cécile; Borrell Estupina, Valérie; Quintana-Seguí, Pere; Llasat, Maria Carmen; Servat, Eric

    2018-01-01

    The climate change impact on mean and extreme precipitation events in the northern Mediterranean region is assessed using high-resolution EuroCORDEX and MedCORDEX simulations. The focus is made on three regions, Lez and Aude located in France, and Muga located in northeastern Spain, and eight pairs of global and regional climate models are analyzed with respect to the SAFRAN product. First the model skills are evaluated in terms of bias for the precipitation annual cycle over historical period. Then future changes in extreme precipitation, under two emission scenarios, are estimated through the computation of past/future change coefficients of quantile-ranked model precipitation outputs. Over the 1981-2010 period, the cumulative precipitation is overestimated for most models over the mountainous regions and underestimated over the coastal regions in autumn and higher-order quantile. The ensemble mean and the spread for future period remain unchanged under RCP4.5 scenario and decrease under RCP8.5 scenario. Extreme precipitation events are intensified over the three catchments with a smaller ensemble spread under RCP8.5 revealing more evident changes, especially in the later part of the 21st century.

  3. Assimilation of the AVISO Altimetry Data into the Ocean Dynamics Model with a High Spatial Resolution Using Ensemble Optimal Interpolation (EnOI)

    NASA Astrophysics Data System (ADS)

    Kaurkin, M. N.; Ibrayev, R. A.; Belyaev, K. P.

    2018-01-01

    A parallel realization of the Ensemble Optimal Interpolation (EnOI) data assimilation (DA) method in conjunction with the eddy-resolving global circulation model is implemented. The results of DA experiments in the North Atlantic with the assimilation of the Archiving, Validation and Interpretation of Satellite Oceanographic (AVISO) data from the Jason-1 satellite are analyzed. The results of simulation are compared with the independent temperature and salinity data from the ARGO drifters.

  4. Application of Enhanced Sampling Monte Carlo Methods for High-Resolution Protein-Protein Docking in Rosetta

    PubMed Central

    Zhang, Zhe; Schindler, Christina E. M.; Lange, Oliver F.; Zacharias, Martin

    2015-01-01

    The high-resolution refinement of docked protein-protein complexes can provide valuable structural and mechanistic insight into protein complex formation complementing experiment. Monte Carlo (MC) based approaches are frequently applied to sample putative interaction geometries of proteins including also possible conformational changes of the binding partners. In order to explore efficiency improvements of the MC sampling, several enhanced sampling techniques, including temperature or Hamiltonian replica exchange and well-tempered ensemble approaches, have been combined with the MC method and were evaluated on 20 protein complexes using unbound partner structures. The well-tempered ensemble method combined with a 2-dimensional temperature and Hamiltonian replica exchange scheme (WTE-H-REMC) was identified as the most efficient search strategy. Comparison with prolonged MC searches indicates that the WTE-H-REMC approach requires approximately 5 times fewer MC steps to identify near native docking geometries compared to conventional MC searches. PMID:26053419

  5. Comparison of numerical weather prediction based deterministic and probabilistic wind resource assessment methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jie; Draxl, Caroline; Hopson, Thomas

    Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less

  6. Ensemble data assimilation in the Red Sea: sensitivity to ensemble selection and atmospheric forcing

    NASA Astrophysics Data System (ADS)

    Toye, Habib; Zhan, Peng; Gopalakrishnan, Ganesh; Kartadikaria, Aditya R.; Huang, Huang; Knio, Omar; Hoteit, Ibrahim

    2017-07-01

    We present our efforts to build an ensemble data assimilation and forecasting system for the Red Sea. The system consists of the high-resolution Massachusetts Institute of Technology general circulation model (MITgcm) to simulate ocean circulation and of the Data Research Testbed (DART) for ensemble data assimilation. DART has been configured to integrate all members of an ensemble adjustment Kalman filter (EAKF) in parallel, based on which we adapted the ensemble operations in DART to use an invariant ensemble, i.e., an ensemble Optimal Interpolation (EnOI) algorithm. This approach requires only single forward model integration in the forecast step and therefore saves substantial computational cost. To deal with the strong seasonal variability of the Red Sea, the EnOI ensemble is then seasonally selected from a climatology of long-term model outputs. Observations of remote sensing sea surface height (SSH) and sea surface temperature (SST) are assimilated every 3 days. Real-time atmospheric fields from the National Center for Environmental Prediction (NCEP) and the European Center for Medium-Range Weather Forecasts (ECMWF) are used as forcing in different assimilation experiments. We investigate the behaviors of the EAKF and (seasonal-) EnOI and compare their performances for assimilating and forecasting the circulation of the Red Sea. We further assess the sensitivity of the assimilation system to various filtering parameters (ensemble size, inflation) and atmospheric forcing.

  7. Random-Forest Classification of High-Resolution Remote Sensing Images and Ndsm Over Urban Areas

    NASA Astrophysics Data System (ADS)

    Sun, X. F.; Lin, X. G.

    2017-09-01

    As an intermediate step between raw remote sensing data and digital urban maps, remote sensing data classification has been a challenging and long-standing research problem in the community of remote sensing. In this work, an effective classification method is proposed for classifying high-resolution remote sensing data over urban areas. Starting from high resolution multi-spectral images and 3D geometry data, our method proceeds in three main stages: feature extraction, classification, and classified result refinement. First, we extract color, vegetation index and texture features from the multi-spectral image and compute the height, elevation texture and differential morphological profile (DMP) features from the 3D geometry data. Then in the classification stage, multiple random forest (RF) classifiers are trained separately, then combined to form a RF ensemble to estimate each sample's category probabilities. Finally the probabilities along with the feature importance indicator outputted by RF ensemble are used to construct a fully connected conditional random field (FCCRF) graph model, by which the classification results are refined through mean-field based statistical inference. Experiments on the ISPRS Semantic Labeling Contest dataset show that our proposed 3-stage method achieves 86.9% overall accuracy on the test data.

  8. Using Analog Ensemble to generate spatially downscaled probabilistic wind power forecasts

    NASA Astrophysics Data System (ADS)

    Delle Monache, L.; Shahriari, M.; Cervone, G.

    2017-12-01

    We use the Analog Ensemble (AnEn) method to generate probabilistic 80-m wind power forecasts. We use data from the NCEP GFS ( 28 km resolution) and NCEP NAM (12 km resolution). We use forecasts data from NAM and GFS, and analysis data from NAM which enables us to: 1) use a lower-resolution model to create higher-resolution forecasts, and 2) use a higher-resolution model to create higher-resolution forecasts. The former essentially increases computing speed and the latter increases forecast accuracy. An aggregated model of the former can be compared against the latter to measure the accuracy of the AnEn spatial downscaling. The AnEn works by taking a deterministic future forecast and comparing it with past forecasts. The model searches for the best matching estimates within the past forecasts and selects the predictand value corresponding to these past forecasts as the ensemble prediction for the future forecast. Our study is based on predicting wind speed and air density at more than 13,000 grid points in the continental US. We run the AnEn model twice: 1) estimating 80-m wind speed by using predictor variables such as temperature, pressure, geopotential height, U-component and V-component of wind, 2) estimating air density by using predictors such as temperature, pressure, and relative humidity. We use the air density values to correct the standard wind power curves for different values of air density. The standard deviation of the ensemble members (i.e. ensemble spread) will be used as the degree of difficulty to predict wind power at different locations. The value of the correlation coefficient between the ensemble spread and the forecast error determines the appropriateness of this measure. This measure is prominent for wind farm developers as building wind farms in regions with higher predictability will reduce the real-time risks of operating in the electricity markets.

  9. Climate Change Signals in the EURO-CORDEX Simulations

    NASA Astrophysics Data System (ADS)

    Jacob, Daniela; Preuschmann, Swantje

    2014-05-01

    A new high-resolution regional climate change ensemble has been established for Europe within the World Climate Research Program Coordinated Regional Downscaling Experiment (EURO-CORDEX) initiative. Within this presentation, the first results on climate change signals based on simulations with a horizontal resolution of 12.5 km for the new emission scenarios RCP4.5 and RCP8.5 will be presented. The new EURO-CORDEX ensemble results have been compared to the SRES A1B simulation results achieved within the ENSEMBLES project. The presentation is based on the results of the Paper JACOB et al. (2013). We concentrated on the statistical analysis of robustness and significance of the climate change signals for mean annual and seasonal temperature, total annual and seasonal precipitation, heavy precipitation, heat waves and dry spells, by using daily data for three time periods: 1971-2000, 2021-2050 and 2071-2100. The analysis of impact indices shows that for RCP8.5, there is a substantially larger change projected for temperature-based indices than for RCP4.5. The difference is less pronounced for precipitation-based indices. Two effects of the increased resolution can be regarded as an added value of regional climate simulations. Regional climate model simulations provide higher daily precipitation intensities, which are completely missing in the global climate model simulations, and they provide a significantly different climate change of daily precipitation intensities resulting in a smoother shift from weak to moderate and high intensities. The analysis of projected changes in the 95th percentile of the mean length of dry spells shows similar patterns for all scenarios. The climate projections from the new ensemble indicate a reduced northwards shift of Mediterranean drying evolution and slightly stronger mean precipitation increases over most of Europe. Within the high-resolution simulations in the EURO-CORDEX changes of the pattern for heavy precipitation events are clearly visible. (Jacob2013) Jacob, D.; Petersen, J.; Eggert, B.; Alias, A.; Christensen, O. B.; Bouwer, L.; Braun, A.; Colette, A.; Déqué, M.; Georgievski, G.; Georgopoulou, E.; Gobiet, A.; Menut, L.; Nikulin, G.; Haensler, A.; Hempelmann, N.; Jones, C.; Keuler, K.; Kovats, S.; Kröner, N.; Kotlarski, S.; Kriegsmann, A.; Martin, E.; Meijgaard, E.; Moseley, C.; Pfeifer, S.; Preuschmann, S.; Radermacher, C.; Radtke, K.; Rechid, D.; Rounsevell, M.; Samuelsson, P.; Somot, S.; Soussana, J.-F.; Teichmann, C.; Valentini, R.; Vautard, R.; Weber, B. & Yiou, P.( 2013): EURO-CORDEX: new high-resolution climate change projections for European impact research Regional Environmental Change, Springer Berlin Heidelberg, 2013, 1-16.

  10. Is inversion based high resolution characterization of spatially heterogeneous river bed hydraulic conductivity needed and possible?

    NASA Astrophysics Data System (ADS)

    Kurtz, W.; Hendricks Franssen, H.-J.; Brunner, P.; Vereecken, H.

    2013-05-01

    River-aquifer exchange fluxes influence local and regional water balances and affect groundwater and river water quality and quantity. Unfortunately, river-aquifer exchange fluxes tend to be strongly spatially variable and it is an open research question to which degree river bed heterogeneity has to be represented in a~model in order to achieve reliable estimates of river-aquifer exchange fluxes. This research question is addressed in this paper with help of synthetic simulation experiments, which mimic the Limmat aquifer in Zurich (Switzerland), where river-aquifer exchange fluxes and groundwater management activities play an important role. The solution of the unsaturated-saturated subsurface hydrological flow problem including river-aquifer interaction is calculated for ten different synthetic realities where the strongly heterogeneous river bed hydraulic conductivities (L) are perfectly known. Hydraulic head data (100 in the default scenario) are sampled from the synthetic realities. In subsequent data assimilation experiments, where L is unknown now, the hydraulic head data are used as conditioning information, with help of the Ensemble Kalman Filter (EnKF). For each of the ten synthetic realities, four different ensembles of L are tested in the experiments with EnKF; one ensemble estimates high resolution L-fields with different L values for each element, and the other three ensembles estimate effective L values for 5, 3 or 2 zones. The calibration of higher resolution L-fields (i.e., fully heterogeneous or 5 zones) gives better results than the calibration of L for only 3 or 2 zones in terms of reproduction of states, stream-aquifer exchange fluxes and parameters. Effective L for a limited number of zones cannot always reproduce the true states and fluxes well and results in biased estimates of net exchange fluxes between aquifer and stream. Also in case only 10 head data are used for conditioning, the high resolution L-fields outperform the others. In case of less heterogeneous river bed hydraulic conductivities, a high-resolution characterization of L is less important. We conclude that for strongly heterogeneous river beds the commonly applied simplified representation of the streambed, with spatially homogeneous parameters or constant parameters for a few zones, might yield significant biases in the characterization of the water balance. For strongly heterogeneous river beds, we suggest to adopt a stochastic field approach to model the spatially heterogeneous river beds geostatistically. The paper illustrates that EnKF is able to calibrate such heterogeneous streambeds on the basis of hydraulic head measurements, outperforming classical approaches.

  11. Car-Parrinello molecular dynamics study of the intramolecular vibrational mode-sensitive double proton-transfer mechanisms in porphycene.

    PubMed

    Walewski, Łukasz; Waluk, Jacek; Lesyng, Bogdan

    2010-02-18

    Car-Parrinello molecular dynamics simulations were carried out to help interpret proton-transfer processes observed experimentally in porphycene under thermodynamic equilibrium conditions (NVT ensemble) as well as during selective, nonequilibrium vibrational excitations of the molecular scaffold (NVE ensemble). In the NVT ensemble, the population of the trans form in the gas phase at 300 K is 96.5%, and of the cis-1 form is 3.5%, in agreement with experimental data. Approximately 70% of the proton-transfer events are asynchronous double proton transfers. According to the high resolution simulation data they consist of two single transfer events that rapidly take place one after the other. The average time-period between the two consecutive jumps is 220 fs. The gas phase reaction rate estimate at 300 K is 3.6 ps, which is comparable to experimentally determined rates. The NVE ensemble nonequilibrium ab initio MD simulations, which correspond to selective vibrational excitations of the molecular scaffold generated with high resolution laser spectroscopy techniques, exhibit an enhancing property of the 182 cm(-1) vibrational mode and an inhibiting property of the 114 cm(-1) one. Both of them influence the proton-transfer rate, in qualitative agreement with experimental findings. Our ab initio simulations provide new predictions regarding the influence of double-mode vibrational excitations on proton-transfer processes. They can help in setting up future programmable spectroscopic experiments for the proton-transfer translocations.

  12. Stochastic Approaches Within a High Resolution Rapid Refresh Ensemble

    NASA Astrophysics Data System (ADS)

    Jankov, I.

    2017-12-01

    It is well known that global and regional numerical weather prediction (NWP) ensemble systems are under-dispersive, producing unreliable and overconfident ensemble forecasts. Typical approaches to alleviate this problem include the use of multiple dynamic cores, multiple physics suite configurations, or a combination of the two. While these approaches may produce desirable results, they have practical and theoretical deficiencies and are more difficult and costly to maintain. An active area of research that promotes a more unified and sustainable system is the use of stochastic physics. Stochastic approaches include Stochastic Parameter Perturbations (SPP), Stochastic Kinetic Energy Backscatter (SKEB), and Stochastic Perturbation of Physics Tendencies (SPPT). The focus of this study is to assess model performance within a convection-permitting ensemble at 3-km grid spacing across the Contiguous United States (CONUS) using a variety of stochastic approaches. A single physics suite configuration based on the operational High-Resolution Rapid Refresh (HRRR) model was utilized and ensemble members produced by employing stochastic methods. Parameter perturbations (using SPP) for select fields were employed in the Rapid Update Cycle (RUC) land surface model (LSM) and Mellor-Yamada-Nakanishi-Niino (MYNN) Planetary Boundary Layer (PBL) schemes. Within MYNN, SPP was applied to sub-grid cloud fraction, mixing length, roughness length, mass fluxes and Prandtl number. In the RUC LSM, SPP was applied to hydraulic conductivity and tested perturbing soil moisture at initial time. First iterative testing was conducted to assess the initial performance of several configuration settings (e.g. variety of spatial and temporal de-correlation lengths). Upon selection of the most promising candidate configurations using SPP, a 10-day time period was run and more robust statistics were gathered. SKEB and SPPT were included in additional retrospective tests to assess the impact of using all three stochastic approaches to address model uncertainty. Results from the stochastic perturbation testing were compared to a baseline multi-physics control ensemble. For probabilistic forecast performance the Model Evaluation Tools (MET) verification package was used.

  13. Chemical Structure, Ensemble and Single-Particle Spectroscopy of Thick-Shell InP-ZnSe Quantum Dots.

    PubMed

    Reid, Kemar R; McBride, James R; Freymeyer, Nathaniel J; Thal, Lucas B; Rosenthal, Sandra J

    2018-02-14

    Thick-shell (>5 nm) InP-ZnSe colloidal quantum dots (QDs) grown by a continuous-injection shell growth process are reported. The growth of a thick crystalline shell is attributed to the high temperature of the growth process and the relatively low lattice mismatch between the InP core and ZnSe shell. In addition to a narrow ensemble photoluminescence (PL) line-width (∼40 nm), ensemble and single-particle emission dynamics measurements indicate that blinking and Auger recombination are reduced in these heterostructures. More specifically, high single-dot ON-times (>95%) were obtained for the core-shell QDs, and measured ensemble biexciton lifetimes, τ 2x ∼ 540 ps, represent a 7-fold increase compared to InP-ZnS QDs. Further, high-resolution energy dispersive X-ray (EDX) chemical maps directly show for the first time significant incorporation of indium into the shell of the InP-ZnSe QDs. Examination of the atomic structure of the thick-shell QDs by high-angle annular dark-field scanning transmission electron microscopy (HAADF-STEM) reveals structural defects in subpopulations of particles that may mitigate PL efficiencies (∼40% in ensemble), providing insight toward further synthetic refinement. These InP-ZnSe heterostructures represent progress toward fully cadmium-free QDs with superior photophysical properties important in biological labeling and other emission-based technologies.

  14. Group for High Resolution Sea Surface Temperature (GHRSST) Analysis Fields Inter-Comparisons. Part 1: A GHRSST Multi-Product Ensemble (GMPE)

    DTIC Science & Technology

    2012-05-02

    Le Borgne, P., Poulter, D., Vazquez-Cuervo, J., Armstrong, E., Beggs, H., Llewellyn - Jones , D., Minnett, P., Merchant, C., Evans, R., 2009. The GODAE...Donlon i, Chelle Gentemann j, Robert Grumbine k, Shiro Ishizaki l, Eileen Maturi b, Richard W. Reynoldsm, Jonah Roberts- Jones a a Met Office, Exeter...high-resolution sea surface temperature pilot project. Oceanography 22, 34–45. Donlon, C.J., Martin, M., Stark, J.D., Roberts- Jones , J., Fiedler, E

  15. Weather extremes in very large, high-resolution ensembles: the weatherathome experiment

    NASA Astrophysics Data System (ADS)

    Allen, M. R.; Rosier, S.; Massey, N.; Rye, C.; Bowery, A.; Miller, J.; Otto, F.; Jones, R.; Wilson, S.; Mote, P.; Stone, D. A.; Yamazaki, Y. H.; Carrington, D.

    2011-12-01

    Resolution and ensemble size are often seen as alternatives in climate modelling. Models with sufficient resolution to simulate many classes of extreme weather cannot normally be run often enough to assess the statistics of rare events, still less how these statistics may be changing. As a result, assessments of the impact of external forcing on regional climate extremes must be based either on statistical downscaling from relatively coarse-resolution models, or statistical extrapolation from 10-year to 100-year events. Under the weatherathome experiment, part of the climateprediction.net initiative, we have compiled the Met Office Regional Climate Model HadRM3P to run on personal computer volunteered by the general public at 25 and 50km resolution, embedded within the HadAM3P global atmosphere model. With a global network of about 50,000 volunteers, this allows us to run time-slice ensembles of essentially unlimited size, exploring the statistics of extreme weather under a range of scenarios for surface forcing and atmospheric composition, allowing for uncertainty in both boundary conditions and model parameters. Current experiments, developed with the support of Microsoft Research, focus on three regions, the Western USA, Europe and Southern Africa. We initially simulate the period 1959-2010 to establish which variables are realistically simulated by the model and on what scales. Our next experiments are focussing on the Event Attribution problem, exploring how the probability of various types of extreme weather would have been different over the recent past in a world unaffected by human influence, following the design of Pall et al (2011), but extended to a longer period and higher spatial resolution. We will present the first results of the unique, global, participatory experiment and discuss the implications for the attribution of recent weather events to anthropogenic influence on climate.

  16. Projecting the effects of climate change on Calanus finmarchicus distribution within the U.S. Northeast Continental Shelf.

    PubMed

    Grieve, Brian D; Hare, Jon A; Saba, Vincent S

    2017-07-24

    Calanus finmarchicus is vital to pelagic ecosystems in the North Atlantic Ocean. Previous studies suggest the species is vulnerable to the effects of global warming, particularly on the Northeast U.S. Shelf, which is in the southern portion of its range. In this study, we evaluate an ensemble of six different downscaled climate models and a high-resolution global climate model, and create a generalized additive model (GAM) to examine how future changes in temperature and salinity could affect the distribution and density of C. finmarchicus. By 2081-2100, we project average C. finmarchicus density will decrease by as much as 50% under a high greenhouse gas emissions scenario. These decreases are particularly pronounced in the spring and summer in the Gulf of Maine and Georges Bank. When compared to a high-resolution global climate model, the ensemble showed a more uniform change throughout the Northeast U.S. Shelf, while the high-resolution model showed larger decreases in the Northeast Channel, Shelf Break, and Central Gulf of Maine. C. finmarchicus is an important link between primary production and higher trophic levels, and the decrease projected here could be detrimental to the North Atlantic Right Whale and a host of important fishery species.

  17. Predictability of short-range forecasting: a multimodel approach

    NASA Astrophysics Data System (ADS)

    García-Moya, Jose-Antonio; Callado, Alfons; Escribà, Pau; Santos, Carlos; Santos-Muñoz, Daniel; Simarro, Juan

    2011-05-01

    Numerical weather prediction (NWP) models (including mesoscale) have limitations when it comes to dealing with severe weather events because extreme weather is highly unpredictable, even in the short range. A probabilistic forecast based on an ensemble of slightly different model runs may help to address this issue. Among other ensemble techniques, Multimodel ensemble prediction systems (EPSs) are proving to be useful for adding probabilistic value to mesoscale deterministic models. A Multimodel Short Range Ensemble Prediction System (SREPS) focused on forecasting the weather up to 72 h has been developed at the Spanish Meteorological Service (AEMET). The system uses five different limited area models (LAMs), namely HIRLAM (HIRLAM Consortium), HRM (DWD), the UM (UKMO), MM5 (PSU/NCAR) and COSMO (COSMO Consortium). These models run with initial and boundary conditions provided by five different global deterministic models, namely IFS (ECMWF), UM (UKMO), GME (DWD), GFS (NCEP) and CMC (MSC). AEMET-SREPS (AE) validation on the large-scale flow, using ECMWF analysis, shows a consistent and slightly underdispersive system. For surface parameters, the system shows high skill forecasting binary events. 24-h precipitation probabilistic forecasts are verified using an up-scaling grid of observations from European high-resolution precipitation networks, and compared with ECMWF-EPS (EC).

  18. Village Building Identification Based on Ensemble Convolutional Neural Networks

    PubMed Central

    Guo, Zhiling; Chen, Qi; Xu, Yongwei; Shibasaki, Ryosuke; Shao, Xiaowei

    2017-01-01

    In this study, we present the Ensemble Convolutional Neural Network (ECNN), an elaborate CNN frame formulated based on ensembling state-of-the-art CNN models, to identify village buildings from open high-resolution remote sensing (HRRS) images. First, to optimize and mine the capability of CNN for village mapping and to ensure compatibility with our classification targets, a few state-of-the-art models were carefully optimized and enhanced based on a series of rigorous analyses and evaluations. Second, rather than directly implementing building identification by using these models, we exploited most of their advantages by ensembling their feature extractor parts into a stronger model called ECNN based on the multiscale feature learning method. Finally, the generated ECNN was applied to a pixel-level classification frame to implement object identification. The proposed method can serve as a viable tool for village building identification with high accuracy and efficiency. The experimental results obtained from the test area in Savannakhet province, Laos, prove that the proposed ECNN model significantly outperforms existing methods, improving overall accuracy from 96.64% to 99.26%, and kappa from 0.57 to 0.86. PMID:29084154

  19. Deep neural network convolution (NNC) for three-class classification of diffuse lung disease opacities in high-resolution CT (HRCT): consolidation, ground-glass opacity (GGO), and normal opacity

    NASA Astrophysics Data System (ADS)

    Hashimoto, Noriaki; Suzuki, Kenji; Liu, Junchi; Hirano, Yasushi; MacMahon, Heber; Kido, Shoji

    2018-02-01

    Consolidation and ground-glass opacity (GGO) are two major types of opacities associated with diffuse lung diseases. Accurate detection and classification of such opacities are crucially important in the diagnosis of lung diseases, but the process is subjective, and suffers from interobserver variability. Our study purpose was to develop a deep neural network convolution (NNC) system for distinguishing among consolidation, GGO, and normal lung tissue in high-resolution CT (HRCT). We developed ensemble of two deep NNC models, each of which was composed of neural network regression (NNR) with an input layer, a convolution layer, a fully-connected hidden layer, and a fully-connected output layer followed by a thresholding layer. The output layer of each NNC provided a map for the likelihood of being each corresponding lung opacity of interest. The two NNC models in the ensemble were connected in a class-selection layer. We trained our NNC ensemble with pairs of input 2D axial slices and "teaching" probability maps for the corresponding lung opacity, which were obtained by combining three radiologists' annotations. We randomly selected 10 and 40 slices from HRCT scans of 172 patients for each class as a training and test set, respectively. Our NNC ensemble achieved an area under the receiver-operating-characteristic (ROC) curve (AUC) of 0.981 and 0.958 in distinction of consolidation and GGO, respectively, from normal opacity, yielding a classification accuracy of 93.3% among 3 classes. Thus, our deep-NNC-based system for classifying diffuse lung diseases achieved high accuracies for classification of consolidation, GGO, and normal opacity.

  20. Short-term Temperature Prediction Using Adaptive Computing on Dynamic Scales

    NASA Astrophysics Data System (ADS)

    Hu, W.; Cervone, G.; Jha, S.; Balasubramanian, V.; Turilli, M.

    2017-12-01

    When predicting temperature, there are specific places and times when high accuracy predictions are harder. For example, not all the sub-regions in the domain require the same amount of computing resources to generate an accurate prediction. Plateau areas might require less computing resources than mountainous areas because of the steeper gradient of temperature change in the latter. However, it is difficult to estimate beforehand the optimal allocation of computational resources because several parameters play a role in determining the accuracy of the forecasts, in addition to orography. The allocation of resources to perform simulations can become a bottleneck because it requires human intervention to stop jobs or start new ones. The goal of this project is to design and develop a dynamic approach to generate short-term temperature predictions that can automatically determines the required computing resources and the geographic scales of the predictions based on the spatial and temporal uncertainties. The predictions and the prediction quality metrics are computed using a numeric weather prediction model, Analog Ensemble (AnEn), and the parallelization on high performance computing systems is accomplished using Ensemble Toolkit, one component of the RADICAL-Cybertools family of tools. RADICAL-Cybertools decouple the science needs from the computational capabilities by building an intermediate layer to run general ensemble patterns, regardless of the science. In this research, we show how the ensemble toolkit allows generating high resolution temperature forecasts at different spatial and temporal resolution. The AnEn algorithm is run using NAM analysis and forecasts data for the continental United States for a period of 2 years. AnEn results show that temperature forecasts perform well according to different probabilistic and deterministic statistical tests.

  1. Consistency of climate change projections from multiple global and regional model intercomparison projects

    NASA Astrophysics Data System (ADS)

    Fernández, J.; Frías, M. D.; Cabos, W. D.; Cofiño, A. S.; Domínguez, M.; Fita, L.; Gaertner, M. A.; García-Díez, M.; Gutiérrez, J. M.; Jiménez-Guerrero, P.; Liguori, G.; Montávez, J. P.; Romera, R.; Sánchez, E.

    2018-03-01

    We present an unprecedented ensemble of 196 future climate projections arising from different global and regional model intercomparison projects (MIPs): CMIP3, CMIP5, ENSEMBLES, ESCENA, EURO- and Med-CORDEX. This multi-MIP ensemble includes all regional climate model (RCM) projections publicly available to date, along with their driving global climate models (GCMs). We illustrate consistent and conflicting messages using continental Spain and the Balearic Islands as target region. The study considers near future (2021-2050) changes and their dependence on several uncertainty sources sampled in the multi-MIP ensemble: GCM, future scenario, internal variability, RCM, and spatial resolution. This initial work focuses on mean seasonal precipitation and temperature changes. The results show that the potential GCM-RCM combinations have been explored very unevenly, with favoured GCMs and large ensembles of a few RCMs that do not respond to any ensemble design. Therefore, the grand-ensemble is weighted towards a few models. The selection of a balanced, credible sub-ensemble is challenged in this study by illustrating several conflicting responses between the RCM and its driving GCM and among different RCMs. Sub-ensembles from different initiatives are dominated by different uncertainty sources, being the driving GCM the main contributor to uncertainty in the grand-ensemble. For this analysis of the near future changes, the emission scenario does not lead to a strong uncertainty. Despite the extra computational effort, for mean seasonal changes, the increase in resolution does not lead to important changes.

  2. Evaluation of local adaptation strategies to climate change of maize crop in Andalusia for the first half of 21st century

    NASA Astrophysics Data System (ADS)

    Gabaldón, Clara; Lorite, Ignacio J.; Inés Mínguez, M.; Dosio, Alessandro; Sánchez-Sánchez, Enrique; Ruiz-Ramos, Margarita

    2013-04-01

    The objective of this work is to generate and analyse adaptation strategies to cope with impacts of climate change on cereal cropping systems in Andalusia (Southern Spain) in a semi-arid environment, with focus on extreme events. In Andalusia, located in the South of the Iberian Peninsula, cereals crops may be affected by the increase in average temperatures, the precipitation variability and the possible extreme events. Those impacts may cause a decrease in both water availability and the pollination rate resulting on a decrease in yield and the farmer's profitability. Designing local and regional adaptation strategies to reduce these negative impacts is necessary. This study is focused on irrigated maize on five Andalusia locations. The Andalusia Network of Agricultural Trials (RAEA in Spanish) provided the experimental crop and soil data, and the observed climate data were obtained from the Agroclimatic Information Network of Andalusia and the Spanish National Meteorological Agency (AEMET in Spanish). The data for future climate scenarios (2013-2050) were generated by Dosio and Paruolo (2011) and Dosio et al. (2012), who corrected the bias of ENSEMBLES data for maximum and minimum temperatures and precipitation. ENSEMBLES data were the results of numerical simulations obtained from a group of regional climate models at high resolution (25 km) from the European Project ENSEMBLES (http://www.ensembles-eu.org/). Crop models considered were CERES-maize (Jones and Kiniry, 1986) under DSSAT platform, and CropSyst (Stockle et al., 2003). Those crop models were applied only on locations were calibration and validation were done. The effects of the adaptations strategies, such as changes in sowing dates or choice of cultivar, were evaluated regarding water consumption; changes in phenological dates were also analysed to compare with occurrence of extreme events of maximum temperature. These events represent a threat on summer crops due to the reduction on the duration of grain filling period with the consequent reduction in yield (Ruiz-Ramos et al., 2011) and with the supraoptimal temperatures in pollination. Finally, results of simulated impacts and adaptations were compared to previous studies done without bias correction of climatic projections, at low resolution and with previous versions of crop models (Mínguez et al., 2007). This study will contribute to MACSUR knowledge Hub within the Joint Programming Initiative on Agriculture, Food Security and Climate Change (FACCE - JPI) of EU and is financed by MULCLIVAR project (CGL2012-38923-C02-02) and IFAPA project AGR6126 from Junta de Andalucía, Spain. References Dosio A. and Paruolo P., 2011. Bias correction of the ENSEMBLES high-resolution climate change projections for use by impact models: Evaluation on the present climate. Journal of Geophysical Research, VOL. 116, D16106, doi:10.1029/2011JD015934 Dosio A., Paruolo P. and Rojas R., 2012. Bias correction of the ENSEMBLES high resolution climate change projections for use by impact models: Analysis of the climate change signal. Journal of Geophysical Research, Volume 117, D17, doi: 0.1029/2012JD017968 Jones, C.A., and J.R. Kiniry. 1986. CERES-Maize: A simulation model of maize growth and development. Texas A&M Univ. Press, College Station. Mínguez, M.I., M. Ruiz-ramos, C.H. Díaz-Ambrona, and M. Quemada. 2007. First-order impacts on winter and summer crops assessed with various high-resolution climate models in the Iberian Peninsula. Climatic Change 81: 343-355. Ruiz-Ramos, M., E. Sanchez, C. Galllardo, and M.I. Minguez. 2011. Impacts of projected maximum temperature extremes for C21 by an ensemble of regional climate models on cereal cropping systems in the Iberian Peninsula. Natural Hazards and Earth System Science 11: 3275-3291. Stockle, C.O., M. Donatelli, and R. Nelson. 2003. CropSyst , a cropping systems simulation model. European Journal of Agronomy18: 289-307.

  3. Wave ensemble forecast system for tropical cyclones in the Australian region

    NASA Astrophysics Data System (ADS)

    Zieger, Stefan; Greenslade, Diana; Kepert, Jeffrey D.

    2018-05-01

    Forecasting of waves under extreme conditions such as tropical cyclones is vitally important for many offshore industries, but there remain many challenges. For Northwest Western Australia (NW WA), wave forecasts issued by the Australian Bureau of Meteorology have previously been limited to products from deterministic operational wave models forced by deterministic atmospheric models. The wave models are run over global (resolution 1/4∘) and regional (resolution 1/10∘) domains with forecast ranges of + 7 and + 3 day respectively. Because of this relatively coarse resolution (both in the wave models and in the forcing fields), the accuracy of these products is limited under tropical cyclone conditions. Given this limited accuracy, a new ensemble-based wave forecasting system for the NW WA region has been developed. To achieve this, a new dedicated 8-km resolution grid was nested in the global wave model. Over this grid, the wave model is forced with winds from a bias-corrected European Centre for Medium Range Weather Forecast atmospheric ensemble that comprises 51 ensemble members to take into account the uncertainties in location, intensity and structure of a tropical cyclone system. A unique technique is used to select restart files for each wave ensemble member. The system is designed to operate in real time during the cyclone season providing + 10-day forecasts. This paper will describe the wave forecast components of this system and present the verification metrics and skill for specific events.

  4. Sensitivity of modeled estuarine circulation to spatial and temporal resolution of input meteorological forcing of a cold frontal passage

    NASA Astrophysics Data System (ADS)

    Weaver, Robert J.; Taeb, Peyman; Lazarus, Steven; Splitt, Michael; Holman, Bryan P.; Colvin, Jeffrey

    2016-12-01

    In this study, a four member ensemble of meteorological forcing is generated using the Weather Research and Forecasting (WRF) model in order to simulate a frontal passage event that impacted the Indian River Lagoon (IRL) during March 2015. The WRF model is run to provide high and low, spatial (0.005° and 0.1°) and temporal (30 min and 6 h) input wind and pressure fields. The four member ensemble is used to force the Advanced Circulation model (ADCIRC) coupled with Simulating Waves Nearshore (SWAN) and compute the hydrodynamic and wave response. Results indicate that increasing the spatial resolution of the meteorological forcing has a greater impact on the results than increasing the temporal resolution in coastal systems like the IRL where the length scales are smaller than the resolution of the operational meteorological model being used to generate the forecast. Changes in predicted water elevations are due in part to the upwind and downwind behavior of the input wind forcing. The significant wave height is more sensitive to the meteorological forcing, exhibited by greater ensemble spread throughout the simulation. It is important that the land mask, seen by the meteorological model, is representative of the geography of the coastal estuary as resolved by the hydrodynamic model. As long as the temporal resolution of the wind field captures the bulk characteristics of the frontal passage, computational resources should be focused so as to ensure that the meteorological model resolves the spatial complexities, such as the land-water interface, that drive the land use responsible for dynamic downscaling of the winds.

  5. A 3-month long operational implementation of an ensemble prediction system of storm surge for the city of Venice

    NASA Astrophysics Data System (ADS)

    Mel, Riccardo; Lionello, Piero

    2014-05-01

    Advantages of an ensemble prediction forecast (EPF) technique that has been used for sea level (SL) prediction at the Northern Adriatic coast are investigated. The aims is to explore whether EPF is more precise than the traditional Deterministic Forecast (DF) and the value of the added information, mainly on forecast uncertainty. Improving the SL forecast for the city of Venice is of paramount importance for the management and maintenance of this historical city and for operating the movable barriers that are presently being built for its protection. The operational practice is simulated for three months from 1st October to 31st December 2010. The EPF is based on the HYPSE model, which is a standard single-layer nonlinear shallow water model, whose equations are derived from the depth averaged momentum equations and predicts the SL. A description of the model is available in the scientific literature. Forcing of HYPSE are provided by three different sets of 3-hourly ECMWF 10m-wind and MSLP fields: the high resolution meteorological forecast (which is used for the deterministic SL forecast, DF), the control run forecast (CRF, that differs from the DF forecast only for it lower meteorological fields resolution) and the 50 ensemble members of the ECMWF EPS (which are used for the SL-EPS. The resolution of DF fields is T1279 and resolution of both CRF and ECMWF EPS fields is T639 resolution. The 10m wind and MSLP fields have been downloaded at 0.125degs (DF) and 0.25degs(CRF and EPS) and linearly interpolated to the HYPSE grid (which is the same for all simulations). The version of HYPSE used in the SR EPS uses a rectangular mesh grid of variable size, which has the minimum grid step (0.03 degrees) in the northern part of the Adriatic Sea, from where grid step increases with a 1.01 factor in both latitude and longitude (In practice, resolution varies in the range from 3.3 to 7km). Results are analyzed considering the EPS spread, the rms of the simulations, the Brier Skill Score and are compared to observations at tide gauges distributed along the Croatian and Italian coast of the Adriatic Sea. It is shown that the ensemble spread is indeed a reliable indicator of the uncertainty of the storm surge prediction. Further, results show how uncertainty depends on the predicted value of sea level and how it increases with the forecast time range. The accuracy of the ensemble mean forecast is actually larger than that of the deterministic forecast, though the latter is produced by meteorological forcings at higher resolution

  6. Ensemble forecasting for renewable energy applications - status and current challenges for their generation and verification

    NASA Astrophysics Data System (ADS)

    Pinson, Pierre

    2016-04-01

    The operational management of renewable energy generation in power systems and electricity markets requires forecasts in various forms, e.g., deterministic or probabilistic, continuous or categorical, depending upon the decision process at hand. Besides, such forecasts may also be necessary at various spatial and temporal scales, from high temporal resolutions (in the order of minutes) and very localized for an offshore wind farm, to coarser temporal resolutions (hours) and covering a whole country for day-ahead power scheduling problems. As of today, weather predictions are a common input to forecasting methodologies for renewable energy generation. Since for most decision processes, optimal decisions can only be made if accounting for forecast uncertainties, ensemble predictions and density forecasts are increasingly seen as the product of choice. After discussing some of the basic approaches to obtaining ensemble forecasts of renewable power generation, it will be argued that space-time trajectories of renewable power production may or may not be necessitate post-processing ensemble forecasts for relevant weather variables. Example approaches and test case applications will be covered, e.g., looking at the Horns Rev offshore wind farm in Denmark, or gridded forecasts for the whole continental Europe. Eventually, we will illustrate some of the limitations of current frameworks to forecast verification, which actually make it difficult to fully assess the quality of post-processing approaches to obtain renewable energy predictions.

  7. Future changes in peak river flows across northern Eurasia as inferred from an ensemble of regional climate projections under the IPCC RCP8.5 scenario

    NASA Astrophysics Data System (ADS)

    Shkolnik, Igor; Pavlova, Tatiana; Efimov, Sergey; Zhuravlev, Sergey

    2018-01-01

    Climate change simulation based on 30-member ensemble of Voeikov Main Geophysical Observatory RCM (resolution 25 km) for northern Eurasia is used to drive hydrological model CaMa-Flood. Using this modeling framework, we evaluate the uncertainties in the future projection of the peak river discharge and flood hazard by 2050-2059 relative to 1990-1999 under IPCC RCP8.5 scenario. Large ensemble size, along with reasonably high modeling resolution, allows one to efficiently sample natural climate variability and increase our ability to predict future changes in the hydrological extremes. It has been shown that the annual maximum river discharge can almost double by the mid-XXI century in the outlets of major Siberian rivers. In the western regions, there is a weak signal in the river discharge and flood hazard, hardly discernible above climate variability. Annual maximum flood area is projected to increase across Siberia mostly by 2-5% relative to the baseline period. A contribution of natural climate variability at different temporal scales to the uncertainty of ensemble prediction is discussed. The analysis shows that there expected considerable changes in the extreme river discharge probability at locations of the key hydropower facilities. This suggests that the extensive impact studies are required to develop recommendations for maintaining regional energy security.

  8. Assimilating every-30-second 100-m-mesh radar observations for convective weather: implications to non-Gaussian PDF

    NASA Astrophysics Data System (ADS)

    Miyoshi, T.; Teramura, T.; Ruiz, J.; Kondo, K.; Lien, G. Y.

    2016-12-01

    Convective weather is known to be highly nonlinear and chaotic, and it is hard to predict their location and timing precisely. Our Big Data Assimilation (BDA) effort has been exploring to use dense and frequent observations to avoid non-Gaussian probability density function (PDF) and to apply an ensemble Kalman filter under the Gaussian error assumption. The phased array weather radar (PAWR) can observe a dense three-dimensional volume scan with 100-m range resolution and 100 elevation angles in only 30 seconds. The BDA system assimilates the PAWR reflectivity and Doppler velocity observations every 30 seconds into 100 ensemble members of storm-scale numerical weather prediction (NWP) model at 100-m grid spacing. The 30-second-update, 100-m-mesh BDA system has been quite successful in multiple case studies of local severe rainfall events. However, with 1000 ensemble members, the reduced-resolution BDA system at 1-km grid spacing showed significant non-Gaussian PDF with every-30-second updates. With a 10240-member ensemble Kalman filter with a global NWP model at 112-km grid spacing, we found roughly 1000 members satisfactory to capture the non-Gaussian error structures. With these in mind, we explore how the density of observations in space and time affects the non-Gaussianity in an ensemble Kalman filter with a simple toy model. In this presentation, we will present the most up-to-date results of the BDA research, as well as the investigation with the toy model on the non-Gaussianity with dense and frequent observations.

  9. Evaluation of WRF-based convection-permitting multi-physics ensemble forecasts over China for an extreme rainfall event on 21 July 2012 in Beijing

    NASA Astrophysics Data System (ADS)

    Zhu, Kefeng; Xue, Ming

    2016-11-01

    On 21 July 2012, an extreme rainfall event that recorded a maximum rainfall amount over 24 hours of 460 mm, occurred in Beijing, China. Most operational models failed to predict such an extreme amount. In this study, a convective-permitting ensemble forecast system (CEFS), at 4-km grid spacing, covering the entire mainland of China, is applied to this extreme rainfall case. CEFS consists of 22 members and uses multiple physics parameterizations. For the event, the predicted maximum is 415 mm d-1 in the probability-matched ensemble mean. The predicted high-probability heavy rain region is located in southwest Beijing, as was observed. Ensemble-based verification scores are then investigated. For a small verification domain covering Beijing and its surrounding areas, the precipitation rank histogram of CEFS is much flatter than that of a reference global ensemble. CEFS has a lower (higher) Brier score and a higher resolution than the global ensemble for precipitation, indicating more reliable probabilistic forecasting by CEFS. Additionally, forecasts of different ensemble members are compared and discussed. Most of the extreme rainfall comes from convection in the warm sector east of an approaching cold front. A few members of CEFS successfully reproduce such precipitation, and orographic lift of highly moist low-level flows with a significantly southeasterly component is suggested to have played important roles in producing the initial convection. Comparisons between good and bad forecast members indicate a strong sensitivity of the extreme rainfall to the mesoscale environmental conditions, and, to less of an extent, the model physics.

  10. Predicting nucleic acid binding interfaces from structural models of proteins

    PubMed Central

    Dror, Iris; Shazman, Shula; Mukherjee, Srayanta; Zhang, Yang; Glaser, Fabian; Mandel-Gutfreund, Yael

    2011-01-01

    The function of DNA- and RNA-binding proteins can be inferred from the characterization and accurate prediction of their binding interfaces. However the main pitfall of various structure-based methods for predicting nucleic acid binding function is that they are all limited to a relatively small number of proteins for which high-resolution three dimensional structures are available. In this study, we developed a pipeline for extracting functional electrostatic patches from surfaces of protein structural models, obtained using the I-TASSER protein structure predictor. The largest positive patches are extracted from the protein surface using the patchfinder algorithm. We show that functional electrostatic patches extracted from an ensemble of structural models highly overlap the patches extracted from high-resolution structures. Furthermore, by testing our pipeline on a set of 55 known nucleic acid binding proteins for which I-TASSER produces high-quality models, we show that the method accurately identifies the nucleic acids binding interface on structural models of proteins. Employing a combined patch approach we show that patches extracted from an ensemble of models better predicts the real nucleic acid binding interfaces compared to patches extracted from independent models. Overall, these results suggest that combining information from a collection of low-resolution structural models could be a valuable approach for functional annotation. We suggest that our method will be further applicable for predicting other functional surfaces of proteins with unknown structure. PMID:22086767

  11. Benefits of an ultra large and multiresolution ensemble for estimating available wind power

    NASA Astrophysics Data System (ADS)

    Berndt, Jonas; Hoppe, Charlotte; Elbern, Hendrik

    2016-04-01

    In this study we investigate the benefits of an ultra large ensemble with up to 1000 members including multiple nesting with a target horizontal resolution of 1 km. The ensemble shall be used as a basis to detect events of extreme errors in wind power forecasting. Forecast value is the wind vector at wind turbine hub height (~ 100 m) in the short range (1 to 24 hour). Current wind power forecast systems rest already on NWP ensemble models. However, only calibrated ensembles from meteorological institutions serve as input so far, with limited spatial resolution (˜10 - 80 km) and member number (˜ 50). Perturbations related to the specific merits of wind power production are yet missing. Thus, single extreme error events which are not detected by such ensemble power forecasts occur infrequently. The numerical forecast model used in this study is the Weather Research and Forecasting Model (WRF). Model uncertainties are represented by stochastic parametrization of sub-grid processes via stochastically perturbed parametrization tendencies and in conjunction via the complementary stochastic kinetic-energy backscatter scheme already provided by WRF. We perform continuous ensemble updates by comparing each ensemble member with available observations using a sequential importance resampling filter to improve the model accuracy while maintaining ensemble spread. Additionally, we use different ensemble systems from global models (ECMWF and GFS) as input and boundary conditions to capture different synoptic conditions. Critical weather situations which are connected to extreme error events are located and corresponding perturbation techniques are applied. The demanding computational effort is overcome by utilising the supercomputer JUQUEEN at the Forschungszentrum Juelich.

  12. Throwing the Uncertainty Toolbox at Antarctica: Multi-model Ensemble Simulation, Emulation and Bayesian Calibration of Marine Ice Sheet Instability

    NASA Astrophysics Data System (ADS)

    Edwards, T.

    2015-12-01

    Modelling Antarctic marine ice sheet instability (MISI) - the potential for sustained grounding line retreat along downsloping bedrock - is very challenging because high resolution at the grounding line is required for reliable simulation. Assessing modelling uncertainties is even more difficult, because such models are very computationally expensive, restricting the number of simulations that can be performed. Quantifying uncertainty in future Antarctic instability has therefore so far been limited. There are several ways to tackle this problem, including: Simulating a small domain, to reduce expense and allow the use of ensemble methods; Parameterising response of the grounding line to the onset of MISI, for the same reasons; Emulating the simulator with a statistical model, to explore the impacts of uncertainties more thoroughly; Substituting physical models with expert-elicited statistical distributions. Methods 2-4 require rigorous testing against observations and high resolution models to have confidence in their results. We use all four to examine the dependence of MISI in the Amundsen Sea Embayment (ASE) on uncertain model inputs, including bedrock topography, ice viscosity, basal friction, model structure (sliding law and treatment of grounding line migration) and MISI triggers (including basal melting and risk of ice shelf collapse). We compare simulations from a 3000 member ensemble with GRISLI (methods 2, 4) with a 284 member ensemble from BISICLES (method 1) and also use emulation (method 3). Results from the two ensembles show similarities, despite very different model structures and ensemble designs. Basal friction and topography have a large effect on the extent of grounding line retreat, and the sliding law strongly modifies sea level contributions through changes in the rate and extent of grounding line retreat and the rate of ice thinning. Over 50 years, MISI in the ASE gives up to 1.1 mm/year (95% quantile) SLE in GRISLI (calibrated with ASE mass losses in a Bayesian framework), and up to 1.2 mm/year SLE (95% quantile) in the 270 completed BISICLES simulations (no calibration). We will show preliminary results emulating the models, calibrating with observations, and comparing them to assess structural uncertainty. We use these to improve MISI projections for the whole continent.

  13. High-resolution magnetic resonance spectroscopy using a solid-state spin sensor

    NASA Astrophysics Data System (ADS)

    Glenn, David R.; Bucher, Dominik B.; Lee, Junghyun; Lukin, Mikhail D.; Park, Hongkun; Walsworth, Ronald L.

    2018-03-01

    Quantum systems that consist of solid-state electronic spins can be sensitive detectors of nuclear magnetic resonance (NMR) signals, particularly from very small samples. For example, nitrogen–vacancy centres in diamond have been used to record NMR signals from nanometre-scale samples, with sensitivity sufficient to detect the magnetic field produced by a single protein. However, the best reported spectral resolution for NMR of molecules using nitrogen–vacancy centres is about 100 hertz. This is insufficient to resolve the key spectral identifiers of molecular structure that are critical to NMR applications in chemistry, structural biology and materials research, such as scalar couplings (which require a resolution of less than ten hertz) and small chemical shifts (which require a resolution of around one part per million of the nuclear Larmor frequency). Conventional, inductively detected NMR can provide the necessary high spectral resolution, but its limited sensitivity typically requires millimetre-scale samples, precluding applications that involve smaller samples, such as picolitre-volume chemical analysis or correlated optical and NMR microscopy. Here we demonstrate a measurement technique that uses a solid-state spin sensor (a magnetometer) consisting of an ensemble of nitrogen–vacancy centres in combination with a narrowband synchronized readout protocol to obtain NMR spectral resolution of about one hertz. We use this technique to observe NMR scalar couplings in a micrometre-scale sample volume of approximately ten picolitres. We also use the ensemble of nitrogen–vacancy centres to apply NMR to thermally polarized nuclear spins and resolve chemical-shift spectra from small molecules. Our technique enables analytical NMR spectroscopy at the scale of single cells.

  14. A comparison of breeding and ensemble transform vectors for global ensemble generation

    NASA Astrophysics Data System (ADS)

    Deng, Guo; Tian, Hua; Li, Xiaoli; Chen, Jing; Gong, Jiandong; Jiao, Meiyan

    2012-02-01

    To compare the initial perturbation techniques using breeding vectors and ensemble transform vectors, three ensemble prediction systems using both initial perturbation methods but with different ensemble member sizes based on the spectral model T213/L31 are constructed at the National Meteorological Center, China Meteorological Administration (NMC/CMA). A series of ensemble verification scores such as forecast skill of the ensemble mean, ensemble resolution, and ensemble reliability are introduced to identify the most important attributes of ensemble forecast systems. The results indicate that the ensemble transform technique is superior to the breeding vector method in light of the evaluation of anomaly correlation coefficient (ACC), which is a deterministic character of the ensemble mean, the root-mean-square error (RMSE) and spread, which are of probabilistic attributes, and the continuous ranked probability score (CRPS) and its decomposition. The advantage of the ensemble transform approach is attributed to its orthogonality among ensemble perturbations as well as its consistence with the data assimilation system. Therefore, this study may serve as a reference for configuration of the best ensemble prediction system to be used in operation.

  15. Utilization of Short-Simulations for Tuning High-Resolution Climate Model

    NASA Astrophysics Data System (ADS)

    Lin, W.; Xie, S.; Ma, P. L.; Rasch, P. J.; Qian, Y.; Wan, H.; Ma, H. Y.; Klein, S. A.

    2016-12-01

    Many physical parameterizations in atmospheric models are sensitive to resolution. Tuning the models that involve a multitude of parameters at high resolution is computationally expensive, particularly when relying primarily on multi-year simulations. This work describes a complementary set of strategies for tuning high-resolution atmospheric models, using ensembles of short simulations to reduce the computational cost and elapsed time. Specifically, we utilize the hindcast approach developed through the DOE Cloud Associated Parameterization Testbed (CAPT) project for high-resolution model tuning, which is guided by a combination of short (< 10 days ) and longer ( 1 year) Perturbed Parameters Ensemble (PPE) simulations at low resolution to identify model feature sensitivity to parameter changes. The CAPT tests have been found to be effective in numerous previous studies in identifying model biases due to parameterized fast physics, and we demonstrate that it is also useful for tuning. After the most egregious errors are addressed through an initial "rough" tuning phase, longer simulations are performed to "hone in" on model features that evolve over longer timescales. We explore these strategies to tune the DOE ACME (Accelerated Climate Modeling for Energy) model. For the ACME model at 0.25° resolution, it is confirmed that, given the same parameters, major biases in global mean statistics and many spatial features are consistent between Atmospheric Model Intercomparison Project (AMIP)-type simulations and CAPT-type hindcasts, with just a small number of short-term simulations for the latter over the corresponding season. The use of CAPT hindcasts to find parameter choice for the reduction of large model biases dramatically improves the turnaround time for the tuning at high resolution. Improvement seen in CAPT hindcasts generally translates to improved AMIP-type simulations. An iterative CAPT-AMIP tuning approach is therefore adopted during each major tuning cycle, with the former to survey the likely responses and narrow the parameter space, and the latter to verify the results in climate context along with assessment in greater detail once an educated set of parameter choice is selected. Limitations on using short-term simulations for tuning climate model are also discussed.

  16. Climate Change and Hydrological Extreme Events - Risks and Perspectives for Water Management in Bavaria and Québec

    NASA Astrophysics Data System (ADS)

    Ludwig, R.

    2017-12-01

    There is as yet no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for `virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the current and future ratio of natural variability and climate change impacts on meteorological extreme events. Selected data from the ensemble is used to drive a hydrological model experiment to illustrate the capacity to better determine the recurrence periods of hydrological extreme events under conditions of climate change.

  17. An operational search and rescue model for the Norwegian Sea and the North Sea

    NASA Astrophysics Data System (ADS)

    Breivik, Øyvind; Allen, Arthur A.

    A new operational, ensemble-based search and rescue model for the Norwegian Sea and the North Sea is presented. The stochastic trajectory model computes the net motion of a range of search and rescue objects. A new, robust formulation for the relation between the wind and the motion of the drifting object (termed the leeway of the object) is employed. Empirically derived coefficients for 63 categories of search objects compiled by the US Coast Guard are ingested to estimate the leeway of the drifting objects. A Monte Carlo technique is employed to generate an ensemble that accounts for the uncertainties in forcing fields (wind and current), leeway drift properties, and the initial position of the search object. The ensemble yields an estimate of the time-evolving probability density function of the location of the search object, and its envelope defines the search area. Forcing fields from the operational oceanic and atmospheric forecast system of The Norwegian Meteorological Institute are used as input to the trajectory model. This allows for the first time high-resolution wind and current fields to be used to forecast search areas up to 60 h into the future. A limited set of field exercises show good agreement between model trajectories, search areas, and observed trajectories for life rafts and other search objects. Comparison with older methods shows that search areas expand much more slowly using the new ensemble method with high resolution forcing fields and the new leeway formulation. It is found that going to higher-order stochastic trajectory models will not significantly improve the forecast skill and the rate of expansion of search areas.

  18. Assessing a local ensemble Kalman filter: perfect model experiments with the National Centers for Environmental Prediction global model

    NASA Astrophysics Data System (ADS)

    Szunyogh, Istvan; Kostelich, Eric J.; Gyarmati, G.; Patil, D. J.; Hunt, Brian R.; Kalnay, Eugenia; Ott, Edward; Yorke, James A.

    2005-08-01

    The accuracy and computational efficiency of the recently proposed local ensemble Kalman filter (LEKF) data assimilation scheme is investigated on a state-of-the-art operational numerical weather prediction model using simulated observations. The model selected for this purpose is the T62 horizontal- and 28-level vertical-resolution version of the Global Forecast System (GFS) of the National Center for Environmental Prediction. The performance of the data assimilation system is assessed for different configurations of the LEKF scheme. It is shown that a modest size (40-member) ensemble is sufficient to track the evolution of the atmospheric state with high accuracy. For this ensemble size, the computational time per analysis is less than 9 min on a cluster of PCs. The analyses are extremely accurate in the mid-latitude storm track regions. The largest analysis errors, which are typically much smaller than the observational errors, occur where parametrized physical processes play important roles. Because these are also the regions where model errors are expected to be the largest, limitations of a real-data implementation of the ensemble-based Kalman filter may be easily mistaken for model errors. In light of these results, the importance of testing the ensemble-based Kalman filter data assimilation systems on simulated observations is stressed.

  19. Hybrid vs Adaptive Ensemble Kalman Filtering for Storm Surge Forecasting

    NASA Astrophysics Data System (ADS)

    Altaf, M. U.; Raboudi, N.; Gharamti, M. E.; Dawson, C.; McCabe, M. F.; Hoteit, I.

    2014-12-01

    Recent storm surge events due to Hurricanes in the Gulf of Mexico have motivated the efforts to accurately forecast water levels. Toward this goal, a parallel architecture has been implemented based on a high resolution storm surge model, ADCIRC. However the accuracy of the model notably depends on the quality and the recentness of the input data (mainly winds and bathymetry), model parameters (e.g. wind and bottom drag coefficients), and the resolution of the model grid. Given all these uncertainties in the system, the challenge is to build an efficient prediction system capable of providing accurate forecasts enough ahead of time for the authorities to evacuate the areas at risk. We have developed an ensemble-based data assimilation system to frequently assimilate available data into the ADCIRC model in order to improve the accuracy of the model. In this contribution we study and analyze the performances of different ensemble Kalman filter methodologies for efficient short-range storm surge forecasting, the aim being to produce the most accurate forecasts at the lowest possible computing time. Using Hurricane Ike meteorological data to force the ADCIRC model over a domain including the Gulf of Mexico coastline, we implement and compare the forecasts of the standard EnKF, the hybrid EnKF and an adaptive EnKF. The last two schemes have been introduced as efficient tools for enhancing the behavior of the EnKF when implemented with small ensembles by exploiting information from a static background covariance matrix. Covariance inflation and localization are implemented in all these filters. Our results suggest that both the hybrid and the adaptive approach provide significantly better forecasts than those resulting from the standard EnKF, even when implemented with much smaller ensembles.

  20. Ensemble assimilation of ARGO temperature profile, sea surface temperature and Altimetric satellite data into an eddy permitting primitive equation model of the North Atlantic ocean

    NASA Astrophysics Data System (ADS)

    Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre

    2015-04-01

    Sea surface height, sea surface temperature and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. 60 ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. Incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with observations used in the assimilation experiments and independent observations, which goes further than most previous studies and constitutes one of the original points of this paper. Regarding the deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations in order to diagnose the ensemble distribution properties in a deterministic way. Regarding the probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analysed jointly. The consistency and complementarity between both validations are highlighted. High reliable situations, in which the RMS error and the CRPS give the same information, are identified for the first time in this paper.

  1. Ideas for a pattern-oriented approach towards a VERA analysis ensemble

    NASA Astrophysics Data System (ADS)

    Gorgas, T.; Dorninger, M.

    2010-09-01

    Ideas for a pattern-oriented approach towards a VERA analysis ensemble For many applications in meteorology and especially for verification purposes it is important to have some information about the uncertainties of observation and analysis data. A high quality of these "reference data" is an absolute necessity as the uncertainties are reflected in verification measures. The VERA (Vienna Enhanced Resolution Analysis) scheme includes a sophisticated quality control tool which accounts for the correction of observational data and provides an estimation of the observation uncertainty. It is crucial for meteorologically and physically reliable analysis fields. VERA is based on a variational principle and does not need any first guess fields. It is therefore NWP model independent and can also be used as an unbiased reference for real time model verification. For downscaling purposes VERA uses an a priori knowledge on small-scale physical processes over complex terrain, the so called "fingerprint technique", which transfers information from rich to data sparse regions. The enhanced Joint D-PHASE and COPS data set forms the data base for the analysis ensemble study. For the WWRP projects D-PHASE and COPS a joint activity has been started to collect GTS and non-GTS data from the national and regional meteorological services in Central Europe for 2007. Data from more than 11.000 stations are available for high resolution analyses. The usage of random numbers as perturbations for ensemble experiments is a common approach in meteorology. In most implementations, like for NWP-model ensemble systems, the focus lies on error growth and propagation on the spatial and temporal scale. When defining errors in analysis fields we have to consider the fact that analyses are not time dependent and that no perturbation method aimed at temporal evolution is possible. Further, the method applied should respect two major sources of analysis errors: Observation errors AND analysis or interpolation errors. With the concept of an analysis ensemble we hope to get a more detailed sight on both sources of analysis errors. For the computation of the VERA ensemble members a sample of Gaussian random perturbations is produced for each station and parameter. The deviation of perturbations is based on the correction proposals by the VERA QC scheme to provide some "natural" limits for the ensemble. In order to put more emphasis on the weather situation we aim to integrate the main synoptic field structures as weighting factors for the perturbations. Two widely approved approaches are used for the definition of these main field structures: The Principal Component Analysis and a 2D-Discrete Wavelet Transform. The results of tests concerning the implementation of this pattern-supported analysis ensemble system and a comparison of the different approaches are given in the presentation.

  2. A SPAD-based 3D imager with in-pixel TDC for 145ps-accuracy ToF measurement

    NASA Astrophysics Data System (ADS)

    Vornicu, I.; Carmona-Galán, R.; Rodríguez-Vázquez, Á.

    2015-03-01

    The design and measurements of a CMOS 64 × 64 Single-Photon Avalanche-Diode (SPAD) array with in-pixel Time-to-Digital Converter (TDC) are presented. This paper thoroughly describes the imager at architectural and circuit level with particular emphasis on the characterization of the SPAD-detector ensemble. It is aimed to 2D imaging and 3D image reconstruction in low light environments. It has been fabricated in a standard 0.18μm CMOS process, i. e. without high voltage or low noise features. In these circumstances, we are facing a high number of dark counts and low photon detection efficiency. Several techniques have been applied to ensure proper functionality, namely: i) time-gated SPAD front-end with fast active-quenching/recharge circuit featuring tunable dead-time, ii) reverse start-stop scheme, iii) programmable time resolution of the TDC based on a novel pseudo-differential voltage controlled ring oscillator with fast start-up, iv) a global calibration scheme against temperature and process variation. Measurements results of individual SPAD-TDC ensemble jitter, array uniformity and time resolution programmability are also provided.

  3. Structural anomalies in undoped Gallium Arsenide observed in high resolution diffraction imaging with monochromatic synchrotron radiation

    NASA Technical Reports Server (NTRS)

    Steiner, B.; Kuriyama, M.; Dobbyn, R. C.; Laor, U.; Larson, D.; Brown, M.

    1988-01-01

    Novel, streak-like disruption features restricted to the plane of diffraction have recently been observed in images obtained by synchrotron radiation diffraction from undoped, semi-insulating gallium arsenide crystals. These features were identified as ensembles of very thin platelets or interfaces lying in (110) planes, and a structural model consisting of antiphase domain boundaries was proposed. We report here the other principal features observed in high resolution monochromatic synchrotron radiation diffraction images: (quasi) cellular structure; linear, very low-angle subgrain boundaries in (110) directions, and surface stripes in a (110) direction. In addition, we report systematic differences in the acceptance angle for images involving various diffraction vectors. When these observations are considered together, a unifying picture emerges. The presence of ensembles of thin (110) antiphase platelet regions or boundaries is generally consistent not only with the streak-like diffraction features but with the other features reported here as well. For the formation of such regions we propose two mechanisms, operating in parallel, that appear to be consistent with the various defect features observed by a variety of techniques.

  4. Structural anomalies in undoped gallium arsenide observed in high-resolution diffraction imaging with monochromatic synchrotron radiation

    NASA Technical Reports Server (NTRS)

    Steiner, B.; Kuriyama, M.; Dobbyn, R. C.; Laor, U.; Larson, D.

    1989-01-01

    Novel, streak-like disruption features restricted to the plane of diffraction have recently been observed in images obtained by synchrotron radiation diffraction from undoped, semi-insulating gallium arsenide crystals. These features were identified as ensembles of very thin platelets or interfaces lying in (110) planes, and a structural model consisting of antiphase domain boundaries was proposed. We report here the other principal features observed in high resolution monochromatic synchrotron radiation diffraction images: (quasi) cellular structure; linear, very low-angle subgrain boundaries in (110) directions, and surface stripes in a (110) direction. In addition, we report systematic differences in the acceptance angle for images involving various diffraction vectors. When these observations are considered together, a unifying picture emerges. The presence of ensembles of thin (110) antiphase platelet regions or boundaries is generally consistent not only with the streak-like diffraction features but with the other features reported here as well. For the formation of such regions we propose two mechanisms, operating in parallel, that appear to be consistent with the various defect features observed by a variety of techniques.

  5. Simulation studies of the fidelity of biomolecular structure ensemble recreation

    NASA Astrophysics Data System (ADS)

    Lätzer, Joachim; Eastwood, Michael P.; Wolynes, Peter G.

    2006-12-01

    We examine the ability of Bayesian methods to recreate structural ensembles for partially folded molecules from averaged data. Specifically we test the ability of various algorithms to recreate different transition state ensembles for folding proteins using a multiple replica simulation algorithm using input from "gold standard" reference ensembles that were first generated with a Gō-like Hamiltonian having nonpairwise additive terms. A set of low resolution data, which function as the "experimental" ϕ values, were first constructed from this reference ensemble. The resulting ϕ values were then treated as one would treat laboratory experimental data and were used as input in the replica reconstruction algorithm. The resulting ensembles of structures obtained by the replica algorithm were compared to the gold standard reference ensemble, from which those "data" were, in fact, obtained. It is found that for a unimodal transition state ensemble with a low barrier, the multiple replica algorithm does recreate the reference ensemble fairly successfully when no experimental error is assumed. The Kolmogorov-Smirnov test as well as principal component analysis show that the overlap of the recovered and reference ensembles is significantly enhanced when multiple replicas are used. Reduction of the multiple replica ensembles by clustering successfully yields subensembles with close similarity to the reference ensembles. On the other hand, for a high barrier transition state with two distinct transition state ensembles, the single replica algorithm only samples a few structures of one of the reference ensemble basins. This is due to the fact that the ϕ values are intrinsically ensemble averaged quantities. The replica algorithm with multiple copies does sample both reference ensemble basins. In contrast to the single replica case, the multiple replicas are constrained to reproduce the average ϕ values, but allow fluctuations in ϕ for each individual copy. These fluctuations facilitate a more faithful sampling of the reference ensemble basins. Finally, we test how robustly the reconstruction algorithm can function by introducing errors in ϕ comparable in magnitude to those suggested by some authors. In this circumstance we observe that the chances of ensemble recovery with the replica algorithm are poor using a single replica, but are improved when multiple copies are used. A multimodal transition state ensemble, however, turns out to be more sensitive to large errors in ϕ (if appropriately gauged) and attempts at successful recreation of the reference ensemble with simple replica algorithms can fall short.

  6. The Ensemble Space Weather Modeling System (eSWMS): Status, Capabilities and Challenges

    NASA Astrophysics Data System (ADS)

    Fry, C. D.; Eccles, J. V.; Reich, J. P.

    2010-12-01

    Marking a milestone in space weather forecasting, the Space Weather Modeling System (SWMS) successfully completed validation testing in advance of operational testing at Air Force Weather Agency’s primary space weather production center. This is the first coupling of stand-alone, physics-based space weather models that are currently in operations at AFWA supporting the warfighter. Significant development effort went into ensuring the component models were portable and scalable while maintaining consistent results across diverse high performance computing platforms. Coupling was accomplished under the Earth System Modeling Framework (ESMF). The coupled space weather models are the Hakamada-Akasofu-Fry version 2 (HAFv2) solar wind model and GAIM1, the ionospheric forecast component of the Global Assimilation of Ionospheric Measurements (GAIM) model. The SWMS was developed by team members from AFWA, Explorations Physics International, Inc. (EXPI) and Space Environment Corporation (SEC). The successful development of the SWMS provides new capabilities beyond enabling extended lead-time, data-driven ionospheric forecasts. These include ingesting diverse data sets at higher resolution, incorporating denser computational grids at finer time steps, and performing probability-based ensemble forecasts. Work of the SWMS development team now focuses on implementing the ensemble-based probability forecast capability by feeding multiple scenarios of 5 days of solar wind forecasts to the GAIM1 model based on the variation of the input fields to the HAFv2 model. The ensemble SWMS (eSWMS) will provide the most-likely space weather scenario with uncertainty estimates for important forecast fields. The eSWMS will allow DoD mission planners to consider the effects of space weather on their systems with more advance warning than is currently possible. The payoff is enhanced, tailored support to the warfighter with improved capabilities, such as point-to-point HF propagation forecasts, single-frequency GPS error corrections, and high cadence, high-resolution Space Situational Awareness (SSA) products. We present the current status of eSWMS, its capabilities, limitations and path of transition to operational use.

  7. Coral Ensemble Estimates of Central Pacific Mean Climate During the Little Ice Age

    NASA Astrophysics Data System (ADS)

    Sayani, H. R.; Cobb, K. M.; O'Connor, G.; Khare, A.; Atwood, A. R.; Grothe, P. R.; Chen, T.; Hagos, M. M.; Hitt, N. T.; Thompson, D. M.; Deocampo, D.; Lu, Y.; Cheng, H.; Edwards, R. L.

    2016-12-01

    Multi-century, robust records of tropical Pacific sea-surface temperature (SST) and salinity (SSS) variability from the pre-industrial era are needed to quantify anthropogenic contributions to present-day climate trends and to improve the accuracy of regional climate projections. However, high-resolution reconstructions of tropical Pacific climate are scarce prior to the 20th century, and only a handful exist from the Little Ice Age (LIA, 1500-1850CE) immediately prior to the documented rise of anthropogenic greenhouse gases. Modern and fossil corals from the northern Line Islands (2-6°N, 157-162°W) have been used to extend the instrumental climate record back into the LIA and beyond, primarily for paleo-ENSO investigations [Cobb et al., 2003, 2013]. However, large offsets in mean coral Sr/Ca and δ18O values observed across overlapping coral colonies translate into 1-2°C (1σ) uncertainties for mean climate reconstructions based on any single fossil coral colony. Here we present the results of a new approach to reconstructing mean climate during the LIA using a large ensemble (N>10) of relatively short (7-15yr long), U/Th-dated fossil corals from Christmas Island (2°N, 157°W). We employ pseudo-coral estimates of paleo-SST and paleo-seawater δ18O variations as benchmarks for our reconstructions, with a focus on quantifying the maximum and minimum potential tropical Pacific SST changes during the LIA that are consistent with our new ensemble of coral data. Lastly, by comparing bulk and high-resolution coral d18O and Sr/Ca records, we identify the strengths and limitations of using a high-N, ensemble approach to climate reconstruction from fossil corals. References:Cobb, K. M., et al. (2003) Nature. doi:10.1038/nature01779Cobb, K. M., et al. (2013) Science. doi: 10.1126/science.1228246

  8. Predicting nucleic acid binding interfaces from structural models of proteins.

    PubMed

    Dror, Iris; Shazman, Shula; Mukherjee, Srayanta; Zhang, Yang; Glaser, Fabian; Mandel-Gutfreund, Yael

    2012-02-01

    The function of DNA- and RNA-binding proteins can be inferred from the characterization and accurate prediction of their binding interfaces. However, the main pitfall of various structure-based methods for predicting nucleic acid binding function is that they are all limited to a relatively small number of proteins for which high-resolution three-dimensional structures are available. In this study, we developed a pipeline for extracting functional electrostatic patches from surfaces of protein structural models, obtained using the I-TASSER protein structure predictor. The largest positive patches are extracted from the protein surface using the patchfinder algorithm. We show that functional electrostatic patches extracted from an ensemble of structural models highly overlap the patches extracted from high-resolution structures. Furthermore, by testing our pipeline on a set of 55 known nucleic acid binding proteins for which I-TASSER produces high-quality models, we show that the method accurately identifies the nucleic acids binding interface on structural models of proteins. Employing a combined patch approach we show that patches extracted from an ensemble of models better predicts the real nucleic acid binding interfaces compared with patches extracted from independent models. Overall, these results suggest that combining information from a collection of low-resolution structural models could be a valuable approach for functional annotation. We suggest that our method will be further applicable for predicting other functional surfaces of proteins with unknown structure. Copyright © 2011 Wiley Periodicals, Inc.

  9. Cloudy Windows: What GCM Ensembles, Reanalyses and Observations Tell Us About Uncertainty in Greenland's Future Climate and Surface Melting

    NASA Astrophysics Data System (ADS)

    Reusch, D. B.

    2016-12-01

    Any analysis that wants to use a GCM-based scenario of future climate benefits from knowing how much uncertainty the GCM's inherent variability adds to the development of climate change predictions. This is extra relevant in the polar regions due to the potential of global impacts (e.g., sea level rise) from local (ice sheet) climate changes such as more frequent/intense surface melting. High-resolution, regional-scale models using GCMs for boundary/initial conditions in future scenarios inherit a measure of GCM-derived externally-driven uncertainty. We investigate these uncertainties for the Greenland ice sheet using the 30-member CESM1.0-CAM5-BGC Large Ensemble (CESMLE) for recent (1981-2000) and future (2081-2100, RCP 8.5) decades. Recent simulations are skill-tested against the ERA-Interim reanalysis and AWS observations with results informing future scenarios. We focus on key variables influencing surface melting through decadal climatologies, nonlinear analysis of variability with self-organizing maps (SOMs), regional-scale modeling (Polar WRF), and simple melt models. Relative to the ensemble average, spatially averaged climatological July temperature anomalies over a Greenland ice-sheet/ocean domain are mostly between +/- 0.2 °C. The spatial average hides larger local anomalies of up to +/- 2 °C. The ensemble average itself is 2 °C cooler than ERA-Interim. SOMs extend our diagnostics by providing a concise, objective summary of model variability as a set of generalized patterns. For CESMLE, the SOM patterns summarize the variability of multiple realizations of climate. Changes in pattern frequency by ensemble member show the influence of initial conditions. For example, basic statistical analysis of pattern frequency yields interquartile ranges of 2-4% for individual patterns across the ensemble. In climate terms, this tells us about climate state variability through the range of the ensemble, a potentially significant source of melt-prediction uncertainty. SOMs can also capture the different trajectories of climate due to intramodel variability over time. Polar WRF provides higher resolution regional modeling with improved, polar-centric model physics. Simple melt models allow us to characterize impacts of the upstream uncertainties on estimates of surface melting.

  10. Efficient Approaches for Propagating Hydrologic Forcing Uncertainty: High-Resolution Applications Over the Western United States

    NASA Astrophysics Data System (ADS)

    Hobbs, J.; Turmon, M.; David, C. H.; Reager, J. T., II; Famiglietti, J. S.

    2017-12-01

    NASA's Western States Water Mission (WSWM) combines remote sensing of the terrestrial water cycle with hydrological models to provide high-resolution state estimates for multiple variables. The effort includes both land surface and river routing models that are subject to several sources of uncertainty, including errors in the model forcing and model structural uncertainty. Computational and storage constraints prohibit extensive ensemble simulations, so this work outlines efficient but flexible approaches for estimating and reporting uncertainty. Calibrated by remote sensing and in situ data where available, we illustrate the application of these techniques in producing state estimates with associated uncertainties at kilometer-scale resolution for key variables such as soil moisture, groundwater, and streamflow.

  11. Analyzing Tropical Waves Using the Parallel Ensemble Empirical Model Decomposition Method: Preliminary Results from Hurricane Sandy

    NASA Technical Reports Server (NTRS)

    Shen, Bo-Wen; Cheung, Samson; Li, Jui-Lin F.; Wu, Yu-ling

    2013-01-01

    In this study, we discuss the performance of the parallel ensemble empirical mode decomposition (EMD) in the analysis of tropical waves that are associated with tropical cyclone (TC) formation. To efficiently analyze high-resolution, global, multiple-dimensional data sets, we first implement multilevel parallelism into the ensemble EMD (EEMD) and obtain a parallel speedup of 720 using 200 eight-core processors. We then apply the parallel EEMD (PEEMD) to extract the intrinsic mode functions (IMFs) from preselected data sets that represent (1) idealized tropical waves and (2) large-scale environmental flows associated with Hurricane Sandy (2012). Results indicate that the PEEMD is efficient and effective in revealing the major wave characteristics of the data, such as wavelengths and periods, by sifting out the dominant (wave) components. This approach has a potential for hurricane climate study by examining the statistical relationship between tropical waves and TC formation.

  12. Rain radar measurement error estimation using data assimilation in an advection-based nowcasting system

    NASA Astrophysics Data System (ADS)

    Merker, Claire; Ament, Felix; Clemens, Marco

    2017-04-01

    The quantification of measurement uncertainty for rain radar data remains challenging. Radar reflectivity measurements are affected, amongst other things, by calibration errors, noise, blocking and clutter, and attenuation. Their combined impact on measurement accuracy is difficult to quantify due to incomplete process understanding and complex interdependencies. An improved quality assessment of rain radar measurements is of interest for applications both in meteorology and hydrology, for example for precipitation ensemble generation, rainfall runoff simulations, or in data assimilation for numerical weather prediction. Especially a detailed description of the spatial and temporal structure of errors is beneficial in order to make best use of the areal precipitation information provided by radars. Radar precipitation ensembles are one promising approach to represent spatially variable radar measurement errors. We present a method combining ensemble radar precipitation nowcasting with data assimilation to estimate radar measurement uncertainty at each pixel. This combination of ensemble forecast and observation yields a consistent spatial and temporal evolution of the radar error field. We use an advection-based nowcasting method to generate an ensemble reflectivity forecast from initial data of a rain radar network. Subsequently, reflectivity data from single radars is assimilated into the forecast using the Local Ensemble Transform Kalman Filter. The spread of the resulting analysis ensemble provides a flow-dependent, spatially and temporally correlated reflectivity error estimate at each pixel. We will present first case studies that illustrate the method using data from a high-resolution X-band radar network.

  13. Multi-model ensemble projections of European river floods and high flows at 1.5, 2, and 3 degree global warming

    NASA Astrophysics Data System (ADS)

    Thober, S.; Kumar, R.; Wanders, N.; Marx, A.; Pan, M.; Rakovec, O.; Samaniego, L. E.; Sheffield, J.; Wood, E. F.; Zink, M.

    2017-12-01

    Severe river floods often result in huge economic losses and fatalities. Since 1980, almost 1500 such events have been reported in Europe. This study investigates climate change impacts on European floods under 1.5, 2, and 3 K global warming. The impacts are assessed employing a multi-model ensemble containing three hydrologic models (HMs: mHM, Noah-MP, PCR-GLOBWB) forced by five CMIP5 General Circulation Models (GCMs) under three Representative Concentration Pathways (RCPs 2.6, 6.0, and 8.5). This multi-model ensemble is unprecedented with respect to the combination of its size (45 realisations) and its spatial resolution, which is 5 km over entire Europe. Climate change impacts are quantified for high flows and flood events, represented by 10% exceedance probability and annual maxima of daily streamflow, respectively. The multi-model ensemble points to the Mediterranean region as a hotspot of changes with significant decrements in high flows from -11% at 1.5 K up to -30% at 3 K global warming mainly resulting from reduced precipitation. Small changes (< ±10%) are observed for river basins in Central Europe and the British Isles under different levels of warming. Projected higher annual precipitation increases high flows in Scandinavia, but reduced snow water equivalent decreases flood events in this region. The contribution by the GCMs to the overall uncertainties of the ensemble is in general higher than that by the HMs. The latter, however, have a substantial share of the overall uncertainty and exceed GCM uncertainty in the Mediterranean and Scandinavia. Adaptation measures for limiting the impacts of global warming could be similar under 1.5 K and 2 K global warming, but has to account for significantly higher changes under 3 K global warming.

  14. Comparison of initial perturbation methods for the mesoscale ensemble prediction system of the Meteorological Research Institute for the WWRP Beijing 2008 Olympics Research and Development Project (B08RDP)

    NASA Astrophysics Data System (ADS)

    Saito, Kazuo; Hara, Masahiro; Kunii, Masaru; Seko, Hiromu; Yamaguchi, Munehiko

    2011-05-01

    Different initial perturbation methods for the mesoscale ensemble prediction were compared by the Meteorological Research Institute (MRI) as a part of the intercomparison of mesoscale ensemble prediction systems (EPSs) of the World Weather Research Programme (WWRP) Beijing 2008 Olympics Research and Development Project (B08RDP). Five initial perturbation methods for mesoscale ensemble prediction were developed for B08RDP and compared at MRI: (1) a downscaling method of the Japan Meteorological Agency (JMA)'s operational one-week EPS (WEP), (2) a targeted global model singular vector (GSV) method, (3) a mesoscale model singular vector (MSV) method based on the adjoint model of the JMA non-hydrostatic model (NHM), (4) a mesoscale breeding growing mode (MBD) method based on the NHM forecast and (5) a local ensemble transform (LET) method based on the local ensemble transform Kalman filter (LETKF) using NHM. These perturbation methods were applied to the preliminary experiments of the B08RDP Tier-1 mesoscale ensemble prediction with a horizontal resolution of 15 km. To make the comparison easier, the same horizontal resolution (40 km) was employed for the three mesoscale model-based initial perturbation methods (MSV, MBD and LET). The GSV method completely outperformed the WEP method, confirming the advantage of targeting in mesoscale EPS. The GSV method generally performed well with regard to root mean square errors of the ensemble mean, large growth rates of ensemble spreads throughout the 36-h forecast period, and high detection rates and high Brier skill scores (BSSs) for weak rains. On the other hand, the mesoscale model-based initial perturbation methods showed good detection rates and BSSs for intense rains. The MSV method showed a rapid growth in the ensemble spread of precipitation up to a forecast time of 6 h, which suggests suitability of the mesoscale SV for short-range EPSs, but the initial large growth of the perturbation did not last long. The performance of the MBD method was good for ensemble prediction of intense rain with a relatively small computing cost. The LET method showed similar characteristics to the MBD method, but the spread and growth rate were slightly smaller and the relative operating characteristic area skill score and BSS did not surpass those of MBD. These characteristic features of the five methods were confirmed by checking the evolution of the total energy norms and their growth rates. Characteristics of the initial perturbations obtained by four methods (GSV, MSV, MBD and LET) were examined for the case of a synoptic low-pressure system passing over eastern China. With GSV and MSV, the regions of large spread were near the low-pressure system, but with MSV, the distribution was more concentrated on the mesoscale disturbance. On the other hand, large-spread areas were observed southwest of the disturbance in MBD and LET. The horizontal pattern of LET perturbation was similar to that of MBD, but the amplitude of the LET perturbation reflected the observation density.

  15. Regional projections of North Indian climate for adaptation studies.

    PubMed

    Mathison, Camilla; Wiltshire, Andrew; Dimri, A P; Falloon, Pete; Jacob, Daniela; Kumar, Pankaj; Moors, Eddy; Ridley, Jeff; Siderius, Christian; Stoffel, Markus; Yasunari, T

    2013-12-01

    Adaptation is increasingly important for regions around the world where large changes in climate could have an impact on populations and industry. The Brahmaputra-Ganges catchments have a large population, a main industry of agriculture and a growing hydro-power industry, making the region susceptible to changes in the Indian Summer Monsoon, annually the main water source. The HighNoon project has completed four regional climate model simulations for India and the Himalaya at high resolution (25km) from 1960 to 2100 to provide an ensemble of simulations for the region. In this paper we have assessed the ensemble for these catchments, comparing the simulations with observations, to give credence that the simulations provide a realistic representation of atmospheric processes and therefore future climate. We have illustrated how these simulations could be used to provide information on potential future climate impacts and therefore aid decision-making using climatology and threshold analysis. The ensemble analysis shows an increase in temperature between the baseline (1970-2000) and the 2050s (2040-2070) of between 2 and 4°C and an increase in the number of days with maximum temperatures above 28°C and 35°C. There is less certainty for precipitation and runoff which show considerable variability, even in this relatively small ensemble, spanning zero. The HighNoon ensemble is the most complete data for the region providing useful information on a wide range of variables for the regional climate of the Brahmaputra-Ganges region, however there are processes not yet included in the models that could have an impact on the simulations of future climate. We have discussed these processes and show that the range from the HighNoon ensemble is similar in magnitude to potential changes in projections where these processes are included. Therefore strategies for adaptation must be robust and flexible allowing for advances in the science and natural environmental changes. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Climate SPHINX: evaluating the impact of resolution and stochastic physics parameterisations in the EC-Earth global climate model

    NASA Astrophysics Data System (ADS)

    Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Christensen, Hannah M.; Juricke, Stephan; Subramanian, Aneesh; Watson, Peter A. G.; Weisheimer, Antje; Palmer, Tim N.

    2017-03-01

    The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), together with coupled transient runs (1850-2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate - specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).

  17. An "Ensemble Approach" to Modernizing Extreme Precipitation Estimation for Dam Safety Decision-Making

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Mahoney, K. M.; Webb, R. S.; McCormick, B.

    2017-12-01

    To ensure structural and operational safety of dams and other water management infrastructure, water resources managers and engineers require information about the potential for heavy precipitation. The methods and data used to estimate extreme rainfall amounts for managing risk are based on 40-year-old science and in need of improvement. The need to evaluate new approaches based on the best science available has led the states of Colorado and New Mexico to engage a body of scientists and engineers in an innovative "ensemble approach" to updating extreme precipitation estimates. NOAA is at the forefront of one of three technical approaches that make up the "ensemble study"; the three approaches are conducted concurrently and in collaboration with each other. One approach is the conventional deterministic, "storm-based" method, another is a risk-based regional precipitation frequency estimation tool, and the third is an experimental approach utilizing NOAA's state-of-the-art High Resolution Rapid Refresh (HRRR) physically-based dynamical weather prediction model. The goal of the overall project is to use the individual strengths of these different methods to define an updated and broadly acceptable state of the practice for evaluation and design of dam spillways. This talk will highlight the NOAA research and NOAA's role in the overarching goal to better understand and characterizing extreme precipitation estimation uncertainty. The research led by NOAA explores a novel high-resolution dataset and post-processing techniques using a super-ensemble of hourly forecasts from the HRRR model. We also investigate how this rich dataset may be combined with statistical methods to optimally cast the data in probabilistic frameworks. NOAA expertise in the physical processes that drive extreme precipitation is also employed to develop careful testing and improved understanding of the limitations of older estimation methods and assumptions. The process of decision making in the midst of uncertainty is a major part of this study. We will speak to how the ensemble approach may be used in concert with one another to manage risk and enhance resiliency in the midst of uncertainty. Finally, the presentation will also address the implications of including climate change in future extreme precipitation estimation studies.

  18. From ENSEMBLES to CORDEX: exploring the progress for hydrological impact research for the upper Danube basin

    NASA Astrophysics Data System (ADS)

    Stanzel, Philipp; Kling, Harald

    2017-04-01

    EURO-CORDEX Regional Climate Model (RCM) data are available as result of the latest initiative of the climate modelling community to provide ever improved simulations of past and future climate in Europe. The spatial resolution of the climate models increased from 25 x 25 km in the previous coordinated initiative, ENSEMBLES, to 12 x 12 km in the CORDEX EUR-11 simulations. This higher spatial resolution might yield improved representation of the historic climate, especially in complex mountainous terrain, improving applicability in impact studies. CORDEX scenario simulations are based on Representative Concentration Pathways, while ENSEMBLES applied the SRES greenhouse gas emission scenarios. The new emission scenarios might lead to different projections of future climate. In this contribution we explore these two dimensions of development from ENSEMBLES to CORDEX - representation of the past and projections for the future - in the context of a hydrological climate change impact study for the Danube River. We replicated previous hydrological simulations that used ENSEMBLES data of 21 RCM simulations under SRES A1B emission scenario as meteorological input data (Kling et al. 2012), and now applied CORDEX EUR-11 data of 16 RCM simulations under RCP4.5 and RCP8.5 emission scenarios. The climate variables precipitation and temperature were used to drive a monthly hydrological model of the upper Danube basin upstream of Vienna (100,000 km2). RCM data was bias corrected and downscaled to the scale of hydrological model units. Results with CORDEX data were compared with results with ENSEMBLES data, analysing both the driving meteorological input and the resulting discharge projections. Results with CORDEX data show no general improvement in the accuracy of representing historic climatic features, despite the increase in spatial model resolution. The tendency of ENSEMBLES scenario projections of increasing precipitation in winter and decreasing precipitation in summer is reproduced with the CORDEX RCMs, albeit with slightly higher precipitation in the CORDEX data. The distinct pattern of future change in discharge seasonality - increasing winter discharge and decreasing summer discharge - is confirmed with the new CORDEX data, with a range of projections very similar to the range projected by the ENSEMBLES RCMs. References: Kling, H., Fuchs, M., Paulin, M. 2012. Runoff conditions in the upper Danube basin under an ensemble of climate change scenarios. Journal of Hydrology 424-425, 264-277.

  19. Analyzing and leveraging self-similarity for variable resolution atmospheric models

    NASA Astrophysics Data System (ADS)

    O'Brien, Travis; Collins, William

    2015-04-01

    Variable resolution modeling techniques are rapidly becoming a popular strategy for achieving high resolution in a global atmospheric models without the computational cost of global high resolution. However, recent studies have demonstrated a variety of resolution-dependent, and seemingly artificial, features. We argue that the scaling properties of the atmosphere are key to understanding how the statistics of an atmospheric model should change with resolution. We provide two such examples. In the first example we show that the scaling properties of the cloud number distribution define how the ratio of resolved to unresolved clouds should increase with resolution. We show that the loss of resolved clouds, in the high resolution region of variable resolution simulations, with the Community Atmosphere Model version 4 (CAM4) is an artifact of the model's treatment of condensed water (this artifact is significantly reduced in CAM5). In the second example we show that the scaling properties of the horizontal velocity field, combined with the incompressibility assumption, necessarily result in an intensification of vertical mass flux as resolution increases. We show that such an increase is present in a wide variety of models, including CAM and the regional climate models of the ENSEMBLES intercomparision. We present theoretical arguments linking this increase to the intensification of precipitation with increasing resolution.

  20. A WRF/Chem sensitivity study using ensemble modelling for a high ozone episode in Slovenia and the Northern Adriatic area

    NASA Astrophysics Data System (ADS)

    Žabkar, Rahela; Koračin, Darko; Rakovec, Jože

    2013-10-01

    A high ozone (O3) concentrations episode during a heat wave event in the Northeastern Mediterranean was investigated using the WRF/Chem model. To understand the major model uncertainties and errors as well as the impacts of model inputs on the model accuracy, an ensemble modelling experiment was conducted. The 51-member ensemble was designed by varying model physics parameterization options (PBL schemes with different surface layer and land-surface modules, and radiation schemes); chemical initial and boundary conditions; anthropogenic and biogenic emission inputs; and model domain setup and resolution. The main impacts of the geographical and emission characteristics of three distinct regions (suburban Mediterranean, continental urban, and continental rural) on the model accuracy and O3 predictions were investigated. In spite of the large ensemble set size, the model generally failed to simulate the extremes; however, as expected from probabilistic forecasting the ensemble spread improved results with respect to extremes compared to the reference run. Noticeable model nighttime overestimations at the Mediterranean and some urban and rural sites can be explained by too strong simulated winds, which reduce the impact of dry deposition and O3 titration in the near surface layers during the nighttime. Another possible explanation could be inaccuracies in the chemical mechanisms, which are suggested also by model insensitivity to variations in the nitrogen oxides (NOx) and volatile organic compounds (VOC) emissions. Major impact factors for underestimations of the daytime O3 maxima at the Mediterranean and some rural sites include overestimation of the PBL depths, a lack of information on forest fires, too strong surface winds, and also possible inaccuracies in biogenic emissions. This numerical experiment with the ensemble runs also provided guidance on an optimum model setup and input data.

  1. The Rise of Complexity in Flood Forecasting: Opportunities, Challenges and Tradeoffs

    NASA Astrophysics Data System (ADS)

    Wood, A. W.; Clark, M. P.; Nijssen, B.

    2017-12-01

    Operational flood forecasting is currently undergoing a major transformation. Most national flood forecasting services have relied for decades on lumped, highly calibrated conceptual hydrological models running on local office computing resources, providing deterministic streamflow predictions at gauged river locations that are important to stakeholders and emergency managers. A variety of recent technological advances now make it possible to run complex, high-to-hyper-resolution models for operational hydrologic prediction over large domains, and the US National Weather Service is now attempting to use hyper-resolution models to create new forecast services and products. Yet other `increased-complexity' forecasting strategies also exist that pursue different tradeoffs between model complexity (i.e., spatial resolution, physics) and streamflow forecast system objectives. There is currently a pressing need for a greater understanding in the hydrology community of the opportunities, challenges and tradeoffs associated with these different forecasting approaches, and for a greater participation by the hydrology community in evaluating, guiding and implementing these approaches. Intermediate-resolution forecast systems, for instance, use distributed land surface model (LSM) physics but retain the agility to deploy ensemble methods (including hydrologic data assimilation and hindcast-based post-processing). Fully coupled numerical weather prediction (NWP) systems, another example, use still coarser LSMs to produce ensemble streamflow predictions either at the model scale or after sub-grid scale runoff routing. Based on the direct experience of the authors and colleagues in research and operational forecasting, this presentation describes examples of different streamflow forecast paradigms, from the traditional to the recent hyper-resolution, to illustrate the range of choices facing forecast system developers. We also discuss the degree to which the strengths and weaknesses of each strategy map onto the requirements for different types of forecasting services (e.g., flash flooding, river flooding, seasonal water supply prediction).

  2. An Ensemble Method for Classifying Regional Disease Patterns of Diffuse Interstitial Lung Disease Using HRCT Images from Different Vendors.

    PubMed

    Jun, Sanghoon; Kim, Namkug; Seo, Joon Beom; Lee, Young Kyung; Lynch, David A

    2017-12-01

    We propose the use of ensemble classifiers to overcome inter-scanner variations in the differentiation of regional disease patterns in high-resolution computed tomography (HRCT) images of diffuse interstitial lung disease patients obtained from different scanners. A total of 600 rectangular 20 × 20-pixel regions of interest (ROIs) on HRCT images obtained from two different scanners (GE and Siemens) and the whole lung area of 92 HRCT images were classified as one of six regional pulmonary disease patterns by two expert radiologists. Textual and shape features were extracted from each ROI and the whole lung parenchyma. For automatic classification, individual and ensemble classifiers were trained and tested with the ROI dataset. We designed the following three experimental sets: an intra-scanner study in which the training and test sets were from the same scanner, an integrated scanner study in which the data from the two scanners were merged, and an inter-scanner study in which the training and test sets were acquired from different scanners. In the ROI-based classification, the ensemble classifiers showed better (p < 0.001) accuracy (89.73%, SD = 0.43) than the individual classifiers (88.38%, SD = 0.31) in the integrated scanner test. The ensemble classifiers also showed partial improvements in the intra- and inter-scanner tests. In the whole lung classification experiment, the quantification accuracies of the ensemble classifiers with integrated training (49.57%) were higher (p < 0.001) than the individual classifiers (48.19%). Furthermore, the ensemble classifiers also showed better performance in both the intra- and inter-scanner experiments. We concluded that the ensemble classifiers provide better performance when using integrated scanner images.

  3. Probabilistic versus deterministic skill in predicting the western North Pacific-East Asian summer monsoon variability with multimodel ensembles

    NASA Astrophysics Data System (ADS)

    Yang, Xiu-Qun; Yang, Dejian; Xie, Qian; Zhang, Yaocun; Ren, Xuejuan; Tang, Youmin

    2017-04-01

    Based on historical forecasts of three quasi-operational multi-model ensemble (MME) systems, this study assesses the superiority of coupled MME over contributing single-model ensembles (SMEs) and over uncoupled atmospheric MME in predicting the Western North Pacific-East Asian summer monsoon variability. The probabilistic and deterministic forecast skills are measured by Brier skill score (BSS) and anomaly correlation (AC), respectively. A forecast-format dependent MME superiority over SMEs is found. The probabilistic forecast skill of the MME is always significantly better than that of each SME, while the deterministic forecast skill of the MME can be lower than that of some SMEs. The MME superiority arises from both the model diversity and the ensemble size increase in the tropics, and primarily from the ensemble size increase in the subtropics. The BSS is composed of reliability and resolution, two attributes characterizing probabilistic forecast skill. The probabilistic skill increase of the MME is dominated by the dramatic improvement in reliability, while resolution is not always improved, similar to AC. A monotonic resolution-AC relationship is further found and qualitatively explained, whereas little relationship can be identified between reliability and AC. It is argued that the MME's success in improving the reliability arises from an effective reduction of the overconfidence in forecast distributions. Moreover, it is examined that the seasonal predictions with coupled MME are more skillful than those with the uncoupled atmospheric MME forced by persisting sea surface temperature (SST) anomalies, since the coupled MME has better predicted the SST anomaly evolution in three key regions.

  4. Uncertainties and coupled error covariances in the CERA-20C, ECMWF's first coupled reanalysis ensemble

    NASA Astrophysics Data System (ADS)

    Feng, Xiangbo; Haines, Keith

    2017-04-01

    ECMWF has produced its first ensemble ocean-atmosphere coupled reanalysis, the 20th century Coupled ECMWF ReAnalysis (CERA-20C), with 10 ensemble members at 3-hour resolution. Here the analysis uncertainties (ensemble spread) of lower atmospheric variables and sea surface temperature (SST), and their correlations, are quantified on diurnal, seasonal and longer timescales. The 2-m air temperature (T2m) spread is always larger than the SST spread at high-frequencies, but smaller on monthly timescales, except in deep convection areas, indicating increasing SST control at longer timescales. Spatially the T2m-SST ensemble correlations are the strongest where ocean mixed layers are shallow and can respond to atmospheric variability. Where atmospheric convection is strong with a deep precipitating boundary layer, T2m-SST correlations are greatly reduced. As the 20th-century progresses more observations become available, and ensemble spreads decline at all variability timescales. The T2m-SST correlations increase through the 20th-century, except in the tropics. As winds become better constrained over the oceans with less spread, T2m-SST become more correlated. In the tropics, strong ENSO-related inter-annual variability is found in the correlations, as atmospheric convection centres move. These ensemble spreads have been used to provide background errors for the assimilation throughout the reanalysis, have implications for the weights given to observations, and are a general measure of the uncertainties in the analysed product. Although cross boundary covariances are not currently used, they offer considerable potential for strengthening the ocean-atmosphere coupling in future reanalyses.

  5. Use of Ensemble Numerical Weather Prediction Data for Inversely Determining Atmospheric Refractivity in Surface Ducting Conditions

    NASA Astrophysics Data System (ADS)

    Greenway, D. P.; Hackett, E.

    2017-12-01

    Under certain atmospheric refractivity conditions, propagated electromagnetic waves (EM) can become trapped between the surface and the bottom of the atmosphere's mixed layer, which is referred to as surface duct propagation. Being able to predict the presence of these surface ducts can reap many benefits to users and developers of sensing technologies and communication systems because they significantly influence the performance of these systems. However, the ability to directly measure or model a surface ducting layer is challenging due to the high spatial resolution and large spatial coverage needed to make accurate refractivity estimates for EM propagation; thus, inverse methods have become an increasingly popular way of determining atmospheric refractivity. This study uses data from the Coupled Ocean/Atmosphere Mesoscale Prediction System developed by the Naval Research Laboratory and instrumented helicopter (helo) measurements taken during the Wallops Island Field Experiment to evaluate the use of ensemble forecasts in refractivity inversions. Helo measurements and ensemble forecasts are optimized to a parametric refractivity model, and three experiments are performed to evaluate whether incorporation of ensemble forecast data aids in more timely and accurate inverse solutions using genetic algorithms. The results suggest that using optimized ensemble members as an initial population for the genetic algorithms generally enhances the accuracy and speed of the inverse solution; however, use of the ensemble data to restrict parameter search space yields mixed results. Inaccurate results are related to parameterization of the ensemble members' refractivity profile and the subsequent extraction of the parameter ranges to limit the search space.

  6. A Multiplicative Cascade Model for High-Resolution Space-Time Downscaling of Rainfall

    NASA Astrophysics Data System (ADS)

    Raut, Bhupendra A.; Seed, Alan W.; Reeder, Michael J.; Jakob, Christian

    2018-02-01

    Distributions of rainfall with the time and space resolutions of minutes and kilometers, respectively, are often needed to drive the hydrological models used in a range of engineering, environmental, and urban design applications. The work described here is the first step in constructing a model capable of downscaling rainfall to scales of minutes and kilometers from time and space resolutions of several hours and a hundred kilometers. A multiplicative random cascade model known as the Short-Term Ensemble Prediction System is run with parameters from the radar observations at Melbourne (Australia). The orographic effects are added through multiplicative correction factor after the model is run. In the first set of model calculations, 112 significant rain events over Melbourne are simulated 100 times. Because of the stochastic nature of the cascade model, the simulations represent 100 possible realizations of the same rain event. The cascade model produces realistic spatial and temporal patterns of rainfall at 6 min and 1 km resolution (the resolution of the radar data), the statistical properties of which are in close agreement with observation. In the second set of calculations, the cascade model is run continuously for all days from January 2008 to August 2015 and the rainfall accumulations are compared at 12 locations in the greater Melbourne area. The statistical properties of the observations lie with envelope of the 100 ensemble members. The model successfully reproduces the frequency distribution of the 6 min rainfall intensities, storm durations, interarrival times, and autocorrelation function.

  7. Extended Range Prediction of Indian Summer Monsoon: Current status

    NASA Astrophysics Data System (ADS)

    Sahai, A. K.; Abhilash, S.; Borah, N.; Joseph, S.; Chattopadhyay, R.; S, S.; Rajeevan, M.; Mandal, R.; Dey, A.

    2014-12-01

    The main focus of this study is to develop forecast consensus in the extended range prediction (ERP) of monsoon Intraseasonal oscillations using a suit of different variants of Climate Forecast system (CFS) model. In this CFS based Grand MME prediction system (CGMME), the ensemble members are generated by perturbing the initial condition and using different configurations of CFSv2. This is to address the role of different physical mechanisms known to have control on the error growth in the ERP in the 15-20 day time scale. The final formulation of CGMME is based on 21 ensembles of the standalone Global Forecast System (GFS) forced with bias corrected forecasted SST from CFS, 11 low resolution CFST126 and 11 high resolution CFST382. Thus, we develop the multi-model consensus forecast for the ERP of Indian summer monsoon (ISM) using a suite of different variants of CFS model. This coordinated international effort lead towards the development of specific tailor made regional forecast products over Indian region. Skill of deterministic and probabilistic categorical rainfall forecast as well the verification of large-scale low frequency monsoon intraseasonal oscillations has been carried out using hindcast from 2001-2012 during the monsoon season in which all models are initialized at every five days starting from 16May to 28 September. The skill of deterministic forecast from CGMME is better than the best participating single model ensemble configuration (SME). The CGMME approach is believed to quantify the uncertainty in both initial conditions and model formulation. Main improvement is attained in probabilistic forecast which is because of an increase in the ensemble spread, thereby reducing the error due to over-confident ensembles in a single model configuration. For probabilistic forecast, three tercile ranges are determined by ranking method based on the percentage of ensemble members from all the participating models falls in those three categories. CGMME further added value to both deterministic and probability forecast compared to raw SME's and this better skill is probably flows from large spread and improved spread-error relationship. CGMME system is currently capable of generating ER prediction in real time and successfully delivering its experimental operational ER forecast of ISM for the last few years.

  8. On the generation of climate model ensembles

    NASA Astrophysics Data System (ADS)

    Haughton, Ned; Abramowitz, Gab; Pitman, Andy; Phipps, Steven J.

    2014-10-01

    Climate model ensembles are used to estimate uncertainty in future projections, typically by interpreting the ensemble distribution for a particular variable probabilistically. There are, however, different ways to produce climate model ensembles that yield different results, and therefore different probabilities for a future change in a variable. Perhaps equally importantly, there are different approaches to interpreting the ensemble distribution that lead to different conclusions. Here we use a reduced-resolution climate system model to compare three common ways to generate ensembles: initial conditions perturbation, physical parameter perturbation, and structural changes. Despite these three approaches conceptually representing very different categories of uncertainty within a modelling system, when comparing simulations to observations of surface air temperature they can be very difficult to separate. Using the twentieth century CMIP5 ensemble for comparison, we show that initial conditions ensembles, in theory representing internal variability, significantly underestimate observed variance. Structural ensembles, perhaps less surprisingly, exhibit over-dispersion in simulated variance. We argue that future climate model ensembles may need to include parameter or structural perturbation members in addition to perturbed initial conditions members to ensure that they sample uncertainty due to internal variability more completely. We note that where ensembles are over- or under-dispersive, such as for the CMIP5 ensemble, estimates of uncertainty need to be treated with care.

  9. Performance of a coupled lagged ensemble weather and river runoff prediction model system for the Alpine Ammer River catchment

    NASA Astrophysics Data System (ADS)

    Smiatek, G.; Kunstmann, H.; Werhahn, J.

    2012-04-01

    The Ammer River catchment located in the Bavarian Ammergau Alps and alpine forelands, Germany, represents with elevations reaching 2185 m and annual mean precipitation between1100 and 2000 mm a very demanding test ground for a river runoff prediction system. Large flooding events in 1999 and 2005 motivated the development of a physically based prediction tool in this area. Such a tool is the coupled high resolution numerical weather and river runoff forecasting system AM-POE that is being studied in several configurations in various experiments starting from the year 2005. Corner stones of the coupled system are the hydrological water balance model WaSiM-ETH run at 100 m grid resolution, the numerical weather prediction model (NWP) MM5 driven at 3.5 km grid cell resolution and the Perl Object Environment (POE) framework. POE implements the input data download from various sources, the input data provision via SOAP based WEB services as well as the runs of the hydrology model both with observed and with NWP predicted meteorology input. The one way coupled system utilizes a lagged ensemble prediction system (EPS) taking into account combination of recent and previous NWP forecasts. Results obtained in the years 2005-2011 reveal that river runoff simulations depict high correlation with observed runoff when driven with monitored observations in hindcast experiments. The ability to runoff forecasts is depending on lead times in the lagged ensemble prediction and shows still limitations resulting from errors in timing and total amount of the predicted precipitation in the complex mountainous area. The presentation describes the system implementation, and demonstrates the application of the POE framework in networking, distributed computing and in the setup of various experiments as well as long term results of the system application in the years 2005 - 2011.

  10. Ocean state and uncertainty forecasts using HYCOM with Local Ensemble Transfer Kalman Filter (LETKF)

    NASA Astrophysics Data System (ADS)

    Wei, Mozheng; Hogan, Pat; Rowley, Clark; Smedstad, Ole-Martin; Wallcraft, Alan; Penny, Steve

    2017-04-01

    An ensemble forecast system based on the US Navy's operational HYCOM using Local Ensemble Transfer Kalman Filter (LETKF) technology has been developed for ocean state and uncertainty forecasts. One of the advantages is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates the operational observations using ensemble method. The background covariance during this assimilation process is supplied with the ensemble, thus it avoids the difficulty of developing tangent linear and adjoint models for 4D-VAR from the complicated hybrid isopycnal vertical coordinate in HYCOM. Another advantage is that the ensemble system provides the valuable uncertainty estimate corresponding to every state forecast from HYCOM. Uncertainty forecasts have been proven to be critical for the downstream users and managers to make more scientifically sound decisions in numerical prediction community. In addition, ensemble mean is generally more accurate and skilful than the single traditional deterministic forecast with the same resolution. We will introduce the ensemble system design and setup, present some results from 30-member ensemble experiment, and discuss scientific, technical and computational issues and challenges, such as covariance localization, inflation, model related uncertainties and sensitivity to the ensemble size.

  11. NMR Studies of Dynamic Biomolecular Conformational Ensembles

    PubMed Central

    Torchia, Dennis A.

    2015-01-01

    Multidimensional heteronuclear NMR approaches can provide nearly complete sequential signal assignments of isotopically enriched biomolecules. The availability of assignments together with measurements of spin relaxation rates, residual spin interactions, J-couplings and chemical shifts provides information at atomic resolution about internal dynamics on timescales ranging from ps to ms, both in solution and in the solid state. However, due to the complexity of biomolecules, it is not possible to extract a unique atomic-resolution description of biomolecular motions even from extensive NMR data when many conformations are sampled on multiple timescales. For this reason, powerful computational approaches are increasingly applied to large NMR data sets to elucidate conformational ensembles sampled by biomolecules. In the past decade, considerable attention has been directed at an important class of biomolecules that function by binding to a wide variety of target molecules. Questions of current interest are: “Does the free biomolecule sample a conformational ensemble that encompasses the conformations found when it binds to various targets; and if so, on what time scale is the ensemble sampled?” This article reviews recent efforts to answer these questions, with a focus on comparing ensembles obtained for the same biomolecules by different investigators. A detailed comparison of results obtained is provided for three biomolecules: ubiquitin, calmodulin and the HIV-1 trans-activation response RNA. PMID:25669739

  12. New Developments in the Data Assimilation Research Testbed

    NASA Astrophysics Data System (ADS)

    Hoar, T. J.; Anderson, J. L.; Raeder, K.; Karspeck, A. R.; Romine, G.; Liu, H.; Collins, N.

    2011-12-01

    NCAR's Data Assimilation Research Testbed (DART) is a community facility that provides ensemble data assimilation tools for geophysical applications. DART works with an expanding set of models and a wide range of conventional and novel observations, and provides a variety of assimilation algorithms and diagnostic tools. The Kodiak release of DART became available in July 2011 and includes more than 20 major feature enhancements, support for 24 models, support for (at least) 14 observation formats, expanded documentation and diagnostic tools, and 12 new utilities. A few examples of research projects that demonstrate the effectiveness and flexibility of the DART are described. The Community Atmosphere Model (CAM) and DART assimilated all the observations that were used in the NCEP/NCAR Reanalysis to produce a global, 6-hourly, 80-member ensemble reanalysis for 1998 through the present. The dataset is ideal for research applications that would benefit from an ensemble of equally-likely atmospheric states that are consistent with observations. Individual ensemble members may be used as a "data atmosphere" in any Community Earth System Model (CESM) experiment. The CESM interfaces for the Parallel Ocean Program (POP) and the Community Land Model (CLM) also support multiple instances, allowing data assimilation experiments exploiting unique atmospheric forcing for each POP or CLM model instance. A multi-year DART ocean assimilation has been completed and provides valuable insight into the successes and challenges of oceanic data assimilation. The DART/CLM research focuses on snow cover fraction and snow depth. The Weather Research and Forecasting (WRF) model was used with DART to perform a real-time CONUS domain mesoscale ensemble analysis with continuous cycling for 47 days. A member was selected once daily for high-resolution convective forecasts supporting a test phase of the Deep Convective Clouds and Chemistry experiment and the Storm Prediction Center spring experiment. The impacts of Moderate Resolution Imaging Spectroradiometer (MODIS) infrared and Advanced Microwave Scanning Radiometer (AMSR) microwave total precipitable water (TPW) observations on analyses and forecasts of tropical cyclone Sinlaku (2008) are investigated by performing assimilations with a 45km resolution WRF model over the Western Pacific domain for 8-14 Septmber, 2008. Particular emphasis is on the performance of the assimilation algorithms in the hurricane core and the impact of novel observations in the hurricane core.

  13. Ensemble Solar Forecasting Statistical Quantification and Sensitivity Analysis: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheung, WanYin; Zhang, Jie; Florita, Anthony

    2015-12-08

    Uncertainties associated with solar forecasts present challenges to maintain grid reliability, especially at high solar penetrations. This study aims to quantify the errors associated with the day-ahead solar forecast parameters and the theoretical solar power output for a 51-kW solar power plant in a utility area in the state of Vermont, U.S. Forecasts were generated by three numerical weather prediction (NWP) models, including the Rapid Refresh, the High Resolution Rapid Refresh, and the North American Model, and a machine-learning ensemble model. A photovoltaic (PV) performance model was adopted to calculate theoretical solar power generation using the forecast parameters (e.g., irradiance,more » cell temperature, and wind speed). Errors of the power outputs were quantified using statistical moments and a suite of metrics, such as the normalized root mean squared error (NRMSE). In addition, the PV model's sensitivity to different forecast parameters was quantified and analyzed. Results showed that the ensemble model yielded forecasts in all parameters with the smallest NRMSE. The NRMSE of solar irradiance forecasts of the ensemble NWP model was reduced by 28.10% compared to the best of the three NWP models. Further, the sensitivity analysis indicated that the errors of the forecasted cell temperature attributed only approximately 0.12% to the NRMSE of the power output as opposed to 7.44% from the forecasted solar irradiance.« less

  14. Opportunities and challenges for extended-range predictions of tropical cyclone impacts on hydrological predictions

    NASA Astrophysics Data System (ADS)

    Tsai, Hsiao-Chung; Elsberry, Russell L.

    2013-12-01

    SummaryAn opportunity exists to extend support to the decision-making processes of water resource management and hydrological operations by providing extended-range tropical cyclone (TC) formation and track forecasts in the western North Pacific from the 51-member ECMWF 32-day ensemble. A new objective verification technique demonstrates that the ECMWF ensemble can predict most of the formations and tracks of the TCs during July 2009 to December 2010, even for most of the tropical depressions. Due to the relatively large number of false-alarm TCs in the ECMWF ensemble forecasts that would cause problems for support of hydrological operations, characteristics of these false alarms are discussed. Special attention is given to the ability of the ECMWF ensemble to predict periods of no-TCs in the Taiwan area, since water resource management decisions also depend on the absence of typhoon-related rainfall. A three-tier approach is proposed to provide support for hydrological operations via extended-range forecasts twice weekly on the 30-day timescale, twice-daily on the 15-day timescale, and up to four times a day with a consensus of high-resolution deterministic models.

  15. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting.

    PubMed

    Men, Zhongxian; Yee, Eugene; Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an "optimal" weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds.

  16. Ensemble Nonlinear Autoregressive Exogenous Artificial Neural Networks for Short-Term Wind Speed and Power Forecasting

    PubMed Central

    Lien, Fue-Sang; Yang, Zhiling; Liu, Yongqian

    2014-01-01

    Short-term wind speed and wind power forecasts (for a 72 h period) are obtained using a nonlinear autoregressive exogenous artificial neural network (ANN) methodology which incorporates either numerical weather prediction or high-resolution computational fluid dynamics wind field information as an exogenous input. An ensemble approach is used to combine the predictions from many candidate ANNs in order to provide improved forecasts for wind speed and power, along with the associated uncertainties in these forecasts. More specifically, the ensemble ANN is used to quantify the uncertainties arising from the network weight initialization and from the unknown structure of the ANN. All members forming the ensemble of neural networks were trained using an efficient particle swarm optimization algorithm. The results of the proposed methodology are validated using wind speed and wind power data obtained from an operational wind farm located in Northern China. The assessment demonstrates that this methodology for wind speed and power forecasting generally provides an improvement in predictive skills when compared to the practice of using an “optimal” weight vector from a single ANN while providing additional information in the form of prediction uncertainty bounds. PMID:27382627

  17. Ensemble modeling of very small ZnO nanoparticles.

    PubMed

    Niederdraenk, Franziska; Seufert, Knud; Stahl, Andreas; Bhalerao-Panajkar, Rohini S; Marathe, Sonali; Kulkarni, Sulabha K; Neder, Reinhard B; Kumpf, Christian

    2011-01-14

    The detailed structural characterization of nanoparticles is a very important issue since it enables a precise understanding of their electronic, optical and magnetic properties. Here we introduce a new method for modeling the structure of very small particles by means of powder X-ray diffraction. Using thioglycerol-capped ZnO nanoparticles with a diameter of less than 3 nm as an example we demonstrate that our ensemble modeling method is superior to standard XRD methods like, e.g., Rietveld refinement. Besides fundamental properties (size, anisotropic shape and atomic structure) more sophisticated properties like imperfections in the lattice, a size distribution as well as strain and relaxation effects in the particles and-in particular-at their surface (surface relaxation effects) can be obtained. Ensemble properties, i.e., distributions of the particle size and other properties, can also be investigated which makes this method superior to imaging techniques like (high resolution) transmission electron microscopy or atomic force microscopy, in particular for very small nanoparticles. For the particles under study an excellent agreement of calculated and experimental X-ray diffraction patterns could be obtained with an ensemble of anisotropic polyhedral particles of three dominant sizes, wurtzite structure and a significant relaxation of Zn atoms close to the surface.

  18. SHORT RANGE ENSEMBLE Products

    Science.gov Websites

    - CONUS Double Resolution (Lambert Conformal - 40km) NEMS Non-hydrostatic Multiscale Model on the B grid AWIPS grid 212 Regional - CONUS Double Resolution (Lambert Conformal - 40km) NEMS Non-hydrostatic 132 - Double Resolution (Lambert Conformal - 16km) NEMS Non-hydrostatic Multiscale Model on the B grid

  19. Does the uncertainty in the representation of terrestrial water flows affect precipitation predictability? A WRF-Hydro ensemble analysis for Central Europe

    NASA Astrophysics Data System (ADS)

    Arnault, Joel; Rummler, Thomas; Baur, Florian; Lerch, Sebastian; Wagner, Sven; Fersch, Benjamin; Zhang, Zhenyu; Kerandi, Noah; Keil, Christian; Kunstmann, Harald

    2017-04-01

    Precipitation predictability can be assessed by the spread within an ensemble of atmospheric simulations being perturbed in the initial, lateral boundary conditions and/or modeled processes within a range of uncertainty. Surface-related processes are more likely to change precipitation when synoptic forcing is weak. This study investigates the effect of uncertainty in the representation of terrestrial water flows on precipitation predictability. The tools used for this investigation are the Weather Research and Forecasting (WRF) model and its hydrologically-enhanced version WRF-Hydro, applied over Central Europe during April-October 2008. The WRF grid is that of COSMO-DE, with a resolution of 2.8 km. In WRF-Hydro, the WRF grid is coupled with a sub-grid at 280 m resolution to resolve lateral terrestrial water flows. Vertical flow uncertainty is considered by modifying the parameter controlling the partitioning between surface runoff and infiltration in WRF, and horizontal flow uncertainty is considered by comparing WRF with WRF-Hydro. Precipitation predictability is deduced from the spread of an ensemble based on three turbulence parameterizations. Model results are validated with E-OBS precipitation and surface temperature, ESA-CCI soil moisture, FLUXNET-MTE surface evaporation and GRDC discharge. It is found that the uncertainty in the representation of terrestrial water flows is more likely to significantly affect precipitation predictability when surface flux spatial variability is high. In comparison to the WRF ensemble, WRF-Hydro slightly improves the adjusted continuous ranked probability score of daily precipitation. The reproduction of observed daily discharge with Nash-Sutcliffe model efficiency coefficients up to 0.91 demonstrates the potential of WRF-Hydro for flood forecasting.

  20. Scientific assessment of accuracy, skill and reliability of ocean probabilistic forecast products.

    NASA Astrophysics Data System (ADS)

    Wei, M.; Rowley, C. D.; Barron, C. N.; Hogan, P. J.

    2016-02-01

    As ocean operational centers are increasingly adopting and generating probabilistic forecast products for their customers with valuable forecast uncertainties, how to assess and measure these complicated probabilistic forecast products objectively is challenging. The first challenge is how to deal with the huge amount of the data from the ensemble forecasts. The second one is how to describe the scientific quality of probabilistic products. In fact, probabilistic forecast accuracy, skills, reliability, resolutions are different attributes of a forecast system. We briefly introduce some of the fundamental metrics such as the Reliability Diagram, Reliability, Resolution, Brier Score (BS), Brier Skill Score (BSS), Ranked Probability Score (RPS), Ranked Probability Skill Score (RPSS), Continuous Ranked Probability Score (CRPS), and Continuous Ranked Probability Skill Score (CRPSS). The values and significance of these metrics are demonstrated for the forecasts from the US Navy's regional ensemble system with different ensemble members. The advantages and differences of these metrics are studied and clarified.

  1. Ensemble-based evaluation of extreme water levels for the eastern Baltic Sea

    NASA Astrophysics Data System (ADS)

    Eelsalu, Maris; Soomere, Tarmo

    2016-04-01

    The risks and damages associated with coastal flooding that are naturally associated with an increase in the magnitude of extreme storm surges are one of the largest concerns of countries with extensive low-lying nearshore areas. The relevant risks are even more contrast for semi-enclosed water bodies such as the Baltic Sea where subtidal (weekly-scale) variations in the water volume of the sea substantially contribute to the water level and lead to large spreading of projections of future extreme water levels. We explore the options for using large ensembles of projections to more reliably evaluate return periods of extreme water levels. Single projections of the ensemble are constructed by means of fitting several sets of block maxima with various extreme value distributions. The ensemble is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. A hindcast by the Rossby Centre Ocean model is sampled with a resolution of 6 h and a similar hindcast by the circulation model NEMO with a resolution of 1 h. As the annual maxima of water levels in the Baltic Sea are not always uncorrelated, we employ maxima for calendar years and for stormy seasons. As the shape parameter of the Generalised Extreme Value distribution changes its sign and substantially varies in magnitude along the eastern coast of the Baltic Sea, the use of a single distribution for the entire coast is inappropriate. The ensemble involves projections based on the Generalised Extreme Value, Gumbel and Weibull distributions. The parameters of these distributions are evaluated using three different ways: maximum likelihood method and method of moments based on both biased and unbiased estimates. The total number of projections in the ensemble is 40. As some of the resulting estimates contain limited additional information, the members of pairs of projections that are highly correlated are assigned weights 0.6. A comparison of the ensemble-based projection of extreme water levels and their return periods with similar estimates derived from local observations reveals an interesting pattern of match and mismatch. The match is almost perfect in measurement sites where local effects (e.g., wave-induced set-up or local surge in very shallow areas that are not resolved by circulation models) do not contribute to the observed values of water level. There is, however, substantial mismatch between projected and observed extreme values for most of the Estonian coast. The mismatch is largest for sections that are open to high waves and for several bays that are deeply cut into mainland but open for predominant strong wind directions. Detailed quantification of this mismatch eventually makes it possible to develop substantially improved estimates of extreme water levels in sections where local effects considerably contribute into the total water level.

  2. An Coral Ensemble Approach to Reconstructing Central Pacific Climate Change During the Holocene

    NASA Astrophysics Data System (ADS)

    Atwood, A. R.; Cobb, K. M.; Grothe, P. R.; Sayani, H. R.; Southon, J. R.; Edwards, R. L.; Deocampo, D.; Chen, T.; Townsend, K. J.; Hagos, M. M.; Chiang, J. C. H.

    2016-12-01

    The processes that control El Niño-Southern Oscillation (ENSO) variability on long timescales are still poorly understood. As a consequence, limited progress has been made in understanding how ENSO will change under greenhouse gas forcing. The mid-Holocene provides a well-defined target to study the fundamental controls of ENSO variability. A large number of paleo-ENSO records spanning the tropical Pacific indicate that ENSO variability was reduced by as much as 50% between 3000-6000 yr BP, relative to modern times. Dynamical models of ENSO suggest that ENSO properties can shift in response to changes in the tropical Pacific mean state and/or seasonal cycle, but few proxy records can resolve such changes during the interval in question with enough accuracy. While decades of research have demonstrated the fidelity of tropical Pacific coral d18O records to quantify interannual temperature and precipitation anomalies associated with ENSO, substantial mean offsets exist across overlapping coral sequences that have made it difficult to quantify past changes in mean climate. Here, we test a new approach to reconstruct changes in mean climate from coral records using a large ensemble of bulk d18O measurements on radiometrically-dated fossil corals from Christmas Island that span the Holocene. In contrast to the traditional method of high-resolution sampling to reconstruct monthly climate conditions, we implement a bulk approach, which dramatically reduces the analysis time needed to estimate mean coral d18O and enables a large number of corals to be analyzed in the production of an ensemble of mean climate estimates. A pseudo-coral experiment based on simulations with a Linear Inverse Model and a coupled GCM is used to determine the number of bulk coral estimates that are required to resolve a given mean climate perturbation. In addition to these bulk measurements, short transects are sampled at high resolution to constrain changes in the amplitude of the seasonal cycle. We present preliminary results from our joint bulk/high-resolution sampling approach that provide new constraints on changes in mean climate and seasonality in the central equatorial Pacific over the last 6,000 yr BP.

  3. Multiphysics superensemble forecast applied to Mediterranean heavy precipitation situations

    NASA Astrophysics Data System (ADS)

    Vich, M.; Romero, R.

    2010-11-01

    The high-impact precipitation events that regularly affect the western Mediterranean coastal regions are still difficult to predict with the current prediction systems. Bearing this in mind, this paper focuses on the superensemble technique applied to the precipitation field. Encouraged by the skill shown by a previous multiphysics ensemble prediction system applied to western Mediterranean precipitation events, the superensemble is fed with this ensemble. The training phase of the superensemble contributes to the actual forecast with weights obtained by comparing the past performance of the ensemble members and the corresponding observed states. The non-hydrostatic MM5 mesoscale model is used to run the multiphysics ensemble. Simulations are performed with a 22.5 km resolution domain (Domain 1 in http://mm5forecasts.uib.es) nested in the ECMWF forecast fields. The period between September and December 2001 is used to train the superensemble and a collection of 19~MEDEX cyclones is used to test it. The verification procedure involves testing the superensemble performance and comparing it with that of the poor-man and bias-corrected ensemble mean and the multiphysic EPS control member. The results emphasize the need of a well-behaved training phase to obtain good results with the superensemble technique. A strategy to obtain this improved training phase is already outlined.

  4. Decadal climate predictions improved by ocean ensemble dispersion filtering

    NASA Astrophysics Data System (ADS)

    Kadow, C.; Illing, S.; Kröner, I.; Ulbrich, U.; Cubasch, U.

    2017-06-01

    Decadal predictions by Earth system models aim to capture the state and phase of the climate several years in advance. Atmosphere-ocean interaction plays an important role for such climate forecasts. While short-term weather forecasts represent an initial value problem and long-term climate projections represent a boundary condition problem, the decadal climate prediction falls in-between these two time scales. In recent years, more precise initialization techniques of coupled Earth system models and increased ensemble sizes have improved decadal predictions. However, climate models in general start losing the initialized signal and its predictive skill from one forecast year to the next. Here we show that the climate prediction skill of an Earth system model can be improved by a shift of the ocean state toward the ensemble mean of its individual members at seasonal intervals. We found that this procedure, called ensemble dispersion filter, results in more accurate results than the standard decadal prediction. Global mean and regional temperature, precipitation, and winter cyclone predictions show an increased skill up to 5 years ahead. Furthermore, the novel technique outperforms predictions with larger ensembles and higher resolution. Our results demonstrate how decadal climate predictions benefit from ocean ensemble dispersion filtering toward the ensemble mean.Plain Language SummaryDecadal predictions aim to predict the climate several years in advance. Atmosphere-ocean interaction plays an important role for such climate forecasts. The ocean memory due to its heat capacity holds big potential skill. In recent years, more precise initialization techniques of coupled Earth system models (incl. atmosphere and ocean) have improved decadal predictions. Ensembles are another important aspect. Applying slightly perturbed predictions to trigger the famous butterfly effect results in an ensemble. Instead of evaluating one prediction, but the whole ensemble with its ensemble average, improves a prediction system. However, climate models in general start losing the initialized signal and its predictive skill from one forecast year to the next. Our study shows that the climate prediction skill of an Earth system model can be improved by a shift of the ocean state toward the ensemble mean of its individual members at seasonal intervals. We found that this procedure applying the average during the model run, called ensemble dispersion filter, results in more accurate results than the standard prediction. Global mean and regional temperature, precipitation, and winter cyclone predictions show an increased skill up to 5 years ahead. Furthermore, the novel technique outperforms predictions with larger ensembles and higher resolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H53F1542R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H53F1542R"><span>Spatial Modeling and Uncertainty Assessment of Fine Scale Surface Processes Based on Coarse Terrain Elevation Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rasera, L. G.; Mariethoz, G.; Lane, S. N.</p> <p>2017-12-01</p> <p>Frequent acquisition of high-resolution digital elevation models (HR-DEMs) over large areas is expensive and difficult. Satellite-derived low-resolution digital elevation models (LR-DEMs) provide extensive coverage of Earth's surface but at coarser spatial and temporal resolutions. Although useful for large scale problems, LR-DEMs are not suitable for modeling hydrologic and geomorphic processes at scales smaller than their spatial resolution. In this work, we present a multiple-point geostatistical approach for downscaling a target LR-DEM based on available high-resolution training data and recurrent high-resolution remote sensing images. The method aims at generating several equiprobable HR-DEMs conditioned to a given target LR-DEM by borrowing small scale topographic patterns from an analogue containing data at both coarse and fine scales. An application of the methodology is demonstrated by using an ensemble of simulated HR-DEMs as input to a flow-routing algorithm. The proposed framework enables a probabilistic assessment of the spatial structures generated by natural phenomena operating at scales finer than the available terrain elevation measurements. A case study in the Swiss Alps is provided to illustrate the methodology.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..1410574B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..1410574B"><span>Estimation of the uncertainty of a climate model using an ensemble simulation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Barth, A.; Mathiot, P.; Goosse, H.</p> <p>2012-04-01</p> <p>The atmospheric forcings play an important role in the study of the ocean and sea-ice dynamics of the Southern Ocean. Error in the atmospheric forcings will inevitably result in uncertain model results. The sensitivity of the model results to errors in the atmospheric forcings are studied with ensemble simulations using multivariate perturbations of the atmospheric forcing fields. The numerical ocean model used is the NEMO-LIM in a global configuration with an horizontal resolution of 2°. NCEP reanalyses are used to provide air temperature and wind data to force the ocean model over the last 50 years. A climatological mean is used to prescribe relative humidity, cloud cover and precipitation. In a first step, the model results is compared with OSTIA SST and OSI SAF sea ice concentration of the southern hemisphere. The seasonal behavior of the RMS difference and bias in SST and ice concentration is highlighted as well as the regions with relatively high RMS errors and biases such as the Antarctic Circumpolar Current and near the ice-edge. Ensemble simulations are performed to statistically characterize the model error due to uncertainties in the atmospheric forcings. Such information is a crucial element for future data assimilation experiments. Ensemble simulations are performed with perturbed air temperature and wind forcings. A Fourier decomposition of the NCEP wind vectors and air temperature for 2007 is used to generate ensemble perturbations. The perturbations are scaled such that the resulting ensemble spread matches approximately the RMS differences between the satellite SST and sea ice concentration. The ensemble spread and covariance are analyzed for the minimum and maximum sea ice extent. It is shown that errors in the atmospheric forcings can extend to several hundred meters in depth near the Antarctic Circumpolar Current.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ERL....13a4003T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ERL....13a4003T"><span>Multi-model ensemble projections of European river floods and high flows at 1.5, 2, and 3 degrees global warming</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Thober, Stephan; Kumar, Rohini; Wanders, Niko; Marx, Andreas; Pan, Ming; Rakovec, Oldrich; Samaniego, Luis; Sheffield, Justin; Wood, Eric F.; Zink, Matthias</p> <p>2018-01-01</p> <p>Severe river floods often result in huge economic losses and fatalities. Since 1980, almost 1500 such events have been reported in Europe. This study investigates climate change impacts on European floods under 1.5, 2, and 3 K global warming. The impacts are assessed employing a multi-model ensemble containing three hydrologic models (HMs: mHM, Noah-MP, PCR-GLOBWB) forced by five CMIP5 general circulation models (GCMs) under three Representative Concentration Pathways (RCPs 2.6, 6.0, and 8.5). This multi-model ensemble is unprecedented with respect to the combination of its size (45 realisations) and its spatial resolution, which is 5 km over the entirety of Europe. Climate change impacts are quantified for high flows and flood events, represented by 10% exceedance probability and annual maxima of daily streamflow, respectively. The multi-model ensemble points to the Mediterranean region as a hotspot of changes with significant decrements in high flows from -11% at 1.5 K up to -30% at 3 K global warming mainly resulting from reduced precipitation. Small changes (< ±10%) are observed for river basins in Central Europe and the British Isles under different levels of warming. Projected higher annual precipitation increases high flows in Scandinavia, but reduced snow melt equivalent decreases flood events in this region. Neglecting uncertainties originating from internal climate variability, downscaling technique, and hydrologic model parameters, the contribution by the GCMs to the overall uncertainties of the ensemble is in general higher than that by the HMs. The latter, however, have a substantial share in the Mediterranean and Scandinavia. Adaptation measures for limiting the impacts of global warming could be similar under 1.5 K and 2 K global warming, but have to account for significantly higher changes under 3 K global warming.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.3922T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.3922T"><span>Sensitivity of worst-case strom surge considering influence of climate change</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Takayabu, Izuru; Hibino, Kenshi; Sasaki, Hidetaka; Shiogama, Hideo; Mori, Nobuhito; Shibutani, Yoko; Takemi, Tetsuya</p> <p>2016-04-01</p> <p>There are two standpoints when assessing risk caused by climate change. One is how to prevent disaster. For this purpose, we get probabilistic information of meteorological elements, from enough number of ensemble simulations. Another one is to consider disaster mitigation. For this purpose, we have to use very high resolution sophisticated model to represent a worst case event in detail. If we could use enough computer resources to drive many ensemble runs with very high resolution model, we can handle these all themes in one time. However resources are unfortunately limited in most cases, and we have to select the resolution or the number of simulations if we design the experiment. Applying PGWD (Pseudo Global Warming Downscaling) method is one solution to analyze a worst case event in detail. Here we introduce an example to find climate change influence on the worst case storm-surge, by applying PGWD to a super typhoon Haiyan (Takayabu et al, 2015). 1 km grid WRF model could represent both the intensity and structure of a super typhoon. By adopting PGWD method, we can only estimate the influence of climate change on the development process of the Typhoon. Instead, the changes in genesis could not be estimated. Finally, we drove SU-WAT model (which includes shallow water equation model) to get the signal of storm surge height. The result indicates that the height of the storm surge increased up to 20% owing to these 150 years climate change.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMGC21A1056O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMGC21A1056O"><span>Climate change indices for Greenland applied directly for other arctic regions - Enhanced and utilized climate information from one high resolution RCM downscaling for Greenland evaluated through pattern scaling and CMIP5</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Olesen, M.; Christensen, J. H.; Boberg, F.</p> <p>2016-12-01</p> <p>Climate change indices for Greenland applied directly for other arctic regions - Enhanced and utilized climate information from one high resolution RCM downscaling for Greenland evaluated through pattern scaling and CMIP5Climate change affects the Greenlandic society both advantageously and disadvantageously. Changes in temperature and precipitation patterns may result in changes in a number of derived society related climate indices, such as the length of growing season or the number of annual dry days or a combination of the two - indices of substantial importance to society in a climate adaptation context.Detailed climate indices require high resolution downscaling. We have carried out a very high resolution (5 km) simulation with the regional climate model HIRHAM5, forced by the global model EC-Earth. Evaluation of RCM output is usually done with an ensemble of downscaled output with multiple RCM's and GCM's. Here we have introduced and tested a new technique; a translation of the robustness of an ensemble of GCM models from CMIP5 into the specific index from the HIRHAM5 downscaling through a correlation between absolute temperatures and its corresponding index values from the HIRHAM5 output.The procedure is basically conducted in two steps: First, the correlation between temperature and a given index for the HIRHAM5 simulation by a best fit to a second order polynomial is identified. Second, the standard deviation from the CMIP5 simulations is introduced to show the corresponding standard deviation of the index from the HIRHAM5 run. The change of specific climate indices due to global warming will then be possible to evaluate elsewhere corresponding to the change in absolute temperature.Results based on selected indices with focus on the future climate in Greenland calculated for the rcp4.5 and rcp8.5 scenarios will be presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1918222L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1918222L"><span>ClimEx - Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ludwig, Ralf; Baese, Frank; Braun, Marco; Brietzke, Gilbert; Brissette, Francois; Frigon, Anne; Giguère, Michel; Komischke, Holger; Kranzlmueller, Dieter; Leduc, Martin; Martel, Jean-Luc; Ricard, Simon; Schmid, Josef; von Trentini, Fabian; Turcotte, Richard; Weismueller, Jens; Willkofer, Florian; Wood, Raul</p> <p>2017-04-01</p> <p>The recent accumulation of extreme hydrological events in Bavaria and Québec has stimulated scientific and also societal interest. In addition to the challenges of an improved prediction of such situations and the implications for the associated risk management, there is, as yet, no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for 'virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the current and future ratio of natural variability and climate change impacts on meteorological extreme events. Selected data from the ensemble is used to drive a hydrological model experiment to illustrate the capacity to better determine the recurrence periods of hydrological extreme events under conditions of climate change. [The authors acknowledge funding for the project from the Bavarian State Ministry for the Environment and Consumer Protection].</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy...49..753T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy...49..753T"><span>Intercomparison and validation of the mixed layer depth fields of global ocean syntheses</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Toyoda, Takahiro; Fujii, Yosuke; Kuragano, Tsurane; Kamachi, Masafumi; Ishikawa, Yoichi; Masuda, Shuhei; Sato, Kanako; Awaji, Toshiyuki; Hernandez, Fabrice; Ferry, Nicolas; Guinehut, Stéphanie; Martin, Matthew J.; Peterson, K. Andrew; Good, Simon A.; Valdivieso, Maria; Haines, Keith; Storto, Andrea; Masina, Simona; Köhl, Armin; Zuo, Hao; Balmaseda, Magdalena; Yin, Yonghong; Shi, Li; Alves, Oscar; Smith, Gregory; Chang, You-Soon; Vernieres, Guillaume; Wang, Xiaochun; Forget, Gael; Heimbach, Patrick; Wang, Ou; Fukumori, Ichiro; Lee, Tong</p> <p>2017-08-01</p> <p>Intercomparison and evaluation of the global ocean surface mixed layer depth (MLD) fields estimated from a suite of major ocean syntheses are conducted. Compared with the reference MLDs calculated from individual profiles, MLDs calculated from monthly mean and gridded profiles show negative biases of 10-20 m in early spring related to the re-stratification process of relatively deep mixed layers. Vertical resolution of profiles also influences the MLD estimation. MLDs are underestimated by approximately 5-7 (14-16) m with the vertical resolution of 25 (50) m when the criterion of potential density exceeding the 10-m value by 0.03 kg m-3 is used for the MLD estimation. Using the larger criterion (0.125 kg m-3) generally reduces the underestimations. In addition, positive biases greater than 100 m are found in wintertime subpolar regions when MLD criteria based on temperature are used. Biases of the reanalyses are due to both model errors and errors related to differences between the assimilation methods. The result shows that these errors are partially cancelled out through the ensemble averaging. Moreover, the bias in the ensemble mean field of the reanalyses is smaller than in the observation-only analyses. This is largely attributed to comparably higher resolutions of the reanalyses. The robust reproduction of both the seasonal cycle and interannual variability by the ensemble mean of the reanalyses indicates a great potential of the ensemble mean MLD field for investigating and monitoring upper ocean processes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy..tmp..791D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy..tmp..791D"><span>Downscaling RCP8.5 daily temperatures and precipitation in Ontario using localized ensemble optimal interpolation (EnOI) and bias correction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Deng, Ziwang; Liu, Jinliang; Qiu, Xin; Zhou, Xiaolan; Zhu, Huaiping</p> <p>2017-10-01</p> <p>A novel method for daily temperature and precipitation downscaling is proposed in this study which combines the Ensemble Optimal Interpolation (EnOI) and bias correction techniques. For downscaling temperature, the day to day seasonal cycle of high resolution temperature of the NCEP climate forecast system reanalysis (CFSR) is used as background state. An enlarged ensemble of daily temperature anomaly relative to this seasonal cycle and information from global climate models (GCMs) are used to construct a gain matrix for each calendar day. Consequently, the relationship between large and local-scale processes represented by the gain matrix will change accordingly. The gain matrix contains information of realistic spatial correlation of temperature between different CFSR grid points, between CFSR grid points and GCM grid points, and between different GCM grid points. Therefore, this downscaling method keeps spatial consistency and reflects the interaction between local geographic and atmospheric conditions. Maximum and minimum temperatures are downscaled using the same method. For precipitation, because of the non-Gaussianity issue, a logarithmic transformation is used to daily total precipitation prior to conducting downscaling. Cross validation and independent data validation are used to evaluate this algorithm. Finally, data from a 29-member ensemble of phase 5 of the Coupled Model Intercomparison Project (CMIP5) GCMs are downscaled to CFSR grid points in Ontario for the period from 1981 to 2100. The results show that this method is capable of generating high resolution details without changing large scale characteristics. It results in much lower absolute errors in local scale details at most grid points than simple spatial downscaling methods. Biases in the downscaled data inherited from GCMs are corrected with a linear method for temperatures and distribution mapping for precipitation. The downscaled ensemble projects significant warming with amplitudes of 3.9 and 6.5 °C for 2050s and 2080s relative to 1990s in Ontario, respectively; Cooling degree days and hot days will significantly increase over southern Ontario and heating degree days and cold days will significantly decrease in northern Ontario. Annual total precipitation will increase over Ontario and heavy precipitation events will increase as well. These results are consistent with conclusions in many other studies in the literature.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A13N..05H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A13N..05H"><span>Low-wave number analysis of observations and ensemble forecasts to develop metrics for the selection of most realistic members to study multi-scale interactions between the environment and the convective organization of hurricanes: Focus on Rapid Intensification</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hristova-Veleva, S. M.; Chen, H.; Gopalakrishnan, S.; Haddad, Z. S.</p> <p>2017-12-01</p> <p>Tropical cyclones (TCs) are the product of complex multi-scale processes and interactions. The role of the environment has long been recognized. However, recent research has shown that convective-scale processes in the hurricane core might also play a crucial role in determining TCs intensity and size. Several studies have linked Rapid Intensification to the characteristics of the convective clouds (shallow versus deep), their organization (isolated versus wide-spread) and their location with respect to dynamical controls (the vertical shear, the radius of maximum wind). Yet a third set of controls signifies the interaction between the storm-scale and large-scale processes. Our goal is to use observations and models to advance the still-lacking understanding of these processes. Recently, hurricane models have improved significantly. However, deterministic forecasts have limitations due to the uncertainty in the representation of the physical processes and initial conditions. A crucial step forward is the use of high-resolution ensembles. We adopt the following approach: i) generate a high resolution ensemble forecast using HWRF; ii) produce synthetic data (e.g. brightness temperature) from the model fields for direct comparison to satellite observations; iii) develop metrics to allow us to sub-select the realistic members of the ensemble, based on objective measures of the similarity between observed and forecasted structures; iv) for these most-realistic members, determine the skill in forecasting TCs to provide"guidance on guidance"; v) use the members with the best predictive skill to untangle the complex multi-scale interactions. We will report on the first three goals of our research, using forecasts and observations of hurricane Edouard (2014), focusing on RI. We will focus on describing the metrics for the selection of the most appropriate ensemble members, based on applying low-wave number analysis (WNA - Hristova-Veleva et al., 2016) to the observed and forecasted 2D fields to develop objective criteria for consistency. We investigate the WNA cartoons of environmental moisture, precipitation structure and surface convergence. We will present the preliminary selection of most skillful members and will outline our future goals - analyzing the multi-scale interactions using these members</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1233231-diagnosing-isopycnal-diffusivity-eddying-idealized-midlatitude-ocean-basin-via-lagrangian-situ-global-high-performance-particle-tracking-light','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1233231-diagnosing-isopycnal-diffusivity-eddying-idealized-midlatitude-ocean-basin-via-lagrangian-situ-global-high-performance-particle-tracking-light"><span>Diagnosing isopycnal diffusivity in an eddying, idealized midlatitude ocean basin via Lagrangian, in Situ, Global, High-Performance Particle Tracking (LIGHT)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Wolfram, Phillip J.; Ringler, Todd D.; Maltrud, Mathew E.; ...</p> <p>2015-08-01</p> <p>Isopycnal diffusivity due to stirring by mesoscale eddies in an idealized, wind-forced, eddying, midlatitude ocean basin is computed using Lagrangian, in Situ, Global, High-Performance Particle Tracking (LIGHT). Simulation is performed via LIGHT within the Model for Prediction across Scales Ocean (MPAS-O). Simulations are performed at 4-, 8-, 16-, and 32-km resolution, where the first Rossby radius of deformation (RRD) is approximately 30 km. Scalar and tensor diffusivities are estimated at each resolution based on 30 ensemble members using particle cluster statistics. Each ensemble member is composed of 303 665 particles distributed across five potential density surfaces. Diffusivity dependence upon modelmore » resolution, velocity spatial scale, and buoyancy surface is quantified and compared with mixing length theory. The spatial structure of diffusivity ranges over approximately two orders of magnitude with values of O(10 5) m 2 s –1 in the region of western boundary current separation to O(10 3) m 2 s –1 in the eastern region of the basin. Dominant mixing occurs at scales twice the size of the first RRD. Model resolution at scales finer than the RRD is necessary to obtain sufficient model fidelity at scales between one and four RRD to accurately represent mixing. Mixing length scaling with eddy kinetic energy and the Lagrangian time scale yield mixing efficiencies that typically range between 0.4 and 0.8. In conclusion, a reduced mixing length in the eastern region of the domain relative to the west suggests there are different mixing regimes outside the baroclinic jet region.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A21F2211K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A21F2211K"><span>Can decadal climate predictions be improved by ocean ensemble dispersion filtering?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kadow, C.; Illing, S.; Kröner, I.; Ulbrich, U.; Cubasch, U.</p> <p>2017-12-01</p> <p>Decadal predictions by Earth system models aim to capture the state and phase of the climate several years inadvance. Atmosphere-ocean interaction plays an important role for such climate forecasts. While short-termweather forecasts represent an initial value problem and long-term climate projections represent a boundarycondition problem, the decadal climate prediction falls in-between these two time scales. The ocean memorydue to its heat capacity holds big potential skill on the decadal scale. In recent years, more precise initializationtechniques of coupled Earth system models (incl. atmosphere and ocean) have improved decadal predictions.Ensembles are another important aspect. Applying slightly perturbed predictions results in an ensemble. Insteadof using and evaluating one prediction, but the whole ensemble or its ensemble average, improves a predictionsystem. However, climate models in general start losing the initialized signal and its predictive skill from oneforecast year to the next. Here we show that the climate prediction skill of an Earth system model can be improvedby a shift of the ocean state toward the ensemble mean of its individual members at seasonal intervals. Wefound that this procedure, called ensemble dispersion filter, results in more accurate results than the standarddecadal prediction. Global mean and regional temperature, precipitation, and winter cyclone predictions showan increased skill up to 5 years ahead. Furthermore, the novel technique outperforms predictions with largerensembles and higher resolution. Our results demonstrate how decadal climate predictions benefit from oceanensemble dispersion filtering toward the ensemble mean. This study is part of MiKlip (fona-miklip.de) - a major project on decadal climate prediction in Germany.We focus on the Max-Planck-Institute Earth System Model using the low-resolution version (MPI-ESM-LR) andMiKlip's basic initialization strategy as in 2017 published decadal climate forecast: http://www.fona-miklip.de/decadal-forecast-2017-2026/decadal-forecast-for-2017-2026/ More informations about this study in JAMES:DOI: 10.1002/2016MS000787</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.7139S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.7139S"><span>Shallow cumuli ensemble statistics for development of a stochastic parameterization</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sakradzija, Mirjana; Seifert, Axel; Heus, Thijs</p> <p>2014-05-01</p> <p>According to a conventional deterministic approach to the parameterization of moist convection in numerical atmospheric models, a given large scale forcing produces an unique response from the unresolved convective processes. This representation leaves out the small-scale variability of convection, as it is known from the empirical studies of deep and shallow convective cloud ensembles, there is a whole distribution of sub-grid states corresponding to the given large scale forcing. Moreover, this distribution gets broader with the increasing model resolution. This behavior is also consistent with our theoretical understanding of a coarse-grained nonlinear system. We propose an approach to represent the variability of the unresolved shallow-convective states, including the dependence of the sub-grid states distribution spread and shape on the model horizontal resolution. Starting from the Gibbs canonical ensemble theory, Craig and Cohen (2006) developed a theory for the fluctuations in a deep convective ensemble. The micro-states of a deep convective cloud ensemble are characterized by the cloud-base mass flux, which, according to the theory, is exponentially distributed (Boltzmann distribution). Following their work, we study the shallow cumulus ensemble statistics and the distribution of the cloud-base mass flux. We employ a Large-Eddy Simulation model (LES) and a cloud tracking algorithm, followed by a conditional sampling of clouds at the cloud base level, to retrieve the information about the individual cloud life cycles and the cloud ensemble as a whole. In the case of shallow cumulus cloud ensemble, the distribution of micro-states is a generalized exponential distribution. Based on the empirical and theoretical findings, a stochastic model has been developed to simulate the shallow convective cloud ensemble and to test the convective ensemble theory. Stochastic model simulates a compound random process, with the number of convective elements drawn from a Poisson distribution, and cloud properties sub-sampled from a generalized ensemble distribution. We study the role of the different cloud subtypes in a shallow convective ensemble and how the diverse cloud properties and cloud lifetimes affect the system macro-state. To what extent does the cloud-base mass flux distribution deviate from the simple Boltzmann distribution and how does it affect the results from the stochastic model? Is the memory, provided by the finite lifetime of individual clouds, of importance for the ensemble statistics? We also test for the minimal information given as an input to the stochastic model, able to reproduce the ensemble mean statistics and the variability in a convective ensemble. An important property of the resulting distribution of the sub-grid convective states is its scale-adaptivity - the smaller the grid-size, the broader the compound distribution of the sub-grid states.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H53N..03O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H53N..03O"><span>Evaluating a Local Ensemble Transform Kalman Filter snow cover data assimilation method to estimate SWE within a high-resolution hydrologic modeling framework across Western US mountainous regions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Oaida, C. M.; Andreadis, K.; Reager, J. T., II; Famiglietti, J. S.; Levoe, S.</p> <p>2017-12-01</p> <p>Accurately estimating how much snow water equivalent (SWE) is stored in mountainous regions characterized by complex terrain and snowmelt-driven hydrologic cycles is not only greatly desirable, but also a big challenge. Mountain snowpack exhibits high spatial variability across a broad range of spatial and temporal scales due to a multitude of physical and climatic factors, making it difficult to observe or estimate in its entirety. Combing remotely sensed data and high resolution hydrologic modeling through data assimilation (DA) has the potential to provide a spatially and temporally continuous SWE dataset at horizontal scales that capture sub-grid snow spatial variability and are also relevant to stakeholders such as water resource managers. Here, we present the evaluation of a new snow DA approach that uses a Local Ensemble Transform Kalman Filter (LETKF) in tandem with the Variable Infiltration Capacity macro-scale hydrologic model across the Western United States, at a daily temporal resolution, and a horizontal resolution of 1.75 km x 1.75 km. The LETKF is chosen for its relative simplicity, ease of implementation, and computational efficiency and scalability. The modeling/DA system assimilates daily MODIS Snow Covered Area and Grain Size (MODSCAG) fractional snow cover over, and has been developed to efficiently calculate SWE estimates over extended periods of time and covering large regional-scale areas at relatively high spatial resolution, ultimately producing a snow reanalysis-type dataset. Here we focus on the assessment of SWE produced by the DA scheme over several basins in California's Sierra Nevada Mountain range where Airborne Snow Observatory data is available, during the last five water years (2013-2017), which include both one of the driest and one of the wettest years. Comparison against such a spatially distributed SWE observational product provides a greater understanding of the model's ability to estimate SWE and SWE spatial variability, and highlights under which conditions snow cover DA can add value in estimating SWE.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1912661M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1912661M"><span>Blending of Radial HF Radar Surface Current and Model Using ETKF Scheme For The Sunda Strait</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mujiasih, Subekti; Riyadi, Mochammad; Wandono, Dr; Wayan Suardana, I.; Nyoman Gede Wiryajaya, I.; Nyoman Suarsa, I.; Hartanto, Dwi; Barth, Alexander; Beckers, Jean-Marie</p> <p>2017-04-01</p> <p>Preliminary study of data blending of surface current for Sunda Strait-Indonesia has been done using the analysis scheme of the Ensemble Transform Kalman Filter (ETKF). The method is utilized to combine radial velocity from HF Radar and u and v component of velocity from Global Copernicus - Marine environment monitoring service (CMEMS) model. The initial ensemble is based on the time variability of the CMEMS model result. Data tested are from 2 CODAR Seasonde radar sites in Sunda Strait and 2 dates such as 09 September 2013 and 08 February 2016 at 12.00 UTC. The radial HF Radar data has a hourly temporal resolution, 20-60 km of spatial range, 3 km of range resolution, 5 degree of angular resolution and spatial resolution and 11.5-14 MHz of frequency range. The u and v component of the model velocity represents a daily mean with 1/12 degree spatial resolution. The radial data from one HF radar site is analyzed and the result compared to the equivalent radial velocity from CMEMS for the second HF radar site. Error checking is calculated by root mean squared error (RMSE). Calculation of ensemble analysis and ensemble mean is using Sangoma software package. The tested R which represents observation error covariance matrix, is a diagonal matrix with diagonal elements equal 0.05, 0.5 or 1.0 m2/s2. The initial ensemble members comes from a model simulation spanning a month (September 2013 or February 2016), one year (2013) or 4 years (2013-2016). The spatial distribution of the radial current are analyzed and the RMSE values obtained from independent HF radar station are optimized. It was verified that the analysis reproduces well the structure included in the analyzed HF radar data. More importantly, the analysis was also improved relative to the second independent HF radar site. RMSE of the improved analysis is better than first HF Radar site Analysis. The best result of the blending exercise was obtained for observation error variance equal to 0.05 m2/s2. This study is still preliminary step, but it gives promising result for bigger size of data, combining other model and further development. Keyword: HF Radar, Sunda Strait, ETKF, CMEMS</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1712188O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1712188O"><span>The total probabilities from high-resolution ensemble forecasting of floods</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian</p> <p>2015-04-01</p> <p>Ensemble forecasting has for a long time been used in meteorological modelling, to give an indication of the uncertainty of the forecasts. As meteorological ensemble forecasts often show some bias and dispersion errors, there is a need for calibration and post-processing of the ensembles. Typical methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). To make optimal predictions of floods along the stream network in hydrology, we can easily use the ensemble members as input to the hydrological models. However, some of the post-processing methods will need modifications when regionalizing the forecasts outside the calibration locations, as done by Hemri et al. (2013). We present a method for spatial regionalization of the post-processed forecasts based on EMOS and top-kriging (Skøien et al., 2006). We will also look into different methods for handling the non-normality of runoff and the effect on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005. Skøien, J. O., Merz, R. and Blöschl, G.: Top-kriging - Geostatistics on stream networks, Hydrol. Earth Syst. Sci., 10(2), 277-287, 2006.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.6575B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.6575B"><span>Bayesian quantitative precipitation forecasts in terms of quantiles</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bentzien, Sabrina; Friederichs, Petra</p> <p>2014-05-01</p> <p>Ensemble prediction systems (EPS) for numerical weather predictions on the mesoscale are particularly developed to obtain probabilistic guidance for high impact weather. An EPS not only issues a deterministic future state of the atmosphere but a sample of possible future states. Ensemble postprocessing then translates such a sample of forecasts into probabilistic measures. This study focus on probabilistic quantitative precipitation forecasts in terms of quantiles. Quantiles are particular suitable to describe precipitation at various locations, since no assumption is required on the distribution of precipitation. The focus is on the prediction during high-impact events and related to the Volkswagen Stiftung funded project WEX-MOP (Mesoscale Weather Extremes - Theory, Spatial Modeling and Prediction). Quantile forecasts are derived from the raw ensemble and via quantile regression. Neighborhood method and time-lagging are effective tools to inexpensively increase the ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Since an EPS provides a large amount of potentially informative predictors, a variable selection is required in order to obtain a stable statistical model. A Bayesian formulation of quantile regression allows for inference about the selection of predictive covariates by the use of appropriate prior distributions. Moreover, the implementation of an additional process layer for the regression parameters accounts for spatial variations of the parameters. Bayesian quantile regression and its spatially adaptive extension is illustrated for the German-focused mesoscale weather prediction ensemble COSMO-DE-EPS, which runs (pre)operationally since December 2010 at the German Meteorological Service (DWD). Objective out-of-sample verification uses the quantile score (QS), a weighted absolute error between quantile forecasts and observations. The QS is a proper scoring function and can be decomposed into reliability, resolutions and uncertainty parts. A quantile reliability plot gives detailed insights in the predictive performance of the quantile forecasts.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_6");'>6</a></li> <li><a href="#" onclick='return showDiv("page_7");'>7</a></li> <li class="active"><span>8</span></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_8 --> <div id="page_9" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_7");'>7</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li class="active"><span>9</span></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="161"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/10370237','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/10370237"><span>Optical mapping and its potential for large-scale sequencing projects.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Aston, C; Mishra, B; Schwartz, D C</p> <p>1999-07-01</p> <p>Physical mapping has been rediscovered as an important component of large-scale sequencing projects. Restriction maps provide landmark sequences at defined intervals, and high-resolution restriction maps can be assembled from ensembles of single molecules by optical means. Such optical maps can be constructed from both large-insert clones and genomic DNA, and are used as a scaffold for accurately aligning sequence contigs generated by shotgun sequencing.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFM.A31F0084G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFM.A31F0084G"><span>A Comparison Between Heliosat-2 and Artificial Neural Network Methods for Global Horizontal Irradiance Retrievals over Desert Environments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ghedira, H.; Eissa, Y.</p> <p>2012-12-01</p> <p>Global horizontal irradiance (GHI) retrievals at the surface of any given location could be used for preliminary solar resource assessments. More accurately, the direct normal irradiance (DNI) and diffuse horizontal irradiance (DHI) are also required to estimate the global tilt irradiance, mainly used for fixed flat plate collectors. Two different satellite-based models for solar irradiance retrievals have been applied over the desert environment of the United Arab Emirates (UAE). Both models employ channels of the SEVIRI instrument, onboard the geostationary satellite Meteosat Second Generation, as their main inputs. The satellite images used in this study have a temporal resolution of 15-min and a spatial resolution of 3-km. The objective of this study is to compare between the GHI retrieved using the Heliosat-2 method and an artificial neural network (ANN) ensemble method over the UAE. The high-resolution visible channel of SEVIRI is used in the Heliosat-2 method to derive the cloud index. The cloud index is then used to compute the cloud transmission, while the cloud-free GHI is computed from the Linke turbidity factor. The product of the cloud transmission and the cloud-free GHI denotes the estimated GHI. A constant underestimation is observed in the estimated GHI over the dataset available in the UAE. Therefore, the cloud-free DHI equation in the model was recalibrated to fix the bias. After recalibration, results over the UAE show a root mean square error (RMSE) value of 10.1% and a mean bias error (MBE) of -0.5%. As for the ANN approach, six thermal channels of SEVIRI were used to estimate the DHI and the total optical depth of the atmosphere (δ). An ensemble approach is employed to obtain a better generalizability of the results, as opposed to using one single weak network. The DNI is then computed from the estimated δ using the Beer-Bouguer-Lambert law. The GHI is computed from the DNI and DHI estimates. The RMSE for the estimated GHI obtained over an independent dataset over the UAE is 7.2% and the MBE is +1.9%. The results obtained by the two methods have shown that both the recalibrated Heliosat-2 and the ANN ensemble methods estimate the GHI at a 15-min resolution with high accuracy. The advantage of the ANN ensemble approach is that it derives the GHI from accurate DNI and DHI estimates. The DNI and DHI estimates are valuable when computing the global tilt irradiance. Also, accurate DNI estimates are beneficial for preliminary site selection for concentrating solar powered plants.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFM.A11A0011F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFM.A11A0011F"><span>Examination of elevation dependency in observed and projected temperature change in the Upper Indus Basin and Western Himalaya</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fowler, H. J.; Forsythe, N. D.; Blenkinsop, S.; Archer, D.; Hardy, A.; Janes, T.; Jones, R. G.; Holderness, T.</p> <p>2013-12-01</p> <p>We present results of two distinct, complementary analyses to assess evidence of elevation dependency in temperature change in the UIB (Karakoram, Eastern Hindu Kush) and wider WH. The first analysis component examines historical remotely-sensed land surface temperature (LST) from the second and third generation of the Advanced Very High Resolution Radiometer (AVHRR/2, AVHRR/3) instrument flown on NOAA satellite platforms since the mid-1980s through present day. The high spatial resolution (<4km) from AVHRR instrument enables precise consideration of the relationship between estimated LST and surface topography. The LST data product was developed as part of initiative to produce continuous time-series for key remotely sensed spatial products (LST, snow covered area, cloud cover, NDVI) extending as far back into the historical record as feasible. Context for the AVHRR LST data product is provided by results of bias assessment and validation procedures against both available local observations, both manned and automatic weather stations. Local observations provide meaningful validation and bias assessment of the vertical gradients found in the AVHRR LST as the elevation range from the lowest manned meteorological station (at 1460m asl) to the highest automatic weather station (4733m asl) covers much of the key range yielding runoff from seasonal snowmelt. Furthermore the common available record period of these stations (1995 to 2007) enables assessment not only of the AVHRR LST but also performance comparisons with the more recent MODIS LST data product. A range of spatial aggregations (from minor tributary catchments to primary basin headwaters) is performed to assess regional homogeneity and identify potential latitudinal or longitudinal gradients in elevation dependency. The second analysis component investigates elevation dependency, including its uncertainty, in projected temperature change trajectories in the downscaling of a seventeen member Global Climate Model (GCM) perturbed physics ensemble (PPE) of transient (130-year) simulations using a moderate resolution (25km) regional climate model (RCM). The GCM ensemble is the17-member QUMP (Quantifying Uncertainty in Model Projections) ensemble and the downscaling is done using HadRM3P, part of the PRECIS regional climate modelling system. Both the RCM and GCMs are models developed the UK Met Office Hadley Centre and are based on the HadCM3 GCM. Use of the multi-member PPE enables quantification of uncertainty in projected temperature change while the spatial resolution of RCM improves insight into the role of elevation in projected rates of change. Furthermore comparison with the results of the remote sensing analysis component - considered to provide an 'observed climatology' - permits evaluation of individual ensemble members with regards to biases in spatial gradients in temperature as well timing and magnitude of annual cycles.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.H31L..08M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.H31L..08M"><span>Using a High-Resolution Ensemble Modeling Method to Inform Risk-Based Decision-Making at Taylor Park Dam, Colorado</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mueller, M.; Mahoney, K. M.; Holman, K. D.</p> <p>2015-12-01</p> <p>The Bureau of Reclamation (Reclamation) is responsible for the safety of Taylor Park Dam, located in central Colorado at an elevation of 9300 feet. A key aspect of dam safety is anticipating extreme precipitation, runoff and the associated inflow of water to the reservoir within a probabilistic framework for risk analyses. The Cooperative Institute for Research in Environmental Sciences (CIRES) has partnered with Reclamation to improve understanding and estimation of precipitation in the western United States, including the Taylor Park watershed. A significant challenge is that Taylor Park Dam is located in a relatively data-sparse region, surrounded by mountains exceeding 12,000 feet. To better estimate heavy precipitation events in this basin, a high-resolution modeling approach is used. The Weather Research and Forecasting (WRF) model is employed to simulate events that have produced observed peaks in streamflow at the location of interest. Importantly, an ensemble of model simulations are run on each event so that uncertainty bounds (i.e., forecast error) may be provided such that the model outputs may be more effectively used in Reclamation's risk assessment framework. Model estimates of precipitation (and the uncertainty thereof) are then used in rainfall runoff models to determine the probability of inflows to the reservoir for use in Reclamation's dam safety risk analyses.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.9681S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.9681S"><span>Seasonal re-emergence of North Atlantic subsurface ocean temperature anomalies and Northern hemisphere climate</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sinha, Bablu; Blaker, Adam; Duchez, Aurelie; Grist, Jeremy; Hewitt, Helene; Hirschi, Joel; Hyder, Patrick; Josey, Simon; Maclachlan, Craig; New, Adrian</p> <p>2017-04-01</p> <p>A high-resolution coupled ocean atmosphere model is used to study the effects of seasonal re-emergence of North Atlantic subsurface ocean temperature anomalies on northern hemisphere winter climate. A 50-member control simulation is integrated from September 1 to 28 February and compared with a similar ensemble with perturbed ocean initial conditions. The perturbation consists of a density-compensated subsurface (deeper than 180m) temperature anomaly corresponding to the observed subsurface temperature anomaly for September 2010, which is known to have re-emerged at the ocean surface in subsequent months. The perturbation is confined to the North Atlantic Ocean between the Equator and 65 degrees North. The model has 1/4 degree horizontal resolution in the ocean and the experiment is repeated for two atmosphere horizontal resolutions ( 60km and 25km) in order to determine whether the sensitivity of the atmosphere to re-emerging temperature anomalies is dependent on resolution. The ensembles display a wide range of reemergence behaviour, in some cases re-emergence occurs by November, in others it is delayed or does not occur at all. A wide range of amplitudes of the re-emergent temperature anomalies is observed. In cases where re-emergence occurs, there is a marked effect on both the regional (North Atlantic and Europe) and hemispheric surface pressure and temperature patterns. The results highlight a potentially important process whereby ocean memory of conditions up to a year earlier can significantly enhance seasonal forecast skill.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1233231','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1233231"><span></span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Wolfram, Phillip J.; Ringler, Todd D.; Maltrud, Mathew E.</p> <p></p> <p>Isopycnal diffusivity due to stirring by mesoscale eddies in an idealized, wind-forced, eddying, midlatitude ocean basin is computed using Lagrangian, in Situ, Global, High-Performance Particle Tracking (LIGHT). Simulation is performed via LIGHT within the Model for Prediction across Scales Ocean (MPAS-O). Simulations are performed at 4-, 8-, 16-, and 32-km resolution, where the first Rossby radius of deformation (RRD) is approximately 30 km. Scalar and tensor diffusivities are estimated at each resolution based on 30 ensemble members using particle cluster statistics. Each ensemble member is composed of 303 665 particles distributed across five potential density surfaces. Diffusivity dependence upon modelmore » resolution, velocity spatial scale, and buoyancy surface is quantified and compared with mixing length theory. The spatial structure of diffusivity ranges over approximately two orders of magnitude with values of O(10 5) m 2 s –1 in the region of western boundary current separation to O(10 3) m 2 s –1 in the eastern region of the basin. Dominant mixing occurs at scales twice the size of the first RRD. Model resolution at scales finer than the RRD is necessary to obtain sufficient model fidelity at scales between one and four RRD to accurately represent mixing. Mixing length scaling with eddy kinetic energy and the Lagrangian time scale yield mixing efficiencies that typically range between 0.4 and 0.8. In conclusion, a reduced mixing length in the eastern region of the domain relative to the west suggests there are different mixing regimes outside the baroclinic jet region.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ACP....18.6483S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ACP....18.6483S"><span>High-resolution inversion of methane emissions in the Southeast US using SEAC4RS aircraft observations of atmospheric methane: anthropogenic and wetland sources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sheng, Jian-Xiong; Jacob, Daniel J.; Turner, Alexander J.; Maasakkers, Joannes D.; Sulprizio, Melissa P.; Bloom, A. Anthony; Andrews, Arlyn E.; Wunch, Debra</p> <p>2018-05-01</p> <p>We use observations of boundary layer methane from the SEAC4RS aircraft campaign over the Southeast US in August-September 2013 to estimate methane emissions in that region through an inverse analysis with up to 0.25° × 0.3125° (25×25 km2) resolution and with full error characterization. The Southeast US is a major source region for methane including large contributions from oil and gas production and wetlands. Our inversion uses state-of-the-art emission inventories as prior estimates, including a gridded version of the anthropogenic EPA Greenhouse Gas Inventory and the mean of the WetCHARTs ensemble for wetlands. Inversion results are independently verified by comparison with surface (NOAA/ESRL) and column (TCCON) methane observations. Our posterior estimates for the Southeast US are 12.8 ± 0.9 Tg a-1 for anthropogenic sources (no significant change from the gridded EPA inventory) and 9.4 ± 0.8 Tg a-1 for wetlands (27 % decrease from the mean in the WetCHARTs ensemble). The largest source of error in the WetCHARTs wetlands ensemble is the land cover map specification of wetland areal extent. Our results support the accuracy of the EPA anthropogenic inventory on a regional scale but there are significant local discrepancies for oil and gas production fields, suggesting that emission factors are more variable than assumed in the EPA inventory.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018HESS...22.1371A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018HESS...22.1371A"><span>A nonparametric statistical technique for combining global precipitation datasets: development and hydrological evaluation over the Iberian Peninsula</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Abul Ehsan Bhuiyan, Md; Nikolopoulos, Efthymios I.; Anagnostou, Emmanouil N.; Quintana-Seguí, Pere; Barella-Ortiz, Anaïs</p> <p>2018-02-01</p> <p>This study investigates the use of a nonparametric, tree-based model, quantile regression forests (QRF), for combining multiple global precipitation datasets and characterizing the uncertainty of the combined product. We used the Iberian Peninsula as the study area, with a study period spanning 11 years (2000-2010). Inputs to the QRF model included three satellite precipitation products, CMORPH, PERSIANN, and 3B42 (V7); an atmospheric reanalysis precipitation and air temperature dataset; satellite-derived near-surface daily soil moisture data; and a terrain elevation dataset. We calibrated the QRF model for two seasons and two terrain elevation categories and used it to generate ensemble for these conditions. Evaluation of the combined product was based on a high-resolution, ground-reference precipitation dataset (SAFRAN) available at 5 km 1 h-1 resolution. Furthermore, to evaluate relative improvements and the overall impact of the combined product in hydrological response, we used the generated ensemble to force a distributed hydrological model (the SURFEX land surface model and the RAPID river routing scheme) and compared its streamflow simulation results with the corresponding simulations from the individual global precipitation and reference datasets. We concluded that the proposed technique could generate realizations that successfully encapsulate the reference precipitation and provide significant improvement in streamflow simulations, with reduction in systematic and random error on the order of 20-99 and 44-88 %, respectively, when considering the ensemble mean.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16..247G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16..247G"><span>Application Bayesian Model Averaging method for ensemble system for Poland</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Guzikowski, Jakub; Czerwinska, Agnieszka</p> <p>2014-05-01</p> <p>The aim of the project is to evaluate methods for generating numerical ensemble weather prediction using a meteorological data from The Weather Research & Forecasting Model and calibrating this data by means of Bayesian Model Averaging (WRF BMA) approach. We are constructing height resolution short range ensemble forecasts using meteorological data (temperature) generated by nine WRF's models. WRF models have 35 vertical levels and 2.5 km x 2.5 km horizontal resolution. The main emphasis is that the used ensemble members has a different parameterization of the physical phenomena occurring in the boundary layer. To calibrate an ensemble forecast we use Bayesian Model Averaging (BMA) approach. The BMA predictive Probability Density Function (PDF) is a weighted average of predictive PDFs associated with each individual ensemble member, with weights that reflect the member's relative skill. For test we chose a case with heat wave and convective weather conditions in Poland area from 23th July to 1st August 2013. From 23th July to 29th July 2013 temperature oscillated below or above 30 Celsius degree in many meteorology stations and new temperature records were added. During this time the growth of the hospitalized patients with cardiovascular system problems was registered. On 29th July 2013 an advection of moist tropical air masses was recorded in the area of Poland causes strong convection event with mesoscale convection system (MCS). MCS caused local flooding, damage to the transport infrastructure, destroyed buildings, trees and injuries and direct threat of life. Comparison of the meteorological data from ensemble system with the data recorded on 74 weather stations localized in Poland is made. We prepare a set of the model - observations pairs. Then, the obtained data from single ensemble members and median from WRF BMA system are evaluated on the basis of the deterministic statistical error Root Mean Square Error (RMSE), Mean Absolute Error (MAE). To evaluation probabilistic data The Brier Score (BS) and Continuous Ranked Probability Score (CRPS) were used. Finally comparison between BMA calibrated data and data from ensemble members will be displayed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1612369D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1612369D"><span>Convergence in France facing Big Data era and Exascale challenges for Climate Sciences</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Denvil, Sébastien; Dufresne, Jean-Louis; Salas, David; Meurdesoif, Yann; Valcke, Sophie; Caubel, Arnaud; Foujols, Marie-Alice; Servonnat, Jérôme; Sénési, Stéphane; Derouillat, Julien; Voury, Pascal</p> <p>2014-05-01</p> <p>The presentation will introduce a french national project : CONVERGENCE that has been funded for four years. This project will tackle big data and computational challenges faced by climate modeling community in HPC context. Model simulations are central to the study of complex mechanisms and feedbacks in the climate system and to provide estimates of future and past climate changes. Recent trends in climate modelling are to add more physical components in the modelled system, increasing the resolution of each individual component and the more systematic use of large suites of simulations to address many scientific questions. Climate simulations may therefore differ in their initial state, parameter values, representation of physical processes, spatial resolution, model complexity, and degree of realism or degree of idealisation. In addition, there is a strong need for evaluating, improving and monitoring the performance of climate models using a large ensemble of diagnostics and better integration of model outputs and observational data. High performance computing is currently reaching the exascale and has the potential to produce this exponential increase of size and numbers of simulations. However, post-processing, analysis, and exploration of the generated data have stalled and there is a strong need for new tools to cope with the growing size and complexity of the underlying simulations and datasets. Exascale simulations require new scalable software tools to generate, manage and mine those simulations ,and data to extract the relevant information and to take the correct decision. The primary purpose of this project is to develop a platform capable of running large ensembles of simulations with a suite of models, to handle the complex and voluminous datasets generated, to facilitate the evaluation and validation of the models and the use of higher resolution models. We propose to gather interdisciplinary skills to design, using a component-based approach, a specific programming environment for scalable scientific simulations and analytics, integrating new and efficient ways of deploying and analysing the applications on High Performance Computing (HPC) system. CONVERGENCE, gathering HPC and informatics expertise that cuts across the individual partners and the broader HPC community, will allow the national climate community to leverage information technology (IT) innovations to address its specific needs. Our methodology consists in developing an ensemble of generic elements needed to run the French climate models with different grids and different resolution, ensuring efficient and reliable execution of these models, managing large volume and number of data and allowing analysis of the results and precise evaluation of the models. These elements include data structure definition and input-output (IO), code coupling and interpolation, as well as runtime and pre/post-processing environments. A common data and metadata structure will allow transferring consistent information between the various elements. All these generic elements will be open source and publicly available. The IPSL-CM and CNRM-CM climate models will make use of these elements that will constitute a national platform for climate modelling. This platform will be used, in its entirety, to optimise and tune the next version of the IPSL-CM model and to develop a global coupled climate model with a regional grid refinement. It will also be used, at least partially, to run ensembles of the CNRM-CM model at relatively high resolution and to run a very-high resolution prototype of this model. The climate models we developed are already involved in many international projects. For instance we participate to the CMIP (Coupled Model Intercomparison Project) project that is very demanding but has a high visibility: its results are widely used and are in particular synthesised in the IPCC (Intergovernmental Panel on Climate Change) assessment reports. The CONVERGENCE project will constitute an invaluable step for the French climate community to prepare and better contribute to the next phase of the CMIP project.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017GeoRL..4410369A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017GeoRL..4410369A"><span>Can Atmospheric Reanalysis Data Sets Be Used to Reproduce Flooding Over Large Scales?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Andreadis, Konstantinos M.; Schumann, Guy J.-P.; Stampoulis, Dimitrios; Bates, Paul D.; Brakenridge, G. Robert; Kettner, Albert J.</p> <p>2017-10-01</p> <p>Floods are costly to global economies and can be exceptionally lethal. The ability to produce consistent flood hazard maps over large areas could provide a significant contribution to reducing such losses, as the lack of knowledge concerning flood risk is a major factor in the transformation of river floods into flood disasters. In order to accurately reproduce flooding in river channels and floodplains, high spatial resolution hydrodynamic models are needed. Despite being computationally expensive, recent advances have made their continental to global implementation feasible, although inputs for long-term simulations may require the use of reanalysis meteorological products especially in data-poor regions. We employ a coupled hydrologic/hydrodynamic model cascade forced by the 20CRv2 reanalysis data set and evaluate its ability to reproduce flood inundation area and volume for Australia during the 1973-2012 period. Ensemble simulations using the reanalysis data were performed to account for uncertainty in the meteorology and compared with a validated benchmark simulation. Results show that the reanalysis ensemble capture the inundated areas and volumes relatively well, with correlations for the ensemble mean of 0.82 and 0.85 for area and volume, respectively, although the meteorological ensemble spread propagates in large uncertainty of the simulated flood characteristics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..11..299H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..11..299H"><span>Probabilistic flood warning using grand ensemble weather forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>He, Y.; Wetterhall, F.; Cloke, H.; Pappenberger, F.; Wilson, M.; Freer, J.; McGregor, G.</p> <p>2009-04-01</p> <p>As the severity of floods increases, possibly due to climate and landuse change, there is urgent need for more effective and reliable warning systems. The incorporation of numerical weather predictions (NWP) into a flood warning system can increase forecast lead times from a few hours to a few days. A single NWP forecast from a single forecast centre, however, is insufficient as it involves considerable non-predictable uncertainties and can lead to a high number of false or missed warnings. An ensemble of weather forecasts from one Ensemble Prediction System (EPS), when used on catchment hydrology, can provide improved early flood warning as some of the uncertainties can be quantified. EPS forecasts from a single weather centre only account for part of the uncertainties originating from initial conditions and stochastic physics. Other sources of uncertainties, including numerical implementations and/or data assimilation, can only be assessed if a grand ensemble of EPSs from different weather centres is used. When various models that produce EPS from different weather centres are aggregated, the probabilistic nature of the ensemble precipitation forecasts can be better retained and accounted for. The availability of twelve global EPSs through the 'THORPEX Interactive Grand Global Ensemble' (TIGGE) offers a new opportunity for the design of an improved probabilistic flood forecasting framework. This work presents a case study using the TIGGE database for flood warning on a meso-scale catchment. The upper reach of the River Severn catchment located in the Midlands Region of England is selected due to its abundant data for investigation and its relatively small size (4062 km2) (compared to the resolution of the NWPs). This choice was deliberate as we hypothesize that the uncertainty in the forcing of smaller catchments cannot be represented by a single EPS with a very limited number of ensemble members, but only through the variance given by a large number ensembles and ensemble system. A coupled atmospheric-hydrologic-hydraulic cascade system driven by the TIGGE ensemble forecasts is set up to study the potential benefits of using the TIGGE database in early flood warning. Physically based and fully distributed LISFLOOD suite of models is selected to simulate discharge and flood inundation consecutively. The results show the TIGGE database is a promising tool to produce forecasts of discharge and flood inundation comparable with the observed discharge and simulated inundation driven by the observed discharge. The spread of discharge forecasts varies from centre to centre, but it is generally large, implying a significant level of uncertainties. Precipitation input uncertainties dominate and propagate through the cascade chain. The current NWPs fall short of representing the spatial variability of precipitation on a comparatively small catchment. This perhaps indicates the need to improve NWPs resolution and/or disaggregation techniques to narrow down the spatial gap between meteorology and hydrology. It is not necessarily true that early flood warning becomes more reliable when more ensemble forecasts are employed. It is difficult to identify the best forecast centre(s), but in general the chance of detecting floods is increased by using the TIGGE database. Only one flood event was studied because most of the TIGGE data became available after October 2007. It is necessary to test the TIGGE ensemble forecasts with other flood events in other catchments with different hydrological and climatic regimes before general conclusions can be made on its robustness and applicability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1812793N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1812793N"><span>Streamflow hindcasting in European river basins via multi-parametric ensemble of the mesoscale hydrologic model (mHM)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Noh, Seong Jin; Rakovec, Oldrich; Kumar, Rohini; Samaniego, Luis</p> <p>2016-04-01</p> <p>There have been tremendous improvements in distributed hydrologic modeling (DHM) which made a process-based simulation with a high spatiotemporal resolution applicable on a large spatial scale. Despite of increasing information on heterogeneous property of a catchment, DHM is still subject to uncertainties inherently coming from model structure, parameters and input forcing. Sequential data assimilation (DA) may facilitate improved streamflow prediction via DHM using real-time observations to correct internal model states. In conventional DA methods such as state updating, parametric uncertainty is, however, often ignored mainly due to practical limitations of methodology to specify modeling uncertainty with limited ensemble members. If parametric uncertainty related with routing and runoff components is not incorporated properly, predictive uncertainty by DHM may be insufficient to capture dynamics of observations, which may deteriorate predictability. Recently, a multi-scale parameter regionalization (MPR) method was proposed to make hydrologic predictions at different scales using a same set of model parameters without losing much of the model performance. The MPR method incorporated within the mesoscale hydrologic model (mHM, http://www.ufz.de/mhm) could effectively represent and control uncertainty of high-dimensional parameters in a distributed model using global parameters. In this study, we present a global multi-parametric ensemble approach to incorporate parametric uncertainty of DHM in DA to improve streamflow predictions. To effectively represent and control uncertainty of high-dimensional parameters with limited number of ensemble, MPR method is incorporated with DA. Lagged particle filtering is utilized to consider the response times and non-Gaussian characteristics of internal hydrologic processes. The hindcasting experiments are implemented to evaluate impacts of the proposed DA method on streamflow predictions in multiple European river basins having different climate and catchment characteristics. Because augmentation of parameters is not required within an assimilation window, the approach could be stable with limited ensemble members and viable for practical uses.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A12H..05M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A12H..05M"><span>Variability of North Atlantic Hurricane Frequency in a Large Ensemble of High-Resolution Climate Simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mei, W.; Kamae, Y.; Xie, S. P.</p> <p>2017-12-01</p> <p>Forced and internal variability of North Atlantic hurricane frequency during 1951-2010 is studied using a large ensemble of climate simulations by a 60-km atmospheric general circulation model that is forced by observed sea surface temperatures (SSTs). The simulations well capture the interannual-to-decadal variability of hurricane frequency in best track data, and further suggest a possible underestimate of hurricane counts in the current best track data prior to 1966 when satellite measurements were unavailable. A genesis potential index (GPI) averaged over the Main Development Region (MDR) accounts for more than 80% of the forced variations in hurricane frequency, with potential intensity and vertical wind shear being the dominant factors. In line with previous studies, the difference between MDR SST and tropical mean SST is a simple but useful predictor; a one-degree increase in this SST difference produces 7.1±1.4 more hurricanes. The hurricane frequency also exhibits internal variability that is comparable in magnitude to the interannual variability. The 100-member ensemble allows us to address the following important questions: (1) Are the observations equivalent to one realization of such a large ensemble? (2) How many ensemble members are needed to reproduce the variability in observations and in the forced component of the simulations? The sources of the internal variability in hurricane frequency will be identified and discussed. The results provide an explanation for the relatively week correlation ( 0.6) between MDR GPI and hurricane frequency on interannual timescales in observations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.H13J1551C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.H13J1551C"><span>Short-term ensemble radar rainfall forecasts for hydrological applications</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Codo de Oliveira, M.; Rico-Ramirez, M. A.</p> <p>2016-12-01</p> <p>Flooding is a very common natural disaster around the world, putting local population and economy at risk. Forecasting floods several hours ahead and issuing warnings are of main importance to permit proper response in emergency situations. However, it is important to know the uncertainties related to the rainfall forecasting in order to produce more reliable forecasts. Nowcasting models (short-term rainfall forecasts) are able to produce high spatial and temporal resolution predictions that are useful in hydrological applications. Nonetheless, they are subject to uncertainties mainly due to the nowcasting model used, errors in radar rainfall estimation, temporal development of the velocity field and to the fact that precipitation processes such as growth and decay are not taken into account. In this study an ensemble generation scheme using rain gauge data as a reference to estimate radars errors is used to produce forecasts with up to 3h lead-time. The ensembles try to assess in a realistic way the residual uncertainties that remain even after correction algorithms are applied in the radar data. The ensembles produced are compered to a stochastic ensemble generator. Furthermore, the rainfall forecast output was used as an input in a hydrodynamic sewer network model and also in hydrological model for catchments of different sizes in north England. A comparative analysis was carried of how was carried out to assess how the radar uncertainties propagate into these models. The first named author is grateful to CAPES - Ciencia sem Fronteiras for funding this PhD research.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016JGRD..121.5213L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016JGRD..121.5213L"><span>High-resolution atmospheric inversion of urban CO2 emissions during the dormant season of the Indianapolis Flux Experiment (INFLUX)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lauvaux, Thomas; Miles, Natasha L.; Deng, Aijun; Richardson, Scott J.; Cambaliza, Maria O.; Davis, Kenneth J.; Gaudet, Brian; Gurney, Kevin R.; Huang, Jianhua; O'Keefe, Darragh; Song, Yang; Karion, Anna; Oda, Tomohiro; Patarasuk, Risa; Razlivanov, Igor; Sarmiento, Daniel; Shepson, Paul; Sweeney, Colm; Turnbull, Jocelyn; Wu, Kai</p> <p>2016-05-01</p> <p>Based on a uniquely dense network of surface towers measuring continuously the atmospheric concentrations of greenhouse gases (GHGs), we developed the first comprehensive monitoring systems of CO2 emissions at high resolution over the city of Indianapolis. The urban inversion evaluated over the 2012-2013 dormant season showed a statistically significant increase of about 20% (from 4.5 to 5.7 MtC ± 0.23 MtC) compared to the Hestia CO2 emission estimate, a state-of-the-art building-level emission product. Spatial structures in prior emission errors, mostly undetermined, appeared to affect the spatial pattern in the inverse solution and the total carbon budget over the entire area by up to 15%, while the inverse solution remains fairly insensitive to the CO2 boundary inflow and to the different prior emissions (i.e., ODIAC). Preceding the surface emission optimization, we improved the atmospheric simulations using a meteorological data assimilation system also informing our Bayesian inversion system through updated observations error variances. Finally, we estimated the uncertainties associated with undetermined parameters using an ensemble of inversions. The total CO2 emissions based on the ensemble mean and quartiles (5.26-5.91 MtC) were statistically different compared to the prior total emissions (4.1 to 4.5 MtC). Considering the relatively small sensitivity to the different parameters, we conclude that atmospheric inversions are potentially able to constrain the carbon budget of the city, assuming sufficient data to measure the inflow of GHG over the city, but additional information on prior emission error structures are required to determine the spatial structures of urban emissions at high resolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1513512R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1513512R"><span>Improving modelled impacts on the flowering of temperate fruit trees in the Iberian Peninsula of climate change projections for 21st century</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ruiz-Ramos, Margarita; Pérez-Lopez, David; Sánchez-Sánchez, Enrique; Centeno, Ana; Dosio, Alessandro; Lopez-de-la-Franca, Noelia</p> <p>2013-04-01</p> <p>Flowering of temperate trees needs winter chilling, being the specific requirements dependent on the variety. This work studied the trend and changes of values of chilling hours for some representative agricultural locations in Spain for the last three decades and their projected changes under climate change scenarios. According to our previous results (Pérez-López et al., 2012), areas traditionally producing fruit as the Ebro (NE of Spain) or Guadalquivir (SO) valleys, Murcia (SE) and Extremadura (SO) could have a major cold reduction of chill-hours. This would drive a change of varieties or species and may enhance the use of chemicals to complete the needs of chill hours for flowering. However, these results showed high uncertainty, partly due to the bias of the climate data used, generated by Regional Climate Models. The chilling hours were calculated with different methods according to the species considered: North Carolina method (Shaltout and Unrath, 1983) was used for apples, Utah method (Richardson et al. 1974) for peach and grapevine and the approach used by De Melo-Abreu et al. (2004) for olive trees. The climate data used as inputs were the results of numerical simulations obtained from a group of regional climate models at high resolution (25 km) from the European Project ENSEMBLES (http://www.ensembles-eu.org/) first bias corrected for temperatures and precipitation (Dosio and Paruolo, 2011; Dosio et al., 2012). This work aims to improve the impact projections obtained in Pérez-López et al. (2012). For this purpose, variation of chill-hours between 2nd half of 20th century and 1st half of 21st century at the study locations were recalculated considering 1) a feedback in the dates in which the chilling hours are calculated, to take into account the shift of phenological dates, and 2) substituting the original ENSEMBLES data set of climate used in Pérez-López et al. (2012) by the bias corrected data set. Calculations for the 2nd half of 20th century will be used to evaluate the quality of the new data set of projections. Acknowledgements This research has been funded by project PEII10-0248-5680 from Junta de Comunidades de Castilla-La Mancha, Spain. References De Melo-Abreu, J. P. Barranco D. Cordeiro, A. M. Tous, J. Rogado, B. M. Villalobos, F. J. 2004. Modelling olive flowering date using chilling for dormancy release and thermal time. Agricultural and Forest Meteorology, 125: 117-127. Dosio A. and Paruolo P., 2011. Bias correction of the ENSEMBLES high-resolution climate change projections for use by impact models: Evaluation on the present climate . Journal of Geophysical Research, VOL. 116, D16106, doi:10.1029/2011JD015934 Dosio A., Paruolo P. and Rojas R., 2012. Bias correction of the ENSEMBLES high resolution climate change projections for use by impact models: Analysis of the climate change signal. Journal of Geophysical Research, Volume 117, D17, doi: 10.1029/2012JD017968 Herrera et. al. (2012) Development and Analysis of a 50 year high-resolution daily gridded precipitation dataset over Spain (Spain02). International Journal of Climatology 32:74-85 DOI: 10.1002/joc.2256. Pérez-López; D., Ruiz-Ramos, M., Sánchez-Sánchez. E., Centeno A., Prieto-Egido, I., and López-de-la-Franca, N., 2012. Influence of climate change on the flowering of temperate fruit trees. Geophysical Research Abstracts Vol. 14, EGU2012-5774, EGU General Assembly 2012. Richardson, E.A. Seeley, S.D. Walker, D.R. 1974. A model for estimating the completion of rest for 'Redhaven' and 'Elberta' peach trees. HortScience, 9: 331-332. Shaltout, A.D. Unrath, C. r. 1983. Rest completion prediction model for 'Starkrimson Delicious' apples. J. Amer. Soc. Hort. Sci., 108: 957-961.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1715248V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1715248V"><span>Global operational hydrological forecasts through eWaterCycle</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>van de Giesen, Nick; Bierkens, Marc; Donchyts, Gennadii; Drost, Niels; Hut, Rolf; Sutanudjaja, Edwin</p> <p>2015-04-01</p> <p>Central goal of the eWaterCycle project (www.ewatercycle.org) is the development of an operational hyper-resolution hydrological global model. This model is able to produce 14 day ensemble forecasts based on a hydrological model and operational weather data (presently NOAA's Global Ensemble Forecast System). Special attention is paid to prediction of situations in which water related issues are relevant, such as floods, droughts, navigation, hydropower generation, and irrigation stress. Near-real time satellite data will be assimilated in the hydrological simulations, which is a feature that will be presented for the first time at EGU 2015. First, we address challenges that are mainly computer science oriented but have direct practical hydrological implications. An important feature in this is the use of existing standards and open-source software to the maximum extent possible. For example, we use the Community Surface Dynamics Modeling System (CSDMS) approach to coupling models (Basic Model Interface (BMI)). The hydrological model underlying the project is PCR-GLOBWB, built by Utrecht University. This is the motor behind the predictions and state estimations. Parts of PCR-GLOBWB have been re-engineered to facilitate running it in a High Performance Computing (HPC) environment, run parallel on multiple nodes, as well as to use BMI. Hydrological models are not very CPU intensive compared to, say, atmospheric models. They are, however, memory hungry due to the localized processes and associated effective parameters. To accommodate this memory need, especially in an ensemble setting, a variation on the traditional Ensemble Kalman Filter was developed that needs much less on-chip memory. Due to the operational nature, the coupling of the hydrological model with hydraulic models is very important. The idea is not to run detailed hydraulic routing schemes over the complete globe but to have on-demand simulation prepared off-line with respect to topography and parameterizations. This allows for very detailed simulations at hectare to meter scales, where and when this is needed. At EGU 2015, the operational global eWaterCycle model will be presented for the first time, including forecasts at high resolution, the innovative data assimilation approach, and on-demand coupling with hydraulic models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ClDy..tmp...65C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ClDy..tmp...65C"><span>Mean and extreme temperatures in a warming climate: EURO CORDEX and WRF regional climate high-resolution projections for Portugal</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cardoso, Rita M.; Soares, Pedro M. M.; Lima, Daniela C. A.; Miranda, Pedro M. A.</p> <p>2018-02-01</p> <p>Large temperature spatio-temporal gradients are a common feature of Mediterranean climates. The Portuguese complex topography and coastlines enhances such features, and in a small region large temperature gradients with high interannual variability is detected. In this study, the EURO-CORDEX high-resolution regional climate simulations (0.11° and 0.44° resolutions) are used to investigate the maximum and minimum temperature projections across the twenty-first century according to RCP4.5 and RCP8.5. An additional WRF simulation with even higher resolution (9 km) for RCP8.5 scenario is also examined. All simulations for the historical period (1971-2000) are evaluated against the available station observations and the EURO-CORDEX model results are ranked in order to build multi-model ensembles. In present climate models are able to reproduce the main topography/coast related temperature gradients. Although there are discernible differences between models, most present a cold bias. The multi-model ensembles improve the overall representation of the temperature. The ensembles project a significant increase of the maximum and minimum temperatures in all seasons and scenarios. Maximum increments of 8 °C in summer and autumn and between 2 and 4 °C in winter and spring are projected in RCP8.5. The temperature distributions for all models show a significant increase in the upper tails of the PDFs. In RCP8.5 more than half of the extended summer (MJJAS) has maximum temperatures exceeding the historical 90th percentile and, on average, 60 tropical nights are projected for the end of the century, whilst there are only 7 tropical nights in the historical period. Conversely, the number of cold days almost disappears. The yearly average number of heat waves increases by seven to ninefold by 2100 and the most frequent length rises from 5 to 22 days throughout the twenty-first century. 5% of the longest events will last for more than one month. The amplitude is overwhelming larger, reaching values which are not observed in the historical period. More than half of the heat waves will be stronger than the extreme heat wave of 2003 by the end of the century. The future heatwaves will also enclose larger areas, approximately 100 events in the 2071-2100 period (more than 3 per year) will cover the whole country. The RCP4.5 scenario has in general smaller magnitudes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013NHESS..13.1135M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013NHESS..13.1135M"><span>High resolution climate projection of storm surge at the Venetian coast</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mel, R.; Sterl, A.; Lionello, P.</p> <p>2013-04-01</p> <p>Climate change impact on storm surge regime is of great importance for the safety and maintenance of Venice. In this study a future storm surge scenario is evaluated using new high resolution sea level pressure and wind data recently produced by EC-Earth, an Earth System Model based on the operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts (ECMWF). The study considers an ensemble of six 5 yr long simulations of the rcp45 scenario at 0.25° resolution and compares the 2094-2098 to the 2004-2008 period. EC-Earth sea level pressure and surface wind fields are used as input for a shallow water hydrodynamic model (HYPSE) which computes sea level and barotropic currents in the Adriatic Sea. Results show that a high resolution climate model is needed for producing realistic values of storm surge statistics and confirm previous studies in that they show little sensitivity of storm surge levels to climate change. However, some climate change signals are detected, such as increased persistence of high pressure conditions, an increased frequency of windless hour, and a decreased number of moderate windstorms.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_7");'>7</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li class="active"><span>9</span></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_9 --> <div id="page_10" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li class="active"><span>10</span></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="181"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018SPIE10503E..2IS','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018SPIE10503E..2IS"><span>Automated high resolution full-field spatial coherence tomography for quantitative phase imaging of human red blood cells</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Singla, Neeru; Dubey, Kavita; Srivastava, Vishal; Ahmad, Azeem; Mehta, D. S.</p> <p>2018-02-01</p> <p>We developed an automated high-resolution full-field spatial coherence tomography (FF-SCT) microscope for quantitative phase imaging that is based on the spatial, rather than the temporal, coherence gating. The Red and Green color laser light was used for finding the quantitative phase images of unstained human red blood cells (RBCs). This study uses morphological parameters of unstained RBCs phase images to distinguish between normal and infected cells. We recorded the single interferogram by a FF-SCT microscope for red and green color wavelength and average the two phase images to further reduced the noise artifacts. In order to characterize anemia infected from normal cells different morphological features were extracted and these features were used to train machine learning ensemble model to classify RBCs with high accuracy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140011364','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140011364"><span>On the Lack of Stratospheric Dynamical Variability in Low-top Versions of the CMIP5 Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Charlton-Perez, Andrew J.; Baldwin, Mark P.; Birner, Thomas; Black, Robert X.; Butler, Amy H.; Calvo, Natalia; Davis, Nicholas A.; Gerber, Edwin P.; Gillett, Nathan; Hardiman, Steven; <a style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20140011364'); toggleEditAbsImage('author_20140011364_show'); toggleEditAbsImage('author_20140011364_hide'); "> <img style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20140011364_show"> <img style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20140011364_hide"></p> <p>2013-01-01</p> <p>We describe the main differences in simulations of stratospheric climate and variability by models within the fifth Coupled Model Intercomparison Project (CMIP5) that have a model top above the stratopause and relatively fine stratospheric vertical resolution (high-top), and those that have a model top below the stratopause (low-top). Although the simulation of mean stratospheric climate by the two model ensembles is similar, the low-top model ensemble has very weak stratospheric variability on daily and interannual time scales. The frequency of major sudden stratospheric warming events is strongly underestimated by the low-top models with less than half the frequency of events observed in the reanalysis data and high-top models. The lack of stratospheric variability in the low-top models affects their stratosphere-troposphere coupling, resulting in short-lived anomalies in the Northern Annular Mode, which do not produce long-lasting tropospheric impacts, as seen in observations. The lack of stratospheric variability, however, does not appear to have any impact on the ability of the low-top models to reproduce past stratospheric temperature trends. We find little improvement in the simulation of decadal variability for the high-top models compared to the low-top, which is likely related to the fact that neither ensemble produces a realistic dynamical response to volcanic eruptions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EaFut...5.1234Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EaFut...5.1234Z"><span>High-Resolution Dynamical Downscaling Ensemble Projections of Future Extreme Temperature Distributions for the United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.; Kotamarthi, V. Rao</p> <p>2017-12-01</p> <p>The aim of this study is to examine projections of extreme temperatures over the continental United States (CONUS) for the 21st century using an ensemble of high spatial resolution dynamically downscaled model simulations with different boundary conditions. The downscaling uses the Weather Research and Forecast model at a spatial resolution of 12 km along with outputs from three different Coupled Model Intercomparison Project Phase 5 global climate models that provide boundary conditions under two different future greenhouse gas (GHG) concentration trajectories. The results from two decadal-length time slices (2045-2054 and 2085-2094) are compared with a historical decade (1995-2004). Probability density functions of daily maximum/minimum temperatures are analyzed over seven climatologically cohesive regions of the CONUS. The impacts of different boundary conditions as well as future GHG concentrations on extreme events such as heat waves and days with temperature higher than 95°F are also investigated. The results show that the intensity of extreme warm temperature in future summer is significantly increased, while the frequency of extreme cold temperature in future winter decreases. The distribution of summer daily maximum temperature experiences a significant warm-side shift and increased variability, while the distribution of winter daily minimum temperature is projected to have a less significant warm-side shift with decreased variability. Using "business-as-usual" scenario, 5-day heat waves are projected to occur at least 5-10 times per year in most CONUS and ≥95°F days will increase by 1-2 months by the end of the century.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014GMDD....7..563M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014GMDD....7..563M"><span>High resolution global climate modelling; the UPSCALE project, a large simulation campaign</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.</p> <p>2014-01-01</p> <p>The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014GMD.....7.1629M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014GMD.....7.1629M"><span>High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.</p> <p>2014-08-01</p> <p>The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFMGC53B1203G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFMGC53B1203G"><span>Applying Multimodel Ensemble from Regional Climate Models for Improving Runoff Projections on Semiarid Regions of Spain</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Garcia Galiano, S. G.; Olmos, P.; Giraldo Osorio, J. D.</p> <p>2015-12-01</p> <p>In the Mediterranean area, significant changes on temperature and precipitation are expected throughout the century. These trends could exacerbate the existing conditions in regions already vulnerable to climatic variability, reducing the water availability. Improving knowledge about plausible impacts of climate change on water cycle processes at basin scale, is an important step for building adaptive capacity to the impacts in this region, where severe water shortages are expected for the next decades. RCMs ensemble in combination with distributed hydrological models with few parameters, constitutes a valid and robust methodology to increase the reliability of climate and hydrological projections. For reaching this objective, a novel methodology for building Regional Climate Models (RCMs) ensembles of meteorological variables (rainfall an temperatures), was applied. RCMs ensembles are justified for increasing the reliability of climate and hydrological projections. The evaluation of RCMs goodness-of-fit to build the ensemble is based on empirical probability density functions (PDF) extracted from both RCMs dataset and a highly resolution gridded observational dataset, for the time period 1961-1990. The applied method is considering the seasonal and annual variability of the rainfall and temperatures. The RCMs ensembles constitute the input to a distributed hydrological model at basin scale, for assessing the runoff projections. The selected hydrological model is presenting few parameters in order to reduce the uncertainties involved. The study basin corresponds to a head basin of Segura River Basin, located in the South East of Spain. The impacts on runoff and its trend from observational dataset and climate projections, were assessed. Considering the control period 1961-1990, plausible significant decreases in runoff for the time period 2021-2050, were identified.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110008058','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110008058"><span>Aerosol Observability and Predictability: From Research to Operations for Chemical Weather Forecasting. Lagrangian Displacement Ensembles for Aerosol Data Assimilation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>da Silva, Arlindo</p> <p>2010-01-01</p> <p>A challenge common to many constituent data assimilation applications is the fact that one observes a much smaller fraction of the phase space that one wishes to estimate. For example, remotely sensed estimates of the column average concentrations are available, while one is faced with the problem of estimating 3D concentrations for initializing a prognostic model. This problem is exacerbated in the case of aerosols because the observable Aerosol Optical Depth (AOD) is not only a column integrated quantity, but it also sums over a large number of species (dust, sea-salt, carbonaceous and sulfate aerosols. An aerosol transport model when driven by high-resolution, state-of-the-art analysis of meteorological fields and realistic emissions can produce skillful forecasts even when no aerosol data is assimilated. The main task of aerosol data assimilation is to address the bias arising from inaccurate emissions, and Lagrangian misplacement of plumes induced by errors in the driving meteorological fields. As long as one decouples the meteorological and aerosol assimilation as we do here, the classic baroclinic growth of error is no longer the main order of business. We will describe an aerosol data assimilation scheme in which the analysis update step is conducted in observation space, using an adaptive maximum-likelihood scheme for estimating background errors in AOD space. This scheme includes e explicit sequential bias estimation as in Dee and da Silva. Unlikely existing aerosol data assimilation schemes we do not obtain analysis increments of the 3D concentrations by scaling the background profiles. Instead we explore the Lagrangian characteristics of the problem for generating local displacement ensembles. These high-resolution state-dependent ensembles are then used to parameterize the background errors and generate 3D aerosol increments. The algorithm has computational complexity running at a resolution of 1/4 degree, globally. We will present the result of assimilating AOD retrievals from MODIS (on both Aqua and TERRA satellites) from AERONET for validation. The impact on the GEOS-5 Aerosol Forecasting will be fully documented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1918291M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1918291M"><span>Evolution of precipitation extremes in two large ensembles of climate simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Martel, Jean-Luc; Mailhot, Alain; Talbot, Guillaume; Brissette, François; Ludwig, Ralf; Frigon, Anne; Leduc, Martin; Turcotte, Richard</p> <p>2017-04-01</p> <p>Recent studies project significant changes in the future distribution of precipitation extremes due to global warming. It is likely that extreme precipitation intensity will increase in a future climate and that extreme events will be more frequent. In this work, annual maxima daily precipitation series from the Canadian Earth System Model (CanESM2) 50-member large ensemble (spatial resolution of 2.8°x2.8°) and the Community Earth System Model (CESM1) 40-member large ensemble (spatial resolution of 1°x1°) are used to investigate extreme precipitation over the historical (1980-2010) and future (2070-2100) periods. The use of these ensembles results in respectively 1 500 (30 years x 50 members) and 1200 (30 years x 40 members) simulated years over both the historical and future periods. These large datasets allow the computation of empirical daily extreme precipitation quantiles for large return periods. Using the CanESM2 and CESM1 large ensembles, extreme daily precipitation with return periods ranging from 2 to 100 years are computed in historical and future periods to assess the impact of climate change. Results indicate that daily precipitation extremes generally increase in the future over most land grid points and that these increases will also impact the 100-year extreme daily precipitation. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety. Estimated increases in precipitation associated to very extreme precipitation events (e.g. 100 years) will drastically change the likelihood of flooding and their extent in future climate. These results, although interesting, need to be extended to sub-daily durations, relevant for urban flooding protection and urban infrastructure design (e.g. sewer networks, culverts). Models and simulations at finer spatial and temporal resolution are therefore needed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMIN33B1550A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMIN33B1550A"><span>Aggregation of Environmental Model Data for Decision Support</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Alpert, J. C.</p> <p>2013-12-01</p> <p>Weather forecasts and warnings must be prepared and then delivered so as to reach their intended audience in good time to enable effective decision-making. An effort to mitigate these difficulties was studied at a Workshop, 'Sustaining National Meteorological Services - Strengthening WMO Regional and Global Centers' convened, June , 2013, by the World Bank, WMO and the US National Weather Service (NWS). The skill and accuracy of atmospheric forecasts from deterministic models have increased and there are now ensembles of such models that improve decisions to protect life, property and commerce. The NWS production of numerical weather prediction products result in model output from global and high resolution regional ensemble forecasts. Ensembles are constructed by changing the initial conditions to make a 'cloud' of forecasts that attempt to span the space of possible atmospheric realizations which can quantify not only the most likely forecast, but also the uncertainty. This has led to an unprecedented increase in data production and information content from higher resolution, multi-model output and secondary calculations. One difficulty is to obtain the needed subset of data required to estimate the probability of events, and report the information. The calibration required to reliably estimate the probability of events, and honing of threshold adjustments to reduce false alarms for decision makers is also needed. To meet the future needs of the ever-broadening user community and address these issues on a national and international basis, the weather service implemented the NOAA Operational Model Archive and Distribution System (NOMADS). NOMADS provides real-time and retrospective format independent access to climate, ocean and weather model data and delivers high availability content services as part of NOAA's official real time data dissemination at its new NCWCP web operations center. An important aspect of the server's abilities is to aggregate the matrix of model output offering access to probability and calibrating information for real time decision making. The aggregation content server reports over ensemble component and forecast time in addition to the other data dimensions of vertical layer and position for each variable. The unpacking, organization and reading of many binary packed files is accomplished most efficiently on the server while weather element event probability calculations, the thresholds for more accurate decision support, or display remain for the client. Our goal is to reduce uncertainty for variables of interest, e.g, agricultural importance. The weather service operational GFS model ensemble and short range ensemble forecasts can make skillful probability forecasts to alert users if and when their selected weather events will occur. A description of how this framework operates and how it can be implemented using existing NOMADS content services and applications is described.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/989015','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/989015"><span>On the possible long-term fate of oil released in the deepwater horizon incident: estimated by ensembles of dye release simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Maltrud, Mathew E.; Peacock, Synte L.; Visbeck, Martin</p> <p>2010-08-01</p> <p>We have conducted an ensemble of 20 simulations using a high-resolution global ocean model in which dye was continuously injected at the site of the Deepwater Horizon drilling rig for two months. We then extended these simulations for another four months to track the dispersal of the dye in the model. We have also performed five simulations in which dye was continuously injected at the site of the spill for four months and then run out to one year from the initial spill date. The experiments can elucidate the time and space scales of dispersal of polluted waters and alsomore » give a quantitative estimate of dilution rate, ignoring any sink terms such as chemical or biological degradation.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.A54E..02R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.A54E..02R"><span>Variable-Resolution Ensemble Climatology Modeling of Sierra Nevada Snowpack within the Community Earth System Model (CESM)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rhoades, A.; Ullrich, P. A.; Zarzycki, C. M.; Levy, M.; Taylor, M.</p> <p>2014-12-01</p> <p>Snowpack is crucial for the western USA, providing around 75% of the total fresh water supply (Cayan et al., 1996) and buffering against seasonal aridity impacts on agricultural, ecosystem, and urban water demands. The resilience of the California water system is largely dependent on natural stores provided by snowpack. This resilience has shown vulnerabilities due to anthropogenic global climate change. Historically, the northern Sierras showed a net decline of 50-75% in snow water equivalent (SWE) while the southern Sierras showed a net accumulation of 30% (Mote et al., 2005). Future trends of SWE highlight that western USA SWE may decline by 40-70% (Pierce and Cayan, 2013), snowfall may decrease by 25-40% (Pierce and Cayan, 2013), and more winter storms may tend towards rain rather than snow (Bales et al., 2006). The volatility of Sierran snowpack presents a need for scientific tools to help water managers and policy makers assess current and future trends. A burgeoning tool to analyze these trends comes in the form of variable-resolution global climate modeling (VRGCM). VRGCMs serve as a bridge between regional and global models and provide added resolution in areas of need, eliminate lateral boundary forcings, provide model runtime speed up, and utilize a common dynamical core, physics scheme and sub-grid scale parameterization package. A cubed-sphere variable-resolution grid with 25 km horizontal resolution over the western USA was developed for use in the Community Atmosphere Model (CAM) within the Community Earth System Model (CESM). A 25-year three-member ensemble climatology (1980-2005) is presented and major snowpack metrics such as SWE, snow depth, snow cover, and two-meter surface temperature are assessed. The ensemble simulation is also compared to observational, reanalysis, and WRF model datasets. The variable-resolution model provides a mechanism for reaching towards non-hydrostatic scales and simulations are currently being developed with refined nests of 12.5km resolution over California.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JGRC..120.5134Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JGRC..120.5134Y"><span>Ensemble assimilation of ARGO temperature profile, sea surface temperature, and altimetric satellite data into an eddy permitting primitive equation model of the North Atlantic Ocean</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yan, Y.; Barth, A.; Beckers, J. M.; Candille, G.; Brankart, J. M.; Brasseur, P.</p> <p>2015-07-01</p> <p>Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. An incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semiindependent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analyzed jointly. The consistency and complementarity between both validations are highlighted.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMOS53E..05P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMOS53E..05P"><span>Hydro and morphodynamic simulations for probabilistic estimates of munitions mobility</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Palmsten, M.; Penko, A.</p> <p>2017-12-01</p> <p>Probabilistic estimates of waves, currents, and sediment transport at underwater munitions remediation sites are necessary to constrain probabilistic predictions of munitions exposure, burial, and migration. To address this need, we produced ensemble simulations of hydrodynamic flow and morphologic change with Delft3D, a coupled system of wave, circulation, and sediment transport models. We have set up the Delft3D model simulations at the Army Corps of Engineers Field Research Facility (FRF) in Duck, NC, USA. The FRF is the prototype site for the near-field munitions mobility model, which integrates far-field and near-field field munitions mobility simulations. An extensive array of in-situ and remotely sensed oceanographic, bathymetric, and meteorological data are available at the FRF, as well as existing observations of munitions mobility for model testing. Here, we present results of ensemble Delft3D hydro- and morphodynamic simulations at Duck. A nested Delft3D simulation runs an outer grid that extends 12-km in the along-shore and 3.7-km in the cross-shore with 50-m resolution and a maximum depth of approximately 17-m. The inner nested grid extends 3.2-km in the along-shore and 1.2-km in the cross-shore with 5-m resolution and a maximum depth of approximately 11-m. The inner nested grid initial model bathymetry is defined as the most recent survey or remotely sensed estimate of water depth. Delft3D-WAVE and FLOW is driven with spectral wave measurements from a Waverider buoy in 17-m depth located on the offshore boundary of the outer grid. The spectral wave output and the water levels from the outer grid are used to define the boundary conditions for the inner nested high-resolution grid, in which the coupled Delft3D WAVE-FLOW-MORPHOLOGY model is run. The ensemble results are compared to the wave, current, and bathymetry observations collected at the FRF.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFM.A33A0242G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFM.A33A0242G"><span>The North American Regional Climate Change Assessment Program (NARCCAP): Status and results</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gutowski, W. J.</p> <p>2009-12-01</p> <p>NARCCAP is a multi-institutional program that is investigating systematically the uncertainties in regional scale simulations of contemporary climate and projections of future climate. NARCCAP is supported by multiple federal agencies. NARCCAP is producing an ensemble of high-resolution climate-change scenarios by nesting multiple RCMs in reanalyses and multiple atmosphere-ocean GCM simulations of contemporary and future-scenario climates. The RCM domains cover the contiguous U.S., northern Mexico, and most of Canada. The simulation suite also includes time-slice, high resolution GCMs that use sea-surface temperatures from parent atmosphere-ocean GCMs. The baseline resolution of the RCMs and time-slice GCMs is 50 km. Simulations use three sources of boundary conditions: National Centers for Environmental Prediction (NCEP)/Department of Energy (DOE) AMIP-II Reanalysis, GCMs simulating contemporary climate and GCMs using the A2 SRES emission scenario for the twenty-first century. Simulations cover 1979-2004 and 2038-2060, with the first 3 years discarded for spin-up. The resulting RCM and time-slice simulations offer opportunity for extensive analysis of RCM simulations as well as a basis for multiple high-resolution climate scenarios for climate change impacts assessments. Geophysical statisticians are developing measures of uncertainty from the ensemble. To enable very high-resolution simulations of specific regions, both RCM and high-resolution time-slice simulations are saving output needed for further downscaling. All output is publically available to the climate analysis and the climate impacts assessment community, through an archiving and data-distribution plan. Some initial results show that the models closely reproduce ENSO-related precipitation variations in coastal California, where the correlation between the simulated and observed monthly time series exceeds 0.94 for all models. The strong El Nino events of 1982-83 and 1997-98 are well reproduced for the Pacific coastal region of the U.S. in all models. ENSO signals are less well reproduced in other regions. The models also produce well extreme monthly precipitation in coastal California and the Upper Midwest. Model performance tends to deteriorate from west to east across the domain, or roughly from the inflow boundary toward the outflow boundary. This deterioration with distance from the inflow boundary is ameliorated to some extent in models formulated such that large-scale information is included in the model solution, whether implemented by spectral nudging or by use of a perturbation form of the governing equations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMIN13B0061Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMIN13B0061Z"><span>An Ensemble Method with Integration of Feature Selection and Classifier Selection to Detect the Landslides</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhongqin, G.; Chen, Y.</p> <p>2017-12-01</p> <p>Abstract Quickly identify the spatial distribution of landslides automatically is essential for the prevention, mitigation and assessment of the landslide hazard. It's still a challenging job owing to the complicated characteristics and vague boundary of the landslide areas on the image. The high resolution remote sensing image has multi-scales, complex spatial distribution and abundant features, the object-oriented image classification methods can make full use of the above information and thus effectively detect the landslides after the hazard happened. In this research we present a new semi-supervised workflow, taking advantages of recent object-oriented image analysis and machine learning algorithms to quick locate the different origins of landslides of some areas on the southwest part of China. Besides a sequence of image segmentation, feature selection, object classification and error test, this workflow ensemble the feature selection and classifier selection. The feature this study utilized were normalized difference vegetation index (NDVI) change, textural feature derived from the gray level co-occurrence matrices (GLCM), spectral feature and etc. The improvement of this study shows this algorithm significantly removes some redundant feature and the classifiers get fully used. All these improvements lead to a higher accuracy on the determination of the shape of landslides on the high resolution remote sensing image, in particular the flexibility aimed at different kinds of landslides.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1429906-internal-variability-dynamically-downscaled-climate-over-north-america','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1429906-internal-variability-dynamically-downscaled-climate-over-north-america"><span>Internal variability of a dynamically downscaled climate over North America</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Wang, Jiali; Bessac, Julie; Kotamarthi, Rao</p> <p></p> <p>This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble duringmore » the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19950036305&hterms=dependency&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Ddependency','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19950036305&hterms=dependency&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D20%26Ntt%3Ddependency"><span>A statistical analysis of the dependency of closure assumptions in cumulus parameterization on the horizontal resolution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Xu, Kuan-Man</p> <p>1994-01-01</p> <p>Simulated data from the UCLA cumulus ensemble model are used to investigate the quasi-universal validity of closure assumptions used in existing cumulus parameterizations. A closure assumption is quasi-universally valid if it is sensitive neither to convective cloud regimes nor to horizontal resolutions of large-scale/mesoscale models. The dependency of three types of closure assumptions, as classified by Arakawa and Chen, on the horizontal resolution is addressed in this study. Type I is the constraint on the coupling of the time tendencies of large-scale temperature and water vapor mixing ratio. Type II is the constraint on the coupling of cumulus heating and cumulus drying. Type III is a direct constraint on the intensity of a cumulus ensemble. The macroscopic behavior of simulated cumulus convection is first compared with the observed behavior in view of Type I and Type II closure assumptions using 'quick-look' and canonical correlation analyses. It is found that they are statistically similar to each other. The three types of closure assumptions are further examined with simulated data averaged over selected subdomain sizes ranging from 64 to 512 km. It is found that the dependency of Type I and Type II closure assumptions on the horizontal resolution is very weak and that Type III closure assumption is somewhat dependent upon the horizontal resolution. The influences of convective and mesoscale processes on the closure assumptions are also addressed by comparing the structures of canonical components with the corresponding vertical profiles in the convective and stratiform regions of cumulus ensembles analyzed directly from simulated data. The implication of these results for cumulus parameterization is discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ClDy...50.4539W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ClDy...50.4539W"><span>Internal variability of a dynamically downscaled climate over North America</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Jiali; Bessac, Julie; Kotamarthi, Rao; Constantinescu, Emil; Drewniak, Beth</p> <p>2018-06-01</p> <p>This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy..tmp..673W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy..tmp..673W"><span>Internal variability of a dynamically downscaled climate over North America</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wang, Jiali; Bessac, Julie; Kotamarthi, Rao; Constantinescu, Emil; Drewniak, Beth</p> <p>2017-09-01</p> <p>This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemble during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late twenty-first century. However, the IV is larger than the projected changes in precipitation for the mid- and late twenty-first century.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMGC53A0865W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMGC53A0865W"><span>Tropical cyclones in a stabilized 1.5 and 2 degree warmer world.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wehner, M. F.; Stone, D. A.; Loring, B.; Krishnan, H.</p> <p>2017-12-01</p> <p>We present an ensemble of very high resolution global climate model simulations of a stabilized 1.5oC and 2oC warmer climate as envisioned by the Paris COP21 agreement. The resolution of this global climate model (25km) permits simulated tropical cyclones up to Category Five on the Saffir-Simpson scale Projected changes in tropical cyclones are significant. Tropical cyclones in the two stabilization scenarios are less frequent but more intense than in simulations of the present. Output data from these simulations is freely available to all interested parties and should prove a useful resource to those interested in studying the impacts of stabilized global warming.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_8");'>8</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li class="active"><span>10</span></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_10 --> <div id="page_11" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li class="active"><span>11</span></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="201"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFMGC11D1036B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFMGC11D1036B"><span>Evaluating potentials for future generation off-shore wind-power outside Norway</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Benestad, R. E.; Haugen, J.; Haakenstad, H.</p> <p>2012-12-01</p> <p>With todays critical need of renewable energy sources, it is naturally to look towards wind power. With the long coast of Norway, there is a large potential for wind farms offshore Norway. Although there are more challenges with offshore wind energy installations compared to wind farms on land, the offshore wind is generally higher, and there is also higher persistence of wind speed values in the power generating classes. I planning offshore wind farms, there is a need of evaluation of the wind resources, the wind climatology and possible future changes. In this aspect, we use data from regional climate model runs performed in the European ENSEMBLE-project (van der Linden and J.F.B. Mitchell, 2009). In spite of increased reliability in RCMs in the recent years, the simulations still suffer from systematic model errors, therefore the data has to be corrected before using them in wind resource analyses. In correcting the wind speeds from the RCMs, we will use wind speeds from a Norwegian high resolution wind- and wave- archive, NORA10 (Reistad et al 2010), to do quantile mapping (Themeβl et. al. 2012). The quantile mapping is performed individually for each regional simulation driven by ERA40-reanalysis from the ENSEMBLE-project corrected against NORA10. The same calibration is then used to the belonging regional climate scenario. The calibration is done for each grid cell in the domain and for each day of the year centered in a +/-15 day window to make an empirical cumulative density function for each day of the year. The quantile mapping of the scenarios provide us with a new wind speed data set for the future, more correct compared to the raw ENSEMBLE scenarios. References: Reistad M., Ø. Breivik, H. Haakenstad, O. J. Aarnes, B. R. Furevik and J-R Bidlo, 2010, A high-resolution hindcast of wind and waves for The North Sea, The Norwegian Sea and The Barents Sea. J. Geophys. Res., 116. doi:10.1029/2010JC006402. Themessl M. J., A. Gobiet and A. Leuprecht, 2012, Empirical-statistical downscaling and error correction of regional climate models and its imipact on the climate change signal. Climatic Change 112: 449-468, DOI 10.1007/s10584-011-0224-4. Van der Linden P. and J.F.B. Mitchell, 2009, ENSEMBLES: Climate Change and its Impacts_ Summary and results from the ENSEMBLES project. Met Office Hadley Centre, FitzRoy Road, Exeter EX1 3PB, UK.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013JCli...26.7525B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013JCli...26.7525B"><span>Wave Extremes in the Northeast Atlantic from Ensemble Forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Breivik, Øyvind; Aarnes, Ole Johan; Bidlot, Jean-Raymond; Carrasco, Ana; Saetra, Øyvind</p> <p>2013-10-01</p> <p>A method for estimating return values from ensembles of forecasts at advanced lead times is presented. Return values of significant wave height in the North-East Atlantic, the Norwegian Sea and the North Sea are computed from archived +240-h forecasts of the ECMWF ensemble prediction system (EPS) from 1999 to 2009. We make three assumptions: First, each forecast is representative of a six-hour interval and collectively the data set is then comparable to a time period of 226 years. Second, the model climate matches the observed distribution, which we confirm by comparing with buoy data. Third, the ensemble members are sufficiently uncorrelated to be considered independent realizations of the model climate. We find anomaly correlations of 0.20, but peak events (>P97) are entirely uncorrelated. By comparing return values from individual members with return values of subsamples of the data set we also find that the estimates follow the same distribution and appear unaffected by correlations in the ensemble. The annual mean and variance over the 11-year archived period exhibit no significant departures from stationarity compared with a recent reforecast, i.e., there is no spurious trend due to model upgrades. EPS yields significantly higher return values than ERA-40 and ERA-Interim and is in good agreement with the high-resolution hindcast NORA10, except in the lee of unresolved islands where EPS overestimates and in enclosed seas where it is biased low. Confidence intervals are half the width of those found for ERA-Interim due to the magnitude of the data set.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.2843C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.2843C"><span>Regional Climate Models Downscaling in the Alpine Area with Multimodel SuperEnsemble</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cane, D.; Barbarino, S.; Renier, L.; Ronchi, C.</p> <p>2012-04-01</p> <p>The climatic scenarios show a strong signal of warming in the Alpine area already for the mid XXI century. The climate simulation, however, even when obtained with Regional Climate Models (RCMs), are affected by strong errors where compared with observations in the control period, due to their difficulties in representing the complex orography of the Alps and limitations in their physical parametrization. In this work we use a selection of RCMs runs from the ENSEMBLES project, carefully chosen in order to maximise the variety of leading Global Climate Models and of the RCMs themselves, calculated on the SRES scenario A1B. The reference observation for the Greater Alpine Area are extracted from the European dataset E-OBS produced by the project ENSEMBLES with an available resolution of 25 km. For the study area of Piemonte daily temperature and precipitation observations (1957-present) were carefully gridded on a 14-km grid over Piemonte Region with an Optimal Interpolation technique. We applied the Multimodel SuperEnsemble technique to temperature fields, reducing the high biases of RCMs temperature field compared to observations in the control period. We propose also the first application to RCMs of a brand new probabilistic Multimodel SuperEnsemble Dressing technique to estimate precipitation fields, already applied successfully to weather forecast models, with careful description of precipitation Probability Density Functions conditioned to the model outputs. This technique reduces the strong precipitation overestimation by RCMs over the alpine chain and reproduces the monthly behaviour of observed precipitation in the control period far better than the direct model outputs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140017716','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140017716"><span>Background Error Covariance Estimation using Information from a Single Model Trajectory with Application to Ocean Data Assimilation into the GEOS-5 Coupled Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Keppenne, Christian L.; Rienecker, Michele M.; Kovach, Robin M.; Vernieres, Guillaume; Koster, Randal D. (Editor)</p> <p>2014-01-01</p> <p>An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory. SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140011836','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140011836"><span>Background Error Covariance Estimation Using Information from a Single Model Trajectory with Application to Ocean Data Assimilation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Keppenne, Christian L.; Rienecker, Michele; Kovach, Robin M.; Vernieres, Guillaume</p> <p>2014-01-01</p> <p>An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory.SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26618792','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26618792"><span>Structural Insights into the Calcium-Mediated Allosteric Transition in the C-Terminal Domain of Calmodulin from Nuclear Magnetic Resonance Measurements.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kukic, Predrag; Lundström, Patrik; Camilloni, Carlo; Evenäs, Johan; Akke, Mikael; Vendruscolo, Michele</p> <p>2016-01-12</p> <p>Calmodulin is a two-domain signaling protein that becomes activated upon binding cooperatively two pairs of calcium ions, leading to large-scale conformational changes that expose its binding site. Despite significant advances in understanding the structural biology of calmodulin functions, the mechanistic details of the conformational transition between closed and open states have remained unclear. To investigate this transition, we used a combination of molecular dynamics simulations and nuclear magnetic resonance (NMR) experiments on the Ca(2+)-saturated E140Q C-terminal domain variant. Using chemical shift restraints in replica-averaged metadynamics simulations, we obtained a high-resolution structural ensemble consisting of two conformational states and validated such an ensemble against three independent experimental data sets, namely, interproton nuclear Overhauser enhancements, (15)N order parameters, and chemical shift differences between the exchanging states. Through a detailed analysis of this structural ensemble and of the corresponding statistical weights, we characterized a calcium-mediated conformational transition whereby the coordination of Ca(2+) by just one oxygen of the bidentate ligand E140 triggers a concerted movement of the two EF-hands that exposes the target binding site. This analysis provides atomistic insights into a possible Ca(2+)-mediated activation mechanism of calmodulin that cannot be achieved from static structures alone or from ensemble NMR measurements of the transition between conformations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018NatSD...580052B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018NatSD...580052B"><span>FLO1K, global maps of mean, maximum and minimum annual streamflow at 1 km resolution from 1960 through 2015</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Barbarossa, Valerio; Huijbregts, Mark A. J.; Beusen, Arthur H. W.; Beck, Hylke E.; King, Henry; Schipper, Aafke M.</p> <p>2018-03-01</p> <p>Streamflow data is highly relevant for a variety of socio-economic as well as ecological analyses or applications, but a high-resolution global streamflow dataset is yet lacking. We created FLO1K, a consistent streamflow dataset at a resolution of 30 arc seconds (~1 km) and global coverage. FLO1K comprises mean, maximum and minimum annual flow for each year in the period 1960-2015, provided as spatially continuous gridded layers. We mapped streamflow by means of artificial neural networks (ANNs) regression. An ensemble of ANNs were fitted on monthly streamflow observations from 6600 monitoring stations worldwide, i.e., minimum and maximum annual flows represent the lowest and highest mean monthly flows for a given year. As covariates we used the upstream-catchment physiography (area, surface slope, elevation) and year-specific climatic variables (precipitation, temperature, potential evapotranspiration, aridity index and seasonality indices). Confronting the maps with independent data indicated good agreement (R2 values up to 91%). FLO1K delivers essential data for freshwater ecology and water resources analyses at a global scale and yet high spatial resolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A53C2258P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A53C2258P"><span>North Atlantic Tropical Cyclones: historical simulations and future changes with the new high-resolution Arpege AGCM.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pilon, R.; Chauvin, F.; Palany, P.; Belmadani, A.</p> <p>2017-12-01</p> <p>A new version of the variable high-resolution Meteo-France Arpege atmospheric general circulation model (AGCM) has been developed for tropical cyclones (TC) studies, with a focus on the North Atlantic basin, where the model horizontal resolution is 15 km. Ensemble historical AMIP (Atmospheric Model Intercomparison Project)-type simulations (1965-2014) and future projections (2020-2080) under the IPCC (Intergovernmental Panel on Climate Change) representative concentration pathway (RCP) 8.5 scenario have been produced. TC-like vortices tracking algorithm is used to investigate TC activity and variability. TC frequency, genesis, geographical distribution and intensity are examined. Historical simulations are compared to best-track and reanalysis datasets. Model TC frequency is generally realistic but tends to be too high during the rst decade of the historical simulations. Biases appear to originate from both the tracking algorithm and model climatology. Nevertheless, the model is able to simulate extremely well intense TCs corresponding to category 5 hurricanes in the North Atlantic, where grid resolution is highest. Interaction between developing TCs and vertical wind shear is shown to be contributing factor for TC variability. Future changes in TC activity and properties are also discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1919576R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1919576R"><span>Challenges in the development of very high resolution Earth System Models for climate science</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rasch, Philip J.; Xie, Shaocheng; Ma, Po-Lun; Lin, Wuyin; Wan, Hui; Qian, Yun</p> <p>2017-04-01</p> <p>The authors represent the 20+ members of the ACME atmosphere development team. The US Department of Energy (DOE) has, like many other organizations around the world, identified the need for an Earth System Model capable of rapid completion of decade to century length simulations at very high (vertical and horizontal) resolution with good climate fidelity. Two years ago DOE initiated a multi-institution effort called ACME (Accelerated Climate Modeling for Energy) to meet this an extraordinary challenge, targeting a model eventually capable of running at 10-25km horizontal and 20-400m vertical resolution through the troposphere on exascale computational platforms at speeds sufficient to complete 5+ simulated years per day. I will outline the challenges our team has encountered in development of the atmosphere component of this model, and the strategies we have been using for tuning and debugging a model that we can barely afford to run on today's computational platforms. These strategies include: 1) evaluation at lower resolutions; 2) ensembles of short simulations to explore parameter space, and perform rough tuning and evaluation; 3) use of regionally refined versions of the model for probing high resolution model behavior at less expense; 4) use of "auto-tuning" methodologies for model tuning; and 5) brute force long climate simulations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMNH31A1887K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMNH31A1887K"><span>Attribution of the 1995 and 2006 storm surge events in the southern Baltic Sea</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Klehmet, K.; Rockel, B.; von Storch, H.</p> <p>2016-12-01</p> <p>In November 1995 and 2006, the German Baltic Sea coast experienced severe storm surge conditions. Exceptional water level heights of about 1.8m above mean sea level were measured at German tide gauges. Extreme event attribution poses unique challenges trying to distinguish the role of anthropogenic influence, as e.g. greenhouse gas emissions or land-use changes, from natural variability. This study, which is part of the EUCLEIA project (EUropean CLimate and weather Events: Interpretation and Attribution, www. eucleia.eu), aims to estimate how the contribution of anthropogenic drivers has altered the probability of single extreme events such as the 1995 and 2006 storm surge events. We explore these aspects using two 7-member ensembles of Hadley Centre Global Environmental Model version 3-A (HadGEM3-A), the atmosphere only component of the HadGEM3, provided by the Met Office Hadley Centre. The ensemble of HadGEM3-A consists of two multi-decadal experiments from 1960-2013 - one with anthropogenic forcing factors and natural forcings representing the actual climate. The second experiment represents the natural climate including only natural forcing factors. These two 7-member ensembles of about 60km spatial resolution are used as atmospheric forcing data to drive the regional ocean model TRIM-NP in order to calculate water level in the Baltic Sea in 12.8km spatial resolution. Findings indicate some limitations of the regional model ensemble to reproduce the magnitude of extreme water levels well. It is tested whether increased spatial resolution of atmospheric forcing fields can improve the representation of Baltic Sea extreme water levels along the coast and thus add value in the attribution analysis.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009EGUGA..11..812W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009EGUGA..11..812W"><span>Evaluation of high intensity precipitation from 16 Regional climate models over a meso-scale catchment in the Midlands Regions of England</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wetterhall, F.; He, Y.; Cloke, H.; Pappenberger, F.; Freer, J.; Wilson, M.; McGregor, G.</p> <p>2009-04-01</p> <p>Local flooding events are often triggered by high-intensity rain-fall events, and it is important that these can be correctly modelled by Regional Climate Models (RCMs) if the results are to be used in climate impact assessment. In this study, daily precipitation from 16 RCMs was compared with observations over a meso-scale catchment in the Midlands Region of England. The RCM data was provided from the European research project ENSEMBLES and the precipitation data from the UK MetOffice. The RCMs were all driven by reanalysis data from the ERA40 dataset over the time period 1961-2000. The ENSEMBLES data is on the spatial scale of 25 x 25 km and it was disaggregated onto a 5 x 5 km grid over the catchment and compared with interpolated observational data with the same resolution. The mean precipitation was generally underestimated by the ENSEMBLES data, and the maximum and persistence of high intensity rainfall was even more underestimated. The inter-annual variability was not fully captured by the RCMs, and there was a systematic underestimation of precipitation during the autumn months. The spatial pattern in the modelled precipitation data was too smooth in comparison with the observed data, especially in the high altitudes in the western part of the catchment where the high precipitation usually occurs. The RCM outputs cannot reproduce the current high intensity precipitation events that are needed to sufficiently model extreme flood events. The results point out the discrepancy between climate model output and the high intensity precipitation input needs for hydrological impact modelling.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018AdSR...15...39L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018AdSR...15...39L"><span>Short-range solar radiation forecasts over Sweden</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Landelius, Tomas; Lindskog, Magnus; Körnich, Heiner; Andersson, Sandra</p> <p>2018-04-01</p> <p>In this article the performance for short-range solar radiation forecasts by the global deterministic and ensemble models from the European Centre for Medium-Range Weather Forecasts (ECMWF) is compared with an ensemble of the regional mesoscale model HARMONIE-AROME used by the national meteorological services in Sweden, Norway and Finland. Note however that only the control members and the ensemble means are included in the comparison. The models resolution differs considerably with 18 km for the ECMWF ensemble, 9 km for the ECMWF deterministic model, and 2.5 km for the HARMONIE-AROME ensemble. The models share the same radiation code. It turns out that they all underestimate systematically the Direct Normal Irradiance (DNI) for clear-sky conditions. Except for this shortcoming, the HARMONIE-AROME ensemble model shows the best agreement with the distribution of observed Global Horizontal Irradiance (GHI) and DNI values. During mid-day the HARMONIE-AROME ensemble mean performs best. The control member of the HARMONIE-AROME ensemble also scores better than the global deterministic ECMWF model. This is an interesting result since mesoscale models have so far not shown good results when compared to the ECMWF models. Three days with clear, mixed and cloudy skies are used to illustrate the possible added value of a probabilistic forecast. It is shown that in these cases the mesoscale ensemble could provide decision support to a grid operator in terms of forecasts of both the amount of solar power and its probabilities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFM.A23F..03A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFM.A23F..03A"><span>Ensemble Downscaling of Winter Seasonal Forecasts: The MRED Project</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Arritt, R. W.; Mred Team</p> <p>2010-12-01</p> <p>The Multi-Regional climate model Ensemble Downscaling (MRED) project is a multi-institutional project that is producing large ensembles of downscaled winter seasonal forecasts from coupled atmosphere-ocean seasonal prediction models. Eight regional climate models each are downscaling 15-member ensembles from the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) and the new NASA seasonal forecast system based on the GEOS5 atmospheric model coupled with the MOM4 ocean model. This produces 240-member ensembles, i.e., 8 regional models x 15 global ensemble members x 2 global models, for each winter season (December-April) of 1982-2003. Results to date show that combined global-regional downscaled forecasts have greatest skill for seasonal precipitation anomalies during strong El Niño events such as 1982-83 and 1997-98. Ensemble means of area-averaged seasonal precipitation for the regional models generally track the corresponding results for the global model, though there is considerable inter-model variability amongst the regional models. For seasons and regions where area mean precipitation is accurately simulated the regional models bring added value by extracting greater spatial detail from the global forecasts, mainly due to better resolution of terrain in the regional models. Our results also emphasize that an ensemble approach is essential to realizing the added value from the combined global-regional modeling system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1913580D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1913580D"><span>Verification of forecast ensembles in complex terrain including observation uncertainty</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dorninger, Manfred; Kloiber, Simon</p> <p>2017-04-01</p> <p>Traditionally, verification means to verify a forecast (ensemble) with the truth represented by observations. The observation errors are quite often neglected arguing that they are small when compared to the forecast error. In this study as part of the MesoVICT (Mesoscale Verification Inter-comparison over Complex Terrain) project it will be shown, that observation errors have to be taken into account for verification purposes. The observation uncertainty is estimated from the VERA (Vienna Enhanced Resolution Analysis) and represented via two analysis ensembles which are compared to the forecast ensemble. For the whole study results from COSMO-LEPS provided by Arpae-SIMC Emilia-Romagna are used as forecast ensemble. The time period covers the MesoVICT core case from 20-22 June 2007. In a first step, all ensembles are investigated concerning their distribution. Several tests have been executed (Kolmogorov-Smirnov-Test, Finkelstein-Schafer Test, Chi-Square Test etc.) showing no exact mathematical distribution. So the main focus is on non-parametric statistics (e.g. Kernel density estimation, Boxplots etc.) and also the deviation between "forced" normal distributed data and the kernel density estimations. In a next step the observational deviations due to the analysis ensembles are analysed. In a first approach scores are multiple times calculated with every single ensemble member from the analysis ensemble regarded as "true" observation. The results are presented as boxplots for the different scores and parameters. Additionally, the bootstrapping method is also applied to the ensembles. These possible approaches to incorporating observational uncertainty into the computation of statistics will be discussed in the talk.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1413834-high-resolution-dynamical-downscaling-ensemble-projections-future-extreme-temperature-distributions-united-states','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1413834-high-resolution-dynamical-downscaling-ensemble-projections-future-extreme-temperature-distributions-united-states"><span>High-Resolution Dynamical Downscaling Ensemble Projections of Future Extreme Temperature Distributions for the United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.</p> <p></p> <p>The aim of this study is to examine projections of extreme temperatures over the continental United States (CONUS) for the 21st century using an ensemble of high spatial resolution dynamically downscaled model simulations with different boundary conditions. The downscaling uses the Weather Research and Forecast model at a spatial resolution of 12 km along with outputs from three different Coupled Model Intercomparison Project Phase 5 global climate models that provide boundary con- ditions under two different future greenhouse gas (GHG) concentration trajectories. The results from two decadal-length time slices (2045–2054 and 2085–2094) are compared with a historical decade (1995–2004). Probabilitymore » density functions of daily maximum/minimum temperatures are analyzed over seven climatologically cohesive regions of the CONUS. The impacts of different boundary conditions as well as future GHG concentrations on extreme events such as heat waves and days with temperature higher than 95°F are also investigated. The results show that the intensity of extreme warm temperature in future summer is significantly increased, while the frequency of extreme cold temperature in future winter decreases. The distribution of summer daily maximum temperature experiences a significant warm-side shift and increased variability, while the distribution of winter daily minimum temperature is projected to have a less significant warm-side shift with decreased variability. Finally, using "business-as-usual" scenario, 5-day heat waves are projected to occur at least 5–10 times per year in most CONUS and ≥ 95°F days will increase by 1–2 months by the end of the century.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27539825','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27539825"><span>Using an ensemble of regional climate models to assess climate change impacts on water scarcity in European river basins.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gampe, David; Nikulin, Grigory; Ludwig, Ralf</p> <p>2016-12-15</p> <p>Climate change will likely increase pressure on the water balances of Mediterranean basins due to decreasing precipitation and rising temperatures. To overcome the issue of data scarcity the hydrological relevant variables total runoff, surface evaporation, precipitation and air temperature are taken from climate model simulations. The ensemble applied in this study consists of 22 simulations, derived from different combinations of four General Circulation Models (GCMs) forcing different Regional Climate Models (RCMs) and two Representative Concentration Pathways (RCPs) at ~12km horizontal resolution provided through the EURO-CORDEX initiative. Four river basins (Adige, Ebro, Evrotas and Sava) are selected and climate change signals for the future period 2035-2065 as compared to the reference period 1981-2010 are investigated. Decreased runoff and evaporation indicate increased water scarcity over the Ebro and the Evrotas, as well as the southern parts of the Adige and the Sava, resulting from a temperature increase of 1-3° and precipitation decrease of up to 30%. Most severe changes are projected for the summer months indicating further pressure on the river basins already at least partly characterized by flow intermittency. The widely used Falkenmark indicator is presented and confirms this tendency and shows the necessity for spatially distributed analysis and high resolution projections. Related uncertainties are addressed by the means of a variance decomposition and model agreement to determine the robustness of the projections. The study highlights the importance of high resolution climate projections and represents a feasible approach to assess climate impacts on water scarcity also in regions that suffer from data scarcity. Copyright © 2016. Published by Elsevier B.V.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1413834-high-resolution-dynamical-downscaling-ensemble-projections-future-extreme-temperature-distributions-united-states','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1413834-high-resolution-dynamical-downscaling-ensemble-projections-future-extreme-temperature-distributions-united-states"><span>High-Resolution Dynamical Downscaling Ensemble Projections of Future Extreme Temperature Distributions for the United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.; ...</p> <p>2017-11-20</p> <p>The aim of this study is to examine projections of extreme temperatures over the continental United States (CONUS) for the 21st century using an ensemble of high spatial resolution dynamically downscaled model simulations with different boundary conditions. The downscaling uses the Weather Research and Forecast model at a spatial resolution of 12 km along with outputs from three different Coupled Model Intercomparison Project Phase 5 global climate models that provide boundary con- ditions under two different future greenhouse gas (GHG) concentration trajectories. The results from two decadal-length time slices (2045–2054 and 2085–2094) are compared with a historical decade (1995–2004). Probabilitymore » density functions of daily maximum/minimum temperatures are analyzed over seven climatologically cohesive regions of the CONUS. The impacts of different boundary conditions as well as future GHG concentrations on extreme events such as heat waves and days with temperature higher than 95°F are also investigated. The results show that the intensity of extreme warm temperature in future summer is significantly increased, while the frequency of extreme cold temperature in future winter decreases. The distribution of summer daily maximum temperature experiences a significant warm-side shift and increased variability, while the distribution of winter daily minimum temperature is projected to have a less significant warm-side shift with decreased variability. Finally, using "business-as-usual" scenario, 5-day heat waves are projected to occur at least 5–10 times per year in most CONUS and ≥ 95°F days will increase by 1–2 months by the end of the century.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29784920','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29784920"><span>Video-rate volumetric neuronal imaging using 3D targeted illumination.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Xiao, Sheng; Tseng, Hua-An; Gritton, Howard; Han, Xue; Mertz, Jerome</p> <p>2018-05-21</p> <p>Fast volumetric microscopy is required to monitor large-scale neural ensembles with high spatio-temporal resolution. Widefield fluorescence microscopy can image large 2D fields of view at high resolution and speed while remaining simple and costeffective. A focal sweep add-on can further extend the capacity of widefield microscopy by enabling extended-depth-of-field (EDOF) imaging, but suffers from an inability to reject out-of-focus fluorescence background. Here, by using a digital micromirror device to target only in-focus sample features, we perform EDOF imaging with greatly enhanced contrast and signal-to-noise ratio, while reducing the light dosage delivered to the sample. Image quality is further improved by the application of a robust deconvolution algorithm. We demonstrate the advantages of our technique for in vivo calcium imaging in the mouse brain.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMPA11B0219W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMPA11B0219W"><span>High-Resolution Hydrological Sub-Seasonal Forecasting for Water Resources Management Over Europe</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wood, E. F.; Wanders, N.; Pan, M.; Sheffield, J.; Samaniego, L. E.; Thober, S.; Kumar, R.; Prudhomme, C.; Houghton-Carr, H.</p> <p>2017-12-01</p> <p>For decision-making at the sub-seasonal and seasonal time scale, hydrological forecasts with a high temporal and spatial resolution are required by water managers. So far such forecasts have been unavailable due to 1) lack of availability of meteorological seasonal forecasts, 2) coarse temporal resolution of meteorological seasonal forecasts, requiring temporal downscaling, 3) lack of consistency between observations and seasonal forecasts, requiring bias-correction. The EDgE (End-to-end Demonstrator for improved decision making in the water sector in Europe) project commissioned by the ECMWF (C3S) created a unique dataset of hydrological seasonal forecasts derived from four global climate models (CanCM4, FLOR-B01, ECMF, LFPW) in combination with four global hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), resulting in 208 forecasts for any given day. The forecasts provide a daily temporal and 5-km spatial resolution, and are bias corrected against E-OBS meteorological observations. The forecasts are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs), created in collaboration with the end-user community of the EDgE project (e.g. the percentage of ensemble realizations above the 10th percentile of monthly river flow, or below the 90th). Results show skillful forecasts for discharge from 3 months to 6 months (latter for N Europe due to snow); for soil moisture up to three months due precipitation forecast skill and short initial condition memory; and for groundwater greater than 6 months (lowest skill in western Europe.) The SCIIs are effective in communicating both forecast skill and uncertainty. Overall the new system provides an unprecedented ensemble for seasonal forecasts with significant skill over Europe to support water management. The consistency in both the GCM forecasts and the LSM parameterization ensures a stable and reliable forecast framework and methodology, even if additional GCMs or LSMs are added in the future.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.5836T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.5836T"><span>New Aspects of Probabilistic Forecast Verification Using Information Theory</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tödter, Julian; Ahrens, Bodo</p> <p>2013-04-01</p> <p>This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_9");'>9</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li class="active"><span>11</span></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_11 --> <div id="page_12" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="221"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUFMNG21A1453K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUFMNG21A1453K"><span>Mesoscale data assimilation for a local severe rainfall event with the NHM-LETKF system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kunii, M.</p> <p>2013-12-01</p> <p>This study aims to improve forecasts of local severe weather events through data assimilation and ensemble forecasting approaches. Here, the local ensemble transform Kalman filter (LETKF) is implemented with the Japan Meteorological Agency's nonhydrostatic model (NHM). The newly developed NHM-LETKF contains an adaptive inflation scheme and a spatial covariance localization scheme with physical distance. One-way nested analysis in which a finer-resolution LETKF is conducted by using the outputs of an outer model also becomes feasible. These new contents should enhance the potential of the LETKF for convective scale events. The NHM-LETKF is applied to a local severe rainfall event in Japan in 2012. Comparison of the root mean square errors between the model first guess and analysis reveals that the system assimilates observations appropriately. Analysis ensemble spreads indicate a significant increase around the time torrential rainfall occurred, which would imply an increase in the uncertainty of environmental fields. Forecasts initialized with LETKF analyses successfully capture intense rainfalls, suggesting that the system can work effectively for local severe weather. Investigation of probabilistic forecasts by ensemble forecasting indicates that this could become a reliable data source for decision making in the future. A one-way nested data assimilation scheme is also tested. The experiment results demonstrate that assimilation with a finer-resolution model provides an advantage in the quantitative precipitation forecasting of local severe weather conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010cosp...38.1822B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010cosp...38.1822B"><span>High-cadence observations of CME initiation and plasma dynamics in the corona with TESIS on board CORONAS-Photon</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bogachev, Sergey; Kuzin, Sergey; Zhitnik, I. A.; Bugaenko, O. I.; Goncharov, A. L.; Ignatyev, A. P.; Krutov, V. V.; Lomkova, V. M.; Mitrofanov, A. V.; Nasonkina, T. P.; Oparin, S. N.; Petzov, A. A.; Shestov, S. V.; Slemzin, V. A.; Soloviev, V. A.; Suhodrev, N. K.; Shergina, T. A.</p> <p></p> <p>The TESIS is an ensemble of space instruments designed in Lebedev Institute of Russian Academy of Sciences for spectroscopic and imaging investigation of the Sun in EUV and soft X-ray spectral range with high spatial, temporal and spectral resolution. From 2009 January, when TESIS was launched onboard the Coronas-Photon satellite, it provided about 200 000 new images and spectra of the Sun, obtained during one of the deepest solar minimum in last century. Because of the wide field of view (4 solar radii) and high sensitivity, TESIS provided high-quality data on the origin and dynamics of eruptive prominences and CMEs in the low and intermediate solar corona. TESIS is also the first EUV instrument which provided high-cadence observations of coronal bright points and solar spicules with temporal resolution of a few seconds. We present first results of TESIS observations and discuss them from a scientific point of view.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNG33A0187P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNG33A0187P"><span>Applications of Machine Learning to Downscaling and Verification</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Prudden, R.</p> <p>2017-12-01</p> <p>Downscaling, sometimes known as super-resolution, means converting model data into a more detailed local forecast. It is a problem which could be highly amenable to machine learning approaches, provided that sufficient historical forecast data and observations are available. It is also closely linked to the subject of verification, since improving a forecast requires a way to measure that improvement. This talk will describe some early work towards downscaling Met Office ensemble forecasts, and discuss how the output may be usefully evaluated.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA598087','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA598087"><span>Achieving Superior Tropical Cyclone Intensity Forecasts by Improving the Assimilation of High-Resolution Satellite Data into Mesoscale Prediction Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2013-09-30</p> <p>using polar orbit microwave and infrared sounder measurements from the Global Telecommunication System (GTS). The SDAT system was developed as a...WRF/GSI initial conditions and WRF boundary conditions. • WRF system to do short-range forecasts (6 hours) to provide the background fields for GSI...UCAR is related to a NASA GNSS proposal: “Improving Tropical Prediction and Analysis using COSMIC Radio Occultation Observations and an Ensemble Data</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.6626V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.6626V"><span>Upscaling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vandenbulcke, Luc; Barth, Alexander</p> <p>2017-04-01</p> <p>In the present European operational oceanography context, global and basin-scale models are run daily at different Monitoring and Forecasting Centers from the Copernicus Marine component (CMEMS). Regional forecasting centers, which run outside of CMEMS, then use these forecasts as initial conditions and/or boundary conditions for high-resolution or coastal forecasts. However, these improved simulations are lost to the basin-scale models (i.e. there is no feedback). Therefore, some potential improvements inside (and even outside) the areas covered by regional models are lost, and the risk for discrepancy between basin-scale and regional model remains high. The objective of this study is to simulate two-way nesting by extracting pseudo-observations from the regional models and assimilating them in the basin-scale models. The proposed method is called "upscaling". A ensemble of 100 one-way nested NEMO models of the Mediterranean Sea (Med) (1/16°) and the North-Western Med (1/80°) is implemented to simulate the period 2014-2015. Each member has perturbed initial conditions, atmospheric forcing fields and river discharge data. The Med model uses climatological Rhone river data, while the nested model uses measured daily discharges. The error of the pseudo-observations can be estimated by analyzing the ensemble of nested models. The pseudo-observations are then assimilated in the parent model by means of an Ensemble Kalman Filter. The experiments show that the proposed method improves different processes in the Med model, such as the position of the Northern Current and its incursion (or not) on the Gulf of Lions, the cold water mass on the shelf, and the position of the Rhone river plume. Regarding areas where no operational regional models exist, (some variables of) the parent model can still be improved by relating some resolved parameters to statistical properties of a higher-resolution simulation. This is the topic of a complementary study also presented at the EGU 2017 (Barth et al).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120014478','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120014478"><span>Comparing Physics Scheme Performance for a Lake Effect Snowfall Event in Northern Lower Michigan</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Molthan, Andrew; Arnott, Justin M.</p> <p>2012-01-01</p> <p>High resolution forecast models, such as those used to predict severe convective storms, can also be applied to predictions of lake effect snowfall. A high resolution WRF model forecast model is provided to support operations at NWS WFO Gaylord, Michigan, using a 12 ]km and 4 ]km nested configuration. This is comparable to the simulations performed by other NWS WFOs adjacent to the Great Lakes, including offices in the NWS Eastern Region who participate in regional ensemble efforts. Ensemble efforts require diversity in initial conditions and physics configurations to emulate the plausible range of events in order to ascertain the likelihood of different forecast scenarios. In addition to providing probabilistic guidance, individual members can be evaluated to determine whether they appear to be biased in some way, or to better understand how certain physics configurations may impact the resulting forecast. On January 20 ]21, 2011, a lake effect snow event occurred in Northern Lower Michigan, with cooperative observing and CoCoRaHS stations reporting new snow accumulations between 2 and 8 inches and liquid equivalents of 0.1 ]0.25 h. The event of January 21, 2011 was particularly well observed, with numerous surface reports available. It was also well represented by the WRF configuration operated at NWS Gaylord. Given that the default configuration produced a reasonable prediction, it is used here to evaluate the impacts of other physics configurations on the resulting prediction of the primary lake effect band and resulting QPF. Emphasis here is on differences in planetary boundary layer and cloud microphysics parameterizations, given their likely role in determining the evolution of shallow convection and precipitation processes. Results from an ensemble of seven microphysics schemes and three planetary boundary layer schemes are presented to demonstrate variability in forecast evolution, with results used in an attempt to improve the forecasts in the 2011 ]2012 lake effect season.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28709206','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28709206"><span>Spectral partitioning in equitable graphs.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Barucca, Paolo</p> <p>2017-06-01</p> <p>Graph partitioning problems emerge in a wide variety of complex systems, ranging from biology to finance, but can be rigorously analyzed and solved only for a few graph ensembles. Here, an ensemble of equitable graphs, i.e., random graphs with a block-regular structure, is studied, for which analytical results can be obtained. In particular, the spectral density of this ensemble is computed exactly for a modular and bipartite structure. Kesten-McKay's law for random regular graphs is found analytically to apply also for modular and bipartite structures when blocks are homogeneous. An exact solution to graph partitioning for two equal-sized communities is proposed and verified numerically, and a conjecture on the absence of an efficient recovery detectability transition in equitable graphs is suggested. A final discussion summarizes results and outlines their relevance for the solution of graph partitioning problems in other graph ensembles, in particular for the study of detectability thresholds and resolution limits in stochastic block models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PhRvE..95f2310B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PhRvE..95f2310B"><span>Spectral partitioning in equitable graphs</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Barucca, Paolo</p> <p>2017-06-01</p> <p>Graph partitioning problems emerge in a wide variety of complex systems, ranging from biology to finance, but can be rigorously analyzed and solved only for a few graph ensembles. Here, an ensemble of equitable graphs, i.e., random graphs with a block-regular structure, is studied, for which analytical results can be obtained. In particular, the spectral density of this ensemble is computed exactly for a modular and bipartite structure. Kesten-McKay's law for random regular graphs is found analytically to apply also for modular and bipartite structures when blocks are homogeneous. An exact solution to graph partitioning for two equal-sized communities is proposed and verified numerically, and a conjecture on the absence of an efficient recovery detectability transition in equitable graphs is suggested. A final discussion summarizes results and outlines their relevance for the solution of graph partitioning problems in other graph ensembles, in particular for the study of detectability thresholds and resolution limits in stochastic block models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1379367-resolution-dependence-precipitation-statistical-fidelity-hindcast-simulations','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1379367-resolution-dependence-precipitation-statistical-fidelity-hindcast-simulations"><span>Resolution dependence of precipitation statistical fidelity in hindcast simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>O'Brien, Travis A.; Collins, William D.; Kashinath, Karthik; ...</p> <p>2016-06-19</p> <p>This article is a U.S. Government work and is in the public domain in the USA. Numerous studies have shown that atmospheric models with high horizontal resolution better represent the physics and statistics of precipitation in climate models. While it is abundantly clear from these studies that high-resolution increases the rate of extreme precipitation, it is not clear whether these added extreme events are “realistic”; whether they occur in simulations in response to the same forcings that drive similar events in reality. In order to understand whether increasing horizontal resolution results in improved model fidelity, a hindcast-based, multiresolution experimental designmore » has been conceived and implemented: the InitiaLIzed-ensemble, Analyze, and Develop (ILIAD) framework. The ILIAD framework allows direct comparison between observed and simulated weather events across multiple resolutions and assessment of the degree to which increased resolution improves the fidelity of extremes. Analysis of 5 years of daily 5 day hindcasts with the Community Earth System Model at horizontal resolutions of 220, 110, and 28 km shows that: (1) these hindcasts reproduce the resolution-dependent increase of extreme precipitation that has been identified in longer-duration simulations, (2) the correspondence between simulated and observed extreme precipitation improves as resolution increases; and (3) this increase in extremes and precipitation fidelity comes entirely from resolved-scale precipitation. Evidence is presented that this resolution-dependent increase in precipitation intensity can be explained by the theory of Rauscher et al. (), which states that precipitation intensifies at high resolution due to an interaction between the emergent scaling (spectral) properties of the wind field and the constraint of fluid continuity.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1379367','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1379367"><span>Resolution dependence of precipitation statistical fidelity in hindcast simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>O'Brien, Travis A.; Collins, William D.; Kashinath, Karthik</p> <p></p> <p>This article is a U.S. Government work and is in the public domain in the USA. Numerous studies have shown that atmospheric models with high horizontal resolution better represent the physics and statistics of precipitation in climate models. While it is abundantly clear from these studies that high-resolution increases the rate of extreme precipitation, it is not clear whether these added extreme events are “realistic”; whether they occur in simulations in response to the same forcings that drive similar events in reality. In order to understand whether increasing horizontal resolution results in improved model fidelity, a hindcast-based, multiresolution experimental designmore » has been conceived and implemented: the InitiaLIzed-ensemble, Analyze, and Develop (ILIAD) framework. The ILIAD framework allows direct comparison between observed and simulated weather events across multiple resolutions and assessment of the degree to which increased resolution improves the fidelity of extremes. Analysis of 5 years of daily 5 day hindcasts with the Community Earth System Model at horizontal resolutions of 220, 110, and 28 km shows that: (1) these hindcasts reproduce the resolution-dependent increase of extreme precipitation that has been identified in longer-duration simulations, (2) the correspondence between simulated and observed extreme precipitation improves as resolution increases; and (3) this increase in extremes and precipitation fidelity comes entirely from resolved-scale precipitation. Evidence is presented that this resolution-dependent increase in precipitation intensity can be explained by the theory of Rauscher et al. (), which states that precipitation intensifies at high resolution due to an interaction between the emergent scaling (spectral) properties of the wind field and the constraint of fluid continuity.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28292714','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28292714"><span>Combined Monte Carlo/torsion-angle molecular dynamics for ensemble modeling of proteins, nucleic acids and carbohydrates.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zhang, Weihong; Howell, Steven C; Wright, David W; Heindel, Andrew; Qiu, Xiangyun; Chen, Jianhan; Curtis, Joseph E</p> <p>2017-05-01</p> <p>We describe a general method to use Monte Carlo simulation followed by torsion-angle molecular dynamics simulations to create ensembles of structures to model a wide variety of soft-matter biological systems. Our particular emphasis is focused on modeling low-resolution small-angle scattering and reflectivity structural data. We provide examples of this method applied to HIV-1 Gag protein and derived fragment proteins, TraI protein, linear B-DNA, a nucleosome core particle, and a glycosylated monoclonal antibody. This procedure will enable a large community of researchers to model low-resolution experimental data with greater accuracy by using robust physics based simulation and sampling methods which are a significant improvement over traditional methods used to interpret such data. Published by Elsevier Inc.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018TCry...12..247A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018TCry...12..247A"><span>Ensemble-based assimilation of fractional snow-covered area satellite retrievals to estimate the snow distribution at Arctic sites</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Aalstad, Kristoffer; Westermann, Sebastian; Vikhamar Schuler, Thomas; Boike, Julia; Bertino, Laurent</p> <p>2018-01-01</p> <p>With its high albedo, low thermal conductivity and large water storing capacity, snow strongly modulates the surface energy and water balance, which makes it a critical factor in mid- to high-latitude and mountain environments. However, estimating the snow water equivalent (SWE) is challenging in remote-sensing applications already at medium spatial resolutions of 1 km. We present an ensemble-based data assimilation framework that estimates the peak subgrid SWE distribution (SSD) at the 1 km scale by assimilating fractional snow-covered area (fSCA) satellite retrievals in a simple snow model forced by downscaled reanalysis data. The basic idea is to relate the timing of the snow cover depletion (accessible from satellite products) to the peak SSD. Peak subgrid SWE is assumed to be lognormally distributed, which can be translated to a modeled time series of fSCA through the snow model. Assimilation of satellite-derived fSCA facilitates the estimation of the peak SSD, while taking into account uncertainties in both the model and the assimilated data sets. As an extension to previous studies, our method makes use of the novel (to snow data assimilation) ensemble smoother with multiple data assimilation (ES-MDA) scheme combined with analytical Gaussian anamorphosis to assimilate time series of Moderate Resolution Imaging Spectroradiometer (MODIS) and Sentinel-2 fSCA retrievals. The scheme is applied to Arctic sites near Ny-Ålesund (79° N, Svalbard, Norway) where field measurements of fSCA and SWE distributions are available. The method is able to successfully recover accurate estimates of peak SSD on most of the occasions considered. Through the ES-MDA assimilation, the root-mean-square error (RMSE) for the fSCA, peak mean SWE and peak subgrid coefficient of variation is improved by around 75, 60 and 20 %, respectively, when compared to the prior, yielding RMSEs of 0.01, 0.09 m water equivalent (w.e.) and 0.13, respectively. The ES-MDA either outperforms or at least nearly matches the performance of other ensemble-based batch smoother schemes with regards to various evaluation metrics. Given the modularity of the method, it could prove valuable for a range of satellite-era hydrometeorological reanalyses.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.7580P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.7580P"><span>Applying Ensemble Kalman Filter to Regional Ocean Circulation Model in the East Asian Marginal Sea</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pak, Gyun-Do; Kim, Young Ho; Chang, Kyung-Il</p> <p>2010-05-01</p> <p>We successfully apply the ensemble Kalman filter (EnKF) data assimilation scheme to the East Sea Regional Ocean Model (ESROM). The ESROM solves the three dimensional ocean primitive equations with the hydrostatic and Boussinesq approximations. The domain of ESROM fully covers East Sea with grid intervals of approximately 0.1˚. The ESROM has one inflow port, the Korea Strait, and two outflow ports, the Tsugaru and Soya straits. High resolution bathymetry of 1/60˚ (Choi et al., 2002) is adopted for the model topography. The ESROM is initialized using hydrographic data from World Ocean Atlas (WOA), and forced by monthly mean surface and open boundary conditions supplied from European Centre for Medium-Range Weather Forecast data, WOA and so on. The EnKF system is composed of 16 ensembles and thousands of observation data are assimilated at every assimilation step into its parallel version, which significantly reduces the required memory and computational time more than 3-fold compared with its serial version. To prevent the collapse of ensembles due to rank deficiency, we employ various schemes such as localization and inflation of the background error covariance and disturbance of observations. Sea surface temperature from the Advanced Very High Resolution Radiometer and in-situ temperature profiles from various sources including Argo floats have been assimilated into the EnKF system. For cyclonic circulation in the northern East Sea and paths of the East Korean Warm Current and the Nearshore Branch, the EnKF system reproduces the mean surface circulation more realistically than that in the case without data assimilation. Simulated area-averaged vertical temperature profiles also agrees well with the Generalized Digital Environmental Model data, which indicates that the EnKF system corrects the warming of subsurface temperature and the erosion of the permanent thermocline that are usually observed in numerical models without data assimilation. We also quantitatively validate the EnKF system by comparing its results with observed temperatures at 100 m for two years in the southwestern East Sea. We find that spatial and temporal correlations are higher and root-mean-square errors are lower in the EnKF system as compared with those systems without data assimilation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.H33H1646A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.H33H1646A"><span>Enhanced Assimilation of InSAR Displacement and Well Data for Groundwater Monitoring</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Abdullin, A.; Jonsson, S.</p> <p>2016-12-01</p> <p>Ground deformation related to aquifer exploitation can cause damage to buildings and infrastructure leading to major economic losses and sometimes even loss of human lives. Understanding reservoir behavior helps in assessing possible future ground movement and water depletion hazard of a region under study. We have developed an InSAR-based data assimilation framework for groundwater reservoirs that efficiently incorporates InSAR data for improved reservoir management and forecasts. InSAR displacement data are integrated with the groundwater modeling software MODFLOW using ensemble-based assimilation approaches. We have examined several Ensemble Methods for updating model parameters such as hydraulic conductivity and model variables like pressure head while simultaneously providing an estimate of the uncertainty. A realistic three-dimensional aquifer model was built to demonstrate the capability of the Ensemble Methods incorporating InSAR-derived displacement measurements. We find from these numerical tests that including both ground deformation and well water level data as observations improves the RMSE of the hydraulic conductivity estimate by up to 20% comparing to using only one type of observations. The RMSE estimation of this property after the final time step is similar for Ensemble Kalman Filter (EnKF), Ensemble Smoother (ES) and ES with multiple data assimilation (ES-MDA) methods. The results suggest that the high spatial and temporal resolution subsidence observations from InSAR are very helpful for accurately quantifying hydraulic parameters. We have tested the framework on several different examples and have found good performance in improving aquifer properties estimation, which should prove useful for groundwater management. Our ongoing work focuses on assimilating real InSAR-derived time series and hydraulic head data for calibrating and predicting aquifer properties of basin-wide groundwater systems.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015BoLMe.155..301P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015BoLMe.155..301P"><span>An Observational Case Study of Persistent Fog and Comparison with an Ensemble Forecast Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Price, Jeremy; Porson, Aurore; Lock, Adrian</p> <p>2015-05-01</p> <p>We present a study of a persistent case of fog and use the observations to evaluate the UK Met Office ensemble model. The fog appeared to form initially in association with small patches of low-level stratus and spread rapidly across southern England during 11 December 2012, persisting for 24 h. The low visibility and occurrence of fog associated with the event was poorly forecast. Observations show that the surprisingly rapid spreading of the layer was due to a circulation at the fog edge, whereby cold cloudy air subsided into and mixed with warmer adjacent clear air. The resulting air was saturated, and hence the fog layer grew rapidly outwards from its edge. Measurements of fog-droplet deposition made overnight show that an average of 12 g m h was deposited but that the liquid water content remained almost constant, indicating that further liquid was condensing at a similar rate to the deposition, most likely due to the slow cooling. The circulation at the fog edge was also present during its dissipation, by which time the fog top had lowered by 150 m. During this period the continuing circulation at the fog edge, and increasing wind shear at fog top, acted to dissipate the fog by creating mixing with, by then, the drier adjacent and overlying air. Comparisons with a new, high resolution Met Office ensemble model show that this type of case remains challenging to simulate. Most ensemble members successfully simulated the formation and persistence of low stratus cloud in the region, but produced too much cloud initially overnight, which created a warm bias. During the daytime, ensemble predictions that had produced fog lifted it into low stratus, whilst in reality the fog remained present all day. Various aspects of the model performance are discussed further.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AdWR..110..371S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AdWR..110..371S"><span>A multi-scale ensemble-based framework for forecasting compound coastal-riverine flooding: The Hackensack-Passaic watershed and Newark Bay</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Saleh, F.; Ramaswamy, V.; Wang, Y.; Georgas, N.; Blumberg, A.; Pullen, J.</p> <p>2017-12-01</p> <p>Estuarine regions can experience compound impacts from coastal storm surge and riverine flooding. The challenges in forecasting flooding in such areas are multi-faceted due to uncertainties associated with meteorological drivers and interactions between hydrological and coastal processes. The objective of this work is to evaluate how uncertainties from meteorological predictions propagate through an ensemble-based flood prediction framework and translate into uncertainties in simulated inundation extents. A multi-scale framework, consisting of hydrologic, coastal and hydrodynamic models, was used to simulate two extreme flood events at the confluence of the Passaic and Hackensack rivers and Newark Bay. The events were Hurricane Irene (2011), a combination of inland flooding and coastal storm surge, and Hurricane Sandy (2012) where coastal storm surge was the dominant component. The hydrodynamic component of the framework was first forced with measured streamflow and ocean water level data to establish baseline inundation extents with the best available forcing data. The coastal and hydrologic models were then forced with meteorological predictions from 21 ensemble members of the Global Ensemble Forecast System (GEFS) to retrospectively represent potential future conditions up to 96 hours prior to the events. Inundation extents produced by the hydrodynamic model, forced with the 95th percentile of the ensemble-based coastal and hydrologic boundary conditions, were in good agreement with baseline conditions for both events. The USGS reanalysis of Hurricane Sandy inundation extents was encapsulated between the 50th and 95th percentile of the forecasted inundation extents, and that of Hurricane Irene was similar but with caveats associated with data availability and reliability. This work highlights the importance of accounting for meteorological uncertainty to represent a range of possible future inundation extents at high resolution (∼m).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012HESSD...9.9425C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012HESSD...9.9425C"><span>Regional climate models downscaling in the Alpine area with Multimodel SuperEnsemble</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cane, D.; Barbarino, S.; Renier, L. A.; Ronchi, C.</p> <p>2012-08-01</p> <p>The climatic scenarios show a strong signal of warming in the Alpine area already for the mid XXI century. The climate simulations, however, even when obtained with Regional Climate Models (RCMs), are affected by strong errors where compared with observations, due to their difficulties in representing the complex orography of the Alps and limitations in their physical parametrization. Therefore the aim of this work is reducing these model biases using a specific post processing statistic technique to obtain a more suitable projection of climate change scenarios in the Alpine area. For our purposes we use a selection of RCMs runs from the ENSEMBLES project, carefully chosen in order to maximise the variety of leading Global Climate Models and of the RCMs themselves, calculated on the SRES scenario A1B. The reference observation for the Greater Alpine Area are extracted from the European dataset E-OBS produced by the project ENSEMBLES with an available resolution of 25 km. For the study area of Piedmont daily temperature and precipitation observations (1957-present) were carefully gridded on a 14-km grid over Piedmont Region with an Optimal Interpolation technique. Hence, we applied the Multimodel SuperEnsemble technique to temperature fields, reducing the high biases of RCMs temperature field compared to observations in the control period. We propose also the first application to RCMS of a brand new probabilistic Multimodel SuperEnsemble Dressing technique to estimate precipitation fields, already applied successfully to weather forecast models, with careful description of precipitation Probability Density Functions conditioned to the model outputs. This technique reduces the strong precipitation overestimation by RCMs over the alpine chain and reproduces well the monthly behaviour of precipitation in the control period.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3970899','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3970899"><span>Ensemble MD simulations restrained via crystallographic data: Accurate structure leads to accurate dynamics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Xue, Yi; Skrynnikov, Nikolai R</p> <p>2014-01-01</p> <p>Currently, the best existing molecular dynamics (MD) force fields cannot accurately reproduce the global free-energy minimum which realizes the experimental protein structure. As a result, long MD trajectories tend to drift away from the starting coordinates (e.g., crystallographic structures). To address this problem, we have devised a new simulation strategy aimed at protein crystals. An MD simulation of protein crystal is essentially an ensemble simulation involving multiple protein molecules in a crystal unit cell (or a block of unit cells). To ensure that average protein coordinates remain correct during the simulation, we introduced crystallography-based restraints into the MD protocol. Because these restraints are aimed at the ensemble-average structure, they have only minimal impact on conformational dynamics of the individual protein molecules. So long as the average structure remains reasonable, the proteins move in a native-like fashion as dictated by the original force field. To validate this approach, we have used the data from solid-state NMR spectroscopy, which is the orthogonal experimental technique uniquely sensitive to protein local dynamics. The new method has been tested on the well-established model protein, ubiquitin. The ensemble-restrained MD simulations produced lower crystallographic R factors than conventional simulations; they also led to more accurate predictions for crystallographic temperature factors, solid-state chemical shifts, and backbone order parameters. The predictions for 15N R1 relaxation rates are at least as accurate as those obtained from conventional simulations. Taken together, these results suggest that the presented trajectories may be among the most realistic protein MD simulations ever reported. In this context, the ensemble restraints based on high-resolution crystallographic data can be viewed as protein-specific empirical corrections to the standard force fields. PMID:24452989</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5036145','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5036145"><span>Hazardous thunderstorm intensification over Lake Victoria</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Thiery, Wim; Davin, Edouard L.; Seneviratne, Sonia I.; Bedka, Kristopher; Lhermitte, Stef; van Lipzig, Nicole P. M.</p> <p>2016-01-01</p> <p>Weather extremes have harmful impacts on communities around Lake Victoria, where thousands of fishermen die every year because of intense night-time thunderstorms. Yet how these thunderstorms will evolve in a future warmer climate is still unknown. Here we show that Lake Victoria is projected to be a hotspot of future extreme precipitation intensification by using new satellite-based observations, a high-resolution climate projection for the African Great Lakes and coarser-scale ensemble projections. Land precipitation on the previous day exerts a control on night-time occurrence of extremes on the lake by enhancing atmospheric convergence (74%) and moisture availability (26%). The future increase in extremes over Lake Victoria is about twice as large relative to surrounding land under a high-emission scenario, as only over-lake moisture advection is high enough to sustain Clausius–Clapeyron scaling. Our results highlight a major hazard associated with climate change over East Africa and underline the need for high-resolution projections to assess local climate change. PMID:27658848</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.3953Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.3953Y"><span>Comparison of different assimilation schemes in an operational assimilation system with Ensemble Kalman Filter</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre</p> <p>2016-04-01</p> <p>In this paper, four assimilation schemes, including an intermittent assimilation scheme (INT) and three incremental assimilation schemes (IAU 0, IAU 50 and IAU 100), are compared in the same assimilation experiments with a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. The three IAU schemes differ from each other in the position of the increment update window that has the same size as the assimilation window. 0, 50 and 100 correspond to the degree of superposition of the increment update window on the current assimilation window. Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments The relevance of each assimilation scheme is evaluated through analyses on thermohaline variables and the current velocities. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semi-independent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_10");'>10</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li class="active"><span>12</span></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_12 --> <div id="page_13" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="241"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3524795','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3524795"><span>Modelling dynamics in protein crystal structures by ensemble refinement</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Burnley, B Tom; Afonine, Pavel V; Adams, Paul D; Gros, Piet</p> <p>2012-01-01</p> <p>Single-structure models derived from X-ray data do not adequately account for the inherent, functionally important dynamics of protein molecules. We generated ensembles of structures by time-averaged refinement, where local molecular vibrations were sampled by molecular-dynamics (MD) simulation whilst global disorder was partitioned into an underlying overall translation–libration–screw (TLS) model. Modeling of 20 protein datasets at 1.1–3.1 Å resolution reduced cross-validated Rfree values by 0.3–4.9%, indicating that ensemble models fit the X-ray data better than single structures. The ensembles revealed that, while most proteins display a well-ordered core, some proteins exhibit a ‘molten core’ likely supporting functionally important dynamics in ligand binding, enzyme activity and protomer assembly. Order–disorder changes in HIV protease indicate a mechanism of entropy compensation for ordering the catalytic residues upon ligand binding by disordering specific core residues. Thus, ensemble refinement extracts dynamical details from the X-ray data that allow a more comprehensive understanding of structure–dynamics–function relationships. DOI: http://dx.doi.org/10.7554/eLife.00311.001 PMID:23251785</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26737994','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26737994"><span>Analysis of microvascular perfusion with multi-dimensional complete ensemble empirical mode decomposition with adaptive noise algorithm: Processing of laser speckle contrast images recorded in healthy subjects, at rest and during acetylcholine stimulation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Humeau-Heurtier, Anne; Marche, Pauline; Dubois, Severine; Mahe, Guillaume</p> <p>2015-01-01</p> <p>Laser speckle contrast imaging (LSCI) is a full-field imaging modality to monitor microvascular blood flow. It is able to give images with high temporal and spatial resolutions. However, when the skin is studied, the interpretation of the bidimensional data may be difficult. This is why an averaging of the perfusion values in regions of interest is often performed and the result is followed in time, reducing the data to monodimensional time series. In order to avoid such a procedure (that leads to a loss of the spatial resolution), we propose to extract patterns from LSCI data and to compare these patterns for two physiological states in healthy subjects: at rest and at the peak of acetylcholine-induced perfusion peak. For this purpose, the recent multi-dimensional complete ensemble empirical mode decomposition with adaptive noise (MCEEMDAN) algorithm is applied to LSCI data. The results show that the intrinsic mode functions and residue given by MCEEMDAN show different patterns for the two physiological states. The images, as bidimensional data, can therefore be processed to reveal microvascular perfusion patterns, hidden in the images themselves. This work is therefore a feasibility study before analyzing data in patients with microvascular dysfunctions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..1413531W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..1413531W"><span>Modelling climate impact on floods under future emission scenarios using an ensemble of climate model projections</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wetterhall, F.; Cloke, H. L.; He, Y.; Freer, J.; Pappenberger, F.</p> <p>2012-04-01</p> <p>Evidence provided by modelled assessments of climate change impact on flooding is fundamental to water resource and flood risk decision making. Impact models usually rely on climate projections from Global and Regional Climate Models, and there is no doubt that these provide a useful assessment of future climate change. However, cascading ensembles of climate projections into impact models is not straightforward because of problems of coarse resolution in Global and Regional Climate Models (GCM/RCM) and the deficiencies in modelling high-intensity precipitation events. Thus decisions must be made on how to appropriately pre-process the meteorological variables from GCM/RCMs, such as selection of downscaling methods and application of Model Output Statistics (MOS). In this paper a grand ensemble of projections from several GCM/RCM are used to drive a hydrological model and analyse the resulting future flood projections for the Upper Severn, UK. The impact and implications of applying MOS techniques to precipitation as well as hydrological model parameter uncertainty is taken into account. The resultant grand ensemble of future river discharge projections from the RCM/GCM-hydrological model chain is evaluated against a response surface technique combined with a perturbed physics experiment creating a probabilisic ensemble climate model outputs. The ensemble distribution of results show that future risk of flooding in the Upper Severn increases compared to present conditions, however, the study highlights that the uncertainties are large and that strong assumptions were made in using Model Output Statistics to produce the estimates of future discharge. The importance of analysing on a seasonal basis rather than just annual is highlighted. The inability of the RCMs (and GCMs) to produce realistic precipitation patterns, even in present conditions, is a major caveat of local climate impact studies on flooding, and this should be a focus for future development.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22465632-ensemble-type-numerical-uncertainty-information-from-single-model-integrations','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22465632-ensemble-type-numerical-uncertainty-information-from-single-model-integrations"><span>Ensemble-type numerical uncertainty information from single model integrations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Rauser, Florian, E-mail: florian.rauser@mpimet.mpg.de; Marotzke, Jochem; Korn, Peter</p> <p>2015-07-01</p> <p>We suggest an algorithm that quantifies the discretization error of time-dependent physical quantities of interest (goals) for numerical models of geophysical fluid dynamics. The goal discretization error is estimated using a sum of weighted local discretization errors. The key feature of our algorithm is that these local discretization errors are interpreted as realizations of a random process. The random process is determined by the model and the flow state. From a class of local error random processes we select a suitable specific random process by integrating the model over a short time interval at different resolutions. The weights of themore » influences of the local discretization errors on the goal are modeled as goal sensitivities, which are calculated via automatic differentiation. The integration of the weighted realizations of local error random processes yields a posterior ensemble of goal approximations from a single run of the numerical model. From the posterior ensemble we derive the uncertainty information of the goal discretization error. This algorithm bypasses the requirement of detailed knowledge about the models discretization to generate numerical error estimates. The algorithm is evaluated for the spherical shallow-water equations. For two standard test cases we successfully estimate the error of regional potential energy, track its evolution, and compare it to standard ensemble techniques. The posterior ensemble shares linear-error-growth properties with ensembles of multiple model integrations when comparably perturbed. The posterior ensemble numerical error estimates are of comparable size as those of a stochastic physics ensemble.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1711779C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1711779C"><span>Highlights of advances in the field of hydrometeorological research brought about by the DRIHM project</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Caumont, Olivier; Hally, Alan; Garrote, Luis; Richard, Évelyne; Weerts, Albrecht; Delogu, Fabio; Fiori, Elisabetta; Rebora, Nicola; Parodi, Antonio; Mihalović, Ana; Ivković, Marija; Dekić, Ljiljana; van Verseveld, Willem; Nuissier, Olivier; Ducrocq, Véronique; D'Agostino, Daniele; Galizia, Antonella; Danovaro, Emanuele; Clematis, Andrea</p> <p>2015-04-01</p> <p>The FP7 DRIHM (Distributed Research Infrastructure for Hydro-Meteorology, http://www.drihm.eu, 2011-2015) project intends to develop a prototype e-Science environment to facilitate the collaboration between meteorologists, hydrologists, and Earth science experts for accelerated scientific advances in Hydro-Meteorology Research (HMR). As the project comes to its end, this presentation will summarize the HMR results that have been obtained in the framework of DRIHM. The vision shaped and implemented in the framework of the DRIHM project enables the production and interpretation of numerous, complex compositions of hydrometeorological simulations of flood events from rainfall, either simulated or modelled, down to discharge. Each element of a composition is drawn from a set of various state-of-the-art models. Atmospheric simulations providing high-resolution rainfall forecasts involve different global and limited-area convection-resolving models, the former being used as boundary conditions for the latter. Some of these models can be run as ensembles, i.e. with perturbed boundary conditions, initial conditions and/or physics, thus sampling the probability density function of rainfall forecasts. In addition, a stochastic downscaling algorithm can be used to create high-resolution rainfall ensemble forecasts from deterministic lower-resolution forecasts. All these rainfall forecasts may be used as input to various rainfall-discharge hydrological models that compute the resulting stream flows for catchments of interest. In some hydrological simulations, physical parameters are perturbed to take into account model errors. As a result, six different kinds of rainfall data (either deterministic or probabilistic) can currently be compared with each other and combined with three different hydrological model engines running either in deterministic or probabilistic mode. HMR topics which are allowed or facilitated by such unprecedented sets of hydrometerological forecasts include: physical process studies, intercomparison of models and ensembles, sensitivity studies to a particular component of the forecasting chain, and design of flash-flood early-warning systems. These benefits will be illustrated with the different key cases that have been under investigation in the course of the project. These are four catastrophic cases of flooding, namely the case of 4 November 2011 in Genoa, Italy, 6 November 2011 in Catalonia, Spain, 13-16 May 2014 in eastern Europe, and 9 October 2014, again in Genoa, Italy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/12918958','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/12918958"><span>Programmable selectivity for GC with series-coupled columns using pulsed heating of the second column.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Whiting, Joshua; Sacks, Richard</p> <p>2003-05-15</p> <p>A series-coupled ensemble of a nonpolar dimethyl polysiloxane column and a polar trifluoropropylmethyl polysiloxane column with independent at-column heating is used to obtain pulsed heating of the second column. For mixture component bands that are separated by the first column but coelute from the column ensemble, a temperature pulse is initiated after the first of the two components has crossed the column junction point and is in the second column, while the other component is still in the first column. This accelerates the band for the first component. If the second column cools sufficiently prior to the second component band crossing the junction, the second band experiences less acceleration, and increased separation is observed for the corresponding peaks in the ensemble chromatogram. High-speed at-column heating is obtained by wrapping the fused-silica capillary column with resistance heater wire and sensor wire. Rapid heating for a temperature pulse is obtained with a short-duration linear heating ramp of 1000 degrees C/min. During a pulse, the second-column temperature increases by 20-100 degrees C in a few seconds. Using a cold gas environment, cooling to a quiescent temperature of 30 degrees C can be obtained in approximately 25 s. The effects of temperature pulse initiation time and amplitude on ensemble peak separation and resolution are described. A series of appropriately timed temperature pulses is used to separate three coeluting pairs of components in a 13-component mixture.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017IJAEO..59...79O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017IJAEO..59...79O"><span>Object-based habitat mapping using very high spatial resolution multispectral and hyperspectral imagery with LiDAR data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Onojeghuo, Alex Okiemute; Onojeghuo, Ajoke Ruth</p> <p>2017-07-01</p> <p>This study investigated the combined use of multispectral/hyperspectral imagery and LiDAR data for habitat mapping across parts of south Cumbria, North West England. The methodology adopted in this study integrated spectral information contained in pansharp QuickBird multispectral/AISA Eagle hyperspectral imagery and LiDAR-derived measures with object-based machine learning classifiers and ensemble analysis techniques. Using the LiDAR point cloud data, elevation models (such as the Digital Surface Model and Digital Terrain Model raster) and intensity features were extracted directly. The LiDAR-derived measures exploited in this study included Canopy Height Model, intensity and topographic information (i.e. mean, maximum and standard deviation). These three LiDAR measures were combined with spectral information contained in the pansharp QuickBird and Eagle MNF transformed imagery for image classification experiments. A fusion of pansharp QuickBird multispectral and Eagle MNF hyperspectral imagery with all LiDAR-derived measures generated the best classification accuracies, 89.8 and 92.6% respectively. These results were generated with the Support Vector Machine and Random Forest machine learning algorithms respectively. The ensemble analysis of all three learning machine classifiers for the pansharp QuickBird and Eagle MNF fused data outputs did not significantly increase the overall classification accuracy. Results of the study demonstrate the potential of combining either very high spatial resolution multispectral or hyperspectral imagery with LiDAR data for habitat mapping.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.B31F2060E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.B31F2060E"><span>Vegetation Mapping in a Dryland Ecosystem Using Multi-temporal Sentinel-2 Imagery and Ensemble Learning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Enterkine, J.; Spaete, L.; Glenn, N. F.; Gallagher, M.</p> <p>2017-12-01</p> <p>Remote sensing and mapping of dryland ecosystem vegetation is notably problematic due to the low canopy cover and fugacious growing seasons. Recent improvements in available satellite imagery and machine learning techniques have enabled enhanced approaches to mapping and monitoring vegetation across dryland ecosystems. The Sentinel-2 satellites (launched June 2015 and March 2017) of ESA's Copernicus Programme offer promising developments from existing multispectral satellite systems such as Landsat. Freely-available, Sentinel-2 imagery offers a five-day revisit frequency, thirteen spectral bands (in the visible, near infrared, and shortwave infrared), and high spatial resolution (from 10m to 60m). Three narrow spectral bands located between the visible and the near infrared are designed to observe changes in photosynthesis. The high temporal, spatial, and spectral resolution of this imagery makes it ideal for monitoring vegetation in dryland ecosystems. In this study, we calculated a large number of vegetation and spectral indices from Sentinel-2 imagery spanning a growing season. This data was leveraged with robust field data of canopy cover at precise geolocations. We then used a Random Forests ensemble learning model to identify the most predictive variables for each landcover class, which were then used to impute landcover over the study area. The resulting vegetation map product will be used by land managers, and the mapping approaches will serve as a basis for future remote sensing projects using Sentinel-2 imagery and machine learning.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29847087','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29847087"><span>High-Resolution Large-Ensemble Nanoparticle Trapping with Multifunctional Thermoplasmonic Nanohole Metasurface.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ndukaife, Justus C; Xuan, Yi; Nnanna, Agbai George Agwu; Kildishev, Alexander V; Shalaev, Vladimir M; Wereley, Steven T; Boltasseva, Alexandra</p> <p>2018-06-07</p> <p>The intrinsic loss in a plasmonic metasurface is usually considered to be detrimental for device applications. Using plasmonic loss to our advantage, we introduce a thermoplasmonic metasurface that enables high-throughput large-ensemble nanoparticle assembly in a lab-on-a-chip platform. In our work, an array of subwavelength nanoholes in a metal film is used as a plasmonic metasurface that supports the excitation of localized surface plasmon and Bloch surface plasmon polariton waves upon optical illumination and provides a platform for molding both optical and thermal landscapes to achieve a tunable many-particle assembling process. The demonstrated many-particle trapping occurs against gravity in an inverted configuration where the light beam first passes through the nanoparticle suspension before illuminating the thermoplasmonic metasurface, a feat previously thought to be impossible. We also report an extraordinarily enhanced electrothermoplasmonic flow in the region of the thermoplasmonic nanohole metasurface, with comparatively larger transport velocities in comparison to the unpatterned region. This thermoplasmonic metasurface could enable possibilities for myriad applications in molecular analysis, quantum photonics, and self-assembly and creates a versatile platform for exploring nonequilibrium physics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1915739W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1915739W"><span>SDCLIREF - A sub-daily gridded reference dataset</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wood, Raul R.; Willkofer, Florian; Schmid, Franz-Josef; Trentini, Fabian; Komischke, Holger; Ludwig, Ralf</p> <p>2017-04-01</p> <p>Climate change is expected to impact the intensity and frequency of hydrometeorological extreme events. In order to adequately capture and analyze extreme rainfall events, in particular when assessing flood and flash flood situations, data is required at high spatial and sub-daily resolution which is often not available in sufficient density and over extended time periods. The ClimEx project (Climate Change and Hydrological Extreme Events) addresses the alteration of hydrological extreme events under climate change conditions. In order to differentiate between a clear climate change signal and the limits of natural variability, unique Single-Model Regional Climate Model Ensembles (CRCM5 driven by CanESM2, RCP8.5) were created for a European and North-American domain, each comprising 50 members of 150 years (1951-2100). In combination with the CORDEX-Database, this newly created ClimEx-Ensemble is a one-of-a-kind model dataset to analyze changes of sub-daily extreme events. For the purpose of bias-correcting the regional climate model ensembles as well as for the baseline calibration and validation of hydrological catchment models, a new sub-daily (3h) high-resolution (500m) gridded reference dataset (SDCLIREF) was created for a domain covering the Upper Danube and Main watersheds ( 100.000km2). As the sub-daily observations lack a continuous time series for the reference period 1980-2010, the need for a suitable method to bridge the gap of the discontinuous time series arouse. The Method of Fragments (Sharma and Srikanthan (2006); Westra et al. (2012)) was applied to transform daily observations to sub-daily rainfall events to extend the time series and densify the station network. Prior to applying the Method of Fragments and creating the gridded dataset using rigorous interpolation routines, data collection of observations, operated by several institutions in three countries (Germany, Austria, Switzerland), and the subsequent quality control of the observations was carried out. Among others, the quality control checked for steps, extensive dry seasons, temporal consistency and maximum hourly values. The resulting SDCLIREF dataset provides a robust precipitation reference for hydrometeorological applications in unprecedented high spatio-temporal resolution. References: Sharma, A.; Srikanthan, S. (2006): Continuous Rainfall Simulation: A Nonparametric Alternative. In: 30th Hydrology and Water Resources Symposium 4-7 December 2006, Launceston, Tasmania. Westra, S.; Mehrotra, R.; Sharma, A.; Srikanthan, R. (2012): Continuous rainfall simulation. 1. A regionalized subdaily disaggregation approach. In: Water Resour. Res. 48 (1). DOI: 10.1029/2011WR010489.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3831284','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3831284"><span>How to Deal with Low-Resolution Target Structures: Using SAR, Ensemble Docking, Hydropathic Analysis, and 3D-QSAR to Definitively Map the αβ-Tubulin Colchicine Site</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Da, Chenxiao; Mooberry, Susan L.; Gupton, John T.; Kellogg, Glen E.</p> <p>2013-01-01</p> <p>αβ-tubulin colchicine site inhibitors (CSIs) from four scaffolds that we previously tested for antiproliferative activity were modeled to better understand their effect on microtubules. Docking models, constructed by exploiting the SAR of a pyrrole subset and HINT scoring, guided ensemble docking of all 59 compounds. This conformation set and two variants having progressively less structure knowledge were subjected to CoMFA, CoMFA+HINT, and CoMSIA 3D-QSAR analyses. The CoMFA+HINT model (docked alignment) showed the best statistics: leave-one-out q2 of 0.616, r2 of 0.949 and r2pred (internal test set) of 0.755. An external (tested in other laboratories) collection of 24 CSIs from eight scaffolds were evaluated with the 3D-QSAR models, which correctly ranked their activity trends in 7/8 scaffolds for CoMFA+HINT (8/8 for CoMFA). The combination of SAR, ensemble docking, hydropathic analysis and 3D-QSAR provides an atomic-scale colchicine site model more consistent with a target structure resolution much higher than the ~3.6 Å available for αβ-tubulin. PMID:23961916</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24842035','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24842035"><span>Increasing horizontal resolution in numerical weather prediction and climate simulations: illusion or panacea?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Wedi, Nils P</p> <p>2014-06-28</p> <p>The steady path of doubling the global horizontal resolution approximately every 8 years in numerical weather prediction (NWP) at the European Centre for Medium Range Weather Forecasts may be substantially altered with emerging novel computing architectures. It coincides with the need to appropriately address and determine forecast uncertainty with increasing resolution, in particular, when convective-scale motions start to be resolved. Blunt increases in the model resolution will quickly become unaffordable and may not lead to improved NWP forecasts. Consequently, there is a need to accordingly adjust proven numerical techniques. An informed decision on the modelling strategy for harnessing exascale, massively parallel computing power thus also requires a deeper understanding of the sensitivity to uncertainty--for each part of the model--and ultimately a deeper understanding of multi-scale interactions in the atmosphere and their numerical realization in ultra-high-resolution NWP and climate simulations. This paper explores opportunities for substantial increases in the forecast efficiency by judicious adjustment of the formal accuracy or relative resolution in the spectral and physical space. One path is to reduce the formal accuracy by which the spectral transforms are computed. The other pathway explores the importance of the ratio used for the horizontal resolution in gridpoint space versus wavenumbers in spectral space. This is relevant for both high-resolution simulations as well as ensemble-based uncertainty estimation. © 2014 The Author(s) Published by the Royal Society. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.3633M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.3633M"><span>The UPSCALE project: a large simulation campaign</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mizielinski, Matthew; Roberts, Malcolm; Vidale, Pier Luigi; Schiemann, Reinhard; Demory, Marie-Estelle; Strachan, Jane</p> <p>2014-05-01</p> <p>The development of a traceable hierarchy of HadGEM3 global climate models, based upon the Met Office Unified Model, at resolutions from 135 km to 25 km, now allows the impact of resolution on the mean state, variability and extremes of climate to be studied in a robust fashion. In 2011 we successfully obtained a single-year grant of 144 million core hours of supercomputing time from the PRACE organization to run ensembles of 27 year atmosphere-only (HadGEM3-A GA3.0) climate simulations at 25km resolution, as used in present global weather forecasting, on HERMIT at HLRS. Through 2012 the UPSCALE project (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) ran over 650 years of simulation at resolutions of 25 km (N512), 60 km (N216) and 135 km (N96) to look at the value of high resolution climate models in the study of both present climate and a potential future climate scenario based on RCP8.5. Over 400 TB of data was produced using HERMIT, with additional simulations run on HECToR (UK supercomputer) and MONSooN (Met Office NERC Supercomputing Node). The data generated was transferred to the JASMIN super-data cluster, hosted by STFC CEDA in the UK, where analysis facilities are allowing rapid scientific exploitation of the data set. Many groups across the UK and Europe are already taking advantage of these facilities and we welcome approaches from other interested scientists. This presentation will briefly cover the following points; Purpose and requirements of the UPSCALE project and facilities used. Technical implementation and hurdles (model porting and optimisation, automation, numerical failures, data transfer). Ensemble specification. Current analysis projects and access to the data set. A full description of UPSCALE and the data set generated has been submitted to Geoscientific Model development, with overview information available from http://proj.badc.rl.ac.uk/upscale .</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A14F..06Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A14F..06Y"><span>Risk assessments of regional climate change over Europe: generation of probabilistic ensemble and uncertainty assessment for EURO-CODEX</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yuan, J.; Kopp, R. E.</p> <p>2017-12-01</p> <p>Quantitative risk analysis of regional climate change is crucial for risk management and impact assessment of climate change. Two major challenges to assessing the risks of climate change are: CMIP5 model runs, which drive EURO-CODEX downscaling runs, do not cover the full range of uncertainty of future projections; Climate models may underestimate the probability of tail risks (i.e. extreme events). To overcome the difficulties, this study offers a viable avenue, where a set of probabilistic climate ensemble is generated using the Surrogate/Model Mixed Ensemble (SMME) method. The probabilistic ensembles for temperature and precipitation are used to assess the range of uncertainty covered by five bias-corrected simulations from the high-resolution (0.11º) EURO-CODEX database, which are selected by the PESETA (The Projection of Economic impacts of climate change in Sectors of the European Union based on bottom-up Analysis) III project. Results show that the distribution of SMME ensemble is notably wider than both distribution of raw ensemble of GCMs and the spread of the five EURO-CORDEX in RCP8.5. Tail risks are well presented by the SMME ensemble. Both SMME ensemble and EURO-CORDEX projections are aggregated to administrative level, and are integrated into impact functions of PESETA III to assess climate risks in Europe. To further evaluate the uncertainties introduced by the downscaling process, we compare the 5 runs from EURO-CORDEX with runs from the corresponding GCMs. Time series of regional mean, spatial patterns, and climate indices are examined for the future climate (2080-2099) deviating from the present climate (1981-2010). The downscaling processes do not appear to be trend-preserving, e.g. the increase in regional mean temperature from EURO-CORDEX is slower than that from the corresponding GCM. The spatial pattern comparison reveals that the differences between each pair of GCM and EURO-CORDEX are small in winter. In summer, the temperatures of EURO-CORDEX are generally lower than those of GCMs, while the drying trends in precipitation of EURO-CORDEX are smaller than those of GCMs. Climate indices are significantly affected by bias-correction and downscaling process. Our study provides valuable information for selecting climate indices in different regions over Europe.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMGC54B..07M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMGC54B..07M"><span>Multi-model ensemble simulations of low flows in Europe under a 1.5, 2, and 3 degree global warming</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Marx, A.; Kumar, R.; Thober, S.; Zink, M.; Wanders, N.; Wood, E. F.; Pan, M.; Sheffield, J.; Samaniego, L. E.</p> <p>2017-12-01</p> <p>There is growing evidence that climate change will alter water availability in Europe. Here, we investigate how hydrological low flows are affected under different levels of future global warming (i.e., 1.5, 2 and 3 K). The analysis is based on a multi-model ensemble of 45 hydrological simulations based on three RCPs (rcp2p6, rcp6p0, rcp8p5), five CMIP5 GCMs (GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, NorESM1-M) and three state-of-the-art hydrological models (HMs: mHM, Noah-MP, and PCR-GLOBWB). High resolution model results are available at the unprecedented spatial resolution of 5 km across the pan-European domain at daily temporal resolution. Low river flow is described as the percentile of daily streamflow that is exceeded 90% of the time. It is determined separately for each GCM/HM combinations and the warming scenarios. The results show that the change signal amplifies with increasing warming levels. Low flows decrease in the Mediterranean, while they increase in the Alpine and Northern regions. In the Mediterranean, the level of warming amplifies the signal from -12% under 1.5 K to -35% under 3 K global warming largely due to the projected decreases in annual precipitation. In contrast, the signal is amplified from +22% (1.5 K) to +45% (3 K) because of the reduced snow melt contribution. The changes in low flows are significant for regions with relatively large change signals and under higher levels of warming. Nevertheless, it is not possible to distinguish climate induced differences in low flows between 1.5 and 2 K warming because of the large variability inherent in the multi-model ensemble. The contribution by the GCMs to the uncertainty in the Alpine and Northern region as well as the Mediterranean, the uncertainty contribution by the HMs is partly higher than those by the GCMs due to different representations of processes such as snow, soil moisture and evapotranspiration.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29843096','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29843096"><span>High-resolution Self-Organizing Maps for advanced visualization and dimension reduction.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Saraswati, Ayu; Nguyen, Van Tuc; Hagenbuchner, Markus; Tsoi, Ah Chung</p> <p>2018-05-04</p> <p>Kohonen's Self Organizing feature Map (SOM) provides an effective way to project high dimensional input features onto a low dimensional display space while preserving the topological relationships among the input features. Recent advances in algorithms that take advantages of modern computing hardware introduced the concept of high resolution SOMs (HRSOMs). This paper investigates the capabilities and applicability of the HRSOM as a visualization tool for cluster analysis and its suitabilities to serve as a pre-processor in ensemble learning models. The evaluation is conducted on a number of established benchmarks and real-world learning problems, namely, the policeman benchmark, two web spam detection problems, a network intrusion detection problem, and a malware detection problem. It is found that the visualization resulted from an HRSOM provides new insights concerning these learning problems. It is furthermore shown empirically that broad benefits from the use of HRSOMs in both clustering and classification problems can be expected. Copyright © 2018 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016ClDy...46..807J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016ClDy...46..807J"><span>The resolution sensitivity of the South Asian monsoon and Indo-Pacific in a global 0.35° AGCM</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Johnson, Stephanie J.; Levine, Richard C.; Turner, Andrew G.; Martin, Gill M.; Woolnough, Steven J.; Schiemann, Reinhard; Mizielinski, Matthew S.; Roberts, Malcolm J.; Vidale, Pier Luigi; Demory, Marie-Estelle; Strachan, Jane</p> <p>2016-02-01</p> <p>The South Asian monsoon is one of the most significant manifestations of the seasonal cycle. It directly impacts nearly one third of the world's population and also has substantial global influence. Using 27-year integrations of a high-resolution atmospheric general circulation model (Met Office Unified Model), we study changes in South Asian monsoon precipitation and circulation when horizontal resolution is increased from approximately 200-40 km at the equator (N96-N512, 1.9°-0.35°). The high resolution, integration length and ensemble size of the dataset make this the most extensive dataset used to evaluate the resolution sensitivity of the South Asian monsoon to date. We find a consistent pattern of JJAS precipitation and circulation changes as resolution increases, which include a slight increase in precipitation over peninsular India, changes in Indian and Indochinese orographic rain bands, increasing wind speeds in the Somali Jet, increasing precipitation over the Maritime Continent islands and decreasing precipitation over the northern Maritime Continent seas. To diagnose which resolution-related processes cause these changes, we compare them to published sensitivity experiments that change regional orography and coastlines. Our analysis indicates that improved resolution of the East African Highlands results in the improved representation of the Somali Jet and further suggests that improved resolution of orography over Indochina and the Maritime Continent results in more precipitation over the Maritime Continent islands at the expense of reduced precipitation further north. We also evaluate the resolution sensitivity of monsoon depressions and lows, which contribute more precipitation over northeast India at higher resolution. We conclude that while increasing resolution at these scales does not solve the many monsoon biases that exist in GCMs, it has a number of small, beneficial impacts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.A51P0317K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.A51P0317K"><span>An ensemble Kalman filter with a high-resolution atmosphere-ocean coupled model for tropical cyclone forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kunii, M.; Ito, K.; Wada, A.</p> <p>2015-12-01</p> <p>An ensemble Kalman filter (EnKF) using a regional mesoscale atmosphere-ocean coupled model was developed to represent the uncertainties of sea surface temperature (SST) in ensemble data assimilation strategies. The system was evaluated through data assimilation cycle experiments over a one-month period from July to August 2014, during which a tropical cyclone as well as severe rainfall events occurred. The results showed that the data assimilation cycle with the coupled model could reproduce SST distributions realistically even without updating SST and salinity during the data assimilation cycle. Therefore, atmospheric variables and radiation applied as a forcing to ocean models can control oceanic variables to some extent in the current data assimilation configuration. However, investigations of the forecast error covariance estimated in EnKF revealed that the correlation between atmospheric and oceanic variables could possibly lead to less flow-dependent error covariance for atmospheric variables owing to the difference in the time scales between atmospheric and oceanic variables. A verification of the analyses showed positive impacts of applying the ocean model to EnKF on precipitation forecasts. The use of EnKF with the coupled model system captured intensity changes of a tropical cyclone better than it did with an uncoupled atmosphere model, even though the impact on the track forecast was negligibly small.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014ACP....14.7837V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014ACP....14.7837V"><span>Time-lagged ensemble simulations of the dispersion of the Eyjafjallajökull plume over Europe with COSMO-ART</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vogel, H.; Förstner, J.; Vogel, B.; Hanisch, T.; Mühr, B.; Schättler, U.; Schad, T.</p> <p>2014-08-01</p> <p>An extended version of the German operational weather forecast model was used to simulate the ash dispersion during the eruption of the Eyjafjallajökull. As an operational forecast was launched every 6 hours, a time-lagged ensemble was obtained. Sensitivity runs show the ability of the model to simulate thin ash layers when an increased vertical resolution is used. Calibration of the model results with measured data allows for a quantitative forecast of the ash concentration. After this calibration an independent comparison of the simulated number concentration of 3 μm particles and observations at Hohenpeißenberg gives a correlation coefficient of 0.79. However, this agreement could only be reached after additional modifications of the emissions. Based on the time lagged ensemble the conditional probability of violation of a certain threshold is calculated. Improving the ensemble technique used in our study such probabilities could become valuable information for the forecasters advising the organizations responsible for the closing of the airspace.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.2367B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.2367B"><span>ESiWACE: A Center of Excellence for HPC applications to support cloud resolving earth system modelling</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Biercamp, Joachim; Adamidis, Panagiotis; Neumann, Philipp</p> <p>2017-04-01</p> <p>With the exa-scale era approaching, length and time scales used for climate research on one hand and numerical weather prediction on the other hand blend into each other. The Centre of Excellence in Simulation of Weather and Climate in Europe (ESiWACE) represents a European consortium comprising partners from climate, weather and HPC in their effort to address key scientific challenges that both communities have in common. A particular challenge is to reach global models with spatial resolutions that allow simulating convective clouds and small-scale ocean eddies. These simulations would produce better predictions of trends and provide much more fidelity in the representation of high-impact regional events. However, running such models in operational mode, i.e with sufficient throughput in ensemble mode clearly will require exa-scale computing and data handling capability. We will discuss the ESiWACE initiative and relate it to work-in-progress on high-resolution simulations in Europe. We present recent strong scalability measurements from ESiWACE to demonstrate current computability in weather and climate simulation. A special focus in this particular talk is on the Icosahedal Nonhydrostatic (ICON) model used for a comparison of high resolution regional and global simulations with high quality observation data. We demonstrate that close-to-optimal parallel efficiency can be achieved in strong scaling global resolution experiments on Mistral/DKRZ, e.g. 94% for 5km resolution simulations using 36k cores on Mistral/DKRZ. Based on our scalability and high-resolution experiments, we deduce and extrapolate future capabilities for ICON that are expected for weather and climate research at exascale.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_11");'>11</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li class="active"><span>13</span></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_13 --> <div id="page_14" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="261"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25844624','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25844624"><span>Individual differences in ensemble perception reveal multiple, independent levels of ensemble representation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Haberman, Jason; Brady, Timothy F; Alvarez, George A</p> <p>2015-04-01</p> <p>Ensemble perception, including the ability to "see the average" from a group of items, operates in numerous feature domains (size, orientation, speed, facial expression, etc.). Although the ubiquity of ensemble representations is well established, the large-scale cognitive architecture of this process remains poorly defined. We address this using an individual differences approach. In a series of experiments, observers saw groups of objects and reported either a single item from the group or the average of the entire group. High-level ensemble representations (e.g., average facial expression) showed complete independence from low-level ensemble representations (e.g., average orientation). In contrast, low-level ensemble representations (e.g., orientation and color) were correlated with each other, but not with high-level ensemble representations (e.g., facial expression and person identity). These results suggest that there is not a single domain-general ensemble mechanism, and that the relationship among various ensemble representations depends on how proximal they are in representational space. (c) 2015 APA, all rights reserved).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4428879','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4428879"><span>Argumentation Based Joint Learning: A Novel Ensemble Learning Approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Xu, Junyi; Yao, Li; Li, Le</p> <p>2015-01-01</p> <p>Recently, ensemble learning methods have been widely used to improve classification performance in machine learning. In this paper, we present a novel ensemble learning method: argumentation based multi-agent joint learning (AMAJL), which integrates ideas from multi-agent argumentation, ensemble learning, and association rule mining. In AMAJL, argumentation technology is introduced as an ensemble strategy to integrate multiple base classifiers and generate a high performance ensemble classifier. We design an argumentation framework named Arena as a communication platform for knowledge integration. Through argumentation based joint learning, high quality individual knowledge can be extracted, and thus a refined global knowledge base can be generated and used independently for classification. We perform numerous experiments on multiple public datasets using AMAJL and other benchmark methods. The results demonstrate that our method can effectively extract high quality knowledge for ensemble classifier and improve the performance of classification. PMID:25966359</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JPRS..101...36D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JPRS..101...36D"><span>Evaluating the utility of the medium-spatial resolution Landsat 8 multispectral sensor in quantifying aboveground biomass in uMgeni catchment, South Africa</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dube, Timothy; Mutanga, Onisimo</p> <p>2015-03-01</p> <p>Aboveground biomass estimation is critical in understanding forest contribution to regional carbon cycles. Despite the successful application of high spatial and spectral resolution sensors in aboveground biomass (AGB) estimation, there are challenges related to high acquisition costs, small area coverage, multicollinearity and limited availability. These challenges hamper the successful regional scale AGB quantification. The aim of this study was to assess the utility of the newly-launched medium-resolution multispectral Landsat 8 Operational Land Imager (OLI) dataset with a large swath width, in quantifying AGB in a forest plantation. We applied different sets of spectral analysis (test I: spectral bands; test II: spectral vegetation indices and test III: spectral bands + spectral vegetation indices) in testing the utility of Landsat 8 OLI using two non-parametric algorithms: stochastic gradient boosting and the random forest ensembles. The results of the study show that the medium-resolution multispectral Landsat 8 OLI dataset provides better AGB estimates for Eucalyptus dunii, Eucalyptus grandis and Pinus taeda especially when using the extracted spectral information together with the derived spectral vegetation indices. We also noted that incorporating the optimal subset of the most important selected medium-resolution multispectral Landsat 8 OLI bands improved AGB accuracies. We compared medium-resolution multispectral Landsat 8 OLI AGB estimates with Landsat 7 ETM + estimates and the latter yielded lower estimation accuracies. Overall, this study demonstrates the invaluable potential and strength of applying the relatively affordable and readily available newly-launched medium-resolution Landsat 8 OLI dataset, with a large swath width (185-km) in precisely estimating AGB. This strength of the Landsat OLI dataset is crucial especially in sub-Saharan Africa where high-resolution remote sensing data availability remains a challenge.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PApGe.tmp...22J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PApGe.tmp...22J"><span>Intraseasonal Variability of the Indian Monsoon as Simulated by a Global Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Joshi, Sneh; Kar, S. C.</p> <p>2018-01-01</p> <p>This study uses the global forecast system (GFS) model at T126 horizontal resolution to carry out seasonal simulations with prescribed sea-surface temperatures. Main objectives of the study are to evaluate the simulated Indian monsoon variability in intraseasonal timescales. The GFS model has been integrated for 29 monsoon seasons with 15 member ensembles forced with observed sea-surface temperatures (SSTs) and additional 16-member ensemble runs have been carried out using climatological SSTs. Northward propagation of intraseasonal rainfall anomalies over the Indian region from the model simulations has been examined. It is found that the model is unable to simulate the observed moisture pattern when the active zone of convection is over central India. However, the model simulates the observed pattern of specific humidity during the life cycle of northward propagation on day - 10 and day + 10 of maximum convection over central India. The space-time spectral analysis of the simulated equatorial waves shows that the ensemble members have varying amount of power in each band of wavenumbers and frequencies. However, variations among ensemble members are more in the antisymmetric component of westward moving waves and maximum difference in power is seen in the 8-20 day mode among ensemble members.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JHyd..548..391C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JHyd..548..391C"><span>Representing radar rainfall uncertainty with ensembles based on a time-variant geostatistical error modelling approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cecinati, Francesca; Rico-Ramirez, Miguel Angel; Heuvelink, Gerard B. M.; Han, Dawei</p> <p>2017-05-01</p> <p>The application of radar quantitative precipitation estimation (QPE) to hydrology and water quality models can be preferred to interpolated rainfall point measurements because of the wide coverage that radars can provide, together with a good spatio-temporal resolutions. Nonetheless, it is often limited by the proneness of radar QPE to a multitude of errors. Although radar errors have been widely studied and techniques have been developed to correct most of them, residual errors are still intrinsic in radar QPE. An estimation of uncertainty of radar QPE and an assessment of uncertainty propagation in modelling applications is important to quantify the relative importance of the uncertainty associated to radar rainfall input in the overall modelling uncertainty. A suitable tool for this purpose is the generation of radar rainfall ensembles. An ensemble is the representation of the rainfall field and its uncertainty through a collection of possible alternative rainfall fields, produced according to the observed errors, their spatial characteristics, and their probability distribution. The errors are derived from a comparison between radar QPE and ground point measurements. The novelty of the proposed ensemble generator is that it is based on a geostatistical approach that assures a fast and robust generation of synthetic error fields, based on the time-variant characteristics of errors. The method is developed to meet the requirement of operational applications to large datasets. The method is applied to a case study in Northern England, using the UK Met Office NIMROD radar composites at 1 km resolution and at 1 h accumulation on an area of 180 km by 180 km. The errors are estimated using a network of 199 tipping bucket rain gauges from the Environment Agency. 183 of the rain gauges are used for the error modelling, while 16 are kept apart for validation. The validation is done by comparing the radar rainfall ensemble with the values recorded by the validation rain gauges. The validated ensemble is then tested on a hydrological case study, to show the advantage of probabilistic rainfall for uncertainty propagation. The ensemble spread only partially captures the mismatch between the modelled and the observed flow. The residual uncertainty can be attributed to other sources of uncertainty, in particular to model structural uncertainty, parameter identification uncertainty, uncertainty in other inputs, and uncertainty in the observed flow.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.5942K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.5942K"><span>Visualization and Nowcasting for Aviation using online verified ensemble weather radar extrapolation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kaltenboeck, Rudolf; Kerschbaum, Markus; Hennermann, Karin; Mayer, Stefan</p> <p>2013-04-01</p> <p>Nowcasting of precipitation events, especially thunderstorm events or winter storms, has high impact on flight safety and efficiency for air traffic management. Future strategic planning by air traffic control will result in circumnavigation of potential hazardous areas, reduction of load around efficiency hot spots by offering alternatives, increase of handling capacity, anticipation of avoidance manoeuvres and increase of awareness before dangerous areas are entered by aircraft. To facilitate this rapid update forecasts of location, intensity, size, movement and development of local storms are necessary. Weather radar data deliver precipitation analysis of high temporal and spatial resolution close to real time by using clever scanning strategies. These data are the basis to generate rapid update forecasts in a time frame up to 2 hours and more for applications in aviation meteorological service provision, such as optimizing safety and economic impact in the context of sub-scale phenomena. On the basis of tracking radar echoes by correlation the movement vectors of successive weather radar images are calculated. For every new successive radar image a set of ensemble precipitation fields is collected by using different parameter sets like pattern match size, different time steps, filter methods and an implementation of history of tracking vectors and plausibility checks. This method considers the uncertainty in rain field displacement and different scales in time and space. By validating manually a set of case studies, the best verification method and skill score is defined and implemented into an online-verification scheme which calculates the optimized forecasts for different time steps and different areas by using different extrapolation ensemble members. To get information about the quality and reliability of the extrapolation process additional information of data quality (e.g. shielding in Alpine areas) is extrapolated and combined with an extrapolation-quality-index. Subsequently the probability and quality information of the forecast ensemble is available and flexible blending to numerical prediction model for each subarea is possible. Simultaneously with automatic processing the ensemble nowcasting product is visualized in a new innovative way which combines the intensity, probability and quality information for different subareas in one forecast image.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1915066M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1915066M"><span>Can limited area NWP and/or RCM models improve on large scales inside their domain?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mesinger, Fedor; Veljovic, Katarina</p> <p>2017-04-01</p> <p>In a paper in press in Meteorology and Atmospheric Physics at the time this abstract is being written, Mesinger and Veljovic point out four requirements that need to be fulfilled by a limited area model (LAM), be it in NWP or RCM environment, to improve on large scales inside its domain. First, NWP/RCM model needs to be run on a relatively large domain. Note that domain size in quite inexpensive compared to resolution. Second, NWP/RCM model should not use more forcing at its boundaries than required by the mathematics of the problem. That means prescribing lateral boundary conditions only at its outside boundary, with one less prognostic variable prescribed at the outflow than at the inflow parts of the boundary. Next, nudging towards the large scales of the driver model must not be used, as it would obviously be nudging in the wrong direction if the nested model can improve on large scales inside its domain. And finally, the NWP/RCM model must have features that enable development of large scales improved compared to those of the driver model. This would typically include higher resolution, but obviously does not have to. Integrations showing improvements in large scales by LAM ensemble members are summarized in the mentioned paper in press. Ensemble members referred to are run using the Eta model, and are driven by ECMWF 32-day ensemble members, initialized 0000 UTC 4 October 2012. The Eta model used is the so-called "upgraded Eta," or "sloping steps Eta," which is free of the Gallus-Klemp problem of weak flow in the lee of the bell-shaped topography, seemed to many as suggesting the eta coordinate to be ill suited for high resolution models. The "sloping steps" in fact represent a simple version of the cut cell scheme. Accuracy of forecasting the position of jet stream winds, chosen to be those of speeds greater than 45 m/s at 250 hPa, expressed by Equitable Threat (or Gilbert) skill scores adjusted to unit bias (ETSa) was taken to show the skill at large scales. Average rms wind difference at 250 hPa compared to ECMWF analyses was used as another verification measure. With 21 members run, at about the same resolution of the driver global and the nested Eta during the first 10 days of the experiment, both verification measures generally demonstrate advantage of the Eta, in particular during and after the time of a deep upper tropospheric trough crossing the Rockies at the first 2-6 days of the experiment. Rerunning the Eta ensemble switched to use sigma (Eta/sigma) showed this advantage of the Eta to come to a considerable degree, but not entirely, from its use of the eta coordinate. Compared to cumulative scores of the ensembles run, this is demonstrated to even a greater degree by the number of "wins" of one model vs. another. Thus, at 4.5 day time when the trough just about crossed the Rockies, all 21 Eta/eta members have better ETSa scores than their ECMWF driver members. Eta/sigma has 19 members improving upon ECMWF, but loses to Eta/eta by a score of as much as 20 to 1. ECMWF members do better with rms scores, losing to Eta/eta by 18 vs. 3, but winning over Eta/sigma by 12 to 9. Examples of wind plots behind these results are shown, and additional reasons possibly helping or not helping the results summarized are discussed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H53J1615A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H53J1615A"><span>Downscaling SMAP Radiometer Soil Moisture over the CONUS using Soil-Climate Information and Ensemble Learning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Abbaszadeh, P.; Moradkhani, H.</p> <p>2017-12-01</p> <p>Soil moisture contributes significantly towards the improvement of weather and climate forecast and understanding terrestrial ecosystem processes. It is known as a key hydrologic variable in the agricultural drought monitoring, flood modeling and irrigation management. While satellite retrievals can provide an unprecedented information on soil moisture at global-scale, the products are generally at coarse spatial resolutions (25-50 km2). This often hampers their use in regional or local studies, which normally require a finer resolution of the data set. This work presents a new framework based on an ensemble learning method while using soil-climate information derived from remote-sensing and ground-based observations to downscale the level 3 daily composite version (L3_SM_P) of SMAP radiometer soil moisture over the Continental U.S. (CONUS) at 1 km spatial resolution. In the proposed method, a suite of remotely sensed and in situ data sets in addition to soil texture information and topography data among others were used. The downscaled product was validated against in situ soil moisture measurements collected from a limited number of core validation sites and several hundred sparse soil moisture networks throughout the CONUS. The obtained results indicated a great potential of the proposed methodology to derive the fine resolution soil moisture information applicable for fine resolution hydrologic modeling, data assimilation and other regional studies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A22F..05C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A22F..05C"><span>Consistency and Main Differences Between European Regional Climate Downscaling Intercomparison Results; From PRUDENCE and ENSEMBLES to CORDEX</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Christensen, J. H.; Larsen, M. A. D.; Christensen, O. B.; Drews, M.</p> <p>2017-12-01</p> <p>For more than 20 years, coordinated efforts to apply regional climate models to downscale GCM simulations for Europe have been pursued by an ever increasing group of scientists. This endeavor showed its first results during EU framework supported projects such as RACCS and MERCURE. Here, the foundation for today's advanced worldwide CORDEX approach was laid out by a core of six research teams, who conducted some of the first coordinated RCM simulations with the aim to assess regional climate change for Europe. However, it was realized at this stage that model bias in GCMs as well as RCMs made this task very challenging. As an immediate outcome, the idea was conceived to make an even more coordinated effort by constructing a well-defined and structured set of common simulations; this lead to the PRUDENCE project (2001-2004). Additional coordinated efforts involving ever increasing numbers of GCMs and RCMs followed in ENSEMBLES (2004-2009) and the ongoing Euro-CORDEX (officially commenced 2011) efforts. Along with the overall coordination, simulations have increased their standard resolution from 50km (PRUDENCE) to about 12km (Euro-CORDEX) and from time slice simulations (PRUDENCE) to transient experiments (ENSEMBLES and CORDEX); from one driving model and emission scenario (PRUDENCE) to several (Euro-CORDEX). So far, this wealth of simulations have been used to assess the potential impacts of future climate change in Europe providing a baseline change as defined by a multi-model mean change with associated uncertainties calculated from model spread in the ensemble. But how has the overall picture of state-of-the-art regional climate change projections changed over this period of almost two decades? Here we compare across scenarios, model resolutions and model vintage the results from PRUDENCE, ENSEMBLES and Euro-CORDEX. By appropriate scaling we identify robust findings about the projected future of European climate expressed by temperature and precipitation changes that confirm the basic findings of PRUDENCE. For parameters such as snow cover and soil moisture availability we also identify major new results, which illustrate that model improvements and higher resolution offer new, physically grounded, robust information that could not have been identified twenty years ago with the approach taken at that time</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.9263C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.9263C"><span>Impact of climate change upon vector born diseases in Europe and Africa using ENSEMBLES Regional Climate Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Caminade, Cyril; Morse, Andy</p> <p>2010-05-01</p> <p>Climate variability is an important component in determining the incidence of a number of diseases with significant human/animal health and socioeconomic impacts. The most important diseases affecting health are vector-borne, such as malaria, Rift Valley Fever and including those that are tick borne, with over 3 billion of the world population at risk. Malaria alone is responsible for at least one million deaths annually, with 80% of malaria deaths occurring in sub-Saharan Africa. The climate has a large impact upon the incidence of vector-borne diseases; directly via the development rates and survival of both the pathogen and the vector, and indirectly through changes in the environmental conditions. A large ensemble of regional climate model simulations has been produced within the ENSEMBLES project framework for both the European and African continent. This work will present recent progress in human and animal disease modelling, based on high resolution climate observations and regional climate simulations. Preliminary results will be given as an illustration, including the impact of climate change upon bluetongue (disease affecting the cattle) over Europe and upon malaria and Rift Valley Fever over Africa. Malaria scenarios based on RCM ensemble simulations have been produced for West Africa. These simulations have been carried out using the Liverpool Malaria Model. Future projections highlight that the malaria incidence decreases at the northern edge of the Sahel and that the epidemic belt is shifted southward in autumn. This could lead to significant public health problems in the future as the demography is expected to dramatically rise over Africa for the 21st century.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMNG33A1851H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMNG33A1851H"><span>DART: A Community Facility Providing State-of-the-Art, Efficient Ensemble Data Assimilation for Large (Coupled) Geophysical Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hoar, T. J.; Anderson, J. L.; Collins, N.; Kershaw, H.; Hendricks, J.; Raeder, K.; Mizzi, A. P.; Barré, J.; Gaubert, B.; Madaus, L. E.; Aydogdu, A.; Raeder, J.; Arango, H.; Moore, A. M.; Edwards, C. A.; Curchitser, E. N.; Escudier, R.; Dussin, R.; Bitz, C. M.; Zhang, Y. F.; Shrestha, P.; Rosolem, R.; Rahman, M.</p> <p>2016-12-01</p> <p>Strongly-coupled ensemble data assimilation with multiple high-resolution model components requires massive state vectors which need to be efficiently stored and accessed throughout the assimilation process. Supercomputer architectures are tending towards increasing the number of cores per node but have the same or less memory per node. Recent advances in the Data Assimilation Research Testbed (DART), a freely-available community ensemble data assimilation facility that works with dozens of large geophysical models, have addressed the need to run with a smaller memory footprint on a higher node count by utilizing MPI-2 one-sided communication to do non-blocking asynchronous access of distributed data. DART runs efficiently on many computational platforms ranging from laptops through thousands of cores on the newest supercomputers. Benefits of the new DART implementation will be shown. In addition, overviews of the most recently supported models will be presented: CAM-CHEM, WRF-CHEM, CM1, OpenGGCM, FESOM, ROMS, CICE5, TerrSysMP (COSMO, CLM, ParFlow), JULES, and CABLE. DART provides a comprehensive suite of software, documentation, and tutorials that can be used for ensemble data assimilation research, operations, and education. Scientists and software engineers at NCAR are available to support DART users who want to use existing DART products or develop their own applications. Current DART users range from university professors teaching data assimilation, to individual graduate students working with simple models, through national laboratories and state agencies doing operational prediction with large state-of-the-art models.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..1512094G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..1512094G"><span>Hydro-meteorological evaluation of downscaled global ensemble rainfall forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gaborit, Étienne; Anctil, François; Fortin, Vincent; Pelletier, Geneviève</p> <p>2013-04-01</p> <p>Ensemble rainfall forecasts are of high interest for decision making, as they provide an explicit and dynamic assessment of the uncertainty in the forecast (Ruiz et al. 2009). However, for hydrological forecasting, their low resolution currently limits their use to large watersheds (Maraun et al. 2010). In order to bridge this gap, various implementations of the statistic-stochastic multi-fractal downscaling technique presented by Perica and Foufoula-Georgiou (1996) were compared, bringing Environment Canada's global ensemble rainfall forecasts from a 100 by 70-km resolution down to 6 by 4-km, while increasing each pixel's rainfall variance and preserving its original mean. For comparison purposes, simpler methods were also implemented such as the bi-linear interpolation, which disaggregates global forecasts without modifying their variance. The downscaled meteorological products were evaluated using different scores and diagrams, from both a meteorological and a hydrological view points. The meteorological evaluation was conducted comparing the forecasted rainfall depths against nine days of observed values taken from Québec City rain gauge database. These 9 days present strong precipitation events occurring during the summer of 2009. For the hydrologic evaluation, the hydrological models SWMM5 and (a modified version of) GR4J were implemented on a small 6 km2 urban catchment located in the Québec City region. Ensemble hydrologic forecasts with a time step of 3 hours were then performed over a 3-months period of the summer of 2010 using the original and downscaled ensemble rainfall forecasts. The most important conclusions of this work are that the overall quality of the forecasts was preserved during the disaggregation procedure and that the disaggregated products using this variance-enhancing method were of similar quality than bi-linear interpolation products. However, variance and dispersion of the different members were, of course, much improved for the variance-enhanced products, compared to the bi-linear interpolation, which is a decisive advantage. The disaggregation technique of Perica and Foufoula-Georgiou (1996) hence represents an interesting way of bridging the gap between the meteorological models' resolution and the high degree of spatial precision sometimes required by hydrological models in their precipitation representation. References Maraun, D., Wetterhall, F., Ireson, A. M., Chandler, R. E., Kendon, E. J., Widmann, M., Brienen, S., Rust, H. W., Sauter, T., Themeßl, M., Venema, V. K. C., Chun, K. P., Goodess, C. M., Jones, R. G., Onof, C., Vrac, M., and Thiele-Eich, I. 2010. Precipitation downscaling under climate change: recent developments to bridge the gap between dynamical models and the end user. Reviews of Geophysics, 48 (3): RG3003, [np]. Doi: 10.1029/2009RG000314. Perica, S., and Foufoula-Georgiou, E. 1996. Model for multiscale disaggregation of spatial rainfall based on coupling meteorological and scaling descriptions. Journal Of Geophysical Research, 101(D21): 26347-26361. Ruiz, J., Saulo, C. and Kalnay, E. 2009. Comparison of Methods Used to Generate Probabilistic Quantitative Precipitation Forecasts over South America. Weather and forecasting, 24: 319-336. DOI: 10.1175/2008WAF2007098.1 This work is distributed under the Creative Commons Attribution 3.0 Unported License together with an author copyright. This license does not conflict with the regulations of the Crown Copyright.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PhDT.......164G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PhDT.......164G"><span>Quantifying the Influence of Dynamics Across Scales on Regional Climate Uncertainty in Western North America</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Goldenson, Naomi L.</p> <p></p> <p>Uncertainties in climate projections at the regional scale are inevitably larger than those for global mean quantities. Here, focusing on western North American regional climate, several approaches are taken to quantifying uncertainties starting with the output of global climate model projections. Internal variance is found to be an important component of the projection uncertainty up and down the west coast. To quantify internal variance and other projection uncertainties in existing climate models, we evaluate different ensemble configurations. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find internal variability can be quantified consistently using a large ensemble or an ensemble of opportunity that includes small ensembles from multiple models and climate scenarios. The latter offers the advantage of also producing estimates of uncertainty due to model differences. We conclude that climate projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible. We then conduct a small single-model ensemble of simulations using the Model for Prediction Across Scales with physics from the Community Atmosphere Model Version 5 (MPAS-CAM5) and prescribed historical sea surface temperatures. In the global variable resolution domain, the finest resolution (at 30 km) is in our region of interest over western North America and upwind over the northeast Pacific. In the finer-scale region, extreme precipitation from atmospheric rivers (ARs) is connected to tendencies in seasonal snowpack in mountains of the Northwest United States and California. In most of the Cascade Mountains, winters with more AR days are associated with less snowpack, in contrast to the northern Rockies and California's Sierra Nevadas. In snowpack observations and reanalysis of the atmospheric circulation, we find similar relationships between frequency of AR events and winter season snowpack in the western United States. In spring, however, there is not a clear relationship between number of AR days and seasonal mean snowpack across the model ensemble, so caution is urged in interpreting the historical record in the spring season. Finally, the representation of the El Nino Southern Oscillation (ENSO)--an important source of interannual climate predictability in some regions--is explored in a large single-model ensemble using ensemble Empirical Orthogonal Functions (EOFs) to find modes of variance across the entire ensemble at once. The leading EOF is ENSO. The principal components (PCs) of the next three EOFs exhibit a lead-lag relationship with the ENSO signal captured in the first PC. The second PC, with most of its variance in the summer season, is the most strongly cross-correlated with the first. This approach offers insight into how the model considered represents this important atmosphere-ocean interaction. Taken together these varied approaches quantify the implications of climate projections regionally, identify processes that make snowpack water resources vulnerable, and seek insight into how to better simulate the large-scale climate modes controlling regional variability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AGUFM.B32A..06J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AGUFM.B32A..06J"><span>FLUXCOM - Overview and First Synthesis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jung, M.; Ichii, K.; Tramontana, G.; Camps-Valls, G.; Schwalm, C. R.; Papale, D.; Reichstein, M.; Gans, F.; Weber, U.</p> <p>2015-12-01</p> <p>We present a community effort aiming at generating an ensemble of global gridded flux products by upscaling FLUXNET data using an array of different machine learning methods including regression/model tree ensembles, neural networks, and kernel machines. We produced products for gross primary production, terrestrial ecosystem respiration, net ecosystem exchange, latent heat, sensible heat, and net radiation for two experimental protocols: 1) at a high spatial and 8-daily temporal resolution (5 arc-minute) using only remote sensing based inputs for the MODIS era; 2) 30 year records of daily, 0.5 degree spatial resolution by incorporating meteorological driver data. Within each set-up, all machine learning methods were trained with the same input data for carbon and energy fluxes respectively. Sets of input driver variables were derived using an extensive formal variable selection exercise. The performance of the extrapolation capacities of the approaches is assessed with a fully internally consistent cross-validation. We perform cross-consistency checks of the gridded flux products with independent data streams from atmospheric inversions (NEE), sun-induced fluorescence (GPP), catchment water balances (LE, H), satellite products (Rn), and process-models. We analyze the uncertainties of the gridded flux products and for example provide a breakdown of the uncertainty of mean annual GPP originating from different machine learning methods, different climate input data sets, and different flux partitioning methods. The FLUXCOM archive will provide an unprecedented source of information for water, energy, and carbon cycle studies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.S43H2978L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.S43H2978L"><span>Sensitivity of a Bayesian atmospheric-transport inversion model to spatio-temporal sensor resolution applied to the 2006 North Korean nuclear test</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lundquist, K. A.; Jensen, D. D.; Lucas, D. D.</p> <p>2017-12-01</p> <p>Atmospheric source reconstruction allows for the probabilistic estimate of source characteristics of an atmospheric release using observations of the release. Performance of the inversion depends partially on the temporal frequency and spatial scale of the observations. The objective of this study is to quantify the sensitivity of the source reconstruction method to sparse spatial and temporal observations. To this end, simulations of atmospheric transport of noble gasses are created for the 2006 nuclear test at the Punggye-ri nuclear test site. Synthetic observations are collected from the simulation, and are taken as "ground truth". Data denial techniques are used to progressively coarsen the temporal and spatial resolution of the synthetic observations, while the source reconstruction model seeks to recover the true input parameters from the synthetic observations. Reconstructed parameters considered here are source location, source timing and source quantity. Reconstruction is achieved by running an ensemble of thousands of dispersion model runs that sample from a uniform distribution of the input parameters. Machine learning is used to train a computationally-efficient surrogate model from the ensemble simulations. Monte Carlo sampling and Bayesian inversion are then used in conjunction with the surrogate model to quantify the posterior probability density functions of source input parameters. This research seeks to inform decision makers of the tradeoffs between more expensive, high frequency observations and less expensive, low frequency observations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20150002810&hterms=filters&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D40%26Ntt%3Dfilters','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20150002810&hterms=filters&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D40%26Ntt%3Dfilters"><span>Simultaneous Radar and Satellite Data Storm-Scale Assimilation Using an Ensemble Kalman Filter Approach for 24 May 2011</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Jones, Thomas A.; Stensrud, David; Wicker, Louis; Minnis, Patrick; Palikonda, Rabindra</p> <p>2015-01-01</p> <p>Assimilating high-resolution radar reflectivity and radial velocity into convection-permitting numerical weather prediction models has proven to be an important tool for improving forecast skill of convection. The use of satellite data for the application is much less well understood, only recently receiving significant attention. Since both radar and satellite data provide independent information, combing these two sources of data in a robust manner potentially represents the future of high-resolution data assimilation. This research combines Geostationary Operational Environmental Satellite 13 (GOES-13) cloud water path (CWP) retrievals with Weather Surveillance Radar-1988 Doppler (WSR-88D) reflectivity and radial velocity to examine the impacts of assimilating each for a severe weather event occurring in Oklahoma on 24 May 2011. Data are assimilated into a 3-km model using an ensemble adjustment Kalman filter approach with 36 members over a 2-h assimilation window between 1800 and 2000 UTC. Forecasts are then generated for 90 min at 5-min intervals starting at 1930 and 2000 UTC. Results show that both satellite and radar data are able to initiate convection, but that assimilating both spins up a storm much faster. Assimilating CWP also performs well at suppressing spurious precipitation and cloud cover in the model as well as capturing the anvil characteristics of developed storms. Radar data are most effective at resolving the 3D characteristics of the core convection. Assimilating both satellite and radar data generally resulted in the best model analysis and most skillful forecast for this event.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27910450','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27910450"><span>First order reversal curves (FORC) analysis of individual magnetic nanostructures using micro-Hall magnetometry.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pohlit, Merlin; Eibisch, Paul; Akbari, Maryam; Porrati, Fabrizio; Huth, Michael; Müller, Jens</p> <p>2016-11-01</p> <p>Alongside the development of artificially created magnetic nanostructures, micro-Hall magnetometry has proven to be a versatile tool to obtain high-resolution hysteresis loop data and access dynamical properties. Here we explore the application of First Order Reversal Curves (FORC)-a technique well-established in the field of paleomagnetism for studying grain-size and interaction effects in magnetic rocks-to individual and dipolar-coupled arrays of magnetic nanostructures using micro-Hall sensors. A proof-of-principle experiment performed on a macroscopic piece of a floppy disk as a reference sample well known in the literature demonstrates that the FORC diagrams obtained by magnetic stray field measurements using home-built magnetometers are in good agreement with magnetization data obtained by a commercial vibrating sample magnetometer. We discuss in detail the FORC diagrams and their interpretation of three different representative magnetic systems, prepared by the direct-write Focused Electron Beam Induced Deposition (FEBID) technique: (1) an isolated Co-nanoisland showing a simple square-shaped hysteresis loop, (2) a more complex CoFe-alloy nanoisland exhibiting a wasp-waist-type hysteresis, and (3) a cluster of interacting Co-nanoislands. Our findings reveal that the combination of FORC and micro-Hall magnetometry is a promising tool to investigate complex magnetization reversal processes within individual or small ensembles of nanomagnets grown by FEBID or other fabrication methods. The method provides sub-μm spatial resolution and bridges the gap of FORC analysis, commonly used for studying macroscopic samples and rather large arrays, to studies of small ensembles of interacting nanoparticles with the high moment sensitivity inherent to micro-Hall magnetometry.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011OcSci...7..805B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011OcSci...7..805B"><span>Usefulness of high resolution coastal models for operational oil spill forecast: the "Full City" accident</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Broström, G.; Carrasco, A.; Hole, L. R.; Dick, S.; Janssen, F.; Mattsson, J.; Berger, S.</p> <p>2011-11-01</p> <p>Oil spill modeling is considered to be an important part of a decision support system (DeSS) for oil spill combatment and is useful for remedial action in case of accidents, as well as for designing the environmental monitoring system that is frequently set up after major accidents. Many accidents take place in coastal areas, implying that low resolution basin scale ocean models are of limited use for predicting the trajectories of an oil spill. In this study, we target the oil spill in connection with the "Full City" accident on the Norwegian south coast and compare operational simulations from three different oil spill models for the area. The result of the analysis is that all models do a satisfactory job. The "standard" operational model for the area is shown to have severe flaws, but by applying ocean forcing data of higher resolution (1.5 km resolution), the model system shows results that compare well with observations. The study also shows that an ensemble of results from the three different models is useful when predicting/analyzing oil spill in coastal areas.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011OcScD...8.1467B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011OcScD...8.1467B"><span>Usefulness of high resolution coastal models for operational oil spill forecast: the Full City accident</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Broström, G.; Carrasco, A.; Hole, L. R.; Dick, S.; Janssen, F.; Mattsson, J.; Berger, S.</p> <p>2011-06-01</p> <p>Oil spill modeling is considered to be an important decision support system (DeSS) useful for remedial action in case of accidents, as well as for designing the environmental monitoring system that is frequently set up after major accidents. Many accidents take place in coastal areas implying that low resolution basin scale ocean models is of limited use for predicting the trajectories of an oil spill. In this study, we target the oil spill in connection with the Full City accident on the Norwegian south coast and compare three different oil spill models for the area. The result of the analysis is that all models do a satisfactory job. The "standard" operational model for the area is shown to have severe flaws but including an analysis based on a higher resolution model (1.5 km resolution) for the area the model system show results that compare well with observations. The study also shows that an ensemble using three different models is useful when predicting/analyzing oil spill in coastal areas.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1235087','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1235087"><span>Statistical Projections for Multi-resolution, Multi-dimensional Visual Data Exploration and Analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Hoa T. Nguyen; Stone, Daithi; E. Wes Bethel</p> <p>2016-01-01</p> <p>An ongoing challenge in visual exploration and analysis of large, multi-dimensional datasets is how to present useful, concise information to a user for some specific visualization tasks. Typical approaches to this problem have proposed either reduced-resolution versions of data, or projections of data, or both. These approaches still have some limitations such as consuming high computation or suffering from errors. In this work, we explore the use of a statistical metric as the basis for both projections and reduced-resolution versions of data, with a particular focus on preserving one key trait in data, namely variation. We use two different casemore » studies to explore this idea, one that uses a synthetic dataset, and another that uses a large ensemble collection produced by an atmospheric modeling code to study long-term changes in global precipitation. The primary findings of our work are that in terms of preserving the variation signal inherent in data, that using a statistical measure more faithfully preserves this key characteristic across both multi-dimensional projections and multi-resolution representations than a methodology based upon averaging.« less</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_12");'>12</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li class="active"><span>14</span></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_14 --> <div id="page_15" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="281"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.H11O..04S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.H11O..04S"><span>High-resolution multimodel projections of soil moisture drought in Europe under 1.5, 2 and 3 degree global warming</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Samaniego, L. E.; Kumar, R.; Zink, M.; Pan, M.; Wanders, N.; Marx, A.; Sheffield, J.; Wood, E. F.; Thober, S.</p> <p>2017-12-01</p> <p>Droughts are creeping hydro-meteorological events that may bring societies and natural systems to their limits by inducing significant environmental changes and large socio-economic losses. Little is know about the effects of varios degrees of warming (i.e., 1.5 , 2 and 3 K) and their respective uncertainties on extreme characteristics such as drought duration and area under drought in general, and in Europe in particular. In this study we investigate the evolution of droughts characteristics under three levels of warming using an unprecedented high-resolution multi-model hydrologic ensemble over the Pan-EU domain at a scale of 5x5 km2 from 1950 until 2100. This multi-model ensemble comprises four hydrologic models (HMs: mHM, Noah-MP, PCR-GLOBWB, VIC) which are forced by five CMIP-5 Global Climate Models (GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, NorESM1-M) under three RCP scenarios 2.6, 6.0, and 8.5. This results in a 60-member ensemble. The contribution GCM/HM uncertainties were analyzed based on a sequential sampling algorithm proposed by Samaniego et al. 2016. This study is carried out within the EDgE project funded by the Copernicus Climate Change Service (edge.climate.copernicus.eu) and the HOKLIM project funded by the German Ministry of Education (BMBF)(www.ufz.de/hoklim). The changes under three levels of warming indicate significant increase (more than 10%) of the number of droughts and area under drought with respect to 30-year climatological means obtained with E-OBS observations. Furthermore, we found that: 1) the number of drought events exhibit significant regional changes. Largest changes are observed in the Mediterrinian where frequency of droughts increases from 25% under 1.5 K to 33% under 2 K, and to more than 50% under 3 K warming. Minor changes are seen in Central-Europe and the British Isles. 2) The GCMs/HMs uncertainties have marked regional differences too, with GCM uncertainty appear to be larger everywhere. The uncertainty of HMs are, however, similar to those of the GCMs in the Iberian peninsula due to different representation of evapotranspiration and soil moisture dynamics. And, 3) despite the large uncertainty in the full ensemble, significant positive trends have been observed in all drought characteristics that intensify with increased global warming.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1612412A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1612412A"><span>Numerical Error Estimation with UQ</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ackmann, Jan; Korn, Peter; Marotzke, Jochem</p> <p>2014-05-01</p> <p>Ocean models are still in need of means to quantify model errors, which are inevitably made when running numerical experiments. The total model error can formally be decomposed into two parts, the formulation error and the discretization error. The formulation error arises from the continuous formulation of the model not fully describing the studied physical process. The discretization error arises from having to solve a discretized model instead of the continuously formulated model. Our work on error estimation is concerned with the discretization error. Given a solution of a discretized model, our general problem statement is to find a way to quantify the uncertainties due to discretization in physical quantities of interest (diagnostics), which are frequently used in Geophysical Fluid Dynamics. The approach we use to tackle this problem is called the "Goal Error Ensemble method". The basic idea of the Goal Error Ensemble method is that errors in diagnostics can be translated into a weighted sum of local model errors, which makes it conceptually based on the Dual Weighted Residual method from Computational Fluid Dynamics. In contrast to the Dual Weighted Residual method these local model errors are not considered deterministically but interpreted as local model uncertainty and described stochastically by a random process. The parameters for the random process are tuned with high-resolution near-initial model information. However, the original Goal Error Ensemble method, introduced in [1], was successfully evaluated only in the case of inviscid flows without lateral boundaries in a shallow-water framework and is hence only of limited use in a numerical ocean model. Our work consists in extending the method to bounded, viscous flows in a shallow-water framework. As our numerical model, we use the ICON-Shallow-Water model. In viscous flows our high-resolution information is dependent on the viscosity parameter, making our uncertainty measures viscosity-dependent. We will show that we can choose a sensible parameter by using the Reynolds-number as a criteria. Another topic, we will discuss is the choice of the underlying distribution of the random process. This is especially of importance in the scope of lateral boundaries. We will present resulting error estimates for different height- and velocity-based diagnostics applied to the Munk gyre experiment. References [1] F. RAUSER: Error Estimation in Geophysical Fluid Dynamics through Learning; PhD Thesis, IMPRS-ESM, Hamburg, 2010 [2] F. RAUSER, J. MAROTZKE, P. KORN: Ensemble-type numerical uncertainty quantification from single model integrations; SIAM/ASA Journal on Uncertainty Quantification, submitted</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.4171K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.4171K"><span>A New Multivariate Approach in Generating Ensemble Meteorological Forcings for Hydrological Forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Khajehei, Sepideh; Moradkhani, Hamid</p> <p>2015-04-01</p> <p>Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3845530','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3845530"><span>Transient regional climate change: analysis of the summer climate response in a high-resolution, century-scale, ensemble experiment over the continental United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Diffenbaugh, Noah S.; Ashfaq, Moetasim; Scherer, Martin</p> <p>2013-01-01</p> <p>Integrating the potential for climate change impacts into policy and planning decisions requires quantification of the emergence of sub-regional climate changes that could occur in response to transient changes in global radiative forcing. Here we report results from a high-resolution, century-scale, ensemble simulation of climate in the United States, forced by atmospheric constituent concentrations from the Special Report on Emissions Scenarios (SRES) A1B scenario. We find that 21st century summer warming permanently emerges beyond the baseline decadal-scale variability prior to 2020 over most areas of the continental U.S. Permanent emergence beyond the baseline annual-scale variability shows much greater spatial heterogeneity, with emergence occurring prior to 2030 over areas of the southwestern U.S., but not prior to the end of the 21st century over much of the southcentral and southeastern U.S. The pattern of emergence of robust summer warming contrasts with the pattern of summer warming magnitude, which is greatest over the central U.S. and smallest over the western U.S. In addition to stronger warming, the central U.S. also exhibits stronger coupling of changes in surface air temperature, precipitation, and moisture and energy fluxes, along with changes in atmospheric circulation towards increased anticylonic anomalies in the mid-troposphere and a poleward shift in the mid-latitude jet aloft. However, as a fraction of the baseline variability, the transient warming over the central U.S. is smaller than the warming over the southwestern or northeastern U.S., delaying the emergence of the warming signal over the central U.S. Our comparisons with observations and the Coupled Model Intercomparison Project Phase 3 (CMIP3) ensemble of global climate model experiments suggest that near-term global warming is likely to cause robust sub-regional-scale warming over areas that exhibit relatively little baseline variability. In contrast, where there is greater variability in the baseline climate dynamics, there can be greater variability in the response to elevated greenhouse forcing, decreasing the robustness of the transient warming signal. PMID:24307747</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=ensemble&pg=3&id=EJ1048255','ERIC'); return false;" href="https://eric.ed.gov/?q=ensemble&pg=3&id=EJ1048255"><span>Collaborative Composing in High School String Chamber Music Ensembles</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Hopkins, Michael T.</p> <p>2015-01-01</p> <p>The purpose of this study was to examine collaborative composing in high school string chamber music ensembles. Research questions included the following: (a) How do high school string instrumentalists in chamber music ensembles use verbal and musical forms of communication to collaboratively compose a piece of music? (b) How do selected variables…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009ClDy...33..723S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009ClDy...33..723S"><span>Ability of an ensemble of regional climate models to reproduce weather regimes over Europe-Atlantic during the period 1961-2000</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sanchez-Gomez, Emilia; Somot, S.; Déqué, M.</p> <p>2009-10-01</p> <p>One of the main concerns in regional climate modeling is to which extent limited-area regional climate models (RCM) reproduce the large-scale atmospheric conditions of their driving general circulation model (GCM). In this work we investigate the ability of a multi-model ensemble of regional climate simulations to reproduce the large-scale weather regimes of the driving conditions. The ensemble consists of a set of 13 RCMs on a European domain, driven at their lateral boundaries by the ERA40 reanalysis for the time period 1961-2000. Two sets of experiments have been completed with horizontal resolutions of 50 and 25 km, respectively. The spectral nudging technique has been applied to one of the models within the ensemble. The RCMs reproduce the weather regimes behavior in terms of composite pattern, mean frequency of occurrence and persistence reasonably well. The models also simulate well the long-term trends and the inter-annual variability of the frequency of occurrence. However, there is a non-negligible spread among the models which is stronger in summer than in winter. This spread is due to two reasons: (1) we are dealing with different models and (2) each RCM produces an internal variability. As far as the day-to-day weather regime history is concerned, the ensemble shows large discrepancies. At daily time scale, the model spread has also a seasonal dependence, being stronger in summer than in winter. Results also show that the spectral nudging technique improves the model performance in reproducing the large-scale of the driving field. In addition, the impact of increasing the number of grid points has been addressed by comparing the 25 and 50 km experiments. We show that the horizontal resolution does not affect significantly the model performance for large-scale circulation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AGUFMGC51A0736C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AGUFMGC51A0736C"><span>Simulation of an ensemble of future climate time series with an hourly weather generator</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Caporali, E.; Fatichi, S.; Ivanov, V. Y.; Kim, J.</p> <p>2010-12-01</p> <p>There is evidence that climate change is occurring in many regions of the world. The necessity of climate change predictions at the local scale and fine temporal resolution is thus warranted for hydrological, ecological, geomorphological, and agricultural applications that can provide thematic insights into the corresponding impacts. Numerous downscaling techniques have been proposed to bridge the gap between the spatial scales adopted in General Circulation Models (GCM) and regional analyses. Nevertheless, the time and spatial resolutions obtained as well as the type of meteorological variables may not be sufficient for detailed studies of climate change effects at the local scales. In this context, this study presents a stochastic downscaling technique that makes use of an hourly weather generator to simulate time series of predicted future climate. Using a Bayesian approach, the downscaling procedure derives distributions of factors of change for several climate statistics from a multi-model ensemble of GCMs. Factors of change are sampled from their distributions using a Monte Carlo technique to entirely account for the probabilistic information obtained with the Bayesian multi-model ensemble. Factors of change are subsequently applied to the statistics derived from observations to re-evaluate the parameters of the weather generator. The weather generator can reproduce a wide set of climate variables and statistics over a range of temporal scales, from extremes, to the low-frequency inter-annual variability. The final result of such a procedure is the generation of an ensemble of hourly time series of meteorological variables that can be considered as representative of future climate, as inferred from GCMs. The generated ensemble of scenarios also accounts for the uncertainty derived from multiple GCMs used in downscaling. Applications of the procedure in reproducing present and future climates are presented for different locations world-wide: Tucson (AZ), Detroit (MI), and Firenze (Italy). The stochastic downscaling is carried out with eight GCMs from the CMIP3 multi-model dataset (IPCC 4AR, A1B scenario).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20160003588&hterms=risk+climate&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Drisk%2Bclimate','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20160003588&hterms=risk+climate&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Drisk%2Bclimate"><span>Development and Evaluation of High-Resolution Climate Simulations Over the Mountainous Northeastern United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Winter, Jonathan M.; Beckage, Brian; Bucini, Gabriela; Horton, Radley M.; Clemins, Patrick J.</p> <p>2016-01-01</p> <p>The mountain regions of the northeastern United States are a critical socioeconomic resource for Vermont, New York State, New Hampshire, Maine, and southern Quebec. While global climate models (GCMs) are important tools for climate change risk assessment at regional scales, even the increased spatial resolution of statistically downscaled GCMs (commonly approximately 1/ 8 deg) is not sufficient for hydrologic, ecologic, and land-use modeling of small watersheds within the mountainous Northeast. To address this limitation, an ensemble of topographically downscaled, high-resolution (30"), daily 2-m maximum air temperature; 2-m minimum air temperature; and precipitation simulations are developed for the mountainous Northeast by applying an additional level of downscaling to intermediately downscaled (1/ 8 deg) data using high-resolution topography and station observations. First, observed relationships between 2-m air temperature and elevation and between precipitation and elevation are derived. Then, these relationships are combined with spatial interpolation to enhance the resolution of intermediately downscaled GCM simulations. The resulting topographically downscaled dataset is analyzed for its ability to reproduce station observations. Topographic downscaling adds value to intermediately downscaled maximum and minimum 2-m air temperature at high-elevation stations, as well as moderately improves domain-averaged maximum and minimum 2-m air temperature. Topographic downscaling also improves mean precipitation but not daily probability distributions of precipitation. Overall, the utility of topographic downscaling is dependent on the initial bias of the intermediately downscaled product and the magnitude of the elevation adjustment. As the initial bias or elevation adjustment increases, more value is added to the topographically downscaled product.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..17.2959M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..17.2959M"><span>Decision Support on the Sediments Flushing of Aimorés Dam Using Medium-Range Ensemble Forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mainardi Fan, Fernando; Schwanenberg, Dirk; Collischonn, Walter; Assis dos Reis, Alberto; Alvarado Montero, Rodolfo; Alencar Siqueira, Vinicius</p> <p>2015-04-01</p> <p>In the present study we investigate the use of medium-range streamflow forecasts in the Doce River basin (Brazil), at the reservoir of Aimorés Hydro Power Plant (HPP). During daily operations this reservoir acts as a "trap" to the sediments that originate from the upstream basin of the Doce River. This motivates a cleaning process called "pass through" to periodically remove the sediments from the reservoir. The "pass through" or "sediments flushing" process consists of a decrease of the reservoir's water level to a certain flushing level when a determined reservoir inflow threshold is forecasted. Then, the water in the approaching inflow is used to flush the sediments from the reservoir through the spillway and to recover the original reservoir storage. To be triggered, the sediments flushing operation requires an inflow larger than 3000m³/s in a forecast horizon of 7 days. This lead-time of 7 days is far beyond the basin's concentration time (around 2 days), meaning that the forecasts for the pass through procedure highly depends on Numerical Weather Predictions (NWP) models that generate Quantitative Precipitation Forecasts (QPF). This dependency creates an environment with a high amount of uncertainty to the operator. To support the decision making at Aimorés HPP we developed a fully operational hydrological forecasting system to the basin. The system is capable of generating ensemble streamflow forecasts scenarios when driven by QPF data from meteorological Ensemble Prediction Systems (EPS). This approach allows accounting for uncertainties in the NWP at a decision making level. This system is starting to be used operationally by CEMIG and is the one shown in the present study, including a hindcasting analysis to assess the performance of the system for the specific flushing problem. The QPF data used in the hindcasting study was derived from the TIGGE (THORPEX Interactive Grand Global Ensemble) database. Among all EPS available on TIGGE, three were selected: ECMWF, GEFS, and CPTEC. As a deterministic reference forecast, we adopt the high resolution ECMWF forecast for comparison. The experiment consisted on running retrospective forecasts for a full five-year period. To verify the proposed objectives of the study, we use different metrics to evaluate the forecast: ROC Curves, Exceedance Diagrams, Forecast Convergence Score (FCS). Metrics results enabled to understand the benefits of the hydrological ensemble prediction system as a decision making tool for the HPP operation. The ROC scores indicate that the use of the lower percentiles of the ensemble scenarios issues for a true alarm rate around 0,5 to 0,8 (depending on the model and on the percentile), for the lead time of seven days. While the false alarm rate is between 0 and 0,3. Those rates were better than the ones resulting from the deterministic reference forecast. Exceedance diagrams and forecast convergence scores indicate that the ensemble scenarios provide an early signal about the threshold crossing. Furthermore, the ensemble forecasts are more consistent between two subsequent forecasts in comparison to the deterministic forecast. The assessments results also give more credibility to CEMIG in the realization and communication of flushing operation with the stakeholders involved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1817703W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1817703W"><span>Detailed climate-change projections for urban land-use change and green-house gas increases for Belgium with COSMO-CLM coupled to TERRA_URB</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wouters, Hendrik; Vanden Broucke, Sam; van Lipzig, Nicole; Demuzere, Matthias</p> <p>2016-04-01</p> <p>Recent research clearly show that climate modelling at high resolution - which resolve the deep convection, the detailed orography and land-use including urbanization - leads to better modelling performance with respect to temperatures, the boundary-layer, clouds and precipitation. The increasing computational power enables the climate research community to address climate-change projections with higher accuracy and much more detail. In the framework of the CORDEX.be project aiming for coherent high-resolution micro-ensemble projections for Belgium employing different GCMs and RCMs, the KU Leuven contributes by means of the downscaling of EC-EARTH global climate model projections (provided by the Royal Meteorological Institute of the Netherlands) to the Belgian domain. The downscaling is obtained with regional climate simulations at 12.5km resolution over Europe (CORDEX-EU domain) and at 2.8km resolution over Belgium (CORDEX.be domain) using COSMO-CLM coupled to urban land-surface parametrization TERRA_URB. This is done for the present-day (1975-2005) and future (2040 → 2070 and 2070 → 2100). In these high-resolution runs, both GHG changes (in accordance to RCP8.5) and urban land-use changes (in accordance to a business-as-usual urban expansion scenario) are taken into account. Based on these simulations, it is shown how climate-change statistics are modified when going from coarse resolution modelling to high-resolution modelling. The climate-change statistics of particular interest are the changes in number of extreme precipitation events and extreme heat waves in cities. Hereby, it is futher investigated for the robustness of the signal change between the course and high-resolution and whether a (statistical) translation is possible. The different simulations also allow to address the relative impact and synergy between the urban expansion and increased GHG on the climate-change statistics. Hereby, it is investigated for which climate-change statistics the urban heat island and urban expansion is relevant, and to what extent the urban expansion can be included in the coarse-to-high resolution translation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A14C..07P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A14C..07P"><span>Air-Sea Interaction Processes in Low and High-Resolution Coupled Climate Model Simulations for the Southeast Pacific</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Porto da Silveira, I.; Zuidema, P.; Kirtman, B. P.</p> <p>2017-12-01</p> <p>The rugged topography of the Andes Cordillera along with strong coastal upwelling, strong sea surface temperatures (SST) gradients and extensive but geometrically-thin stratocumulus decks turns the Southeast Pacific (SEP) into a challenge for numerical modeling. In this study, hindcast simulations using the Community Climate System Model (CCSM4) at two resolutions were analyzed to examine the importance of resolution alone, with the parameterizations otherwise left unchanged. The hindcasts were initialized on January 1 with the real-time oceanic and atmospheric reanalysis (CFSR) from 1982 to 2003, forming a 10-member ensemble. The two resolutions are (0.1o oceanic and 0.5o atmospheric) and (1.125o oceanic and 0.9o atmospheric). The SST error growth in the first six days of integration (fast errors) and those resulted from model drift (saturated errors) are assessed and compared towards evaluating the model processes responsible for the SST error growth. For the high-resolution simulation, SST fast errors are positive (+0.3oC) near the continental borders and negative offshore (-0.1oC). Both are associated with a decrease in cloud cover, a weakening of the prevailing southwesterly winds and a reduction of latent heat flux. The saturated errors possess a similar spatial pattern, but are larger and are more spatially concentrated. This suggests that the processes driving the errors already become established within the first week, in contrast to the low-resolution simulations. These, instead, manifest too-warm SSTs related to too-weak upwelling, driven by too-strong winds and Ekman pumping. Nevertheless, the ocean surface tends to be cooler in the low-resolution simulation than the high-resolution due to a higher cloud cover. Throughout the integration, saturated SST errors become positive and could reach values up to +4oC. These are accompanied by upwelling dumping and a decrease in cloud cover. High and low resolution models presented notable differences in how SST errors variability drove atmospheric changes, especially because the high resolution is sensitive to resurgence regions. This allows the model to resolve cloud heights and establish different radiative feedbacks.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/19206638','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/19206638"><span>Band-filling of solution-synthesized CdS nanowires.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Puthussery, James; Lan, Aidong; Kosel, Thomas H; Kuno, Masaru</p> <p>2008-02-01</p> <p>The band edge optical characterization of solution-synthesized CdS nanowires (NWs) is described. Investigated wires are made through a solution-liquid-solid approach that entails the use of low-melting bimetallic catalyst particles to seed NW growth. Resulting diameters are approximately 14 nm, and lengths exceed 1 microm. Ensemble diameter distributions are approximately 13%, with corresponding intrawire diameter variations of approximately 5%. High-resolution transmission electron micrographs show that the wires are highly crystalline and have the wurtzite structure with growth along at least two directions: [0001] and [1010]. Band edge emission is observed with estimated quantum yields between approximately 0.05% and 1%. Complementary photoluminescence excitation spectra show structure consistent with the linear absorption. Carrier cooling dynamics are subsequently examined through ensemble lifetime and transient differential absorption measurements. The former reveals unexpectedly long band edge decays that extend beyond tens of nanoseconds. The latter indicates rapid intraband carrier cooling on time scales of 300-400 fs. Subsequent recovery at the band edge contains significant Auger contributions at high intensities which are usurped by other, possibly surface-related, carrier relaxation pathways at lower intensities. Furthermore, an unusual intensity-dependent transient broadening is seen, connected with these long decays. The effect likely stems from band-filling on the basis of an analysis of observed spectral shifts and line widths.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.5043R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.5043R"><span>MiKlip-PRODEF: Probabilistic Decadal Forecast for Central and Western Europe</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Reyers, Mark; Haas, Rabea; Ludwig, Patrick; Pinto, Joaquim</p> <p>2013-04-01</p> <p>The demand for skilful climate predictions on time-scales of several years to decades has increased in recent years, in particular for economic, societal and political terms. Within the BMBF MiKlip consortium, a decadal prediction system on the global to local scale is currently being developed. The subproject PRODEF is part of the MiKlip-Module C, which aims at the regionalisation of decadal predictability for Central and Western Europe. In PRODEF, a combined statistical-dynamical downscaling (SDD) and a probabilistic forecast tool are developed and applied to the new Earth system model of the Max-Planck Institute Hamburg (MPI-ESM), which is part of the CMIP5 experiment. Focus is given on the decadal predictability of windstorms, related wind gusts as well as wind energy potentials. SDD combines the benefits of both high resolution dynamical downscaling and purely statistical downscaling of GCM output. Hence, the SDD approach is used to obtain a very large ensemble of highly resolved decadal forecasts. With respect to the focal points of PRODEF, a clustering of temporal evolving atmospheric fields, a circulation weather type (CWT) analysis, and a storm damage indices analysis is applied to the full ensemble of the decadal hindcast experiments of the MPI-ESM in its lower resolution (MPI-ESM-LR). The ensemble consists of up to ten realisations per yearly initialised decadal hindcast experiments for the period 1960-2010 (altogether 287 realisations). Representatives of CWTs / clusters and single storm episodes are dynamical downscaled with the regional climate model COSMO-CLM with a horizontal resolution of 0.22°. For each model grid point, the distributions of the local climate parameters (e.g. surface wind gusts) are determined for different periods (e.g. each decades) by recombining dynamical downscaled episodes weighted with the respective weather type frequencies. The applicability of the SDD approach is illustrated with examples of decadal forecasts of the MPI-ESM. We are able to perform a bias correction of the frequencies of large scale weather types and to quantify the uncertainties of decadal predictability on large and local scale arising from different initial conditions. Further, probability density functions of local parameters like e.g. wind gusts for different periods and decades derived from the SDD approach is compared to observations and reanalysis data. Skill scores are used to quantify the decadal predictability for different leading time periods and to analyse whether the SDD approach shows systematic errors for some regions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4457290','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4457290"><span>Integrative, Dynamic Structural Biology at Atomic Resolution—It’s About Time</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>van den Bedem, Henry; Fraser, James S.</p> <p>2015-01-01</p> <p>Biomolecules adopt a dynamic ensemble of conformations, each with the potential to interact with binding partners or perform the chemical reactions required for a multitude of cellular functions. Recent advances in X-ray crystallography, Nuclear Magnetic Resonance (NMR) spectroscopy, and other techniques are helping us realize the dream of seeing—in atomic detail—how different parts of biomolecules exchange between functional sub-states using concerted motions. Integrative structural biology has advanced our understanding of the formation of large macromolecular complexes and how their components interact in assemblies by leveraging data from many low-resolution methods. Here, we review the growing opportunities for integrative, dynamic structural biology at the atomic scale, contending there is increasing synergistic potential between X-ray crystallography, NMR, and computer simulations to reveal a structural basis for protein conformational dynamics at high resolution. PMID:25825836</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMIN21D0065T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMIN21D0065T"><span>The NASA Reanalysis Ensemble Service - Advanced Capabilities for Integrated Reanalysis Access and Intercomparison</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.</p> <p>2017-12-01</p> <p>NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AdSpR..49...64B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AdSpR..49...64B"><span>Detection of sub-kilometer craters in high resolution planetary images using shape and texture features</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bandeira, Lourenço; Ding, Wei; Stepinski, Tomasz F.</p> <p>2012-01-01</p> <p>Counting craters is a paramount tool of planetary analysis because it provides relative dating of planetary surfaces. Dating surfaces with high spatial resolution requires counting a very large number of small, sub-kilometer size craters. Exhaustive manual surveys of such craters over extensive regions are impractical, sparking interest in designing crater detection algorithms (CDAs). As a part of our effort to design a CDA, which is robust and practical for planetary research analysis, we propose a crater detection approach that utilizes both shape and texture features to identify efficiently sub-kilometer craters in high resolution panchromatic images. First, a mathematical morphology-based shape analysis is used to identify regions in an image that may contain craters; only those regions - crater candidates - are the subject of further processing. Second, image texture features in combination with the boosting ensemble supervised learning algorithm are used to accurately classify previously identified candidates into craters and non-craters. The design of the proposed CDA is described and its performance is evaluated using a high resolution image of Mars for which sub-kilometer craters have been manually identified. The overall detection rate of the proposed CDA is 81%, the branching factor is 0.14, and the overall quality factor is 72%. This performance is a significant improvement over the previous CDA based exclusively on the shape features. The combination of performance level and computational efficiency offered by this CDA makes it attractive for practical application.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27806613','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27806613"><span>Transitional hemodynamics in intracranial aneurysms - Comparative velocity investigations with high resolution lattice Boltzmann simulations, normal resolution ANSYS simulations, and MR imaging.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Jain, Kartik; Jiang, Jingfeng; Strother, Charles; Mardal, Kent-André</p> <p>2016-11-01</p> <p>Blood flow in intracranial aneurysms has, until recently, been considered to be disturbed but still laminar. Recent high resolution computational studies have demonstrated, in some situations, however, that the flow may exhibit high frequency fluctuations that resemble weakly turbulent or transitional flow. Due to numerous assumptions required for simplification in computational fluid dynamics (CFD) studies, the occurrence of these events, in vivo, remains unsettled. The detection of these fluctuations in aneurysmal blood flow, i.e., hemodynamics by CFD, poses additional challenges as such phenomena cannot be captured in clinical data acquisition with magnetic resonance (MR) due to inadequate temporal and spatial resolutions. The authors' purpose was to address this issue by comparing results from highly resolved simulations, conventional resolution laminar simulations, and MR measurements, identify the differences, and identify their causes. Two aneurysms in the basilar artery, one with disturbed yet laminar flow and the other with transitional flow, were chosen. One set of highly resolved direct numerical simulations using the lattice Boltzmann method (LBM) and another with adequate resolutions under laminar flow assumption were conducted using a commercially available ANSYS Fluent solver. The velocity fields obtained from simulation results were qualitatively and statistically compared against each other and with MR acquisition. Results from LBM, ANSYS Fluent, and MR agree well qualitatively and quantitatively for one of the aneurysms with laminar flow in which fluctuations were <80 Hz. The comparisons for the second aneurysm with high fluctuations of > ∼ 600 Hz showed vivid differences between LBM, ANSYS Fluent, and magnetic resonance imaging. After ensemble averaging and down-sampling to coarser space and time scales, these differences became minimal. A combination of MR derived data and CFD can be helpful in estimating the hemodynamic environment of intracranial aneurysms. Adequately resolved CFD would suffice gross assessment of hemodynamics, potentially in a clinical setting, and highly resolved CFD could be helpful in a detailed and retrospective understanding of the physiological mechanisms.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017HESS...21.4681B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017HESS...21.4681B"><span>A national-scale seasonal hydrological forecast system: development and evaluation over Britain</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bell, Victoria A.; Davies, Helen N.; Kay, Alison L.; Brookshaw, Anca; Scaife, Adam A.</p> <p>2017-09-01</p> <p>Skilful winter seasonal predictions for the North Atlantic circulation and northern Europe have now been demonstrated and the potential for seasonal hydrological forecasting in the UK is now being explored. One of the techniques being used combines seasonal rainfall forecasts provided by operational weather forecast systems with hydrological modelling tools to provide estimates of seasonal mean river flows up to a few months ahead. The work presented here shows how spatial information contained in a distributed hydrological model typically requiring high-resolution (daily or better) rainfall data can be used to provide an initial condition for a much simpler forecast model tailored to use low-resolution monthly rainfall forecasts. Rainfall forecasts (<q>hindcasts</q>) from the GloSea5 model (1996 to 2009) are used to provide the first assessment of skill in these national-scale flow forecasts. The skill in the combined modelling system is assessed for different seasons and regions of Britain, and compared to what might be achieved using other approaches such as use of an ensemble of historical rainfall in a hydrological model, or a simple flow persistence forecast. The analysis indicates that only limited forecast skill is achievable for Spring and Summer seasonal hydrological forecasts; however, Autumn and Winter flows can be reasonably well forecast using (ensemble mean) rainfall forecasts based on either GloSea5 forecasts or historical rainfall (the preferred type of forecast depends on the region). Flow forecasts using ensemble mean GloSea5 rainfall perform most consistently well across Britain, and provide the most skilful forecasts overall at the 3-month lead time. Much of the skill (64 %) in the 1-month ahead seasonal flow forecasts can be attributed to the hydrological initial condition (particularly in regions with a significant groundwater contribution to flows), whereas for the 3-month ahead lead time, GloSea5 forecasts account for ˜ 70 % of the forecast skill (mostly in areas of high rainfall to the north and west) and only 30 % of the skill arises from hydrological memory (typically groundwater-dominated areas). Given the high spatial heterogeneity in typical patterns of UK rainfall and evaporation, future development of skilful spatially distributed seasonal forecasts could lead to substantial improvements in seasonal flow forecast capability, potentially benefitting practitioners interested in predicting hydrological extremes, not only in the UK but also across Europe.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..1412962J','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..1412962J"><span>Using HPC within an operational forecasting configuration</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Jagers, H. R. A.; Genseberger, M.; van den Broek, M. A. F. H.</p> <p>2012-04-01</p> <p>Various natural disasters are caused by high-intensity events, for example: extreme rainfall can in a short time cause major damage in river catchments, storms can cause havoc in coastal areas. To assist emergency response teams in operational decisions, it's important to have reliable information and predictions as soon as possible. This starts before the event by providing early warnings about imminent risks and estimated probabilities of possible scenarios. In the context of various applications worldwide, Deltares has developed an open and highly configurable forecasting and early warning system: Delft-FEWS. Finding the right balance between simulation time (and hence prediction lead time) and simulation accuracy and detail is challenging. Model resolution may be crucial to capture certain critical physical processes. Uncertainty in forcing conditions may require running large ensembles of models; data assimilation techniques may require additional ensembles and repeated simulations. The computational demand is steadily increasing and data streams become bigger. Using HPC resources is a logical step; in different settings Delft-FEWS has been configured to take advantage of distributed computational resources available to improve and accelerate the forecasting process (e.g. Montanari et al, 2006). We will illustrate the system by means of a couple of practical applications including the real-time dynamic forecasting of wind driven waves, flow of water, and wave overtopping at dikes of Lake IJssel and neighboring lakes in the center of The Netherlands. Montanari et al., 2006. Development of an ensemble flood forecasting system for the Po river basin, First MAP D-PHASE Scientific Meeting, 6-8 November 2006, Vienna, Austria.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A52G..04C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A52G..04C"><span>Creating Weather System Ensembles Through Synergistic Process Modeling and Machine Learning</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chen, B.; Posselt, D. J.; Nguyen, H.; Wu, L.; Su, H.; Braverman, A. J.</p> <p>2017-12-01</p> <p>Earth's weather and climate are sensitive to a variety of control factors (e.g., initial state, forcing functions, etc). Characterizing the response of the atmosphere to a change in initial conditions or model forcing is critical for weather forecasting (ensemble prediction) and climate change assessment. Input - response relationships can be quantified by generating an ensemble of multiple (100s to 1000s) realistic realizations of weather and climate states. Atmospheric numerical models generate simulated data through discretized numerical approximation of the partial differential equations (PDEs) governing the underlying physics. However, the computational expense of running high resolution atmospheric state models makes generation of more than a few simulations infeasible. Here, we discuss an experiment wherein we approximate the numerical PDE solver within the Weather Research and Forecasting (WRF) Model using neural networks trained on a subset of model run outputs. Once trained, these neural nets can produce large number of realization of weather states from a small number of deterministic simulations with speeds that are orders of magnitude faster than the underlying PDE solver. Our neural network architecture is inspired by the governing partial differential equations. These equations are location-invariant, and consist of first and second derivations. As such, we use a 3x3 lon-lat grid of atmospheric profiles as the predictor in the neural net to provide the network the information necessary to compute the first and second moments. Results indicate that the neural network algorithm can approximate the PDE outputs with high degree of accuracy (less than 1% error), and that this error increases as a function of the prediction time lag.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_13");'>13</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li class="active"><span>15</span></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_15 --> <div id="page_16" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="301"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016NuPhB.910..842B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016NuPhB.910..842B"><span>I = 1 and I = 2 π-π scattering phase shifts from Nf = 2 + 1 lattice QCD</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bulava, John; Fahy, Brendan; Hörz, Ben; Juge, Keisuke J.; Morningstar, Colin; Wong, Chik Him</p> <p>2016-09-01</p> <p>The I = 1 p-wave and I = 2 s-wave elastic π-π scattering amplitudes are calculated from a first-principles lattice QCD simulation using a single ensemble of gauge field configurations with Nf = 2 + 1 dynamical flavors of anisotropic clover-improved Wilson fermions. This ensemble has a large spatial volume V =(3.7 fm)3, pion mass mπ = 230 MeV, and spatial lattice spacing as = 0.11 fm. Calculation of the necessary temporal correlation matrices is efficiently performed using the stochastic LapH method, while the large volume enables an improved energy resolution compared to previous work. For this single ensemble we obtain mρ /mπ = 3.350 (24), gρππ = 5.99 (26), and a clear signal for the I = 2 s-wave. The success of the stochastic LapH method in this proof-of-principle large-volume calculation paves the way for quantitative study of the lattice spacing effects and quark mass dependence of scattering amplitudes using state-of-the-art ensembles.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20100017230','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20100017230"><span>Forecasting Lightning Threat Using WRF Proxy Fields</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>McCaul, E. W., Jr.</p> <p>2010-01-01</p> <p>Objectives: Given that high-resolution WRF forecasts can capture the character of convective outbreaks, we seek to: 1. Create WRF forecasts of LTG threat (1-24 h), based on 2 proxy fields from explicitly simulated convection: - graupel flux near -15 C (captures LTG time variability) - vertically integrated ice (captures LTG threat area). 2. Calibrate each threat to yield accurate quantitative peak flash rate densities. 3. Also evaluate threats for areal coverage, time variability. 4. Blend threats to optimize results. 5. Examine sensitivity to model mesh, microphysics. Methods: 1. Use high-resolution 2-km WRF simulations to prognose convection for a diverse series of selected case studies. 2. Evaluate graupel fluxes; vertically integrated ice (VII). 3. Calibrate WRF LTG proxies using peak total LTG flash rate densities from NALMA; relationships look linear, with regression line passing through origin. 4. Truncate low threat values to make threat areal coverage match NALMA flash extent density obs. 5. Blend proxies to achieve optimal performance 6. Study CAPS 4-km ensembles to evaluate sensitivities.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017APS..DPPJ11010Q','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017APS..DPPJ11010Q"><span>High-Resolution Measurement of the Turbulent Frequency-Wavenumber Power Spectrum in a Laboratory Magnetosphere</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Qian, T. M.; Mauel, M. E.</p> <p>2017-10-01</p> <p>In a laboratory magnetosphere, plasma is confined by a strong dipole magnet, where interchange and entropy mode turbulence can be studied and controlled in near steady-state conditions. Whole-plasma imaging shows turbulence dominated by long wavelength modes having chaotic amplitudes and phases. Here, we report for the first time, high-resolution measurement of the frequency-wavenumber power spectrum by applying the method of Capon to simultaneous multi-point measurement of electrostatic entropy modes using an array of floating potential probes. Unlike previously reported measurements in which ensemble correlation between two probes detected only the dominant wavenumber, Capon's ``maximum likelihood method'' uses all available probes to produce a frequency-wavenumber spectrum, showing the existence of modes propagating in both electron and ion magnetic drift directions. We also discuss the wider application of this technique to laboratory and magnetospheric plasmas with simultaneous multi-point measurements. Supported by NSF-DOE Partnership in Plasma Science Grant DE-FG02-00ER54585.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1911237M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1911237M"><span>Improving medium-range ensemble streamflow forecasts through statistical post-processing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey</p> <p>2017-04-01</p> <p>Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011TellA..63..531K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011TellA..63..531K"><span>Verification and intercomparison of mesoscale ensemble prediction systems in the Beijing 2008 Olympics Research and Development Project</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui</p> <p>2011-05-01</p> <p>During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27487904','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27487904"><span>Resolution recovery for Compton camera using origin ensemble algorithm.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Andreyev, A; Celler, A; Ozsahin, I; Sitek, A</p> <p>2016-08-01</p> <p>Compton cameras (CCs) use electronic collimation to reconstruct the images of activity distribution. Although this approach can greatly improve imaging efficiency, due to complex geometry of the CC principle, image reconstruction with the standard iterative algorithms, such as ordered subset expectation maximization (OSEM), can be very time-consuming, even more so if resolution recovery (RR) is implemented. We have previously shown that the origin ensemble (OE) algorithm can be used for the reconstruction of the CC data. Here we propose a method of extending our OE algorithm to include RR. To validate the proposed algorithm we used Monte Carlo simulations of a CC composed of multiple layers of pixelated CZT detectors and designed for imaging small animals. A series of CC acquisitions of small hot spheres and the Derenzo phantom placed in air were simulated. Images obtained from (a) the exact data, (b) blurred data but reconstructed without resolution recovery, and (c) blurred and reconstructed with resolution recovery were compared. Furthermore, the reconstructed contrast-to-background ratios were investigated using the phantom with nine spheres placed in a hot background. Our simulations demonstrate that the proposed method allows for the recovery of the resolution loss that is due to imperfect accuracy of event detection. Additionally, tests of camera sensitivity corresponding to different detector configurations demonstrate that the proposed CC design has sensitivity comparable to PET. When the same number of events were considered, the computation time per iteration increased only by a factor of 2 when OE reconstruction with the resolution recovery correction was performed relative to the original OE algorithm. We estimate that the addition of resolution recovery to the OSEM would increase reconstruction times by 2-3 orders of magnitude per iteration. The results of our tests demonstrate the improvement of image resolution provided by the OE reconstructions with resolution recovery. The quality of images and their contrast are similar to those obtained from the OE reconstructions from scans simulated with perfect energy and spatial resolutions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A51K..07Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A51K..07Z"><span>Finding Snowmageddon: Detecting and quantifying northeastern U.S. snowstorms in a multi-decadal global climate ensemble</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zarzycki, C. M.</p> <p>2017-12-01</p> <p>The northeastern coast of the United States is particularly vulnerable to impacts from extratropical cyclones during winter months, which produce heavy precipitation, high winds, and coastal flooding. These impacts are amplified by the proximity of major population centers to common storm tracks and include risks to health and welfare, massive transportation disruption, lost spending productivity, power outages, and structural damage. Historically, understanding regional snowfall in climate models has generally centered around seasonal mean climatologies even though major impacts typically occur at the scales of hours to days. To quantify discrete snowstorms at the event level, we describe a new objective detection algorithm for gridded data based on the Regional Snowfall Index (RSI) produced by NOAA's National Centers for Environmental Information. The algorithm uses 6-hourly precipitation to collocate storm-integrated snowfall with population density to produce a distribution of snowstorms with societally relevant impacts. The algorithm is tested on the Community Earth System Model (CESM) Large Ensemble Project (LENS) data. Present day distributions of snowfall events is well-replicated within the ensemble. We discuss classification sensitivities to assumptions made in determining precipitation phase and snow water equivalent. We also explore projected reductions in mid-century and end-of-century snowstorms due to changes in snowfall rates and precipitation phase, as well as highlight potential improvements in storm representation from refined horizontal resolution in model simulations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20180002172','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20180002172"><span>The GMAO Hybrid Ensemble-Variational Atmospheric Data Assimilation System: Version 2.0</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Todling, Ricardo; El Akkraoui, Amal</p> <p>2018-01-01</p> <p>This document describes the implementation and usage of the Goddard Earth Observing System (GEOS) Hybrid Ensemble-Variational Atmospheric Data Assimilation System (Hybrid EVADAS). Its aim is to provide comprehensive guidance to users of GEOS ADAS interested in experimenting with its hybrid functionalities. The document is also aimed at providing a short summary of the state-of-science in this release of the hybrid system. As explained here, the ensemble data assimilation system (EnADAS) mechanism added to GEOS ADAS to enable hybrid data assimilation applications has been introduced to the pre-existing machinery of GEOS in the most non-intrusive possible way. Only very minor changes have been made to the original scripts controlling GEOS ADAS with the objective of facilitating its usage by both researchers and the GMAO's near-real-time Forward Processing applications. In a hybrid scenario two data assimilation systems run concurrently in a two-way feedback mode such that: the ensemble provides background ensemble perturbations required by the ADAS deterministic (typically high resolution) hybrid analysis; and the deterministic ADAS provides analysis information for recentering of the EnADAS analyses and information necessary to ensure that observation bias correction procedures are consistent between both the deterministic ADAS and the EnADAS. The nonintrusive approach to introducing hybrid capability to GEOS ADAS means, in particular, that previously existing features continue to be available. Thus, not only is this upgraded version of GEOS ADAS capable of supporting new applications such as Hybrid 3D-Var, 3D-EnVar, 4D-EnVar and Hybrid 4D-EnVar, it remains possible to use GEOS ADAS in its traditional 3D-Var mode which has been used in both MERRA and MERRA-2. Furthermore, as described in this document, GEOS ADAS also supports a configuration for exercising a purely ensemble-based assimilation strategy which can be fully decoupled from its variational component. We should point out that Release 1.0 of this document was made available to GMAO in mid-2013, when we introduced Hybrid 3D-Var capability to GEOS ADAS. This initial version of the documentation included a considerably different state-of-science introductory section but many of the same detailed description of the mechanisms of GEOS EnADAS. We are glad to report that a few of the desirable Future Works listed in Release 1.0 have now been added to the present version of GEOS EnADAS. These include the ability to exercise an Ensemble Prediction System that uses the ensemble analyses of GEOS EnADAS and (a very early, but functional version of) a tool to support Ensemble Forecast Sensitivity and Observation Impact applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.3746K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.3746K"><span>The Predictability of Dry-Season Precipitation in Tropical West Africa</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Knippertz, P.; Davis, J.; Fink, A. H.</p> <p>2012-04-01</p> <p>Precipitation during the boreal winter dry season in tropical West Africa is rare but occasionally connected to high-impacts for the local population. Previous work has shown that these events are usually connected to a trough over northwestern Africa, an extensive cloud plume on its eastern side, unusual precipitation at the northern and western fringes of the Sahara, and reduced surface pressure over the southern Sahara and Sahel, which allows an inflow of moist southerlies from the Gulf of Guinea to feed the unusual dry-season rainfalls. These results also suggest that the extratropical influence enhances the predictability of these events on the synoptic timescale. Here we further investigate this question for the 11 dry seasons (November-March) 1998/99-2008/09 using rainfall estimates from TRMM (Tropical Rainfall Measuring Mission) and GPCP (Global Precipitation Climatology Project), and operational ensemble predictions from the European Centre for Medium-Range Forecasts (ECMWF). All fields are averaged over the study area 7.5-15°N, 10°W-10°E that spans most of southern West Africa. For each 0000 UTC analysis time, the daily precipitation estimates are accumulated to pentads and compared with 120-hour predictions starting at the same time. Compared to TRMM, the ensemble mean shows a weak positive bias, whereas there is a substantial negative bias with regard to GPCP. Temporal correlations reach a high value of 0.8 for both datasets, showing similar synoptic variability despite the differences in total amount. Standard probabilistic evaluation methods such as relative operating characteristic (ROC) diagrams indicate remarkably good reliability, resolution and skill, particularly for lower precipitation thresholds. Not surprisingly, forecasts cluster at low probabilities for higher thresholds, but the reliability and ROC score are still reasonably high. The results show that global ensemble prediction systems are capable to predict dry-season rainfall events in southern West Africa well, at least on regional spatial and synoptic time scales. These results should encourage West African weather services to capitalize more on the valuable information provided by ensemble prediction systems during the dry season.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AdSR...14...77B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AdSR...14...77B"><span>Sensitivity of sea-level forecasting to the horizontal resolution and sea surface forcing for different configurations of an oceanographic model of the Adriatic Sea</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bressan, Lidia; Valentini, Andrea; Paccagnella, Tiziana; Montani, Andrea; Marsigli, Chiara; Stefania Tesini, Maria</p> <p>2017-04-01</p> <p>At the Hydro-meteo-climate service of the Regional environmental agency of Emilia-Romagna, Italy (Arpae-SIMC), the oceanographic numerical model AdriaROMS is used in the operational forecasting suite to compute sea level, temperature, salinity and 3-D current fields of the Adriatic Sea (northern Mediterranean Sea). In order to evaluate the performance of the sea-level forecast and to study different configurations of the ROMS model, two marine storms occurred on the Emilia Romagna coast during the winter 2015-2016 are investigated. The main focus of this study is to analyse the sensitivity of the model to the horizontal resolution and to the meteorological forcing. To this end, the model is run with two different configurations and with two horizontal grids at 1 and 2 km resolution. To study the influence of the meteorological forcing, the two storms have been reproduced by running ROMS in ensemble mode, forced by the 16-members of the meteorological ensemble COSMO-LEPS system. Possible optimizations of the model set-up are deduced by the comparison of the different run outputs.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016GMD.....9.4185H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016GMD.....9.4185H"><span>High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; Senior, Catherine A.; Bellucci, Alessio; Bao, Qing; Chang, Ping; Corti, Susanna; Fučkar, Neven S.; Guemas, Virginie; von Hardenberg, Jost; Hazeleger, Wilco; Kodama, Chihiro; Koenigk, Torben; Leung, L. Ruby; Lu, Jian; Luo, Jing-Jia; Mao, Jiafu; Mizielinski, Matthew S.; Mizuta, Ryo; Nobre, Paulo; Satoh, Masaki; Scoccimarro, Enrico; Semmler, Tido; Small, Justin; von Storch, Jin-Song</p> <p>2016-11-01</p> <p>Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relatively few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950-2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. HighResMIP thereby focuses on one of the CMIP6 broad questions, "what are the origins and consequences of systematic model biases?", but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120015888','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120015888"><span>A Robust Multi-Scale Modeling System for the Study of Cloud and Precipitation Processes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Tao, Wei-Kuo</p> <p>2012-01-01</p> <p>During the past decade, numerical weather and global non-hydrostatic models have started using more complex microphysical schemes originally developed for high resolution cloud resolving models (CRMs) with 1-2 km or less horizontal resolutions. These microphysical schemes affect the dynamic through the release of latent heat (buoyancy loading and pressure gradient) the radiation through the cloud coverage (vertical distribution of cloud species), and surface processes through rainfall (both amount and intensity). Recently, several major improvements of ice microphysical processes (or schemes) have been developed for cloud-resolving model (Goddard Cumulus Ensemble, GCE, model) and regional scale (Weather Research and Forecast, WRF) model. These improvements include an improved 3-ICE (cloud ice, snow and graupel) scheme (Lang et al. 2010); a 4-ICE (cloud ice, snow, graupel and hail) scheme and a spectral bin microphysics scheme and two different two-moment microphysics schemes. The performance of these schemes has been evaluated by using observational data from TRMM and other major field campaigns. In this talk, we will present the high-resolution (1 km) GeE and WRF model simulations and compared the simulated model results with observation from recent field campaigns [i.e., midlatitude continental spring season (MC3E; 2010), high latitude cold-season (C3VP, 2007; GCPEx, 2012), and tropical oceanic (TWP-ICE, 2006)].</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1715281G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1715281G"><span>Quantifying the effect of Tmax extreme events on local adaptation to climate change of maize crop in Andalusia for the 21st century</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gabaldon, Clara; Lorite, Ignacio J.; Ines Minguez, M.; Lizaso, Jon; Dosio, Alessandro; Sanchez, Enrique; Ruiz-Ramos, Margarita</p> <p>2015-04-01</p> <p>Extreme events of Tmax can threaten maize production on Andalusia (Ruiz-Ramos et al., 2011). The objective of this work is to attempt a quantification of the effects of Tmax extreme events on the previously identified (Gabaldón et al., 2013) local adaptation strategies to climate change of irrigated maize crop in Andalusia for the first half of the 21st century. This study is focused on five Andalusia locations. Local adaptation strategies identified consisted on combinations of changes on sowing dates and choice of cultivar (Gabaldón et al., 2013). Modified cultivar features were the duration of phenological phases and the grain filling rate. The phenological and yield simulations with the adaptative changes were obtained from a modelling chain: current simulated climate and future climate scenarios (2013-2050) were taken from a group of regional climate models at high resolution (25 km) from the European Project ENSEMBLES (http://www.ensembles-eu.org/). After bias correcting these data for temperature and precipitation (Dosio and Paruolo, 2011; Dosio et al., 2012) crop simulations were generated by the CERES-maize model (Jones and Kiniry, 1986) under DSSAT platform, previously calibrated and validated. Quantification of the effects of extreme Tmax on maize yield was computed for different phenological stages following Teixeira et al. (2013). A heat stress index was computed; this index assumes that yield-damage intensity due to heat stress increases linearly from 0.0 at a critical temperature to a maximum of 1.0 at a limit temperature. The decrease of crop yield is then computed by a normalized production damage index which combines attainable yield and heat stress index for each location. Selection of the most suitable adaptation strategy will be reviewed and discussed in light of the quantified effect on crop yield of the projected change of Tmax extreme events. This study will contribute to MACSUR knowledge Hub within the Joint Programming Initiative on Agriculture, Food Security and Climate Change (FACCE - JPI) of EU and is financed by MULCLIVAR project (CGL2012-38923-C02-02) and IFAPA project AGR6126 from Junta de Andalucía, Spain. References Dosio A. and Paruolo P., 2011. Bias correction of the ENSEMBLES high-resolution climate change projections for use by impact models: Evaluation on the present climate. Journal of Geophysical Research, VOL. 116, D16106, doi:10.1029/2011JD015934 Dosio A., Paruolo P. and Rojas R., 2012. Bias correction of the ENSEMBLES high resolution climate change projections for use by impact models: Analysis of the climate change signal. Journal of Geophysical Research, Volume 117, D17, doi: 0.1029/2012JD017968 Gabaldón C, Lorite IJ, Mínguez MI, Dosio A, Sánchez-Sánchez E and Ruiz-Ramos M, 2013. Evaluation of local adaptation strategies to climate change of maize crop in Andalusia for the first half of 21st century. Geophysical Research Abstracts. Vol. 15, EGU2013-13625, 2013. EGU General Assembly 2013, April 2013, Vienna, Austria. Jones C.A. and J.R. Kiniry. 1986. CERES-Maize: A simulation model of maize growth and development. Texas A&M Univ. Press, College Station. Ruiz-Ramos M., E. Sanchez, C. Galllardo, and M.I. Minguez. 2011. Impacts of projected maximum temperature extremes for C21 by an ensemble of regional climate models on cereal cropping systems in the Iberian Peninsula. Natural Hazards and Earth System Science 11: 3275-3291. Teixeira EI, Fischer G, van Velthuizen H, Walter C, Ewert F. Global hotspots of heat stress on agricultural crops due to climate change. Agric For Meteorol. 2013;170(15):206-215.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22416042-canonical-ensemble-state-averaged-complete-active-space-self-consistent-field-sa-casscf-strategy-problems-more-diabatic-than-adiabatic-states-charge-bond-resonance-monomethine-cyanines','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22416042-canonical-ensemble-state-averaged-complete-active-space-self-consistent-field-sa-casscf-strategy-problems-more-diabatic-than-adiabatic-states-charge-bond-resonance-monomethine-cyanines"><span>Canonical-ensemble state-averaged complete active space self-consistent field (SA-CASSCF) strategy for problems with more diabatic than adiabatic states: Charge-bond resonance in monomethine cyanines</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Olsen, Seth, E-mail: seth.olsen@uq.edu.au</p> <p>2015-01-28</p> <p>This paper reviews basic results from a theory of the a priori classical probabilities (weights) in state-averaged complete active space self-consistent field (SA-CASSCF) models. It addresses how the classical probabilities limit the invariance of the self-consistency condition to transformations of the complete active space configuration interaction (CAS-CI) problem. Such transformations are of interest for choosing representations of the SA-CASSCF solution that are diabatic with respect to some interaction. I achieve the known result that a SA-CASSCF can be self-consistently transformed only within degenerate subspaces of the CAS-CI ensemble density matrix. For uniformly distributed (“microcanonical”) SA-CASSCF ensembles, self-consistency is invariant tomore » any unitary CAS-CI transformation that acts locally on the ensemble support. Most SA-CASSCF applications in current literature are microcanonical. A problem with microcanonical SA-CASSCF models for problems with “more diabatic than adiabatic” states is described. The problem is that not all diabatic energies and couplings are self-consistently resolvable. A canonical-ensemble SA-CASSCF strategy is proposed to solve the problem. For canonical-ensemble SA-CASSCF, the equilibrated ensemble is a Boltzmann density matrix parametrized by its own CAS-CI Hamiltonian and a Lagrange multiplier acting as an inverse “temperature,” unrelated to the physical temperature. Like the convergence criterion for microcanonical-ensemble SA-CASSCF, the equilibration condition for canonical-ensemble SA-CASSCF is invariant to transformations that act locally on the ensemble CAS-CI density matrix. The advantage of a canonical-ensemble description is that more adiabatic states can be included in the support of the ensemble without running into convergence problems. The constraint on the dimensionality of the problem is relieved by the introduction of an energy constraint. The method is illustrated with a complete active space valence-bond (CASVB) analysis of the charge/bond resonance electronic structure of a monomethine cyanine: Michler’s hydrol blue. The diabatic CASVB representation is shown to vary weakly for “temperatures” corresponding to visible photon energies. Canonical-ensemble SA-CASSCF enables the resolution of energies and couplings for all covalent and ionic CASVB structures contributing to the SA-CASSCF ensemble. The CASVB solution describes resonance of charge- and bond-localized electronic structures interacting via bridge resonance superexchange. The resonance couplings can be separated into channels associated with either covalent charge delocalization or chemical bonding interactions, with the latter significantly stronger than the former.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25637978','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25637978"><span>Canonical-ensemble state-averaged complete active space self-consistent field (SA-CASSCF) strategy for problems with more diabatic than adiabatic states: charge-bond resonance in monomethine cyanines.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Olsen, Seth</p> <p>2015-01-28</p> <p>This paper reviews basic results from a theory of the a priori classical probabilities (weights) in state-averaged complete active space self-consistent field (SA-CASSCF) models. It addresses how the classical probabilities limit the invariance of the self-consistency condition to transformations of the complete active space configuration interaction (CAS-CI) problem. Such transformations are of interest for choosing representations of the SA-CASSCF solution that are diabatic with respect to some interaction. I achieve the known result that a SA-CASSCF can be self-consistently transformed only within degenerate subspaces of the CAS-CI ensemble density matrix. For uniformly distributed ("microcanonical") SA-CASSCF ensembles, self-consistency is invariant to any unitary CAS-CI transformation that acts locally on the ensemble support. Most SA-CASSCF applications in current literature are microcanonical. A problem with microcanonical SA-CASSCF models for problems with "more diabatic than adiabatic" states is described. The problem is that not all diabatic energies and couplings are self-consistently resolvable. A canonical-ensemble SA-CASSCF strategy is proposed to solve the problem. For canonical-ensemble SA-CASSCF, the equilibrated ensemble is a Boltzmann density matrix parametrized by its own CAS-CI Hamiltonian and a Lagrange multiplier acting as an inverse "temperature," unrelated to the physical temperature. Like the convergence criterion for microcanonical-ensemble SA-CASSCF, the equilibration condition for canonical-ensemble SA-CASSCF is invariant to transformations that act locally on the ensemble CAS-CI density matrix. The advantage of a canonical-ensemble description is that more adiabatic states can be included in the support of the ensemble without running into convergence problems. The constraint on the dimensionality of the problem is relieved by the introduction of an energy constraint. The method is illustrated with a complete active space valence-bond (CASVB) analysis of the charge/bond resonance electronic structure of a monomethine cyanine: Michler's hydrol blue. The diabatic CASVB representation is shown to vary weakly for "temperatures" corresponding to visible photon energies. Canonical-ensemble SA-CASSCF enables the resolution of energies and couplings for all covalent and ionic CASVB structures contributing to the SA-CASSCF ensemble. The CASVB solution describes resonance of charge- and bond-localized electronic structures interacting via bridge resonance superexchange. The resonance couplings can be separated into channels associated with either covalent charge delocalization or chemical bonding interactions, with the latter significantly stronger than the former.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007AGUFM.A12D..01K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007AGUFM.A12D..01K"><span>High Resolution Modeling of Hurricanes in a Climate Context</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Knutson, T. R.</p> <p>2007-12-01</p> <p>Modeling of tropical cyclone activity in a climate context initially focused on simulation of relatively weak tropical storm-like disturbances as resolved by coarse grid (200 km) global models. As computing power has increased, multi-year simulations with global models of grid spacing 20-30 km have become feasible. Increased resolution also allowed for simulation storms of increasing intensity, and some global models generate storms of hurricane strength, depending on their resolution and other factors, although detailed hurricane structure is not simulated realistically. Results from some recent high resolution global model studies are reviewed. An alternative for hurricane simulation is regional downscaling. An early approach was to embed an operational (GFDL) hurricane prediction model within a global model solution, either for 5-day case studies of particular model storm cases, or for "idealized experiments" where an initial vortex is inserted into an idealized environments derived from global model statistics. Using this approach, hurricanes up to category five intensity can be simulated, owing to the model's relatively high resolution (9 km grid) and refined physics. Variants on this approach have been used to provide modeling support for theoretical predictions that greenhouse warming will increase the maximum intensities of hurricanes. These modeling studies also simulate increased hurricane rainfall rates in a warmer climate. The studies do not address hurricane frequency issues, and vertical shear is neglected in the idealized studies. A recent development is the use of regional model dynamical downscaling for extended (e.g., season-length) integrations of hurricane activity. In a study for the Atlantic basin, a non-hydrostatic model with grid spacing of 18km is run without convective parameterization, but with internal spectral nudging toward observed large-scale (basin wavenumbers 0-2) atmospheric conditions from reanalyses. Using this approach, our model reproduces the observed increase in Atlantic hurricane activity (numbers, Accumulated Cyclone Energy (ACE), Power Dissipation Index (PDI), etc.) over the period 1980-2006 fairly realistically, and also simulates ENSO-related interannual variations in hurricane counts. Annual simulated hurricane counts from a two-member ensemble correlate with observed counts at r=0.86. However, the model does not simulate hurricanes as intense as those observed, with minimum central pressures of 937 hPa (category 4) and maximum surface winds of 47 m/s (category 2) being the most intense simulated so far in these experiments. To explore possible impacts of future climate warming on Atlantic hurricane activity, we are re-running the 1980- 2006 seasons, keeping the interannual to multidecadal variations unchanged, but altering the August-October mean climate according to changes simulated by an 18-member ensemble of AR4 climate models (years 2080- 2099, A1B emission scenario). The warmer climate state features higher Atlantic SSTs, and also increased vertical wind shear across the Caribbean (Vecchi and Soden, GRL 2007). A key assumption of this approach is that the 18-model ensemble-mean climate change is the best available projection of future climate change in the Atlantic. Some of the 18 global models show little increase in wind shear, or even a decrease, and thus there will be considerable uncertainty associated with the hurricane frequency results, which will require further exploration. Results from our simulations will be presented at the meeting.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..16.5638B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..16.5638B"><span>A High-resolution Reanalysis for the European CORDEX Region</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Bentzien, Sabrina; Bollmeyer, Christoph; Crewell, Susanne; Friederichs, Petra; Hense, Andreas; Keller, Jan; Keune, Jessica; Kneifel, Stefan; Ohlwein, Christian; Pscheidt, Ieda; Redl, Stephanie; Steinke, Sandra</p> <p>2014-05-01</p> <p>A High-resolution Reanalysis for the European CORDEX Region Within the Hans-Ertel-Centre for Weather Research (HErZ), the climate monitoring branch concentrates efforts on the assessment and analysis of regional climate in Germany and Europe. In joint cooperation with DWD (German Meteorological Service), a high-resolution reanalysis system based on the COSMO model has been developed. Reanalyses gain more and more importance as a source of meteorological information for many purposes and applications. Several global reanalyses projects (e.g., ERA, MERRA, CSFR, JMA9) produce and verify these data sets to provide time series as long as possible combined with a high data quality. Due to a spatial resolution down to 50-70km and 3-hourly temporal output, they are not suitable for small scale problems (e.g., regional climate assessment, meso-scale NWP verification, input for subsequent models such as river runoff simulations). The implementation of regional reanalyses based on a limited area model along with a data assimilation scheme is able to generate reanalysis data sets with high spatio-temporal resolution. The work presented here focuses on the regional reanalysis for Europe with a domain matching the CORDEX-EURO-11 specifications, albeit at a higher spatial resolution, i.e., 0.055° (6km) instead of 0.11° (12km). The COSMO reanalysis system comprises the assimilation of observational data using the existing nudging scheme of COSMO and is complemented by a special soil moisture analysis and boundary conditions given by ERA-interim data. The reanalysis data set currently covers 6 years (2007-2012). The evaluation of the reanalyses is done using independent observations with special emphasis on precipitation and high-impact weather situations. The development and evaluation of the COSMO-based reanalysis for the CORDEX-Euro domain can be seen as a preparation for joint European activities on the development of an ensemble system of regional reanalyses for Europe.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN13A1649T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN13A1649T"><span>Extending Climate Analytics as a Service to the Earth System Grid Federation Progress Report on the Reanalysis Ensemble Service</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.</p> <p>2016-12-01</p> <p>We are extending climate analytics-as-a-service, including: (1) A high-performance Virtual Real-Time Analytics Testbed supporting six major reanalysis data sets using advanced technologies like the Cloudera Impala-based SQL and Hadoop-based MapReduce analytics over native NetCDF files. (2) A Reanalysis Ensemble Service (RES) that offers a basic set of commonly used operations over the reanalysis collections that are accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib. (3) An Open Geospatial Consortium (OGC) WPS-compliant Web service interface to CDSLib to accommodate ESGF's Web service endpoints. This presentation will report on the overall progress of this effort, with special attention to recent enhancements that have been made to the Reanalysis Ensemble Service, including the following: - An CDSlib Python library that supports full temporal, spatial, and grid-based resolution services - A new reanalysis collections reference model to enable operator design and implementation - An enhanced library of sample queries to demonstrate and develop use case scenarios - Extended operators that enable single- and multiple reanalysis area average, vertical average, re-gridding, and trend, climatology, and anomaly computations - Full support for the MERRA-2 reanalysis and the initial integration of two additional reanalyses - A prototype Jupyter notebook-based distribution mechanism that combines CDSlib documentation with interactive use case scenarios and personalized project management - Prototyped uncertainty quantification services that combine ensemble products with comparative observational products - Convenient, one-stop shopping for commonly used data products from multiple reanalyses, including basic subsetting and arithmetic operations over the data and extractions of trends, climatologies, and anomalies - The ability to compute and visualize multiple reanalysis intercomparisons</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1168923-short-ensembles-efficient-method-discerning-climate-relevant-sensitivities-atmospheric-general-circulation-models','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1168923-short-ensembles-efficient-method-discerning-climate-relevant-sensitivities-atmospheric-general-circulation-models"><span>Short ensembles: An Efficient Method for Discerning Climate-relevant Sensitivities in Atmospheric General Circulation Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Wan, Hui; Rasch, Philip J.; Zhang, Kai</p> <p>2014-09-08</p> <p>This paper explores the feasibility of an experimentation strategy for investigating sensitivities in fast components of atmospheric general circulation models. The basic idea is to replace the traditional serial-in-time long-term climate integrations by representative ensembles of shorter simulations. The key advantage of the proposed method lies in its efficiency: since fewer days of simulation are needed, the computational cost is less, and because individual realizations are independent and can be integrated simultaneously, the new dimension of parallelism can dramatically reduce the turnaround time in benchmark tests, sensitivities studies, and model tuning exercises. The strategy is not appropriate for exploring sensitivitymore » of all model features, but it is very effective in many situations. Two examples are presented using the Community Atmosphere Model version 5. The first example demonstrates that the method is capable of characterizing the model cloud and precipitation sensitivity to time step length. A nudging technique is also applied to an additional set of simulations to help understand the contribution of physics-dynamics interaction to the detected time step sensitivity. In the second example, multiple empirical parameters related to cloud microphysics and aerosol lifecycle are perturbed simultaneously in order to explore which parameters have the largest impact on the simulated global mean top-of-atmosphere radiation balance. Results show that in both examples, short ensembles are able to correctly reproduce the main signals of model sensitivities revealed by traditional long-term climate simulations for fast processes in the climate system. The efficiency of the ensemble method makes it particularly useful for the development of high-resolution, costly and complex climate models.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015ISPAn.II4..119I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015ISPAn.II4..119I"><span>Statistical Methods in Ai: Rare Event Learning Using Associative Rules and Higher-Order Statistics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Iyer, V.; Shetty, S.; Iyengar, S. S.</p> <p>2015-07-01</p> <p>Rare event learning has not been actively researched since lately due to the unavailability of algorithms which deal with big samples. The research addresses spatio-temporal streams from multi-resolution sensors to find actionable items from a perspective of real-time algorithms. This computing framework is independent of the number of input samples, application domain, labelled or label-less streams. A sampling overlap algorithm such as Brooks-Iyengar is used for dealing with noisy sensor streams. We extend the existing noise pre-processing algorithms using Data-Cleaning trees. Pre-processing using ensemble of trees using bagging and multi-target regression showed robustness to random noise and missing data. As spatio-temporal streams are highly statistically correlated, we prove that a temporal window based sampling from sensor data streams converges after n samples using Hoeffding bounds. Which can be used for fast prediction of new samples in real-time. The Data-cleaning tree model uses a nonparametric node splitting technique, which can be learned in an iterative way which scales linearly in memory consumption for any size input stream. The improved task based ensemble extraction is compared with non-linear computation models using various SVM kernels for speed and accuracy. We show using empirical datasets the explicit rule learning computation is linear in time and is only dependent on the number of leafs present in the tree ensemble. The use of unpruned trees (t) in our proposed ensemble always yields minimum number (m) of leafs keeping pre-processing computation to n × t log m compared to N2 for Gram Matrix. We also show that the task based feature induction yields higher Qualify of Data (QoD) in the feature space compared to kernel methods using Gram Matrix.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_14");'>14</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li class="active"><span>16</span></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_16 --> <div id="page_17" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="321"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ESD.....9..459K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ESD.....9..459K"><span>European climate change at global mean temperature increases of 1.5 and 2 °C above pre-industrial conditions as simulated by the EURO-CORDEX regional climate models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kjellström, Erik; Nikulin, Grigory; Strandberg, Gustav; Bøssing Christensen, Ole; Jacob, Daniela; Keuler, Klaus; Lenderink, Geert; van Meijgaard, Erik; Schär, Christoph; Somot, Samuel; Sørland, Silje Lund; Teichmann, Claas; Vautard, Robert</p> <p>2018-05-01</p> <p>We investigate European regional climate change for time periods when the global mean temperature has increased by 1.5 and 2 °C compared to pre-industrial conditions. Results are based on regional downscaling of transient climate change simulations for the 21st century with global climate models (GCMs) from the fifth-phase Coupled Model Intercomparison Project (CMIP5). We use an ensemble of EURO-CORDEX high-resolution regional climate model (RCM) simulations undertaken at a computational grid of 12.5 km horizontal resolution covering Europe. The ensemble consists of a range of RCMs that have been used for downscaling different GCMs under the RCP8.5 forcing scenario. The results indicate considerable near-surface warming already at the lower 1.5 °C of warming. Regional warming exceeds that of the global mean in most parts of Europe, being the strongest in the northernmost parts of Europe in winter and in the southernmost parts of Europe together with parts of Scandinavia in summer. Changes in precipitation, which are less robust than the ones in temperature, include increases in the north and decreases in the south with a borderline that migrates from a northerly position in summer to a southerly one in winter. Some of these changes are already seen at 1.5 °C of warming but are larger and more robust at 2 °C. Changes in near-surface wind speed are associated with a large spread among individual ensemble members at both warming levels. Relatively large areas over the North Atlantic and some parts of the continent show decreasing wind speed while some ocean areas in the far north show increasing wind speed. The changes in temperature, precipitation and wind speed are shown to be modified by changes in mean sea level pressure, indicating a strong relationship with the large-scale circulation and its internal variability on decade-long timescales. By comparing to a larger ensemble of CMIP5 GCMs we find that the RCMs can alter the results, leading either to attenuation or amplification of the climate change signal in the underlying GCMs. We find that the RCMs tend to produce less warming and more precipitation (or less drying) in many areas in both winter and summer.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015APS..MARG50001W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015APS..MARG50001W"><span>Human movement stochastic variability leads to diagnostic biomarkers In Autism Spectrum Disorders (ASD)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wu, Di; Torres, Elizabeth B.; Jose, Jorge V.</p> <p>2015-03-01</p> <p>ASD is a spectrum of neurodevelopmental disorders. The high heterogeneity of the symptoms associated with the disorder impedes efficient diagnoses based on human observations. Recent advances with high-resolution MEM wearable sensors enable accurate movement measurements that may escape the naked eye. It calls for objective metrics to extract physiological relevant information from the rapidly accumulating data. In this talk we'll discuss the statistical analysis of movement data continuously collected with high-resolution sensors at 240Hz. We calculated statistical properties of speed fluctuations within the millisecond time range that closely correlate with the subjects' cognitive abilities. We computed the periodicity and synchronicity of the speed fluctuations' from their power spectrum and ensemble averaged two-point cross-correlation function. We built a two-parameter phase space from the temporal statistical analyses of the nearest neighbor fluctuations that provided a quantitative biomarker for ASD and adult normal subjects and further classified ASD severity. We also found age related developmental statistical signatures and potential ASD parental links in our movement dynamical studies. Our results may have direct clinical applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUOS.A14B2544T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUOS.A14B2544T"><span>Assessing Australian Rainfall Projections in Two Model Resolutions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Taschetto, A.; Haarsma, R. D.; Sen Gupta, A.</p> <p>2016-02-01</p> <p>Australian climate is projected to change with increases in greenhouse gases. The IPCC reports an increase in extreme daily rainfall across the country. At the same time, mean rainfall over southeast Australia is projected to reduce during austral winter, but to increase during austral summer, mainly associated with changes in the surrounding oceans. Climate models agree better on the future reduction of average rainfall over the southern regions of Australia compared to the increase in extreme rainfall events. One of the reasons for this disagreement may be related to climate model limitations in simulating the observed mechanisms associated with the mid-latitude weather systems, in particular due to coarse model resolutions. In this study we investigate how changes in sea surface temperature (SST) affect Australian mean and extreme rainfall under global warming, using a suite of numerical experiments at two model resolutions: about 126km (T159) and 25km (T799). The numerical experiments are performed with the earth system model EC-EARTH. Two 6-member ensembles are produced for the present day conditions and a future scenario. The present day ensemble is forced with the observed daily SST from the NOAA National Climatic Data Center from 2002 to 2006. The future scenario simulation is integrated from 2094 to 2098 using the present day SST field added onto the future SST change created from a 17-member ensemble based on the RCP4.5 scenario. Preliminary results show an increase in extreme rainfall events over Tasmania associated with enhanced convection driven by the Tasman Sea warming. We will further discuss how the projected changes in SST will impact the southern mid-latitude weather systems that ultimately affect Australian rainfall.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ESSD...10..815H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ESSD...10..815H"><span>The WASCAL high-resolution regional climate simulation ensemble for West Africa: concept, dissemination and assessment</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Heinzeller, Dominikus; Dieng, Diarra; Smiatek, Gerhard; Olusegun, Christiana; Klein, Cornelia; Hamann, Ilse; Salack, Seyni; Bliefernicht, Jan; Kunstmann, Harald</p> <p>2018-04-01</p> <p>Climate change and constant population growth pose severe challenges to 21st century rural Africa. Within the framework of the West African Science Service Center on Climate Change and Adapted Land Use (WASCAL), an ensemble of high-resolution regional climate change scenarios for the greater West African region is provided to support the development of effective adaptation and mitigation measures. This contribution presents the overall concept of the WASCAL regional climate simulations, as well as detailed information on the experimental design, and provides information on the format and dissemination of the available data. All data are made available to the public at the CERA long-term archive of the German Climate Computing Center (DKRZ) with a subset available at the PANGAEA Data Publisher for Earth & Environmental Science portal (<a href="https://doi.pangaea.de/10.1594/PANGAEA.880512" target="_blank">https://doi.pangaea.de/10.1594/PANGAEA.880512</a>). A brief assessment of the data are presented to provide guidance for future users. Regional climate projections are generated at high (12 km) and intermediate (60 km) resolution using the Weather Research and Forecasting Model (WRF). The simulations cover the validation period 1980-2010 and the two future periods 2020-2050 and 2070-2100. A brief comparison to observations and two climate change scenarios from the Coordinated Regional Downscaling Experiment (CORDEX) initiative is presented to provide guidance on the data set to future users and to assess their climate change signal. Under the RCP4.5 (Representative Concentration Pathway 4.5) scenario, the results suggest an increase in temperature by 1.5 °C at the coast of Guinea and by up to 3 °C in the northern Sahel by the end of the 21st century, in line with existing climate projections for the region. They also project an increase in precipitation by up to 300 mm per year along the coast of Guinea, by up to 150 mm per year in the Soudano region adjacent in the north and almost no change in precipitation in the Sahel. This stands in contrast to existing regional climate projections, which predict increasingly drier conditions.The high spatial and temporal resolution of the data, the extensive list of output variables, the large computational domain and the long time periods covered make this data set a unique resource for follow-up analyses and impact modelling studies over the greater West African region. The comprehensive documentation and standardisation of the data facilitate and encourage their use within and outside of the WASCAL community.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017PEPS....4...13S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017PEPS....4...13S"><span>Outcomes and challenges of global high-resolution non-hydrostatic atmospheric simulations using the K computer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Satoh, Masaki; Tomita, Hirofumi; Yashiro, Hisashi; Kajikawa, Yoshiyuki; Miyamoto, Yoshiaki; Yamaura, Tsuyoshi; Miyakawa, Tomoki; Nakano, Masuo; Kodama, Chihiro; Noda, Akira T.; Nasuno, Tomoe; Yamada, Yohei; Fukutomi, Yoshiki</p> <p>2017-12-01</p> <p>This article reviews the major outcomes of a 5-year (2011-2016) project using the K computer to perform global numerical atmospheric simulations based on the non-hydrostatic icosahedral atmospheric model (NICAM). The K computer was made available to the public in September 2012 and was used as a primary resource for Japan's Strategic Programs for Innovative Research (SPIRE), an initiative to investigate five strategic research areas; the NICAM project fell under the research area of climate and weather simulation sciences. Combining NICAM with high-performance computing has created new opportunities in three areas of research: (1) higher resolution global simulations that produce more realistic representations of convective systems, (2) multi-member ensemble simulations that are able to perform extended-range forecasts 10-30 days in advance, and (3) multi-decadal simulations for climatology and variability. Before the K computer era, NICAM was used to demonstrate realistic simulations of intra-seasonal oscillations including the Madden-Julian oscillation (MJO), merely as a case study approach. Thanks to the big leap in computational performance of the K computer, we could greatly increase the number of cases of MJO events for numerical simulations, in addition to integrating time and horizontal resolution. We conclude that the high-resolution global non-hydrostatic model, as used in this five-year project, improves the ability to forecast intra-seasonal oscillations and associated tropical cyclogenesis compared with that of the relatively coarser operational models currently in use. The impacts of the sub-kilometer resolution simulation and the multi-decadal simulations using NICAM are also reviewed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017WRR....53.2149M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017WRR....53.2149M"><span>A parametric approach for simultaneous bias correction and high-resolution downscaling of climate model rainfall</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto; Marrocu, Marino</p> <p>2017-03-01</p> <p>Distribution mapping has been identified as the most efficient approach to bias-correct climate model rainfall, while reproducing its statistics at spatial and temporal resolutions suitable to run hydrologic models. Yet its implementation based on empirical distributions derived from control samples (referred to as nonparametric distribution mapping) makes the method's performance sensitive to sample length variations, the presence of outliers, the spatial resolution of climate model results, and may lead to biases, especially in extreme rainfall estimation. To address these shortcomings, we propose a methodology for simultaneous bias correction and high-resolution downscaling of climate model rainfall products that uses: (a) a two-component theoretical distribution model (i.e., a generalized Pareto (GP) model for rainfall intensities above a specified threshold u*, and an exponential model for lower rainrates), and (b) proper interpolation of the corresponding distribution parameters on a user-defined high-resolution grid, using kriging for uncertain data. We assess the performance of the suggested parametric approach relative to the nonparametric one, using daily raingauge measurements from a dense network in the island of Sardinia (Italy), and rainfall data from four GCM/RCM model chains of the ENSEMBLES project. The obtained results shed light on the competitive advantages of the parametric approach, which is proved more accurate and considerably less sensitive to the characteristics of the calibration period, independent of the GCM/RCM combination used. This is especially the case for extreme rainfall estimation, where the GP assumption allows for more accurate and robust estimates, also beyond the range of the available data.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=ensemble&id=EJ1144713','ERIC'); return false;" href="https://eric.ed.gov/?q=ensemble&id=EJ1144713"><span>Just Ask Me: Convergent Validity of Self-Reported Measures of Music Participation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Elpus, Kenneth</p> <p>2017-01-01</p> <p>The purpose of this study was to determine the convergent validity of self-reported and objective measures of school music ensemble participation. Self-reported survey responses to a question about high school music ensemble participation and administrative data in the form of high school transcript-indicated ensemble enrollments were compared…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1544316','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1544316"><span>Toward an Accurate Theoretical Framework for Describing Ensembles for Proteins under Strongly Denaturing Conditions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Tran, Hoang T.; Pappu, Rohit V.</p> <p>2006-01-01</p> <p>Our focus is on an appropriate theoretical framework for describing highly denatured proteins. In high concentrations of denaturants, proteins behave like polymers in a good solvent and ensembles for denatured proteins can be modeled by ignoring all interactions except excluded volume (EV) effects. To assay conformational preferences of highly denatured proteins, we quantify a variety of properties for EV-limit ensembles of 23 two-state proteins. We find that modeled denatured proteins can be best described as follows. Average shapes are consistent with prolate ellipsoids. Ensembles are characterized by large correlated fluctuations. Sequence-specific conformational preferences are restricted to local length scales that span five to nine residues. Beyond local length scales, chain properties follow well-defined power laws that are expected for generic polymers in the EV limit. The average available volume is filled inefficiently, and cavities of all sizes are found within the interiors of denatured proteins. All properties characterized from simulated ensembles match predictions from rigorous field theories. We use our results to resolve between conflicting proposals for structure in ensembles for highly denatured states. PMID:16766618</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21785142','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21785142"><span>Ensembl BioMarts: a hub for data retrieval across taxonomic space.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kinsella, Rhoda J; Kähäri, Andreas; Haider, Syed; Zamora, Jorge; Proctor, Glenn; Spudich, Giulietta; Almeida-King, Jeff; Staines, Daniel; Derwent, Paul; Kerhornou, Arnaud; Kersey, Paul; Flicek, Paul</p> <p>2011-01-01</p> <p>For a number of years the BioMart data warehousing system has proven to be a valuable resource for scientists seeking a fast and versatile means of accessing the growing volume of genomic data provided by the Ensembl project. The launch of the Ensembl Genomes project in 2009 complemented the Ensembl project by utilizing the same visualization, interactive and programming tools to provide users with a means for accessing genome data from a further five domains: protists, bacteria, metazoa, plants and fungi. The Ensembl and Ensembl Genomes BioMarts provide a point of access to the high-quality gene annotation, variation data, functional and regulatory annotation and evolutionary relationships from genomes spanning the taxonomic space. This article aims to give a comprehensive overview of the Ensembl and Ensembl Genomes BioMarts as well as some useful examples and a description of current data content and future objectives. Database URLs: http://www.ensembl.org/biomart/martview/; http://metazoa.ensembl.org/biomart/martview/; http://plants.ensembl.org/biomart/martview/; http://protists.ensembl.org/biomart/martview/; http://fungi.ensembl.org/biomart/martview/; http://bacteria.ensembl.org/biomart/martview/.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2006JAtS...63.1895K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2006JAtS...63.1895K"><span>A Mass-Flux Scheme View of a High-Resolution Simulation of a Transition from Shallow to Deep Cumulus Convection.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kuang, Zhiming; Bretherton, Christopher S.</p> <p>2006-07-01</p> <p>In this paper, an idealized, high-resolution simulation of a gradually forced transition from shallow, nonprecipitating to deep, precipitating cumulus convection is described; how the cloud and transport statistics evolve as the convection deepens is explored; and the collected statistics are used to evaluate assumptions in current cumulus schemes. The statistical analysis methodologies that are used do not require tracing the history of individual clouds or air parcels; instead they rely on probing the ensemble characteristics of cumulus convection in the large model dataset. They appear to be an attractive way for analyzing outputs from cloud-resolving numerical experiments. Throughout the simulation, it is found that 1) the initial thermodynamic properties of the updrafts at the cloud base have rather tight distributions; 2) contrary to the assumption made in many cumulus schemes, nearly undiluted air parcels are too infrequent to be relevant to any stage of the simulated convection; and 3) a simple model with a spectrum of entraining plumes appears to reproduce most features of the cloudy updrafts, but significantly overpredicts the mass flux as the updrafts approach their levels of zero buoyancy. A buoyancy-sorting model was suggested as a potential remedy. The organized circulations of cold pools seem to create clouds with larger-sized bases and may correspondingly contribute to their smaller lateral entrainment rates. Our results do not support a mass-flux closure based solely on convective available potential energy (CAPE), and are in general agreement with a convective inhibition (CIN)-based closure. The general similarity in the ensemble characteristics of shallow and deep convection and the continuous evolution of the thermodynamic structure during the transition provide justification for developing a single unified cumulus parameterization that encompasses both shallow and deep convection.<HR ALIGN="center" WIDTH="30%"></p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ApJ...844..136E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ApJ...844..136E"><span>Joint Bayesian Estimation of Quasar Continua and the Lyα Forest Flux Probability Distribution Function</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Eilers, Anna-Christina; Hennawi, Joseph F.; Lee, Khee-Gan</p> <p>2017-08-01</p> <p>We present a new Bayesian algorithm making use of Markov Chain Monte Carlo sampling that allows us to simultaneously estimate the unknown continuum level of each quasar in an ensemble of high-resolution spectra, as well as their common probability distribution function (PDF) for the transmitted Lyα forest flux. This fully automated PDF regulated continuum fitting method models the unknown quasar continuum with a linear principal component analysis (PCA) basis, with the PCA coefficients treated as nuisance parameters. The method allows one to estimate parameters governing the thermal state of the intergalactic medium (IGM), such as the slope of the temperature-density relation γ -1, while marginalizing out continuum uncertainties in a fully Bayesian way. Using realistic mock quasar spectra created from a simplified semi-numerical model of the IGM, we show that this method recovers the underlying quasar continua to a precision of ≃ 7 % and ≃ 10 % at z = 3 and z = 5, respectively. Given the number of principal component spectra, this is comparable to the underlying accuracy of the PCA model itself. Most importantly, we show that we can achieve a nearly unbiased estimate of the slope γ -1 of the IGM temperature-density relation with a precision of +/- 8.6 % at z = 3 and +/- 6.1 % at z = 5, for an ensemble of ten mock high-resolution quasar spectra. Applying this method to real quasar spectra and comparing to a more realistic IGM model from hydrodynamical simulations would enable precise measurements of the thermal and cosmological parameters governing the IGM, albeit with somewhat larger uncertainties, given the increased flexibility of the model.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JGRC..122.8813W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JGRC..122.8813W"><span>Impact of the Mesoscale Dynamics on Ocean Deep Convection: The 2012-2013 Case Study in the Northwestern Mediterranean Sea</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Waldman, Robin; Herrmann, Marine; Somot, Samuel; Arsouze, Thomas; Benshila, Rachid; Bosse, Anthony; Chanut, Jerome; Giordani, Herve; Sevault, Florence; Testor, Pierre</p> <p>2017-11-01</p> <p>Winter 2012-2013 was a particularly intense and well-observed Dense Water Formation (DWF) event in the Northwestern Mediterranean Sea. In this study, we investigate the impact of the mesoscale dynamics on DWF. We perform two perturbed initial state simulation ensembles from summer 2012 to 2013, respectively, mesoscale-permitting and mesoscale-resolving, with the AGRIF refinement tool in the Mediterranean configuration NEMOMED12. The mean impact of the mesoscale on DWF occurs mainly through the high-resolution physics and not the high-resolution bathymetry. This impact is shown to be modest: the mesoscale does not modify the chronology of the deep convective winter nor the volume of dense waters formed. It however impacts the location of the mixed patch by reducing its extent to the west of the North Balearic Front and by increasing it along the Northern Current, in better agreement with observations. The maximum mixed patch volume is significantly reduced from 5.7 ± 0.2 to 4.2 ± 0.6 × 1013 m3. Finally, the spring restratification volume is more realistic and enhanced from 1.4 ± 0.2 to 1.8 ± 0.2 × 1013 m3 by the mesoscale. We also address the mesoscale impact on the ocean intrinsic variability by performing perturbed initial state ensemble simulations. The mesoscale enhances the intrinsic variability of the deep convection geography, with most of the mixed patch area impacted by intrinsic variability. The DWF volume has a low intrinsic variability but it is increased by 2-3 times with the mesoscale. We relate it to a dramatic increase of the Gulf of Lions eddy kinetic energy from 5.0 ± 0.6 to 17.3 ± 1.5 cm2/s2, in remarkable agreement with observations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016RScI...87k3907P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016RScI...87k3907P"><span>First order reversal curves (FORC) analysis of individual magnetic nanostructures using micro-Hall magnetometry</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pohlit, Merlin; Eibisch, Paul; Akbari, Maryam; Porrati, Fabrizio; Huth, Michael; Müller, Jens</p> <p>2016-11-01</p> <p>Alongside the development of artificially created magnetic nanostructures, micro-Hall magnetometry has proven to be a versatile tool to obtain high-resolution hysteresis loop data and access dynamical properties. Here we explore the application of First Order Reversal Curves (FORC)—a technique well-established in the field of paleomagnetism for studying grain-size and interaction effects in magnetic rocks—to individual and dipolar-coupled arrays of magnetic nanostructures using micro-Hall sensors. A proof-of-principle experiment performed on a macroscopic piece of a floppy disk as a reference sample well known in the literature demonstrates that the FORC diagrams obtained by magnetic stray field measurements using home-built magnetometers are in good agreement with magnetization data obtained by a commercial vibrating sample magnetometer. We discuss in detail the FORC diagrams and their interpretation of three different representative magnetic systems, prepared by the direct-write Focused Electron Beam Induced Deposition (FEBID) technique: (1) an isolated Co-nanoisland showing a simple square-shaped hysteresis loop, (2) a more complex CoFe-alloy nanoisland exhibiting a wasp-waist-type hysteresis, and (3) a cluster of interacting Co-nanoislands. Our findings reveal that the combination of FORC and micro-Hall magnetometry is a promising tool to investigate complex magnetization reversal processes within individual or small ensembles of nanomagnets grown by FEBID or other fabrication methods. The method provides sub-μm spatial resolution and bridges the gap of FORC analysis, commonly used for studying macroscopic samples and rather large arrays, to studies of small ensembles of interacting nanoparticles with the high moment sensitivity inherent to micro-Hall magnetometry.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1326511-high-resolution-ensemble-projections-near-term-regional-climate-over-continental-united-states','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1326511-high-resolution-ensemble-projections-near-term-regional-climate-over-continental-united-states"><span>High-resolution ensemble projections of near-term regional climate over the continental United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Ashfaq, Moetasim; Rastogi, Deeksha; Mei, Rui; ...</p> <p>2016-09-01</p> <p>We present high-resolution near-term ensemble projections of hydro-climatic changes over the contiguous U.S. using a regional climate model (RegCM4) that dynamically downscales 11 Global Climate Models from the 5th phase of Coupled Model Inter-comparison Project at 18km horizontal grid spacing. All model integrations span 41 years in the historical period (1965 – 2005) and 41 years in the near-term future period (2010 – 2050) under Representative Concentration Pathway 8.5 and cover a domain that includes the contiguous U.S. and parts of Canada and Mexico. Should emissions continue to rise, surface temperatures in every region within the U.S. will reach amore » new climate norm well before mid 21st century regardless of the magnitudes of regional warming. Significant warming will likely intensify the regional hydrological cycle through the acceleration of the historical trends in cold, warm and wet extremes. The future temperature response will be partly regulated by changes in snow hydrology over the regions that historically receive a major portion of cold season precipitation in the form of snow. Our results indicate the existence of the Clausius-Clapeyron scaling at regional scales where per degree centigrade rise in surface temperature will lead to a 7.4% increase in precipitation from extremes. More importantly, both winter (snow) and summer (liquid) extremes are projected to increase across the U.S. These changes in precipitation characteristics will be driven by a shift towards shorter and wetter seasons. Altogether, projected changes in the regional hydro-climate can have substantial impacts on the natural and human systems across the U.S.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018NatSD...580057S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018NatSD...580057S"><span>Ensemble of European regional climate simulations for the winter of 2013 and 2014 from HadAM3P-RM3P</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schaller, Nathalie; Sparrow, Sarah N.; Massey, Neil R.; Bowery, Andy; Miller, Jonathan; Wilson, Simon; Wallom, David C. H.; Otto, Friederike E. L.</p> <p>2018-04-01</p> <p>Large data sets used to study the impact of anthropogenic climate change on the 2013/14 floods in the UK are provided. The data consist of perturbed initial conditions simulations using the Weather@Home regional climate modelling framework. Two different base conditions, Actual, including atmospheric conditions (anthropogenic greenhouse gases and human induced aerosols) as at present and Natural, with these forcings all removed are available. The data set is made up of 13 different ensembles (2 actual and 11 natural) with each having more than 7500 members. The data is available as NetCDF V3 files representing monthly data within the period of interest (1st Dec 2013 to 15th February 2014) for both a specified European region at a 50 km horizontal resolution and globally at N96 resolution. The data is stored within the UK Natural and Environmental Research Council Centre for Environmental Data Analysis repository.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1544146','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=1544146"><span>Relation between native ensembles and experimental structures of proteins</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Best, Robert B.; Lindorff-Larsen, Kresten; DePristo, Mark A.; Vendruscolo, Michele</p> <p>2006-01-01</p> <p>Different experimental structures of the same protein or of proteins with high sequence similarity contain many small variations. Here we construct ensembles of “high-sequence similarity Protein Data Bank” (HSP) structures and consider the extent to which such ensembles represent the structural heterogeneity of the native state in solution. We find that different NMR measurements probing structure and dynamics of given proteins in solution, including order parameters, scalar couplings, and residual dipolar couplings, are remarkably well reproduced by their respective high-sequence similarity Protein Data Bank ensembles; moreover, we show that the effects of uncertainties in structure determination are insufficient to explain the results. These results highlight the importance of accounting for native-state protein dynamics in making comparisons with ensemble-averaged experimental data and suggest that even a modest number of structures of a protein determined under different conditions, or with small variations in sequence, capture a representative subset of the true native-state ensemble. PMID:16829580</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1325752','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1325752"><span>Interactive Correlation Analysis and Visualization of Climate Data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Ma, Kwan-Liu</p> <p></p> <p>The relationship between our ability to analyze and extract insights from visualization of climate model output and the capability of the available resources to make those visualizations has reached a crisis point. The large volume of data currently produced by climate models is overwhelming the current, decades-old visualization workflow. The traditional methods for visualizing climate output also have not kept pace with changes in the types of grids used, the number of variables involved, and the number of different simulations performed with a climate model or the feature-richness of high-resolution simulations. This project has developed new and faster methods formore » visualization in order to get the most knowledge out of the new generation of high-resolution climate models. While traditional climate images will continue to be useful, there is need for new approaches to visualization and analysis of climate data if we are to gain all the insights available in ultra-large data sets produced by high-resolution model output and ensemble integrations of climate models such as those produced for the Coupled Model Intercomparison Project. Towards that end, we have developed new visualization techniques for performing correlation analysis. We have also introduced highly scalable, parallel rendering methods for visualizing large-scale 3D data. This project was done jointly with climate scientists and visualization researchers at Argonne National Laboratory and NCAR.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.3429M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.3429M"><span>Is 30-second update fast enough for convection-resolving data assimilation?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Miyoshi, Takemasa; Ruiz, Juan; Lien, Guo-Yuan; Teramura, Toshiki; Kondo, Keiichi; Maejima, Yasumitsu; Honda, Takumi; Otsuka, Shigenori</p> <p>2017-04-01</p> <p>For local severe weather forecasting at 100-m resolution with 30-minute lead time, we have been working on the "Big Data Assimilation" (BDA) effort for super-rapid 30-second cycle of an ensemble Kalman filter. We have presented two papers with the concept and case studies (Miyoshi et al. 2016, BAMS; Proceedings of the IEEE). We focus on the non-Gaussian PDF in this study. We were hoping that we could assume the Gaussian error distribution in 30-second forecasts before strong nonlinear dynamics distort the error distribution for rapidly-changing convective storms. However, using 1000 ensemble members, the reduced-resolution version of the BDA system at 1-km grid spacing with 30-second updates showed ubiquity of highly non-Gaussian PDF. Although our results so far with multiple case studies were quite successful, this gives us a doubt about our Gaussian assumption even if the data assimilation interval is short enough compared with the system's chaotic time scale. We therefore pose a question if the 30-second update is fast enough for convection-resolving data assimilation under the Gaussian assumption. To answer this question, we aim to gain combined knowledge from BDA case studies, 1000-member experiments, 30-second breeding experiments, and toy-model experiments with dense and frequent observations. In this presentation, we will show the most up-to-date results of the BDA research, and will discuss about the question if the 30-second update is fast enough for convective-scale data assimilation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22255683','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22255683"><span>Decoding ensemble activity from neurophysiological recordings in the temporal cortex.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kreiman, Gabriel</p> <p>2011-01-01</p> <p>We study subjects with pharmacologically intractable epilepsy who undergo semi-chronic implantation of electrodes for clinical purposes. We record physiological activity from tens to more than one hundred electrodes implanted in different parts of neocortex. These recordings provide higher spatial and temporal resolution than non-invasive measures of human brain activity. Here we discuss our efforts to develop hardware and algorithms to interact with the human brain by decoding ensemble activity in single trials. We focus our discussion on decoding visual information during a variety of visual object recognition tasks but the same technologies and algorithms can also be directly applied to other cognitive phenomena.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1810065M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1810065M"><span>TRMM- and GPM-based precipitation analysis and modelling in the Tropical Andes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Manz, Bastian; Buytaert, Wouter; Zulkafli, Zed; Onof, Christian</p> <p>2016-04-01</p> <p>Despite wide-spread applications of satellite-based precipitation products (SPPs) throughout the TRMM-era, the scarcity of ground-based in-situ data (high density gauge networks, rainfall radar) in many hydro-meteorologically important regions, such as tropical mountain environments, has limited our ability to evaluate both SPPs and individual satellite-based sensors as well as accurately model or merge rainfall at high spatial resolutions, particularly with respect to extremes. This has restricted both the understanding of sensor behaviour and performance controls in such regions as well as the accuracy of precipitation estimates and respective hydrological applications ranging from water resources management to early warning systems. Here we report on our recent research into precipitation analysis and modelling using various TRMM and GPM products (2A25, 3B42 and IMERG) in the tropical Andes. In an initial study, 78 high-frequency (10-min) recording gauges in Colombia and Ecuador are used to generate a ground-based validation dataset for evaluation of instantaneous TRMM Precipitation Radar (TPR) overpasses from the 2A25 product. Detection ability, precipitation time-series, empirical distributions and statistical moments are evaluated with respect to regional climatological differences, seasonal behaviour, rainfall types and detection thresholds. Results confirmed previous findings from extra-tropical regions of over-estimation of low rainfall intensities and under-estimation of the highest 10% of rainfall intensities by the TPR. However, in spite of evident regionalised performance differences as a function of local climatological regimes, the TPR provides an accurate estimate of climatological annual and seasonal rainfall means. On this basis, high-resolution (5 km) climatological maps are derived for the entire tropical Andes. The second objective of this work is to improve the local precipitation estimation accuracy and representation of spatial patterns of extreme rainfall probabilities over the region. For this purpose, an ensemble of high-resolution rainfall fields is generated by stochastic simulation using space-time averaged, coarse-scale (daily, 0.25°) satellite-based rainfall inputs (TRMM 3B42/ -RT) and the high-resolution climatological information derived from the TPR as spatial disaggregation proxies. For evaluation and merging, gridded ground-based rainfall fields are generated from gauge data using sequential simulation. Satellite and ground-based ensembles are subsequently merged using an inverse error weighting scheme. The model was tested over a case study in the Colombian Andes with optional coarse-scale bias correction prior to disaggregation and merging. The resulting outputs were assessed in the context of Generalized Extreme Value theory and showed improved estimation of extreme rainfall probabilities compared to the original TMPA inputs. Initial findings using GPM-IMERG inputs are also presented.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_15");'>15</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li class="active"><span>17</span></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_17 --> <div id="page_18" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="341"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.5523K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.5523K"><span>Application of Hydrometeorological Information for Short-term and Long-term Water Resources Management over Ungauged Basin in Korea</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kim, Ji-in; Ryu, Kyongsik; Suh, Ae-sook</p> <p>2016-04-01</p> <p>In 2014, three major governmental organizations that are Korea Meteorological Administration (KMA), K-water, and Korea Rural Community Corporation have been established the Hydrometeorological Cooperation Center (HCC) to accomplish more effective water management for scarcely gauged river basins, where data are uncertain or non-consistent. To manage the optimal drought and flood control over the ungauged river, HCC aims to interconnect between weather observations and forecasting information, and hydrological model over sparse regions with limited observations sites in Korean peninsula. In this study, long-term forecasting ensemble models so called Global Seasonal forecast system version 5 (GloSea5): a high-resolution seasonal forecast system, provided by KMA was used in order to produce drought outlook. Glosea5 ensemble model prediction provides predicted drought information for 1 and 3 months ahead with drought index including Standardized Precipitation Index (SPI3) and Palmer Drought Severity Index (PDSI). Also, Global Precipitation Measurement and Global Climate Observation Measurement - Water1 satellites data products are used to estimate rainfall and soil moisture contents over the ungauged region.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5657493','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5657493"><span>Ensemble cryo-EM elucidates the mechanism of translation fidelity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Loveland, Anna B.; Demo, Gabriel; Grigorieff, Nikolaus; Korostelev, Andrei A.</p> <p>2017-01-01</p> <p>SUMMARY Faithful gene translation depends on accurate decoding, whose structural mechanism remains a matter of debate. Ribosomes decode mRNA codons by selecting cognate aminoacyl-tRNAs delivered by EF-Tu. We present high-resolution structural ensembles of ribosomes with cognate or near-cognate aminoacyl-tRNAs delivered by EF-Tu. Both cognate and near-cognate tRNA anticodons explore the A site of an open 30S subunit, while inactive EF-Tu is separated from the 50S subunit. A transient conformation of decoding-center nucleotide G530 stabilizes the cognate codon-anticodon helix, initiating step-wise “latching” of the decoding center. The resulting 30S domain closure docks EF-Tu at the sarcin-ricin loop of the 50S subunit, activating EF-Tu for GTP hydrolysis and ensuing aminoacyl-tRNA accommodation. By contrast, near-cognate complexes fail to induce the G530 latch, thus favoring open 30S pre-accommodation intermediates with inactive EF-Tu. This work unveils long-sought structural differences between the pre-accommodation of cognate and near-cognate tRNA that elucidate the mechanism of accurate decoding. PMID:28538735</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1337889-bandgap-inhomogeneity-pbse-quantum-dot-ensemble-from-two-dimensional-spectroscopy-comparison-size-inhomogeneity-from-electron-microscopy','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1337889-bandgap-inhomogeneity-pbse-quantum-dot-ensemble-from-two-dimensional-spectroscopy-comparison-size-inhomogeneity-from-electron-microscopy"><span>Bandgap Inhomogeneity of a PbSe Quantum Dot Ensemble from Two-Dimensional Spectroscopy and Comparison to Size Inhomogeneity from Electron Microscopy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Park, Samuel D.; Baranov, Dmitry; Ryu, Jisu; ...</p> <p>2017-01-03</p> <p>Femtosecond two-dimensional Fourier transform spectroscopy is used to determine the static bandgap inhomogeneity of a colloidal quantum dot ensemble. The excited states of quantum dots absorb light, so their absorptive two-dimensional (2D) spectra will typically have positive and negative peaks. We show that the absorption bandgap inhomogeneity is robustly determined by the slope of the nodal line separating positive and negative peaks in the 2D spectrum around the bandgap transition; this nodal line slope is independent of excited state parameters not known from the absorption and emission spectra. The absorption bandgap inhomogeneity is compared to a size and shape distributionmore » determined by electron microscopy. The electron microscopy images are analyzed using new 2D histograms that correlate major and minor image projections to reveal elongated nanocrystals, a conclusion supported by grazing incidence small-angle X-ray scattering and high-resolution transmission electron microscopy. Lastly, the absorption bandgap inhomogeneity quantitatively agrees with the bandgap variations calculated from the size and shape distribution, placing upper bounds on any surface contributions.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2007AGUFM.A53D1441Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2007AGUFM.A53D1441Z"><span>Use of High-Resolution Satellite Observations to Evaluate Cloud and Precipitation Statistics from Cloud-Resolving Model Simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhou, Y.; Tao, W.; Hou, A. Y.; Zeng, X.; Shie, C.</p> <p>2007-12-01</p> <p>The cloud and precipitation statistics simulated by 3D Goddard Cumulus Ensemble (GCE) model for different environmental conditions, i.e., the South China Sea Monsoon Experiment (SCSMEX), CRYSTAL-FACE, and KAWJEX are compared with Tropical Rainfall Measuring Mission (TRMM) TMI and PR rainfall measurements and as well as cloud observations from the Earth's Radiant Energy System (CERES) and the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments. It is found that GCE is capable of simulating major convective system development and reproducing total surface rainfall amount as compared with rainfall estimated from the soundings. The model presents large discrepancies in rain spectrum and vertical hydrometer profiles. The discrepancy in the precipitation field is also consistent with the cloud and radiation observations. The study will focus on the effects of large scale forcing and microphysics to the simulated model- observation discrepancies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29500344','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29500344"><span>Three-dimensional localization of nanoscale battery reactions using soft X-ray tomography.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Yu, Young-Sang; Farmand, Maryam; Kim, Chunjoong; Liu, Yijin; Grey, Clare P; Strobridge, Fiona C; Tyliszczak, Tolek; Celestre, Rich; Denes, Peter; Joseph, John; Krishnan, Harinarayan; Maia, Filipe R N C; Kilcoyne, A L David; Marchesini, Stefano; Leite, Talita Perciano Costa; Warwick, Tony; Padmore, Howard; Cabana, Jordi; Shapiro, David A</p> <p>2018-03-02</p> <p>Battery function is determined by the efficiency and reversibility of the electrochemical phase transformations at solid electrodes. The microscopic tools available to study the chemical states of matter with the required spatial resolution and chemical specificity are intrinsically limited when studying complex architectures by their reliance on two-dimensional projections of thick material. Here, we report the development of soft X-ray ptychographic tomography, which resolves chemical states in three dimensions at 11 nm spatial resolution. We study an ensemble of nano-plates of lithium iron phosphate extracted from a battery electrode at 50% state of charge. Using a set of nanoscale tomograms, we quantify the electrochemical state and resolve phase boundaries throughout the volume of individual nanoparticles. These observations reveal multiple reaction points, intra-particle heterogeneity, and size effects that highlight the importance of multi-dimensional analytical tools in providing novel insight to the design of the next generation of high-performance devices.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20180000715','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20180000715"><span>GEOS S2S-2_1: GMAO's New High Resolution Seasonal Prediction System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Molod, Andrea; Akella, Santha; Andrews, Lauren; Barahona, Donifan; Borovikov, Anna; Chang, Yehui; Cullather, Richard; Hackert, Eric; Kovach, Robin; Koster, Randal; <a style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20180000715'); toggleEditAbsImage('author_20180000715_show'); toggleEditAbsImage('author_20180000715_hide'); "> <img style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20180000715_show"> <img style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20180000715_hide"></p> <p>2017-01-01</p> <p>A new version of the modeling and analysis system used to produce sub-seasonal to seasonal forecasts has just been released by the NASA Goddard Global Modeling and Assimilation Office. The new version runs at higher atmospheric resolution (approximately 12 degree globally), contains a substantially improved model description of the cryosphere, and includes additional interactive earth system model components (aerosol model). In addition, the Ocean data assimilation system has been replaced with a Local Ensemble Transform Kalman Filter. Here will describe the new system, along with the plans for the future (GEOS S2S-3_0) which will include a higher resolution ocean model and more interactive earth system model components (interactive vegetation, biomass burning from fires). We will also present results from a free-running coupled simulation with the new system and results from a series of retrospective seasonal forecasts. Results from retrospective forecasts show significant improvements in surface temperatures over much of the northern hemisphere and a much improved prediction of sea ice extent in both hemispheres. The precipitation forecast skill is comparable to previous S2S systems, and the only trade off is an increased double ITCZ, which is expected as we go to higher atmospheric resolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A14C..06M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A14C..06M"><span>GEOS S2S-2_1: The GMAO new high resolution Seasonal Prediction System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Molod, A.; Vikhliaev, Y. V.; Hackert, E. C.; Kovach, R. M.; Zhao, B.; Cullather, R. I.; Marshak, J.; Borovikov, A.; Li, Z.; Barahona, D.; Andrews, L. C.; Chang, Y.; Schubert, S. D.; Koster, R. D.; Suarez, M.; Akella, S.</p> <p>2017-12-01</p> <p>A new version of the modeling and analysis system used to produce subseasonalto seasonal forecasts has just been released by the NASA/Goddard GlobalModeling and Assimilation Office. The new version runs at higher atmospheric resolution (approximately 1/2 degree globally), contains a subtantially improvedmodel description of the cryosphere, and includes additional interactive earth system model components (aerosol model). In addition, the Ocean data assimilationsystem has been replaced with a Local Ensemble Transform Kalman Filter.Here will describe the new system, along with the plans for the future (GEOS S2S-3_0) which will include a higher resolution ocean model and more interactive earth system model components (interactive vegetation, biomass burning from fires). We will alsopresent results from a free-running coupled simulation with the new system and resultsfrom a series of retrospective seasonal forecasts.Results from retrospective forecasts show significant improvements in surface temperaturesover much of the northern hemisphere and a much improved prediction of sea ice extent in bothhemispheres. The precipitation forecast skill is comparable to previous S2S systems, andthe only tradeoff is an increased "double ITCZ", which is expected as we go to higher atmospheric resolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.A23C3251H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.A23C3251H"><span>DART: Tools and Support for Ensemble Data Assimilation Research, Operations, and Education</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hoar, T. J.; Anderson, J. L.; Collins, N.; Raeder, K.; Kershaw, H.; Romine, G. S.; Mizzi, A. P.; Chatterjee, A.; Karspeck, A. R.; Zarzycki, C. M.; Ha, S. Y.; Barre, J.; Gaubert, B.</p> <p>2014-12-01</p> <p>The Data Assimilation Research Testbed (DART) is a community facility for ensemble data assimilation developed and supported by the National Center for Atmospheric Research. DART provides a comprehensive suite of software, documentation, examples and tutorials that can be used for ensemble data assimilation research, operations, and education. Scientists and software engineers from the Data Assimilation Research Section at NCAR are available to actively support DART users who want to use existing DART products or develop their own new applications. Current DART users range from university professors teaching data assimilation, to individual graduate students working with simple models, through national laboratories doing operational prediction with large state-of-the-art models. DART runs efficiently on many computational platforms ranging from laptops through thousands of cores on the newest supercomputers. This poster focuses on several recent research activities using DART with geophysical models. First, DART is being used with the Community Atmosphere Model Spectral Element (CAM-SE) and Model for Prediction Across Scales (MPAS) global atmospheric models that support locally enhanced grid resolution. Initial results from ensemble assimilation with both models are presented. DART is also being used to produce ensemble analyses of atmospheric tracers, in particular CO, in both the global CAM-Chem model and the regional Weather Research and Forecast with chemistry (WRF-Chem) model by assimilating observations from the Measurements of Pollution in the Troposphere (MOPITT) and Infrared Atmospheric Sounding Interferometer (IASI) instruments. Results from ensemble analyses in both models are presented. An interface between DART and the Community Atmosphere Biosphere Land Exchange (CABLE) model has been completed and ensemble land surface analyses with DART/CABLE will be discussed. Finally, an update on ensemble analyses in the fully-coupled Community Earth System (CESM) is presented. The poster includes instructions on how to get started using DART for research or educational applications.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010pcms.confE..23V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010pcms.confE..23V"><span>Comparison of three different methods of perturbing the potential vorticity field in mesoscale forecasts of Mediterranean heavy precipitation events: PV-gradient, PV-adjoint and PV-satellite</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vich, M.; Romero, R.; Richard, E.; Arbogast, P.; Maynard, K.</p> <p>2010-09-01</p> <p>Heavy precipitation events occur regularly in the western Mediterranean region. These events often have a high impact on the society due to economic and personal losses. The improvement of the mesoscale numerical forecasts of these events can be used to prevent or minimize their impact on the society. In previous studies, two ensemble prediction systems (EPSs) based on perturbing the model initial and boundary conditions were developed and tested for a collection of high-impact MEDEX cyclonic episodes. These EPSs perturb the initial and boundary potential vorticity (PV) field through a PV inversion algorithm. This technique ensures modifications of all the meteorological fields without compromising the mass-wind balance. One EPS introduces the perturbations along the zones of the three-dimensional PV structure presenting the local most intense values and gradients of the field (a semi-objective choice, PV-gradient), while the other perturbs the PV field over the MM5 adjoint model calculated sensitivity zones (an objective method, PV-adjoint). The PV perturbations are set from a PV error climatology (PVEC) that characterizes typical PV errors in the ECMWF forecasts, both in intensity and displacement. This intensity and displacement perturbation of the PV field is chosen randomly, while its location is given by the perturbation zones defined in each ensemble generation method. Encouraged by the good results obtained by these two EPSs that perturb the PV field, a new approach based on a manual perturbation of the PV field has been tested and compared with the previous results. This technique uses the satellite water vapor (WV) observations to guide the correction of initial PV structures. The correction of the PV field intents to improve the match between the PV distribution and the WV image, taking advantage of the relation between dark and bright features of WV images and PV anomalies, under some assumptions. Afterwards, the PV inversion algorithm is applied to run a forecast with the corresponding perturbed initial state (PV-satellite). The non hydrostatic MM5 mesoscale model has been used to run all forecasts. The simulations are performed for a two-day period with a 22.5 km resolution domain (Domain 1 in http://mm5forecasts.uib.es) nested in the ECMWF large-scale forecast fields. The MEDEX cyclone of 10 June 2000, also known as the Montserrat Case, is a suitable testbed to compare the performance of each ensemble and the PV-satellite method. This case is characterized by an Atlantic upper-level trough and low-level cold front which generated a stationary mesoscale cyclone over the Spanish Mediterranean coast, advecting warm and moist air toward Catalonia from the Mediterranean Sea. The consequences of the resulting mesoscale convective system were 6-h accumulated rainfall amounts of 180 mm with estimated material losses to exceed 65 million euros by media. The performace of both ensemble forecasting systems and PV-satellite technique for our case study is evaluated through the verification of the rainfall field. Since the EPSs are probabilistic forecasts and the PV-satellite is deterministic, their comparison is done using the individual ensemble members. Therefore the verification procedure uses deterministic scores, like the ROC curve, the Taylor diagram or the Q-Q plot. These scores cover the different quality attributes of the forecast such as reliability, resolution, uncertainty and sharpness. The results show that the PV-satellite technique performance lies within the performance range obtained by both ensembles; it is even better than the non-perturbed ensemble member. Thus, perturbing randomly using the PV error climatology and introducing the perturbations in the zones given by each EPS captures the mismatch between PV and WV fields better than manual perturbations made by an expert forecaster, at least for this case study.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPhD...50X5104Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPhD...50X5104Z"><span>Dependence of high density nitrogen-vacancy center ensemble coherence on electron irradiation doses and annealing time</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, C.; Yuan, H.; Zhang, N.; Xu, L. X.; Li, B.; Cheng, G. D.; Wang, Y.; Gui, Q.; Fang, J. C.</p> <p>2017-12-01</p> <p>Negatively charged nitrogen-vacancy (NV-) center ensembles in diamond have proved to have great potential for use in highly sensitive, small-package solid-state quantum sensors. One way to improve sensitivity is to produce a high-density NV- center ensemble on a large scale with a long coherence lifetime. In this work, the NV- center ensemble is prepared in type-Ib diamond using high energy electron irradiation and annealing, and the transverse relaxation time of the ensemble—T 2—was systematically investigated as a function of the irradiation electron dose and annealing time. Dynamical decoupling sequences were used to characterize T 2. To overcome the problem of low signal-to-noise ratio in T 2 measurement, a coupled strip lines waveguide was used to synchronously manipulate NV- centers along three directions to improve fluorescence signal contrast. Finally, NV- center ensembles with a high concentration of roughly 1015 mm-3 were manipulated within a ~10 µs coherence time. By applying a multi-coupled strip-lines waveguide to improve the effective volume of the diamond, a sub-femtotesla sensitivity for AC field magnetometry can be achieved. The long-coherence high-density large-scale NV- center ensemble in diamond means that types of room-temperature micro-sized solid-state quantum sensors with ultra-high sensitivity can be further developed in the near future.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AdSR...14..227L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AdSR...14..227L"><span>Wind power application research on the fusion of the determination and ensemble prediction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lan, Shi; Lina, Xu; Yuzhu, Hao</p> <p>2017-07-01</p> <p>The fused product of wind speed for the wind farm is designed through the use of wind speed products of ensemble prediction from the European Centre for Medium-Range Weather Forecasts (ECMWF) and professional numerical model products on wind power based on Mesoscale Model5 (MM5) and Beijing Rapid Update Cycle (BJ-RUC), which are suitable for short-term wind power forecasting and electric dispatch. The single-valued forecast is formed by calculating the different ensemble statistics of the Bayesian probabilistic forecasting representing the uncertainty of ECMWF ensemble prediction. Using autoregressive integrated moving average (ARIMA) model to improve the time resolution of the single-valued forecast, and based on the Bayesian model averaging (BMA) and the deterministic numerical model prediction, the optimal wind speed forecasting curve and the confidence interval are provided. The result shows that the fusion forecast has made obvious improvement to the accuracy relative to the existing numerical forecasting products. Compared with the 0-24 h existing deterministic forecast in the validation period, the mean absolute error (MAE) is decreased by 24.3 % and the correlation coefficient (R) is increased by 12.5 %. In comparison with the ECMWF ensemble forecast, the MAE is reduced by 11.7 %, and R is increased 14.5 %. Additionally, MAE did not increase with the prolongation of the forecast ahead.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017OcMod.113..171K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017OcMod.113..171K"><span>Oceanic ensemble forecasting in the Gulf of Mexico: An application to the case of the Deep Water Horizon oil spill</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Khade, Vikram; Kurian, Jaison; Chang, Ping; Szunyogh, Istvan; Thyng, Kristen; Montuoro, Raffaele</p> <p>2017-05-01</p> <p>This paper demonstrates the potential of ocean ensemble forecasting in the Gulf of Mexico (GoM). The Bred Vector (BV) technique with one week rescaling frequency is implemented on a 9 km resolution version of the Regional Ocean Modelling System (ROMS). Numerical experiments are carried out by using the HYCOM analysis products to define the initial conditions and the lateral boundary conditions. The growth rates of the forecast uncertainty are estimated to be about 10% of initial amplitude per week. By carrying out ensemble forecast experiments with and without perturbed surface forcing, it is demonstrated that in the coastal regions accounting for uncertainties in the atmospheric forcing is more important than accounting for uncertainties in the ocean initial conditions. In the Loop Current region, the initial condition uncertainties, are the dominant source of the forecast uncertainty. The root-mean-square error of the Lagrangian track forecasts at the 15-day forecast lead time can be reduced by about 10 - 50 km using the ensemble mean Eulerian forecast of the oceanic flow for the computation of the tracks, instead of the single-initial-condition Eulerian forecast.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20010111480','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20010111480"><span>Design and Implementation of a Parallel Multivariate Ensemble Kalman Filter for the Poseidon Ocean General Circulation Model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Keppenne, Christian L.; Rienecker, Michele M.; Koblinsky, Chester (Technical Monitor)</p> <p>2001-01-01</p> <p>A multivariate ensemble Kalman filter (MvEnKF) implemented on a massively parallel computer architecture has been implemented for the Poseidon ocean circulation model and tested with a Pacific Basin model configuration. There are about two million prognostic state-vector variables. Parallelism for the data assimilation step is achieved by regionalization of the background-error covariances that are calculated from the phase-space distribution of the ensemble. Each processing element (PE) collects elements of a matrix measurement functional from nearby PEs. To avoid the introduction of spurious long-range covariances associated with finite ensemble sizes, the background-error covariances are given compact support by means of a Hadamard (element by element) product with a three-dimensional canonical correlation function. The methodology and the MvEnKF configuration are discussed. It is shown that the regionalization of the background covariances; has a negligible impact on the quality of the analyses. The parallel algorithm is very efficient for large numbers of observations but does not scale well beyond 100 PEs at the current model resolution. On a platform with distributed memory, memory rather than speed is the limiting factor.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A13K..03H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A13K..03H"><span>To Which Extent can Aerosols Affect Alpine Mixed-Phase Clouds?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Henneberg, O.; Lohmann, U.</p> <p>2017-12-01</p> <p>Aerosol-cloud interactions constitute a high uncertainty in regional climate and changing weather patterns. Such uncertainties are due to the multiple processes that can be triggered by aerosol especially in mixed-phase clouds. Mixed-phase clouds most likely result in precipitation due to the formation of ice crystals, which can grow to precipitation size. Ice nucleating particles (INPs) determine how fast these clouds glaciate and form precipitation. The potential for INP to transfer supercooled liquid clouds to precipitating clouds depends on the available humidity and supercooled liquid. Those conditions are determined by dynamics. Moderately high updraft velocities result in persistent mixed-phase clouds in the Swiss Alps [1], which provide an ideal testbed to investigate the effect of aerosol on precipitation in mixed-phase clouds. To address the effect of aerosols in orographic winter clouds under different dynamic conditions, we run a number of real case ensembles with the regional climate model COSMO on a horizontal resolution of 1.1 km. Simulations with different INP concentrations within the range observed at the GAW research station Jungfraujoch in the Swiss Alps are conducted and repeated within the ensemble. Microphysical processes are described with a two-moment scheme. Enhanced INP concentrations enhance the precipitation rate of a single precipitation event up to 20%. Other precipitation events of similar strength are less affected by the INP concentration. The effect of CCNs is negligible for precipitation from orographic winter clouds in our case study. There is evidence for INP to change precipitation rate and location more effectively in stronger dynamic regimes due to the enhanced potential to transfer supercooled liquid to ice. The classification of the ensemble members according to their dynamics will quantify the interaction of aerosol effects and dynamics. Reference [1] Lohmann et al, 2016: Persistence of orographic mixed-phase clouds, GRL</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.A24A..07B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.A24A..07B"><span>Development of a multi-ensemble Prediction Model for China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Brasseur, G. P.; Bouarar, I.; Petersen, A. K.</p> <p>2016-12-01</p> <p>As part of the EU-sponsored Panda and MarcoPolo Projects, a multi-model prediction system including 7 models has been developed. Most regional models use global air quality predictions provided by the Copernicus Atmospheric Monitoring Service and downscale the forecast at relatively high spatial resolution in eastern China. The paper will describe the forecast system and show examples of forecasts produced for several Chinese urban areas and displayed on a web site developed by the Dutch Meteorological service. A discussion on the accuracy of the predictions based on a detailed validation process using surface measurements from the Chinese monitoring network will be presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017APS..MAR.A4003M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017APS..MAR.A4003M"><span>Entropic Elasticity in the Giant Muscle Protein Titin</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Morgan, Ian; Saleh, Omar</p> <p></p> <p>Intrinsically disordered proteins (IDPs) are a large and functionally important class of proteins that lack a fixed three-dimensional structure. Instead, they adopt a conformational ensemble of states which facilitates their biological function as molecular linkers, springs, and switches. Due to their conformational flexibility, it can be difficult to study IDPs using typical experimental methods. To overcome this challenge, we use a high-resolution single-molecule magnetic stretching technique to quantify IDP flexibility. We apply this technique to the giant muscle protein titin, measuring its elastic response at low forces. We present results demonstrating that titin's native elastic response derives from the combined entropic elasticity of its ordered and disordered domains.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017OcMod.120..120H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017OcMod.120..120H"><span>Will high-resolution global ocean models benefit coupled predictions on short-range to climate timescales?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hewitt, Helene T.; Bell, Michael J.; Chassignet, Eric P.; Czaja, Arnaud; Ferreira, David; Griffies, Stephen M.; Hyder, Pat; McClean, Julie L.; New, Adrian L.; Roberts, Malcolm J.</p> <p>2017-12-01</p> <p>As the importance of the ocean in the weather and climate system is increasingly recognised, operational systems are now moving towards coupled prediction not only for seasonal to climate timescales but also for short-range forecasts. A three-way tension exists between the allocation of computing resources to refine model resolution, the expansion of model complexity/capability, and the increase of ensemble size. Here we review evidence for the benefits of increased ocean resolution in global coupled models, where the ocean component explicitly represents transient mesoscale eddies and narrow boundary currents. We consider lessons learned from forced ocean/sea-ice simulations; from studies concerning the SST resolution required to impact atmospheric simulations; and from coupled predictions. Impacts of the mesoscale ocean in western boundary current regions on the large-scale atmospheric state have been identified. Understanding of air-sea feedback in western boundary currents is modifying our view of the dynamics in these key regions. It remains unclear whether variability associated with open ocean mesoscale eddies is equally important to the large-scale atmospheric state. We include a discussion of what processes can presently be parameterised in coupled models with coarse resolution non-eddying ocean models, and where parameterizations may fall short. We discuss the benefits of resolution and identify gaps in the current literature that leave important questions unanswered.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2011AGUFM.H41C1049F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2011AGUFM.H41C1049F"><span>Developing a regional retrospective ensemble precipitation dataset for watershed hydrology modeling, Idaho, USA</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Flores, A. N.; Smith, K.; LaPorte, P.</p> <p>2011-12-01</p> <p>Applications like flood forecasting, military trafficability assessment, and slope stability analysis necessitate the use of models capable of resolving hydrologic states and fluxes at spatial scales of hillslopes (e.g., 10s to 100s m). These models typically require precipitation forcings at spatial scales of kilometers or better and time intervals of hours. Yet in especially rugged terrain that typifies much of the Western US and throughout much of the developing world, precipitation data at these spatiotemporal resolutions is difficult to come by. Ground-based weather radars have significant problems in high-relief settings and are sparsely located, leaving significant gaps in coverage and high uncertainties. Precipitation gages provide accurate data at points but are very sparsely located and their placement is often not representative, yielding significant coverage gaps in a spatial and physiographic sense. Numerical weather prediction efforts have made precipitation data, including critically important information on precipitation phase, available globally and in near real-time. However, these datasets present watershed modelers with two problems: (1) spatial scales of many of these datasets are tens of kilometers or coarser, (2) numerical weather models used to generate these datasets include a land surface parameterization that in some circumstances can significantly affect precipitation predictions. We report on the development of a regional precipitation dataset for Idaho that leverages: (1) a dataset derived from a numerical weather prediction model, (2) gages within Idaho that report hourly precipitation data, and (3) a long-term precipitation climatology dataset. Hourly precipitation estimates from the Modern Era Retrospective-analysis for Research and Applications (MERRA) are stochastically downscaled using a hybrid orographic and statistical model from their native resolution (1/2 x 2/3 degrees) to a resolution of approximately 1 km. Downscaled precipitation realizations are conditioned on hourly observations from reporting gages and then conditioned again on the Parameter-elevation Regressions on Independent Slopes Model (PRISM) at the monthly timescale to reflect orographic precipitation trends common to watersheds of the Western US. While this methodology potentially introduces cross-pollination of errors due to the re-use of precipitation gage data, it nevertheless achieves an ensemble-based precipitation estimate and appropriate measures of uncertainty at a spatiotemporal resolution appropriate for watershed modeling.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.C41C1235L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.C41C1235L"><span>Sensitivity of an Antarctic Ice Sheet Model to Sub-Ice-Shelf Melting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Lipscomb, W. H.; Leguy, G.; Urban, N. M.; Berdahl, M.</p> <p>2017-12-01</p> <p>Theory and observations suggest that marine-based sectors of the Antarctic ice sheet could retreat rapidly under ocean warming and increased melting beneath ice shelves. Numerical models of marine ice sheets vary widely in sensitivity, depending on grid resolution and the parameterization of key processes (e.g., calving and hydrofracture). Here we study the sensitivity of the Antarctic ice sheet to ocean warming and sub-shelf melting in standalone simulations of the Community Ice Sheet Model (CISM). Melt rates either are prescribed based on observations and high-resolution ocean model output, or are derived from a plume model forced by idealized ocean temperature profiles. In CISM, we vary the model resolution (between 1 and 8 km), Stokes approximation (shallow-shelf, depth-integrated higher-order, or 3D higher-order) and calving scheme to create an ensemble of plausible responses to sub-shelf melting. This work supports a broader goal of building statistical and reduced models that can translate large-scale Earth-system model projections to changes in Antarctic ocean temperatures and ice sheet discharge, thus better quantifying uncertainty in Antarctic-sourced sea-level rise.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ1102256.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ1102256.pdf"><span>The Effects of Classical Guitar Ensembles on Student Self-Perceptions and Acquisition of Music Skills</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Kramer, John R.</p> <p>2012-01-01</p> <p>Classical guitar ensembles are increasing in the United States as popular alternatives to band, choir, and orchestra. Classical guitar ensembles are offered at many middle and high schools as fine arts electives as one of the only options for classical guitarists to participate in ensembles. The purpose of this study was to explore the development…</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_16");'>16</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li class="active"><span>18</span></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_18 --> <div id="page_19" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="361"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015AdWR...86..273H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015AdWR...86..273H"><span>Reduction of Used Memory Ensemble Kalman Filtering (RumEnKF): A data assimilation scheme for memory intensive, high performance computing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hut, Rolf; Amisigo, Barnabas A.; Steele-Dunne, Susan; van de Giesen, Nick</p> <p>2015-12-01</p> <p>Reduction of Used Memory Ensemble Kalman Filtering (RumEnKF) is introduced as a variant on the Ensemble Kalman Filter (EnKF). RumEnKF differs from EnKF in that it does not store the entire ensemble, but rather only saves the first two moments of the ensemble distribution. In this way, the number of ensemble members that can be calculated is less dependent on available memory, and mainly on available computing power (CPU). RumEnKF is developed to make optimal use of current generation super computer architecture, where the number of available floating point operations (flops) increases more rapidly than the available memory and where inter-node communication can quickly become a bottleneck. RumEnKF reduces the used memory compared to the EnKF when the number of ensemble members is greater than half the number of state variables. In this paper, three simple models are used (auto-regressive, low dimensional Lorenz and high dimensional Lorenz) to show that RumEnKF performs similarly to the EnKF. Furthermore, it is also shown that increasing the ensemble size has a similar impact on the estimation error from the three algorithms.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy..tmp..785K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy..tmp..785K"><span>Intercomparison of model response and internal variability across climate model ensembles</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kumar, Devashish; Ganguly, Auroop R.</p> <p>2017-10-01</p> <p>Characterization of climate uncertainty at regional scales over near-term planning horizons (0-30 years) is crucial for climate adaptation. Climate internal variability (CIV) dominates climate uncertainty over decadal prediction horizons at stakeholders' scales (regional to local). In the literature, CIV has been characterized indirectly using projections of climate change from multi-model ensembles (MME) instead of directly using projections from multiple initial condition ensembles (MICE), primarily because adequate number of initial condition (IC) runs were not available for any climate model. Nevertheless, the recent availability of significant number of IC runs from one climate model allows for the first time to characterize CIV directly from climate model projections and perform a sensitivity analysis to study the dominance of CIV compared to model response variability (MRV). Here, we measure relative agreement (a dimensionless number with values ranging between 0 and 1, inclusive; a high value indicates less variability and vice versa) among MME and MICE and find that CIV is lower than MRV for all projection time horizons and spatial resolutions for precipitation and temperature. However, CIV exhibits greater dominance over MRV for seasonal and annual mean precipitation at higher latitudes where signals of climate change are expected to emerge sooner. Furthermore, precipitation exhibits large uncertainties and a rapid decline in relative agreement from global to continental, regional, or local scales for MICE compared to MME. The fractional contribution of uncertainty due to CIV is invariant for precipitation and decreases for temperature as lead time progresses towards the end of the century.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.2809W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.2809W"><span>How does mesoscale impact deep convection? Answers from ensemble Northwestern Mediterranean Sea simulations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Waldman, Robin; Herrmann, Marine; Somot, Samuel; Arsouze, Thomas; Benshila, Rachid; Bosse, Anthony; Chanut, Jérôme; Giordani, Hervé; Pennel, Romain; Sevault, Florence; Testor, Pierre</p> <p>2017-04-01</p> <p>Ocean deep convection is a major process of interaction between surface and deep ocean. The Gulf of Lions is a well-documented deep convection area in the Mediterranean Sea, and mesoscale dynamics is a known factor impacting this phenomenon. However, previous modelling studies don't allow to address the robustness of its impact with respect to the physical configuration and ocean intrinsic variability. In this study, the impact of mesoscale on ocean deep convection in the Gulf of Lions is investigated using a multi-resolution ensemble simulation of the northwestern Mediterranean sea. The eddy-permitting Mediterranean model NEMOMED12 (6km resolution) is compared to its eddy-resolving counterpart with the 2-way grid refinement AGRIF in the northwestern Mediterranean (2km resolution). We focus on the well-documented 2012-2013 period and on the multidecadal timescale (1979-2013). The impact of mesoscale on deep convection is addressed in terms of its mean and variability, its impact on deep water transformations and on associated dynamical structures. Results are interpreted by diagnosing regional mean and eddy circulation and using buoyancy budgets. We find a mean inhibition of deep convection by mesoscale with large interannual variability. It is associated with a large impact on mean and transient circulation and a large air-sea flux feedback.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1917358B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1917358B"><span>High resolution statistical downscaling of the EUROSIP seasonal prediction. Application for southeastern Romania</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Busuioc, Aristita; Dumitrescu, Alexandru; Dumitrache, Rodica; Iriza, Amalia</p> <p>2017-04-01</p> <p>Seasonal climate forecasts in Europe are currently issued at the European Centre for Medium-Range Weather Forecasts (ECMWF) in the form of multi-model ensemble predictions available within the "EUROSIP" system. Different statistical techniques to calibrate, downscale and combine the EUROSIP direct model output are used to optimize the quality of the final probabilistic forecasts. In this study, a statistical downscaling model (SDM) based on canonical correlation analysis (CCA) is used to downscale the EUROSIP seasonal forecast at a spatial resolution of 1km x 1km over the Movila farm placed in southeastern Romania. This application is achieved in the framework of the H2020 MOSES project (http://www.moses-project.eu). The combination between monthly standardized values of three climate variables (maximum/minimum temperatures-Tmax/Tmin, total precipitation-Prec) is used as predictand while combinations of various large-scale predictors are tested in terms of their availability as outputs in the seasonal EUROSIP probabilistic forecasting (sea level pressure, temperature at 850 hPa and geopotential height at 500 hPa). The predictors are taken from the ECMWF system considering 15 members of the ensemble, for which the hindcasts since 1991 until present are available. The model was calibrated over the period 1991-2014 and predictions for summers 2015 and 2016 were achieved. The calibration was made for the ensemble average as well as for each ensemble member. The model was developed for each lead time: one month anticipation for June, two months anticipation for July and three months anticipation for August. The main conclusions from these preliminary results are: best predictions (in terms of the anomaly sign) for Tmax (July-2 months anticipation, August-3 months anticipation) for both years (2015, 2016); for Tmin - good predictions only for August (3 months anticipation ) for both years; for precipitation, good predictions for July (2 months anticipation) in 2015 and August (3 months anticipation) in 2016; failed prediction for June (1-month anticipation) for all parameters. To see if the results obtained for 2015 and 2016 summers are in agreement with the general ECMWF model performance in forecast of the three predictors used in the CCA SDM calibration, the mean bias and root mean square errors (RMSE) calculated over the entire period in each grid point, for each ensemble member and ensemble average were computed. The obtained results are confirmed, showing highest ECMWF performance in forecasting of the three predictors for 3 months anticipation (August) and lowest performance for one month anticipation (June). The added value of the CCA SDM in forecasting local Tmax/Tmin and total precipitation was compared to the ECMWF performance using nearest grid point method. Comparisons were performed for the 1991-2014 period, taking into account the forecast made in May for July. An important improvement was found for the CCA SDM predictions in terms of the RMSE value (computed against observations) for Tmax/Tmin and less for precipitation. The tests are in progress for the other summer months (June, July).</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1339833-high-resolution-model-intercomparison-project-highresmip-nbsp-v1-cmip6','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1339833-high-resolution-model-intercomparison-project-highresmip-nbsp-v1-cmip6"><span>High Resolution Model Intercomparison Project (HighResMIP v1.0) for CMIP6</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Haarsma, Reindert J.; Roberts, Malcolm J.; Vidale, Pier Luigi; ...</p> <p>2016-11-22</p> <p>Robust projections and predictions of climate variability and change, particularly at regional scales, rely on the driving processes being represented with fidelity in model simulations. The role of enhanced horizontal resolution in improved process representation in all components of the climate system is of growing interest, particularly as some recent simulations suggest both the possibility of significant changes in large-scale aspects of circulation as well as improvements in small-scale processes and extremes. However, such high-resolution global simulations at climate timescales, with resolutions of at least 50 km in the atmosphere and 0.25° in the ocean, have been performed at relativelymore » few research centres and generally without overall coordination, primarily due to their computational cost. Assessing the robustness of the response of simulated climate to model resolution requires a large multi-model ensemble using a coordinated set of experiments. The Coupled Model Intercomparison Project 6 (CMIP6) is the ideal framework within which to conduct such a study, due to the strong link to models being developed for the CMIP DECK experiments and other model intercomparison projects (MIPs). Increases in high-performance computing (HPC) resources, as well as the revised experimental design for CMIP6, now enable a detailed investigation of the impact of increased resolution up to synoptic weather scales on the simulated mean climate and its variability. The High Resolution Model Intercomparison Project (HighResMIP) presented in this paper applies, for the first time, a multi-model approach to the systematic investigation of the impact of horizontal resolution. A coordinated set of experiments has been designed to assess both a standard and an enhanced horizontal-resolution simulation in the atmosphere and ocean. The set of HighResMIP experiments is divided into three tiers consisting of atmosphere-only and coupled runs and spanning the period 1950–2050, with the possibility of extending to 2100, together with some additional targeted experiments. This paper describes the experimental set-up of HighResMIP, the analysis plan, the connection with the other CMIP6 endorsed MIPs, as well as the DECK and CMIP6 historical simulations. Lastly, HighResMIP thereby focuses on one of the CMIP6 broad questions, “what are the origins and consequences of systematic model biases?”, but we also discuss how it addresses the World Climate Research Program (WCRP) grand challenges.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=555740','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=555740"><span>High-Temperature unfolding of a trp-Cage mini-protein: a molecular dynamics simulation study</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Seshasayee, Aswin Sai Narain</p> <p>2005-01-01</p> <p>Background Trp cage is a recently-constructed fast-folding miniprotein. It consists of a short helix, a 3,10 helix and a C-terminal poly-proline that packs against a Trp in the alpha helix. It is known to fold within 4 ns. Results High-temperature unfolding molecular dynamics simulations of the Trp cage miniprotein have been carried out in explicit water using the OPLS-AA force-field incorporated in the program GROMACS. The radius of gyration (Rg) and Root Mean Square Deviation (RMSD) have been used as order parameters to follow the unfolding process. Distributions of Rg were used to identify ensembles. Conclusion Three ensembles could be identified. While the native-state ensemble shows an Rg distribution that is slightly skewed, the second ensemble, which is presumably the Transition State Ensemble (TSE), shows an excellent fit. The denatured ensemble shows large fluctuations, but a Gaussian curve could be fitted. This means that the unfolding process is two-state. Representative structures from each of these ensembles are presented here. PMID:15760474</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20070032972&hterms=ensemble&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D60%26Ntt%3Densemble','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20070032972&hterms=ensemble&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D60%26Ntt%3Densemble"><span>An Ensemble-Based Smoother with Retrospectively Updated Weights for Highly Nonlinear Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Chin, T. M.; Turmon, M. J.; Jewell, J. B.; Ghil, M.</p> <p>2006-01-01</p> <p>Monte Carlo computational methods have been introduced into data assimilation for nonlinear systems in order to alleviate the computational burden of updating and propagating the full probability distribution. By propagating an ensemble of representative states, algorithms like the ensemble Kalman filter (EnKF) and the resampled particle filter (RPF) rely on the existing modeling infrastructure to approximate the distribution based on the evolution of this ensemble. This work presents an ensemble-based smoother that is applicable to the Monte Carlo filtering schemes like EnKF and RPF. At the minor cost of retrospectively updating a set of weights for ensemble members, this smoother has demonstrated superior capabilities in state tracking for two highly nonlinear problems: the double-well potential and trivariate Lorenz systems. The algorithm does not require retrospective adaptation of the ensemble members themselves, and it is thus suited to a streaming operational mode. The accuracy of the proposed backward-update scheme in estimating non-Gaussian distributions is evaluated by comparison to the more accurate estimates provided by a Markov chain Monte Carlo algorithm.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010EGUGA..12.1940C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010EGUGA..12.1940C"><span>A Wind Forecasting System for Energy Application</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Courtney, Jennifer; Lynch, Peter; Sweeney, Conor</p> <p>2010-05-01</p> <p>Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/23541400','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/23541400"><span>Downscaled climate change projections with uncertainty assessment over India using a high resolution multi-model approach.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kumar, Pankaj; Wiltshire, Andrew; Mathison, Camilla; Asharaf, Shakeel; Ahrens, Bodo; Lucas-Picher, Philippe; Christensen, Jens H; Gobiet, Andreas; Saeed, Fahad; Hagemann, Stefan; Jacob, Daniela</p> <p>2013-12-01</p> <p>This study presents the possible regional climate change over South Asia with a focus over India as simulated by three very high resolution regional climate models (RCMs). One of the most striking results is a robust increase in monsoon precipitation by the end of the 21st century but regional differences in strength. First the ability of RCMs to simulate the monsoon climate is analyzed. For this purpose all three RCMs are forced with ECMWF reanalysis data for the period 1989-2008 at a horizontal resolution of ~25 km. The results are compared against independent observations. In order to simulate future climate the models are driven by lateral boundary conditions from two global climate models (GCMs: ECHAM5-MPIOM and HadCM3) using the SRES A1B scenario, except for one RCM, which only used data from one GCM. The results are presented for the full transient simulation period 1970-2099 and also for several time slices. The analysis concentrates on precipitation and temperature over land. All models show a clear signal of gradually wide-spread warming throughout the 21st century. The ensemble-mean warming over India is 1.5°C at the end of 2050, whereas it is 3.9°C at the end of century with respect to 1970-1999. The pattern of projected precipitation changes shows considerable spatial variability, with an increase in precipitation over the peninsular of India and coastal areas and, either no change or decrease further inland. From the analysis of a larger ensemble of global climate models using the A1B scenario a wide spread warming (~3.2°C) and an overall increase (~8.5%) in mean monsoon precipitation by the end of the 21st century is very likely. The influence of the driving GCM on the projected precipitation change simulated with each RCM is as strong as the variability among the RCMs driven with one. Copyright © 2013 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.8950W','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.8950W"><span>The value of the North American Multi Model Ensemble phase 2 for sub-seasonal hydrological forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Wanders, Niko; Wood, Eric</p> <p>2016-04-01</p> <p>Sub-seasonal to seasonal weather and hydrological forecasts have the potential to provide vital information for a variety of water-related decision makers. For example, seasonal forecasts of drought risk can enable farmers to make adaptive choices on crop varieties, labour usage, and technology investments. Seasonal and sub-seasonal predictions can increase preparedness to hydrological extremes that regularly occur in all regions of the world with large impacts on society. We investigated the skill of six seasonal forecast models from the NMME-2 ensemble coupled to two global hydrological models (VIC and PCRGLOBWB) for the period 1982-2012. The 31 years of NNME-2 hindcast data is used in combination with an ensemble mean and ESP forecast, to forecast important hydrological variables (e.g. soil moisture, groundwater storage, snow, reservoir levels and river discharge). By using two global hydrological models we are able to quantify both the uncertainty in the meteorological input and the uncertainty created by the different hydrological models. We show that the NMME-2 forecast outperforms the ESP forecasts in terms of anomaly correlation and brier skill score for all forecasted hydrological variables, with a low uncertainty in the performance amongst the hydrological models. However, the continuous ranked probability score (CRPS) of the NMME-2 ensemble is inferior to the ESP due to a large spread between the individual ensemble members. We use a cost analysis to show that the damage caused by floods and droughts in large scale rivers can globally be reduced by 48% (for leads from 1-2 months) to 20% (for leads between 6-9 months) when precautions are taken based on the NMME-2 ensemble instead of an ESP forecast. In collaboration with our local partner in West Africa (AGHRYMET), we looked at the performance of the sub-seasonal forecasts for crop planting dates and high flow season in West Africa. We show that the uncertainty in the optimal planting date is reduced from 30 days to 12 days (2.5 month lead) and an increased predictability of the high flow season from 45 days to 20 days (3-4 months lead). Additionally, we show that snow accumulation and melt onset in the Northern hemisphere can be forecasted with an uncertainty of 10 days (2.5 months lead). Both the overall skill, and the skill found in these last two examples, indicates that the new NMME-2 forecast dataset is valuable for sub-seasonal forecast applications. The high temporal resolution (daily), long leads (one year leads) and large hindcast archive enable new sub-seasonal forecasting applications to be explored. We show that the NMME-2 has a large potential for sub-seasonal hydrological forecasting and other potential hydrological applications (e.g. reservoir management), which could benefit from these new forecasts.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4552622','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4552622"><span>Quantitative Connection Between Ensemble Thermodynamics and Single-Molecule Kinetics: A Case Study Using Cryo-EM and smFRET Investigations of the Ribosome</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Frank, Joachim; Gonzalez, Ruben L.</p> <p>2015-01-01</p> <p>At equilibrium, thermodynamic and kinetic information can be extracted from biomolecular energy landscapes by many techniques. However, while static, ensemble techniques yield thermodynamic data, often only dynamic, single-molecule techniques can yield the kinetic data that describes transition-state energy barriers. Here we present a generalized framework based upon dwell-time distributions that can be used to connect such static, ensemble techniques with dynamic, single-molecule techniques, and thus characterize energy landscapes to greater resolutions. We demonstrate the utility of this framework by applying it to cryogenic electron microscopy and single-molecule fluorescence resonance energy transfer studies of the bacterial ribosomal pretranslocation complex. Among other benefits, application of this framework to these data explains why two transient, intermediate conformations of the pretranslocation complex, which are observed in a cryogenic electron microscopy study, may not be observed in several single-molecule fluorescence resonance energy transfer studies. PMID:25785884</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1439549-solar-time-based-analog-ensemble-method-regional-solar-power-forecasting','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1439549-solar-time-based-analog-ensemble-method-regional-solar-power-forecasting"><span>A Solar Time-Based Analog Ensemble Method for Regional Solar Power Forecasting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Hodge, Brian S; Zhang, Xinmin; Li, Yuan</p> <p></p> <p>This paper presents a new analog ensemble method for day-ahead regional photovoltaic (PV) power forecasting with hourly resolution. By utilizing open weather forecast and power measurement data, this prediction method is processed within a set of historical data with similar meteorological data (temperature and irradiance), and astronomical date (solar time and earth declination angle). Further, clustering and blending strategies are applied to improve its accuracy in regional PV forecasting. The robustness of the proposed method is demonstrated with three different numerical weather prediction models, the North American Mesoscale Forecast System, the Global Forecast System, and the Short-Range Ensemble Forecast, formore » both region level and single site level PV forecasts. Using real measured data, the new forecasting approach is applied to the load zone in Southeastern Massachusetts as a case study. The normalized root mean square error (NRMSE) has been reduced by 13.80%-61.21% when compared with three tested baselines.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17384062','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17384062"><span>Molecular dynamics simulations using temperature-enhanced essential dynamics replica exchange.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kubitzki, Marcus B; de Groot, Bert L</p> <p>2007-06-15</p> <p>Today's standard molecular dynamics simulations of moderately sized biomolecular systems at full atomic resolution are typically limited to the nanosecond timescale and therefore suffer from limited conformational sampling. Efficient ensemble-preserving algorithms like replica exchange (REX) may alleviate this problem somewhat but are still computationally prohibitive due to the large number of degrees of freedom involved. Aiming at increased sampling efficiency, we present a novel simulation method combining the ideas of essential dynamics and REX. Unlike standard REX, in each replica only a selection of essential collective modes of a subsystem of interest (essential subspace) is coupled to a higher temperature, with the remainder of the system staying at a reference temperature, T(0). This selective excitation along with the replica framework permits efficient approximate ensemble-preserving conformational sampling and allows much larger temperature differences between replicas, thereby considerably enhancing sampling efficiency. Ensemble properties and sampling performance of the method are discussed using dialanine and guanylin test systems, with multi-microsecond molecular dynamics simulations of these test systems serving as references.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25785884','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25785884"><span>Quantitative Connection between Ensemble Thermodynamics and Single-Molecule Kinetics: A Case Study Using Cryogenic Electron Microscopy and Single-Molecule Fluorescence Resonance Energy Transfer Investigations of the Ribosome.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Thompson, Colin D Kinz; Sharma, Ajeet K; Frank, Joachim; Gonzalez, Ruben L; Chowdhury, Debashish</p> <p>2015-08-27</p> <p>At equilibrium, thermodynamic and kinetic information can be extracted from biomolecular energy landscapes by many techniques. However, while static, ensemble techniques yield thermodynamic data, often only dynamic, single-molecule techniques can yield the kinetic data that describe transition-state energy barriers. Here we present a generalized framework based upon dwell-time distributions that can be used to connect such static, ensemble techniques with dynamic, single-molecule techniques, and thus characterize energy landscapes to greater resolutions. We demonstrate the utility of this framework by applying it to cryogenic electron microscopy (cryo-EM) and single-molecule fluorescence resonance energy transfer (smFRET) studies of the bacterial ribosomal pre-translocation complex. Among other benefits, application of this framework to these data explains why two transient, intermediate conformations of the pre-translocation complex, which are observed in a cryo-EM study, may not be observed in several smFRET studies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3645981','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3645981"><span>Understanding the Structural Ensembles of a Highly Extended Disordered Protein†</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Daughdrill, Gary W.; Kashtanov, Stepan; Stancik, Amber; Hill, Shannon E.; Helms, Gregory; Muschol, Martin</p> <p>2013-01-01</p> <p>Developing a comprehensive description of the equilibrium structural ensembles for intrinsically disordered proteins (IDPs) is essential to understanding their function. The p53 transactivation domain (p53TAD) is an IDP that interacts with multiple protein partners and contains numerous phosphorylation sites. Multiple techniques were used to investigate the equilibrium structural ensemble of p53TAD in its native and chemically unfolded states. The results from these experiments show that the native state of p53TAD has dimensions similar to a classical random coil while the chemically unfolded state is more extended. To investigate the molecular properties responsible for this behavior, a novel algorithm that generates diverse and unbiased structural ensembles of IDPs was developed. This algorithm was used to generate a large pool of plausible p53TAD structures that were reweighted to identify a subset of structures with the best fit to small angle X-ray scattering data. High weight structures in the native state ensemble show features that are localized to protein binding sites and regions with high proline content. The features localized to the protein binding sites are mostly eliminated in the chemically unfolded ensemble; while, the regions with high proline content remain relatively unaffected. Data from NMR experiments support these results, showing that residues from the protein binding sites experience larger environmental changes upon unfolding by urea than regions with high proline content. This behavior is consistent with the urea-induced exposure of nonpolar and aromatic side-chains in the protein binding sites that are partially excluded from solvent in the native state ensemble. PMID:21979461</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19990087333&hterms=behavior+modification&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dbehavior%2Bmodification','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19990087333&hterms=behavior+modification&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dbehavior%2Bmodification"><span>Behavior of Filters and Smoothers for Strongly Nonlinear Dynamics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Zhu, Yanqui; Cohn, Stephen E.; Todling, Ricardo</p> <p>1999-01-01</p> <p>The Kalman filter is the optimal filter in the presence of known gaussian error statistics and linear dynamics. Filter extension to nonlinear dynamics is non trivial in the sense of appropriately representing high order moments of the statistics. Monte Carlo, ensemble-based, methods have been advocated as the methodology for representing high order moments without any questionable closure assumptions. Investigation along these lines has been conducted for highly idealized dynamics such as the strongly nonlinear Lorenz model as well as more realistic models of the means and atmosphere. A few relevant issues in this context are related to the necessary number of ensemble members to properly represent the error statistics and, the necessary modifications in the usual filter situations to allow for correct update of the ensemble members. The ensemble technique has also been applied to the problem of smoothing for which similar questions apply. Ensemble smoother examples, however, seem to be quite puzzling in that results state estimates are worse than for their filter analogue. In this study, we use concepts in probability theory to revisit the ensemble methodology for filtering and smoothing in data assimilation. We use the Lorenz model to test and compare the behavior of a variety of implementations of ensemble filters. We also implement ensemble smoothers that are able to perform better than their filter counterparts. A discussion of feasibility of these techniques to large data assimilation problems will be given at the time of the conference.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018OSJ...tmp...10C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018OSJ...tmp...10C"><span>An OSSE Study for Deep Argo Array using the GFDL Ensemble Coupled Data Assimilation System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chang, You-Soon; Zhang, Shaoqing; Rosati, Anthony; Vecchi, Gabriel A.; Yang, Xiaosong</p> <p>2018-03-01</p> <p>An observing system simulation experiment (OSSE) using an ensemble coupled data assimilation system was designed to investigate the impact of deep ocean Argo profile assimilation in a biased numerical climate system. Based on the modern Argo observational array and an artificial extension to full depth, "observations" drawn from one coupled general circulation model (CM2.0) were assimilated into another model (CM2.1). Our results showed that coupled data assimilation with simultaneous atmospheric and oceanic constraints plays a significant role in preventing deep ocean drift. However, the extension of the Argo array to full depth did not significantly improve the quality of the oceanic climate estimation within the bias magnitude in the twin experiment. Even in the "identical" twin experiment for the deep Argo array from the same model (CM2.1) with the assimilation model, no significant changes were shown in the deep ocean, such as in the Atlantic meridional overturning circulation and the Antarctic bottom water cell. The small ensemble spread and corresponding weak constraints by the deep Argo profiles with medium spatial and temporal resolution may explain why the deep Argo profiles did not improve the deep ocean features in the assimilation system. Additional studies using different assimilation methods with improved spatial and temporal resolution of the deep Argo array are necessary in order to more thoroughly understand the impact of the deep Argo array on the assimilation system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.8611H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.8611H"><span>Post-processing of global model output to forecast point rainfall</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hewson, Tim; Pillosu, Fatima</p> <p>2016-04-01</p> <p>ECMWF (the European Centre for Medium range Weather Forecasts) has recently embarked upon a new project to post-process gridbox rainfall forecasts from its ensemble prediction system, to provide probabilistic forecasts of point rainfall. The new post-processing strategy relies on understanding how different rainfall generation mechanisms lead to different degrees of sub-grid variability in rainfall totals. We use a number of simple global model parameters, such as the convective rainfall fraction, to anticipate the sub-grid variability, and then post-process each ensemble forecast into a pdf (probability density function) for a point-rainfall total. The final forecast will comprise the sum of the different pdfs from all ensemble members. The post-processing is essentially a re-calibration exercise, which needs only rainfall totals from standard global reporting stations (and forecasts) to train it. High density observations are not needed. This presentation will describe results from the initial 'proof of concept' study, which has been remarkably successful. Reference will also be made to other useful outcomes of the work, such as gaining insights into systematic model biases in different synoptic settings. The special case of orographic rainfall will also be discussed. Work ongoing this year will also be described. This involves further investigations of which model parameters can provide predictive skill, and will then move on to development of an operational system for predicting point rainfall across the globe. The main practical benefit of this system will be a greatly improved capacity to predict extreme point rainfall, and thereby provide early warnings, for the whole world, of flash flood potential for lead times that extend beyond day 5. This will be incorporated into the suite of products output by GLOFAS (the GLObal Flood Awareness System) which is hosted at ECMWF. As such this work offers a very cost-effective approach to satisfying user needs right around the world. This field has hitherto relied on using very expensive high-resolution ensembles; by their very nature these can only run over small regions, and only for lead times up to about 2 days.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110013410','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110013410"><span>The Ensemble Canon</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>MIittman, David S</p> <p>2011-01-01</p> <p>Ensemble is an open architecture for the development, integration, and deployment of mission operations software. Fundamentally, it is an adaptation of the Eclipse Rich Client Platform (RCP), a widespread, stable, and supported framework for component-based application development. By capitalizing on the maturity and availability of the Eclipse RCP, Ensemble offers a low-risk, politically neutral path towards a tighter integration of operations tools. The Ensemble project is a highly successful, ongoing collaboration among NASA Centers. Since 2004, the Ensemble project has supported the development of mission operations software for NASA's Exploration Systems, Science, and Space Operations Directorates.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1613434Y','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1613434Y"><span>A variational ensemble scheme for noisy image data assimilation</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Yang, Yin; Robinson, Cordelia; Heitz, Dominique; Mémin, Etienne</p> <p>2014-05-01</p> <p>Data assimilation techniques aim at recovering a system state variables trajectory denoted as X, along time from partially observed noisy measurements of the system denoted as Y. These procedures, which couple dynamics and noisy measurements of the system, fulfill indeed a twofold objective. On one hand, they provide a denoising - or reconstruction - procedure of the data through a given model framework and on the other hand, they provide estimation procedures for unknown parameters of the dynamics. A standard variational data assimilation problem can be formulated as the minimization of the following objective function with respect to the initial discrepancy, η, from the background initial guess: δ« J(η(x)) = 1∥Xb (x) - X (t ,x)∥2 + 1 tf∥H(X (t,x ))- Y (t,x)∥2dt. 2 0 0 B 2 t0 R (1) where the observation operator H links the state variable and the measurements. The cost function can be interpreted as the log likelihood function associated to the a posteriori distribution of the state given the past history of measurements and the background. In this work, we aim at studying ensemble based optimal control strategies for data assimilation. Such formulation nicely combines the ingredients of ensemble Kalman filters and variational data assimilation (4DVar). It is also formulated as the minimization of the objective function (1), but similarly to ensemble filter, it introduces in its objective function an empirical ensemble-based background-error covariance defined as: B ≡ <(Xb - <Xb>)(Xb - <Xb >)T>. (2) Thus, it works in an off-line smoothing mode rather than on the fly like sequential filters. Such resulting ensemble variational data assimilation technique corresponds to a relatively new family of methods [1,2,3]. It presents two main advantages: first, it does not require anymore to construct the adjoint of the dynamics tangent linear operator, which is a considerable advantage with respect to the method's implementation, and second, it enables the handling of a flow-dependent background error covariance matrix that can be consistently adjusted to the background error. These nice advantages come however at the cost of a reduced rank modeling of the solution space. The B matrix is at most of rank N - 1 (N is the size of the ensemble) which is considerably lower than the dimension of state space. This rank deficiency may introduce spurious correlation errors, which particularly impact the quality of results associated with a high resolution computing grid. The common strategy to suppress these distant correlations for ensemble Kalman techniques is through localization procedures. In this paper we present key theoretical properties associated to different choices of methods involved in this setup and compare with an incremental 4DVar method experimentally the performances of several variations of an ensemble technique of interest. The comparisons have been led on the basis of a Shallow Water model and have been carried out both with synthetic data and real observations. We particularly addressed the potential pitfalls and advantages of the different methods. The results indicate an advantage in favor of the ensemble technique both in quality and computational cost when dealing with incomplete observations. We highlight as the premise of using ensemble variational assimilation, that the initial perturbation used to build the initial ensemble has to fit the physics of the observed phenomenon . We also apply the method to a stochastic shallow-water model which incorporate an uncertainty expression if the subgrid stress tensor related to the ensemble spread. References [1] A. C. Lorenc, The potential of the ensemble kalman filter for nwp - a comparison with 4d-var, Quart. J. Roy. Meteor. Soc., Vol. 129, pp. 3183-3203, 2003. [2] C. Liu, Q. Xiao, and B. Wang, An Ensemble-Based Four-Dimensional Variational Data Assimilation Scheme. Part I: Technical Formulation and Preliminary Test, Mon. Wea. Rev., Vol. 136(9), pp. 3363-3373, 2008. [3] M. Buehner, Ensemble-derived stationary and flow-dependent background-error covariances: Evaluation in a quasi- operational NWP setting, Quart. J. Roy. Meteor. Soc., Vol. 131(607), pp. 1013-1043, April 2005.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_17");'>17</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li class="active"><span>19</span></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_19 --> <div id="page_20" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="381"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..19.3395M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..19.3395M"><span>"Big Data Assimilation" for 30-second-update 100-m-mesh Numerical Weather Prediction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Miyoshi, Takemasa; Lien, Guo-Yuan; Kunii, Masaru; Ruiz, Juan; Maejima, Yasumitsu; Otsuka, Shigenori; Kondo, Keiichi; Seko, Hiromu; Satoh, Shinsuke; Ushio, Tomoo; Bessho, Kotaro; Kamide, Kazumi; Tomita, Hirofumi; Nishizawa, Seiya; Yamaura, Tsuyoshi; Ishikawa, Yutaka</p> <p>2017-04-01</p> <p>A typical lifetime of a single cumulonimbus is within an hour, and radar observations often show rapid changes in only a 5-minute period. For precise prediction of such rapidly-changing local severe storms, we have developed what we call a "Big Data Assimilation" (BDA) system that performs 30-second-update data assimilation cycles at 100-m grid spacing. The concept shares that of NOAA's Warn-on-Forecast (WoF), in which rapidly-updated high-resolution NWP will play a central role in issuing severe-storm warnings even only minutes in advance. The 100-m resolution and 30-second update frequency are a leap above typical recent research settings, and it was possible by the fortunate combination of Japan's most advanced supercomputing and sensing technologies: the 10-petaflops K computer and the Phased Array Weather Radar (PAWR). The X-band PAWR is capable of a dense three-dimensional volume scan at 100-m range resolution with 100 elevation angles and 300 azimuth angles, up to 60-km range within 30 seconds. The PAWR data show temporally-smooth evolution of convective rainstorms. This gives us a hope that we may assume the Gaussian error distribution in 30-second forecasts before strong nonlinear dynamics distort the error distribution for rapidly-changing convective storms. With this in mind, we apply the Local Ensemble Transform Kalman Filter (LETKF) that considers flow-dependent error covariance explicitly under the Gaussian-error assumption. The flow-dependence would be particularly important in rapidly-changing convective weather. Using a 100-member ensemble at 100-m resolution, we have tested the Big Data Assimilation system in real-world cases of sudden local rainstorms, and obtained promising results. However, the real-time application is a big challenge, and currently it takes 10 minutes for a cycle. We explore approaches to accelerating the computations, such as using single-precision arrays in the model computation and developing an efficient I/O middleware for passing the large data between model and data assimilation as quickly as possible. In this presentation, we will present the most up-to-date progress of our Big Data Assimilation research.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27789526','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27789526"><span>An integrated 3-Dimensional Genome Modeling Engine for data-driven simulation of spatial genome organization.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Szałaj, Przemysław; Tang, Zhonghui; Michalski, Paul; Pietal, Michal J; Luo, Oscar J; Sadowski, Michał; Li, Xingwang; Radew, Kamen; Ruan, Yijun; Plewczynski, Dariusz</p> <p>2016-12-01</p> <p>ChIA-PET is a high-throughput mapping technology that reveals long-range chromatin interactions and provides insights into the basic principles of spatial genome organization and gene regulation mediated by specific protein factors. Recently, we showed that a single ChIA-PET experiment provides information at all genomic scales of interest, from the high-resolution locations of binding sites and enriched chromatin interactions mediated by specific protein factors, to the low resolution of nonenriched interactions that reflect topological neighborhoods of higher-order chromosome folding. This multilevel nature of ChIA-PET data offers an opportunity to use multiscale 3D models to study structural-functional relationships at multiple length scales, but doing so requires a structural modeling platform. Here, we report the development of 3D-GNOME (3-Dimensional Genome Modeling Engine), a complete computational pipeline for 3D simulation using ChIA-PET data. 3D-GNOME consists of three integrated components: a graph-distance-based heat map normalization tool, a 3D modeling platform, and an interactive 3D visualization tool. Using ChIA-PET and Hi-C data derived from human B-lymphocytes, we demonstrate the effectiveness of 3D-GNOME in building 3D genome models at multiple levels, including the entire genome, individual chromosomes, and specific segments at megabase (Mb) and kilobase (kb) resolutions of single average and ensemble structures. Further incorporation of CTCF-motif orientation and high-resolution looping patterns in 3D simulation provided additional reliability of potential biologically plausible topological structures. © 2016 Szałaj et al.; Published by Cold Spring Harbor Laboratory Press.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFM.A23D2394H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFM.A23D2394H"><span>Combining super-ensembles and statistical emulation to improve a regional climate and vegetation model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hawkins, L. R.; Rupp, D. E.; Li, S.; Sarah, S.; McNeall, D. J.; Mote, P.; Betts, R. A.; Wallom, D.</p> <p>2017-12-01</p> <p>Changing regional patterns of surface temperature, precipitation, and humidity may cause ecosystem-scale changes in vegetation, altering the distribution of trees, shrubs, and grasses. A changing vegetation distribution, in turn, alters the albedo, latent heat flux, and carbon exchanged with the atmosphere with resulting feedbacks onto the regional climate. However, a wide range of earth-system processes that affect the carbon, energy, and hydrologic cycles occur at sub grid scales in climate models and must be parameterized. The appropriate parameter values in such parameterizations are often poorly constrained, leading to uncertainty in predictions of how the ecosystem will respond to changes in forcing. To better understand the sensitivity of regional climate to parameter selection and to improve regional climate and vegetation simulations, we used a large perturbed physics ensemble and a suite of statistical emulators. We dynamically downscaled a super-ensemble (multiple parameter sets and multiple initial conditions) of global climate simulations using a 25-km resolution regional climate model HadRM3p with the land-surface scheme MOSES2 and dynamic vegetation module TRIFFID. We simultaneously perturbed land surface parameters relating to the exchange of carbon, water, and energy between the land surface and atmosphere in a large super-ensemble of regional climate simulations over the western US. Statistical emulation was used as a computationally cost-effective tool to explore uncertainties in interactions. Regions of parameter space that did not satisfy observational constraints were eliminated and an ensemble of parameter sets that reduce regional biases and span a range of plausible interactions among earth system processes were selected. This study demonstrated that by combining super-ensemble simulations with statistical emulation, simulations of regional climate could be improved while simultaneously accounting for a range of plausible land-atmosphere feedback strengths.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/15020771-changes-seasonal-extreme-hydrologic-conditions-georgia-basin-puget-sound-ensemble-regional-climate-simulation-mid-century','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/15020771-changes-seasonal-extreme-hydrologic-conditions-georgia-basin-puget-sound-ensemble-regional-climate-simulation-mid-century"><span>Changes in Seasonal and Extreme Hydrologic Conditions of the Georgia Basin/Puget Sound in an Ensemble Regional Climate Simulation for the Mid-Century</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Leung, Lai R.; Qian, Yun</p> <p></p> <p>This study examines an ensemble of climate change projections simulated by a global climate model (GCM) and downscaled with a region climate model (RCM) to 40 km spatial resolution for the western North America. One control and three ensemble future climate simulations were produced by the GCM following a business as usual scenario for greenhouse gases and aerosols emissions from 1995 to 2100. The RCM was used to downscale the GCM control simulation (1995-2015) and each ensemble future GCM climate (2040-2060) simulation. Analyses of the regional climate simulations for the Georgia Basin/Puget Sound showed a warming of 1.5-2oC and statisticallymore » insignificant changes in precipitation by the mid-century. Climate change has large impacts on snowpack (about 50% reduction) but relatively smaller impacts on the total runoff for the basin as a whole. However, climate change can strongly affect small watersheds such as those located in the transient snow zone, causing a higher likelihood of winter flooding as a higher percentage of precipitation falls in the form of rain rather than snow, and reduced streamflow in early summer. In addition, there are large changes in the monthly total runoff above the upper 1% threshold (or flood volume) from October through May, and the December flood volume of the future climate is 60% above the maximum monthly flood volume of the control climate. Uncertainty of the climate change projections, as characterized by the spread among the ensemble future climate simulations, is relatively small for the basin mean snowpack and runoff, but increases in smaller watersheds, especially in the transient snow zone, and associated with extreme events. This emphasizes the importance of characterizing uncertainty through ensemble simulations.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ESSD....9..389S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ESSD....9..389S"><span>A global water resources ensemble of hydrological models: the eartH2Observe Tier-1 dataset</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Schellekens, Jaap; Dutra, Emanuel; Martínez-de la Torre, Alberto; Balsamo, Gianpaolo; van Dijk, Albert; Sperna Weiland, Frederiek; Minvielle, Marie; Calvet, Jean-Christophe; Decharme, Bertrand; Eisner, Stephanie; Fink, Gabriel; Flörke, Martina; Peßenteiner, Stefanie; van Beek, Rens; Polcher, Jan; Beck, Hylke; Orth, René; Calton, Ben; Burke, Sophia; Dorigo, Wouter; Weedon, Graham P.</p> <p>2017-07-01</p> <p>The dataset presented here consists of an ensemble of 10 global hydrological and land surface models for the period 1979-2012 using a reanalysis-based meteorological forcing dataset (0.5° resolution). The current dataset serves as a state of the art in current global hydrological modelling and as a benchmark for further improvements in the coming years. A signal-to-noise ratio analysis revealed low inter-model agreement over (i) snow-dominated regions and (ii) tropical rainforest and monsoon areas. The large uncertainty of precipitation in the tropics is not reflected in the ensemble runoff. Verification of the results against benchmark datasets for evapotranspiration, snow cover, snow water equivalent, soil moisture anomaly and total water storage anomaly using the tools from The International Land Model Benchmarking Project (ILAMB) showed overall useful model performance, while the ensemble mean generally outperformed the single model estimates. The results also show that there is currently no single best model for all variables and that model performance is spatially variable. In our unconstrained model runs the ensemble mean of total runoff into the ocean was 46 268 km3 yr-1 (334 kg m-2 yr-1), while the ensemble mean of total evaporation was 537 kg m-2 yr-1. All data are made available openly through a Water Cycle Integrator portal (WCI, wci.earth2observe.eu), and via a direct http and ftp download. The portal follows the protocols of the open geospatial consortium such as OPeNDAP, WCS and WMS. The DOI for the data is <a href="https://doi.org/10.5281/zenodo.167070" target="_blank">https://doi.org/10.1016/10.5281/zenodo.167070</a>.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/27967122','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/27967122"><span>Sensitivity of small myosin II ensembles from different isoforms to mechanical load and ATP concentration.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Erdmann, Thorsten; Bartelheimer, Kathrin; Schwarz, Ulrich S</p> <p>2016-11-01</p> <p>Based on a detailed crossbridge model for individual myosin II motors, we systematically study the influence of mechanical load and adenosine triphosphate (ATP) concentration on small myosin II ensembles made from different isoforms. For skeletal and smooth muscle myosin II, which are often used in actomyosin gels that reconstitute cell contractility, fast forward movement is restricted to a small region of phase space with low mechanical load and high ATP concentration, which is also characterized by frequent ensemble detachment. At high load, these ensembles are stalled or move backwards, but forward motion can be restored by decreasing ATP concentration. In contrast, small ensembles of nonmuscle myosin II isoforms, which are found in the cytoskeleton of nonmuscle cells, are hardly affected by ATP concentration due to the slow kinetics of the bound states. For all isoforms, the thermodynamic efficiency of ensemble movement increases with decreasing ATP concentration, but this effect is weaker for the nonmuscle myosin II isoforms.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ClDy...50..863Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ClDy...50..863Z"><span>Evaluations of high-resolution dynamically downscaled ensembles over the contiguous United States</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.; Kotamarthi, V. Rao</p> <p>2018-02-01</p> <p>This study uses Weather Research and Forecast (WRF) model to evaluate the performance of six dynamical downscaled decadal historical simulations with 12-km resolution for a large domain (7200 × 6180 km) that covers most of North America. The initial and boundary conditions are from three global climate models (GCMs) and one reanalysis data. The GCMs employed in this study are the Geophysical Fluid Dynamics Laboratory Earth System Model with Generalized Ocean Layer Dynamics component, Community Climate System Model, version 4, and the Hadley Centre Global Environment Model, version 2-Earth System. The reanalysis data is from the National Centers for Environmental Prediction-US. Department of Energy Reanalysis II. We analyze the effects of bias correcting, the lateral boundary conditions and the effects of spectral nudging. We evaluate the model performance for seven surface variables and four upper atmospheric variables based on their climatology and extremes for seven subregions across the United States. The results indicate that the simulation's performance depends on both location and the features/variable being tested. We find that the use of bias correction and/or nudging is beneficial in many situations, but employing these when running the RCM is not always an improvement when compared to the reference data. The use of an ensemble mean and median leads to a better performance in measuring the climatology, while it is significantly biased for the extremes, showing much larger differences than individual GCM driven model simulations from the reference data. This study provides a comprehensive evaluation of these historical model runs in order to make informed decisions when making future projections.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017HESS...21.2881B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017HESS...21.2881B"><span>Global evaluation of runoff from 10 state-of-the-art hydrological models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Beck, Hylke E.; van Dijk, Albert I. J. M.; de Roo, Ad; Dutra, Emanuel; Fink, Gabriel; Orth, Rene; Schellekens, Jaap</p> <p>2017-06-01</p> <p>Observed streamflow data from 966 medium sized catchments (1000-5000 km2) around the globe were used to comprehensively evaluate the daily runoff estimates (1979-2012) of six global hydrological models (GHMs) and four land surface models (LSMs) produced as part of tier-1 of the eartH2Observe project. The models were all driven by the WATCH Forcing Data ERA-Interim (WFDEI) meteorological dataset, but used different datasets for non-meteorologic inputs and were run at various spatial and temporal resolutions, although all data were re-sampled to a common 0. 5° spatial and daily temporal resolution. For the evaluation, we used a broad range of performance metrics related to important aspects of the hydrograph. We found pronounced inter-model performance differences, underscoring the importance of hydrological model uncertainty in addition to climate input uncertainty, for example in studies assessing the hydrological impacts of climate change. The uncalibrated GHMs were found to perform, on average, better than the uncalibrated LSMs in snow-dominated regions, while the ensemble mean was found to perform only slightly worse than the best (calibrated) model. The inclusion of less-accurate models did not appreciably degrade the ensemble performance. Overall, we argue that more effort should be devoted on calibrating and regionalizing the parameters of macro-scale models. We further found that, despite adjustments using gauge observations, the WFDEI precipitation data still contain substantial biases that propagate into the simulated runoff. The early bias in the spring snowmelt peak exhibited by most models is probably primarily due to the widespread precipitation underestimation at high northern latitudes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.emc.ncep.noaa.gov/gmb/STATS/html/model_changes.html','SCIGOVWS'); return false;" href="http://www.emc.ncep.noaa.gov/gmb/STATS/html/model_changes.html"><span>MODEL CHANGES SINCE 1991</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.science.gov/aboutsearch.html">Science.gov Websites</a></p> <p></p> <p></p> <p>, effects of balloon drift in time and space included Forecast and <em>post</em> processing: Improved orography minor changes: Observations and analysis: Higher resolution sea ice mask Forecast and <em>post</em> processing . 12/04/07 12Z: Use of Unified <em>Post</em> Processor in GFS 12/04/07 12Z: GFS Ensemble (NAEFS/TIGGE) UPGRADE</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016PhRvA..93f3855G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016PhRvA..93f3855G"><span>Coherently coupling distinct spin ensembles through a high-Tc superconducting resonator</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ghirri, A.; Bonizzoni, C.; Troiani, F.; Buccheri, N.; Beverina, L.; Cassinese, A.; Affronte, M.</p> <p>2016-06-01</p> <p>The problem of coupling multiple spin ensembles through cavity photons is revisited by using (3,5-dichloro-4-pyridyl)bis(2,4,6-trichlorophenyl)methyl (PyBTM) organic radicals and a high-Tc superconducting coplanar resonator. An exceptionally strong coupling is obtained and up to three spin ensembles are simultaneously coupled. The ensembles are made physically distinguishable by chemically varying the g factor and by exploiting the inhomogeneities of the applied magnetic field. The coherent mixing of the spin and field modes is demonstrated by the observed multiple anticrossing, along with the simulations performed within the input-output formalism, and quantified by suitable entropic measures.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMGC53A0868R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMGC53A0868R"><span>Greenland in Warm (1.5 °C) and Warmer (RCP 8.5) Worlds: The Influence of the Paris Agreement on Ice Sheet Surface Melting</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Reusch, D. B.</p> <p>2017-12-01</p> <p>Melting on the surface of the Greenland ice sheet has been changing dramatically as global air temperatures have increased in recent decades, including melt extent often exceeding the 1981-2010 median through much of the melt season and the onset of intermittent melt moving to earlier in the year. To evaluate potential future change, we investigate surface melting characteristics under both "low" (limited to 1.5 °C) and "high" (RCP 8.5) warming scenarios including analysis of differences in scenario outcomes. Climatologies of melt-relevant variables are developed from two publicly available ensembles of CESM1-CAM5-BGC GCM runs: the 30-member Large Ensemble (CESM LE; Kay et al. 2015) for historical calibration and the RCP 8.5 scenario and the 11-member Low Warming ensemble (CESM LW; Sanderson et al. 2017) for the 1.5 °C scenario. For higher spatial resolution (15 km) and improved polar-centric model physics, we also apply the regional forecast model Polar WRF to decadal subsets (1996-2005; 2071-80) using GCM data archived at sub-daily resolution for boundary conditions. Models were skill-tested against ERA-Interim Reanalysis (ERAI) and AWS observations. For example, CESM LE tends to overpredict both maximum (above-freezing) and minimum daily average surface temperatures compared to observations from the GC-Net Swiss Camp AWS. Ensembles of members differing only by initial conditions allow us to also estimate intramodel uncertainty. Historical (1981-2000) CESM LE spatially averaged July temperatures are 2 +/- 0.2 °C cooler than ERAI while local anomalies in individual members reach up to +/- 2 °C. As expected, Greenland does not escape future (2081-2100) warming (and expectations of more widespread surface melting) even in the LW scenario, but positive changes versus ERAI are mostly coastal (2-3 °C) with the interior showing only minor change (+/- 1 °C). In contrast, under RCP 8.5, the entire ice sheet has warmed by 2-6 °C, or a median increase of 5 °C versus LW. Adjusting for the CESM cold bias versus ERAI pushes these values even closer to more frequent melting conditions. We combine these measures of model skill and intramodel variability to develop improved estimates of uncertainty for our estimates of future surface melting based on calibrations of models to passive microwave observations of melting.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20180000538','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20180000538"><span>High Resolution Atmospheric Inversion of Urban CO2 Emissions During the Dormant Season of the Indianapolis Flux Experiment (INFLUX)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Lauvaux, Thomas; Miles, Natasha L.; Deng, Aijun; Richardson, Scott J.; Cambaliza, Maria O.; Davis, Kenneth J.; Gaudet, Brian; Gurney, Kevin R.; Huang, Jianhua; O'Keefe, Darragh; <a style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20180000538'); toggleEditAbsImage('author_20180000538_show'); toggleEditAbsImage('author_20180000538_hide'); "> <img style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20180000538_show"> <img style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20180000538_hide"></p> <p>2016-01-01</p> <p>Urban emissions of greenhouse gases (GHG) represent more than 70% of the global fossil fuel GHG emissions. Unless mitigation strategies are successfully implemented, the increase in urban GHG emissions is almost inevitable as large metropolitan areas are projected to grow twice as fast as the world population in the coming 15 years. Monitoring these emissions becomes a critical need as their contribution to the global carbon budget increases rapidly. In this study, we developed the first comprehensive monitoring systems of CO2 emissions at high resolution using a dense network of CO2 atmospheric measurements over the city of Indianapolis. The inversion system was evaluated over a 8-month period and showed an increase compared to the Hestia CO2 emission estimate, a state-of-the-art building-level emission product, with a 20% increase in the total emissions over the area (from 4.5 to 5.7 Metric Megatons of Carbon +/- 0.23 Metric Megatons of Carbon). However, several key parameters of the inverse system need to be addressed to carefully characterize the spatial distribution of the emissions and the aggregated total emissions.We found that spatial structures in prior emission errors, mostly undetermined, affect significantly the spatial pattern in the inverse solution, as well as the carbon budget over the urban area. Several other parameters of the inversion were sufficiently constrained by additional observations such as the characterization of the GHG boundary inflow and the introduction of hourly transport model errors estimated from the meteorological assimilation system. Finally, we estimated the uncertainties associated with remaining systematic errors and undetermined parameters using an ensemble of inversions. The total CO2 emissions for the Indianapolis urban area based on the ensemble mean and quartiles are 5.26 - 5.91 Metric Megatons of Carbon, i.e. a statistically significant difference compared to the prior total emissions of 4.1 to 4.5 Metric Megatons of Carbon. We therefore conclude that atmospheric inversions are potentially able to constrain the carbon budget of the city, assuming sufficient data to measure the inflow of GHG over the city, but additional information on prior emissions and their associated error structures are required if we are to determine the spatial structures of urban emissions at high resolution.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20120011274','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20120011274"><span>On the Influence of North Pacific Sea Surface Temperature on the Arctic Winter Climate</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Hurwitz, Margaret M.; Newman, P. A.; Garfinkel, C. I.</p> <p>2012-01-01</p> <p>Differences between two ensembles of Goddard Earth Observing System Chemistry-Climate Model simulations isolate the impact of North Pacific sea surface temperatures (SSTs) on the Arctic winter climate. One ensemble of extended winter season forecasts is forced by unusually high SSTs in the North Pacific, while in the second ensemble SSTs in the North Pacific are unusually low. High Low differences are consistent with a weakened Western Pacific atmospheric teleconnection pattern, and in particular, a weakening of the Aleutian low. This relative change in tropospheric circulation inhibits planetary wave propagation into the stratosphere, in turn reducing polar stratospheric temperature in mid- and late winter. The number of winters with sudden stratospheric warmings is approximately tripled in the Low ensemble as compared with the High ensemble. Enhanced North Pacific SSTs, and thus a more stable and persistent Arctic vortex, lead to a relative decrease in lower stratospheric ozone in late winter, affecting the April clear-sky UV index at Northern Hemisphere mid-latitudes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22416233-forces-stress-second-order-mller-plesset-perturbation-theory-condensed-phase-systems-within-resolution-identity-gaussian-plane-waves-approach','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22416233-forces-stress-second-order-mller-plesset-perturbation-theory-condensed-phase-systems-within-resolution-identity-gaussian-plane-waves-approach"><span>Forces and stress in second order Møller-Plesset perturbation theory for condensed phase systems within the resolution-of-identity Gaussian and plane waves approach</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Del Ben, Mauro, E-mail: mauro.delben@chem.uzh.ch; Hutter, Jürg, E-mail: hutter@chem.uzh.ch; VandeVondele, Joost, E-mail: Joost.VandeVondele@mat.ethz.ch</p> <p></p> <p>The forces acting on the atoms as well as the stress tensor are crucial ingredients for calculating the structural and dynamical properties of systems in the condensed phase. Here, these derivatives of the total energy are evaluated for the second-order Møller-Plesset perturbation energy (MP2) in the framework of the resolution of identity Gaussian and plane waves method, in a way that is fully consistent with how the total energy is computed. This consistency is non-trivial, given the different ways employed to compute Coulomb, exchange, and canonical four center integrals, and allows, for example, for energy conserving dynamics in various ensembles.more » Based on this formalism, a massively parallel algorithm has been developed for finite and extended system. The designed parallel algorithm displays, with respect to the system size, cubic, quartic, and quintic requirements, respectively, for the memory, communication, and computation. All these requirements are reduced with an increasing number of processes, and the measured performance shows excellent parallel scalability and efficiency up to thousands of nodes. Additionally, the computationally more demanding quintic scaling steps can be accelerated by employing graphics processing units (GPU’s) showing, for large systems, a gain of almost a factor two compared to the standard central processing unit-only case. In this way, the evaluation of the derivatives of the RI-MP2 energy can be performed within a few minutes for systems containing hundreds of atoms and thousands of basis functions. With good time to solution, the implementation thus opens the possibility to perform molecular dynamics (MD) simulations in various ensembles (microcanonical ensemble and isobaric-isothermal ensemble) at the MP2 level of theory. Geometry optimization, full cell relaxation, and energy conserving MD simulations have been performed for a variety of molecular crystals including NH{sub 3}, CO{sub 2}, formic acid, and benzene.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1411037-internal-variability-dynamically-downscaled-climate-over-north-america','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1411037-internal-variability-dynamically-downscaled-climate-over-north-america"><span>Internal variability of a dynamically downscaled climate over North America</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Wang, Jiali; Bessac, Julie; Kotamarthi, Rao</p> <p></p> <p>This study investigates the internal variability (IV) of a regional climate model, and considers the impacts of horizontal resolution and spectral nudging on the IV. A 16-member simulation ensemble was conducted using the Weather Research Forecasting model for three model configurations. Ensemble members included simulations at spatial resolutions of 50 km and 12 km without spectral nudging and simulations at a spatial resolution of 12 km with spectral nudging. All the simulations were generated over the same domain, which covered much of North America. The degree of IV was measured as the spread between the individual members of the ensemblemore » during the integration period. The IV of the 12 km simulation with spectral nudging was also compared with a future climate change simulation projected by the same model configuration. The variables investigated focus on precipitation and near-surface air temperature. While the IVs show a clear annual cycle with larger values in summer and smaller values in winter, the seasonal IV is smaller for a 50-km spatial resolution than for a 12-km resolution when nudging is not applied. Applying a nudging technique to the 12-km simulation reduces the IV by a factor of two, and produces smaller IV than the simulation at 50 km without nudging. Applying a nudging technique also changes the geographic distributions of IV in all examined variables. The IV is much smaller than the inter-annual variability at seasonal scales for regionally averaged temperature and precipitation. The IV is also smaller than the projected changes in air-temperature for the mid- and late 21st century. However, the IV is larger than the projected changes in precipitation for the mid- and late 21st century.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMGC13C1209E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMGC13C1209E"><span>Future Climate Change Impact Assessment of River Flows at Two Watersheds of Peninsular Malaysia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ercan, A.; Ishida, K.; Kavvas, M. L.; Chen, Z. R.; Jang, S.; Amin, M. Z. M.; Shaaban, A. J.</p> <p>2016-12-01</p> <p>Impacts of climate change on the river flows under future climate change conditions were assessed over Muda and Dungun watersheds of Peninsular Malaysia by means of a coupled regional climate model and a physically-based hydrology model utilizing an ensemble of 15 different future climate realizations. Coarse resolution GCMs' future projections covering a wide range of emission scenarios were dynamically downscaled to 6 km resolution over the study area. Hydrologic simulations of the two selected watersheds were carried out at hillslope-scale and at hourly increments.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMGC23A1040E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMGC23A1040E"><span>Climate Change Impact Assessment of Hydro-Climate in Southern Peninsular Malaysia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ercan, A.; Ishida, K.; Kavvas, M. L.; Chen, Z. R.; Jang, S.; Amin, M. Z. M.; Shaaban, A. J.</p> <p>2017-12-01</p> <p>Impacts of climate change on the hydroclimate of the coastal region in the south of Peninsular Malaysia in the 21st century was assessed by means of a regional climate model utilizing an ensemble of 15 different future climate realizations. Coarse resolution Global Climate Models' future projections covering four emission scenarios based on Coupled Model Intercomparison Project phase 3 (CMIP3) datasets were dynamically downscaled to 6 km resolution over the study area. The analyses were made in terms of rainfall, air temperature, evapotranporation, and soil water storage.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017GPC...148...96R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017GPC...148...96R"><span>Ensemble climate projections of mean and extreme rainfall over Vietnam</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Raghavan, S. V.; Vu, M. T.; Liong, S. Y.</p> <p>2017-01-01</p> <p>A systematic ensemble high resolution climate modelling study over Vietnam has been performed using the PRECIS model developed by the Hadley Center in UK. A 5 member subset of the 17-member Perturbed Physics Ensembles (PPE) of the Quantifying Uncertainty in Model Predictions (QUMP) project were simulated and analyzed. The PRECIS model simulations were conducted at a horizontal resolution of 25 km for the baseline period 1961-1990 and a future climate period 2061-2090 under scenario A1B. The results of model simulations show that the model was able to reproduce the mean state of climate over Vietnam when compared to observations. The annual cycles and seasonal averages of precipitation over different sub-regions of Vietnam show the ability of the model in also reproducing the observed peak and magnitude of monthly rainfall. The climate extremes of precipitation were also fairly well captured. Projections of future climate show both increases and decreases in the mean climate over different regions of Vietnam. The analyses of future extreme rainfall using the STARDEX precipitation indices show an increase in 90th percentile precipitation (P90p) over the northern provinces (15-25%) and central highland (5-10%) and over southern Vietnam (up to 5%). The total number of wet days (Prcp) indicates a decrease of about 5-10% all over Vietnam. Consequently, an increase in the wet day rainfall intensity (SDII), is likely inferring that the projected rainfall would be much more severe and intense which have the potential to cause flooding in some regions. Risks due to extreme drought also exist in other regions where the number of wet days decreases. In addition, the maximum 5 day consecutive rainfall (R5d) increases by 20-25% over northern Vietnam but decreases in a similar range over the central and southern Vietnam. These results have strong implications for the management water resources, agriculture, bio diversity and economy and serve as some useful findings to be considered by the policy makers within a wider range of climate uncertainties.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.7768S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.7768S"><span>Coupled lagged ensemble weather- and river runoff prediction in complex Alpine terrain</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Smiatek, Gerhard; Kunstmann, Harald; Werhahn, Johannes</p> <p>2013-04-01</p> <p>It is still a challenge to predict fast reacting streamflow precipitation response in Alpine terrain. Civil protection measures require flood prediction in 24 - 48 lead time. This holds particularly true for the Ammer River region which was affected by century floods in 1999, 2003 and 2005. Since 2005 a coupled NWP/Hydrology model system is operated in simulating and predicting the Ammer River discharges. The Ammer River catchment is located in the Bavarian Ammergau Alps and alpine forelands, Germany. With elevations reaching 2185 m and annual mean precipitation between 1100 and 2000 mm it represents very demanding test ground for a river runoff prediction system. The one way coupled system utilizes a lagged ensemble prediction system (EPS) taking into account combination of recent and previous NWP forecasts. The major components of the system are the MM5 NWP model run at 3.5 km resolution and initialized twice a day, the hydrology model WaSiM-ETH run at 100 m resolution and Perl object environment (POE) implementing the networking and the system operation. Results obtained in the years 2005-2012 reveal that river runoff simulations depict already high correlation (NSC in range 0.53 and 0.95) with observed runoff in retrospective runs with monitored meteorology data, but suffer from errors in quantitative precipitation forecast (QPF) from the employed numerical weather prediction model. We evaluate the NWP model accuracy, especially the precipitation intensity, frequency and location and put a focus on the performance gain of bias adjustment procedures. We show how this enhanced QFP data help to reduce the uncertainty in the discharge prediction. In addition to the HND (Hochwassernachrichtendienst, Bayern) observations TERENO Longterm Observatory hydrometeorological observation data are available since 2011. They are used to evaluate the NWP performance and setup of a bias correction procedure based on ensemble postprocessing applying Bayesian (BMA) model averaging. We first present briefly the technical setup of the operational coupled lagged NWP/Hydrology model system and then focus on the evaluation of the NWP model, the BMA enhanced QPF and its application within the Ammer simulation system in the period 2011 - 2012</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFM.H41E0866E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFM.H41E0866E"><span>A Novel approach for monitoring cyanobacterial blooms using an ensemble based system from MODIS imagery downscaled to 250 metres spatial resolution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>El Alem, A.; Chokmani, K.; Laurion, I.; El-Adlouni, S. E.</p> <p>2014-12-01</p> <p>In reason of inland freshwaters sensitivity to Harmful algae blooms (HAB) development and the limits coverage of standards monitoring programs, remote sensing data have become increasingly used for monitoring HAB extension. Usually, HAB monitoring using remote sensing data is based on empirical and semi-empirical models. Development of such models requires a great number of continuous in situ measurements to reach an acceptable accuracy. However, Ministries and water management organizations often use two thresholds, established by the World Health Organization, to determine water quality. Consequently, the available data are ordinal «semi-qualitative» and they are mostly unexploited. Use of such databases with remote sensing data and statistical classification algorithms can produce hazard management maps linked to the presence of cyanobacteria. Unlike standard classification algorithms, which are generally unstable, classifiers based on ensemble systems are more general and stable. In the present study, an ensemble based classifier was developed and compared to a standard classification method called CART (Classification and Regression Tree) in a context of HAB monitoring in freshwaters using MODIS images downscaled to 250 spatial resolution and ordinal in situ data. Calibration and validation data on cyanobacteria densities were collected by the Ministère du Développement durable, de l'Environnement et de la Lutte contre les changements climatiques on 22 waters bodies between 2000 and 2010. These data comprise three density classes: waters poorly (< 20,000 cells mL-1), moderately (20,000 - 100,000 cells mL-1), and highly (> 100,000 cells mL-1) loaded in cyanobacteria. Results were very interesting and highlighted that inland waters exhibit different spectral response allowing them to be classified into the three above classes for water quality monitoring. On the other, even if the accuracy (Kappa-index = 0.86) of the proposed approach is relatively lower than that of the CART algorithm (Kappa-index = 0.87), but its robustness is higher with a standard-deviation of 0.05 versus 0.06, specifically when applied on MODIS images. A new accurate, robust, and quick approach is thus proposed for a daily near real-time monitoring of HAB in southern Quebec freshwaters.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_18");'>18</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li class="active"><span>20</span></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_20 --> <div id="page_21" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="401"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy...48..745F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy...48..745F"><span>Variability of hydrological extreme events in East Asia and their dynamical control: a comparison between observations and two high-resolution global climate models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Freychet, N.; Duchez, A.; Wu, C.-H.; Chen, C.-A.; Hsu, H.-H.; Hirschi, J.; Forryan, A.; Sinha, B.; New, A. L.; Graham, T.; Andrews, M. B.; Tu, C.-Y.; Lin, S.-J.</p> <p>2017-02-01</p> <p>This work investigates the variability of extreme weather events (drought spells, DS15, and daily heavy rainfall, PR99) over East Asia. It particularly focuses on the large scale atmospheric circulation associated with high levels of the occurrence of these extreme events. Two observational datasets (APHRODITE and PERSIANN) are compared with two high-resolution global climate models (HiRAM and HadGEM3-GC2) and an ensemble of other lower resolution climate models from CMIP5. We first evaluate the performance of the high resolution models. They both exhibit good skill in reproducing extreme events, especially when compared with CMIP5 results. Significant differences exist between the two observational datasets, highlighting the difficulty of having a clear estimate of extreme events. The link between the variability of the extremes and the large scale circulation is investigated, on monthly and interannual timescales, using composite and correlation analyses. Both extreme indices DS15 and PR99 are significantly linked to the low level wind intensity over East Asia, i.e. the monsoon circulation. It is also found that DS15 events are strongly linked to the surface temperature over the Siberian region and to the land-sea pressure contrast, while PR99 events are linked to the sea surface temperature anomalies over the West North Pacific. These results illustrate the importance of the monsoon circulation on extremes over East Asia. The dependencies on of the surface temperature over the continent and the sea surface temperature raise the question as to what extent they could affect the occurrence of extremes over tropical regions in future projections.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1816808N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1816808N"><span>A framework for probabilistic pluvial flood nowcasting for urban areas</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ntegeka, Victor; Murla, Damian; Wang, Lipen; Foresti, Loris; Reyniers, Maarten; Delobbe, Laurent; Van Herk, Kristine; Van Ootegem, Luc; Willems, Patrick</p> <p>2016-04-01</p> <p>Pluvial flood nowcasting is gaining ground not least because of the advancements in rainfall forecasting schemes. Short-term forecasts and applications have benefited from the availability of such forecasts with high resolution in space (~1km) and time (~5min). In this regard, it is vital to evaluate the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS-BE forecasts are provided at high resolution (1km/5min) with 20 ensemble members with a lead time of up to 2 hours using a 4 C-band radar composite as input. Forecasts' verification was performed over the cities of Leuven and Ghent and biases were found to be small. The hydraulic model consists of the 1D sewer network and an innovative 'nested' 2D surface model to model 2D urban surface inundations at high resolution. The surface components are categorized into three groups and each group is modelled using triangular meshes at different resolutions; these include streets (3.75 - 15 m2), high flood hazard areas (12.5 - 50 m2) and low flood hazard areas (75 - 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based on spatial interpolation techniques of flood inundation. The method has been implemented and tested for the villages Oostakker and Sint-Amandsberg, which are part of the larger city of Gent, Belgium. After each of the different above-mentioned components were evaluated, they were combined and tested for recent historical flood events. The rainfall nowcasting, hydraulic sewer and 2D inundation modelling and socio-economical flood risk results each could be partly evaluated: the rainfall nowcasting results based on radar data and rain gauges; the hydraulic sewer model results based on water level and discharge data at pumping stations; the 2D inundation modelling results based on limited data on some recent flood locations and inundation depths; the results for the socio-economical flood consequences of the most extreme events based on claims in the database of the national disaster agency. Different methods for visualization of the probabilistic inundation results are proposed and tested.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=musicality&pg=7&id=EJ231418','ERIC'); return false;" href="https://eric.ed.gov/?q=musicality&pg=7&id=EJ231418"><span>Relationships among Ensemble Participation, Private Instruction, and Aural Skill Development.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>May, William V.; Elliott, Charles A.</p> <p>1980-01-01</p> <p>This study sought to determine the relationships that exist among junior high school students' participation in school performing ensembles, those skills measured by the Gaston Test of Musicality, and the number of years of private study on the piano or on ensemble instruments. (Author/SJL)</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22759391','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22759391"><span>Minimalist ensemble algorithms for genome-wide protein localization prediction.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lin, Jhih-Rong; Mondal, Ananda Mohan; Liu, Rong; Hu, Jianjun</p> <p>2012-07-03</p> <p>Computational prediction of protein subcellular localization can greatly help to elucidate its functions. Despite the existence of dozens of protein localization prediction algorithms, the prediction accuracy and coverage are still low. Several ensemble algorithms have been proposed to improve the prediction performance, which usually include as many as 10 or more individual localization algorithms. However, their performance is still limited by the running complexity and redundancy among individual prediction algorithms. This paper proposed a novel method for rational design of minimalist ensemble algorithms for practical genome-wide protein subcellular localization prediction. The algorithm is based on combining a feature selection based filter and a logistic regression classifier. Using a novel concept of contribution scores, we analyzed issues of algorithm redundancy, consensus mistakes, and algorithm complementarity in designing ensemble algorithms. We applied the proposed minimalist logistic regression (LR) ensemble algorithm to two genome-wide datasets of Yeast and Human and compared its performance with current ensemble algorithms. Experimental results showed that the minimalist ensemble algorithm can achieve high prediction accuracy with only 1/3 to 1/2 of individual predictors of current ensemble algorithms, which greatly reduces computational complexity and running time. It was found that the high performance ensemble algorithms are usually composed of the predictors that together cover most of available features. Compared to the best individual predictor, our ensemble algorithm improved the prediction accuracy from AUC score of 0.558 to 0.707 for the Yeast dataset and from 0.628 to 0.646 for the Human dataset. Compared with popular weighted voting based ensemble algorithms, our classifier-based ensemble algorithms achieved much better performance without suffering from inclusion of too many individual predictors. We proposed a method for rational design of minimalist ensemble algorithms using feature selection and classifiers. The proposed minimalist ensemble algorithm based on logistic regression can achieve equal or better prediction performance while using only half or one-third of individual predictors compared to other ensemble algorithms. The results also suggested that meta-predictors that take advantage of a variety of features by combining individual predictors tend to achieve the best performance. The LR ensemble server and related benchmark datasets are available at http://mleg.cse.sc.edu/LRensemble/cgi-bin/predict.cgi.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3426488','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3426488"><span>Minimalist ensemble algorithms for genome-wide protein localization prediction</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2012-01-01</p> <p>Background Computational prediction of protein subcellular localization can greatly help to elucidate its functions. Despite the existence of dozens of protein localization prediction algorithms, the prediction accuracy and coverage are still low. Several ensemble algorithms have been proposed to improve the prediction performance, which usually include as many as 10 or more individual localization algorithms. However, their performance is still limited by the running complexity and redundancy among individual prediction algorithms. Results This paper proposed a novel method for rational design of minimalist ensemble algorithms for practical genome-wide protein subcellular localization prediction. The algorithm is based on combining a feature selection based filter and a logistic regression classifier. Using a novel concept of contribution scores, we analyzed issues of algorithm redundancy, consensus mistakes, and algorithm complementarity in designing ensemble algorithms. We applied the proposed minimalist logistic regression (LR) ensemble algorithm to two genome-wide datasets of Yeast and Human and compared its performance with current ensemble algorithms. Experimental results showed that the minimalist ensemble algorithm can achieve high prediction accuracy with only 1/3 to 1/2 of individual predictors of current ensemble algorithms, which greatly reduces computational complexity and running time. It was found that the high performance ensemble algorithms are usually composed of the predictors that together cover most of available features. Compared to the best individual predictor, our ensemble algorithm improved the prediction accuracy from AUC score of 0.558 to 0.707 for the Yeast dataset and from 0.628 to 0.646 for the Human dataset. Compared with popular weighted voting based ensemble algorithms, our classifier-based ensemble algorithms achieved much better performance without suffering from inclusion of too many individual predictors. Conclusions We proposed a method for rational design of minimalist ensemble algorithms using feature selection and classifiers. The proposed minimalist ensemble algorithm based on logistic regression can achieve equal or better prediction performance while using only half or one-third of individual predictors compared to other ensemble algorithms. The results also suggested that meta-predictors that take advantage of a variety of features by combining individual predictors tend to achieve the best performance. The LR ensemble server and related benchmark datasets are available at http://mleg.cse.sc.edu/LRensemble/cgi-bin/predict.cgi. PMID:22759391</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19990089293&hterms=behavior+modification&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dbehavior%2Bmodification','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19990089293&hterms=behavior+modification&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D10%26Ntt%3Dbehavior%2Bmodification"><span>The Behavior of Filters and Smoothers for Strongly Nonlinear Dynamics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Zhu, Yanqiu; Cohn, Stephen E.; Todling, Ricardo</p> <p>1999-01-01</p> <p>The Kalman filter is the optimal filter in the presence of known Gaussian error statistics and linear dynamics. Filter extension to nonlinear dynamics is non trivial in the sense of appropriately representing high order moments of the statistics. Monte Carlo, ensemble-based, methods have been advocated as the methodology for representing high order moments without any questionable closure assumptions (e.g., Miller 1994). Investigation along these lines has been conducted for highly idealized dynamics such as the strongly nonlinear Lorenz (1963) model as well as more realistic models of the oceans (Evensen and van Leeuwen 1996) and atmosphere (Houtekamer and Mitchell 1998). A few relevant issues in this context are related to the necessary number of ensemble members to properly represent the error statistics and, the necessary modifications in the usual filter equations to allow for correct update of the ensemble members (Burgers 1998). The ensemble technique has also been applied to the problem of smoothing for which similar questions apply. Ensemble smoother examples, however, seem to quite puzzling in that results of state estimate are worse than for their filter analogue (Evensen 1997). In this study, we use concepts in probability theory to revisit the ensemble methodology for filtering and smoothing in data assimilation. We use Lorenz (1963) model to test and compare the behavior of a variety implementations of ensemble filters. We also implement ensemble smoothers that are able to perform better than their filter counterparts. A discussion of feasibility of these techniques to large data assimilation problems will be given at the time of the conference.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28830107','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28830107"><span>epiDMS: Data Management and Analytics for Decision-Making From Epidemic Spread Simulation Ensembles.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Liu, Sicong; Poccia, Silvestro; Candan, K Selçuk; Chowell, Gerardo; Sapino, Maria Luisa</p> <p>2016-12-01</p> <p>Carefully calibrated large-scale computational models of epidemic spread represent a powerful tool to support the decision-making process during epidemic emergencies. Epidemic models are being increasingly used for generating forecasts of the spatial-temporal progression of epidemics at different spatial scales and for assessing the likely impact of different intervention strategies. However, the management and analysis of simulation ensembles stemming from large-scale computational models pose challenges, particularly when dealing with multiple interdependent parameters, spanning multiple layers and geospatial frames, affected by complex dynamic processes operating at different resolutions. We describe and illustrate with examples a novel epidemic simulation data management system, epiDMS, that was developed to address the challenges that arise from the need to generate, search, visualize, and analyze, in a scalable manner, large volumes of epidemic simulation ensembles and observations during the progression of an epidemic. epiDMS is a publicly available system that facilitates management and analysis of large epidemic simulation ensembles. epiDMS aims to fill an important hole in decision-making during healthcare emergencies by enabling critical services with significant economic and health impact. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25998277','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25998277"><span>Tracking individual membrane proteins and their biochemistry: The power of direct observation.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Barden, Adam O; Goler, Adam S; Humphreys, Sara C; Tabatabaei, Samaneh; Lochner, Martin; Ruepp, Marc-David; Jack, Thomas; Simonin, Jonathan; Thompson, Andrew J; Jones, Jeffrey P; Brozik, James A</p> <p>2015-11-01</p> <p>The advent of single molecule fluorescence microscopy has allowed experimental molecular biophysics and biochemistry to transcend traditional ensemble measurements, where the behavior of individual proteins could not be precisely sampled. The recent explosion in popularity of new super-resolution and super-localization techniques coupled with technical advances in optical designs and fast highly sensitive cameras with single photon sensitivity and millisecond time resolution have made it possible to track key motions, reactions, and interactions of individual proteins with high temporal resolution and spatial resolution well beyond the diffraction limit. Within the purview of membrane proteins and ligand gated ion channels (LGICs), these outstanding advances in single molecule microscopy allow for the direct observation of discrete biochemical states and their fluctuation dynamics. Such observations are fundamentally important for understanding molecular-level mechanisms governing these systems. Examples reviewed here include the effects of allostery on the stoichiometry of ligand binding in the presence of fluorescent ligands; the observation of subdomain partitioning of membrane proteins due to microenvironment effects; and the use of single particle tracking experiments to elucidate characteristics of membrane protein diffusion and the direct measurement of thermodynamic properties, which govern the free energy landscape of protein dimerization. The review of such characteristic topics represents a snapshot of efforts to push the boundaries of fluorescence microscopy of membrane proteins to the absolute limit. This article is part of the Special Issue entitled 'Fluorescent Tools in Neuropharmacology'. Copyright © 2015 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4553757','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4553757"><span>Concrete ensemble Kalman filters with rigorous catastrophic filter divergence</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kelly, David; Majda, Andrew J.; Tong, Xin T.</p> <p>2015-01-01</p> <p>The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature. PMID:26261335</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26261335','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26261335"><span>Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kelly, David; Majda, Andrew J; Tong, Xin T</p> <p>2015-08-25</p> <p>The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMGC24D..03S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMGC24D..03S"><span>Making decisions based on an imperfect ensemble of climate simulators: strategies and future directions</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Sanderson, B. M.</p> <p>2017-12-01</p> <p>The CMIP ensembles represent the most comprehensive source of information available to decision-makers for climate adaptation, yet it is clear that there are fundamental limitations in our ability to treat the ensemble as an unbiased sample of possible future climate trajectories. There is considerable evidence that models are not independent, and increasing complexity and resolution combined with computational constraints prevent a thorough exploration of parametric uncertainty or internal variability. Although more data than ever is available for calibration, the optimization of each model is influenced by institutional priorities, historical precedent and available resources. The resulting ensemble thus represents a miscellany of climate simulators which defy traditional statistical interpretation. Models are in some cases interdependent, but are sufficiently complex that the degree of interdependency is conditional on the application. Configurations have been updated using available observations to some degree, but not in a consistent or easily identifiable fashion. This means that the ensemble cannot be viewed as a true posterior distribution updated by available data, but nor can observational data alone be used to assess individual model likelihood. We assess recent literature for combining projections from an imperfect ensemble of climate simulators. Beginning with our published methodology for addressing model interdependency and skill in the weighting scheme for the 4th US National Climate Assessment, we consider strategies for incorporating process-based constraints on future response, perturbed parameter experiments and multi-model output into an integrated framework. We focus on a number of guiding questions: Is the traditional framework of confidence in projections inferred from model agreement leading to biased or misleading conclusions? Can the benefits of upweighting skillful models be reconciled with the increased risk of truth lying outside the weighted ensemble distribution? If CMIP is an ensemble of partially informed best-guesses, can we infer anything about the parent distribution of all possible models of the climate system (and if not, are we implicitly under-representing the risk of a climate catastrophe outside of the envelope of CMIP simulations)?</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015PhRvX...5a1001A','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015PhRvX...5a1001A"><span>Atomic-Scale Nuclear Spin Imaging Using Quantum-Assisted Sensors in Diamond</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ajoy, A.; Bissbort, U.; Lukin, M. D.; Walsworth, R. L.; Cappellaro, P.</p> <p>2015-01-01</p> <p>Nuclear spin imaging at the atomic level is essential for the understanding of fundamental biological phenomena and for applications such as drug discovery. The advent of novel nanoscale sensors promises to achieve the long-standing goal of single-protein, high spatial-resolution structure determination under ambient conditions. In particular, quantum sensors based on the spin-dependent photoluminescence of nitrogen-vacancy (NV) centers in diamond have recently been used to detect nanoscale ensembles of external nuclear spins. While NV sensitivity is approaching single-spin levels, extracting relevant information from a very complex structure is a further challenge since it requires not only the ability to sense the magnetic field of an isolated nuclear spin but also to achieve atomic-scale spatial resolution. Here, we propose a method that, by exploiting the coupling of the NV center to an intrinsic quantum memory associated with the nitrogen nuclear spin, can reach a tenfold improvement in spatial resolution, down to atomic scales. The spatial resolution enhancement is achieved through coherent control of the sensor spin, which creates a dynamic frequency filter selecting only a few nuclear spins at a time. We propose and analyze a protocol that would allow not only sensing individual spins in a complex biomolecule, but also unraveling couplings among them, thus elucidating local characteristics of the molecule structure.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29394278','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29394278"><span>Resolution of ranking hierarchies in directed networks.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Letizia, Elisa; Barucca, Paolo; Lillo, Fabrizio</p> <p>2018-01-01</p> <p>Identifying hierarchies and rankings of nodes in directed graphs is fundamental in many applications such as social network analysis, biology, economics, and finance. A recently proposed method identifies the hierarchy by finding the ordered partition of nodes which minimises a score function, termed agony. This function penalises the links violating the hierarchy in a way depending on the strength of the violation. To investigate the resolution of ranking hierarchies we introduce an ensemble of random graphs, the Ranked Stochastic Block Model. We find that agony may fail to identify hierarchies when the structure is not strong enough and the size of the classes is small with respect to the whole network. We analytically characterise the resolution threshold and we show that an iterated version of agony can partly overcome this resolution limit.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5796714','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5796714"><span>Resolution of ranking hierarchies in directed networks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Barucca, Paolo; Lillo, Fabrizio</p> <p>2018-01-01</p> <p>Identifying hierarchies and rankings of nodes in directed graphs is fundamental in many applications such as social network analysis, biology, economics, and finance. A recently proposed method identifies the hierarchy by finding the ordered partition of nodes which minimises a score function, termed agony. This function penalises the links violating the hierarchy in a way depending on the strength of the violation. To investigate the resolution of ranking hierarchies we introduce an ensemble of random graphs, the Ranked Stochastic Block Model. We find that agony may fail to identify hierarchies when the structure is not strong enough and the size of the classes is small with respect to the whole network. We analytically characterise the resolution threshold and we show that an iterated version of agony can partly overcome this resolution limit. PMID:29394278</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1336178-vertically-grown-nanowire-crystals-dibenzotetrathienocoronene-dbttc-large-area-graphene','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1336178-vertically-grown-nanowire-crystals-dibenzotetrathienocoronene-dbttc-large-area-graphene"><span>Vertically grown nanowire crystals of dibenzotetrathienocoronene (DBTTC) on large-area graphene</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Kim, B.; Chiu, C. -Y.; Kang, S. J.; ...</p> <p>2016-06-01</p> <p>Here we demonstrate controlled growth of vertical organic crystal nanowires on single layer graphene. Using Scanning Electron Microscopy (SEM), high-resolution transition electron microscopy (TEM), and Grazing Incidence X-ray Diffraction (GIXD), we probe the microstructure and morphology of dibenzotetrathienocoronene (DBTTC) nanowires epitaxially grown on graphene. The investigation is performed at both the ensemble and single nanowire level, and as function of growth parameters, providing insight of and control over the formation mechanism. Finally, the size, density and height of the nanowires can be tuned via growth conditions, opening new avenues for tailoring three-dimensional (3-D) nanostructured architectures for organic electronics with improvedmore » functional performance.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22093433-photon-number-discrimination-without-photon-counter-its-application-reconstructing-non-gaussian-states','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22093433-photon-number-discrimination-without-photon-counter-its-application-reconstructing-non-gaussian-states"><span>Photon-number discrimination without a photon counter and its application to reconstructing non-Gaussian states</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Chrzanowski, H. M.; Bernu, J.; Sparkes, B. M.</p> <p>2011-11-15</p> <p>The nonlinearity of a conditional photon-counting measurement can be used to ''de-Gaussify'' a Gaussian state of light. Here we present and experimentally demonstrate a technique for photon-number resolution using only homodyne detection. We then apply this technique to inform a conditional measurement, unambiguously reconstructing the statistics of the non-Gaussian one- and two-photon-subtracted squeezed vacuum states. Although our photon-number measurement relies on ensemble averages and cannot be used to prepare non-Gaussian states of light, its high efficiency, photon-number-resolving capabilities, and compatibility with the telecommunications band make it suitable for quantum-information tasks relying on the outcomes of mean values.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/14630222','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/14630222"><span>Intracellular applications of fluorescence correlation spectroscopy: prospects for neuroscience.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kim, Sally A; Schwille, Petra</p> <p>2003-10-01</p> <p>Based on time-averaging fluctuation analysis of small fluorescent molecular ensembles in equilibrium, fluorescence correlation spectroscopy has recently been applied to investigate processes in the intracellular milieu. The exquisite sensitivity of fluorescence correlation spectroscopy provides access to a multitude of measurement parameters (rates of diffusion, local concentration, states of aggregation and molecular interactions) in real time with fast temporal and high spatial resolution. The introduction of dual-color cross-correlation, imaging, two-photon excitation, and coincidence analysis coupled with fluorescence correlation spectroscopy has expanded the utility of the technique to encompass a wide range of promising applications in living cells that may provide unprecedented insight into understanding the molecular mechanisms of intracellular neurobiological processes.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EL....11520012G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EL....11520012G"><span>Single-shot imaging of trapped Fermi gas</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gajda, Mariusz; Mostowski, Jan; Sowiński, Tomasz; Załuska-Kotur, Magdalena</p> <p>2016-07-01</p> <p>Recently developed techniques allow for simultaneous measurements of the positions of all ultra-cold atoms in a trap with high resolution. Each such single-shot experiment detects one element of the quantum ensemble formed by the cloud of atoms. Repeated single-shot measurements can be used to determine all correlations between particle positions as opposed to standard measurements that determine particle density or two-particle correlations only. In this paper we discuss the possible outcomes of such single-shot measurements in the case of cloud of ultra-cold noninteracting Fermi atoms. We show that the Pauli exclusion principle alone leads to correlations between particle positions that originate from unexpected spatial structures formed by the atoms.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=ensemble&pg=4&id=EJ1009350','ERIC'); return false;" href="https://eric.ed.gov/?q=ensemble&pg=4&id=EJ1009350"><span>Programming in the Zone: Repertoire Selection for the Large Ensemble</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Hopkins, Michael</p> <p>2013-01-01</p> <p>One of the great challenges ensemble directors face is selecting high-quality repertoire that matches the musical and technical levels of their ensembles. Thoughtful repertoire selection can lead to increased student motivation as well as greater enthusiasm for the music program from parents, administrators, teachers, and community members. Common…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/18345086','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/18345086"><span>Lidar inelastic multiple-scattering parameters of cirrus particle ensembles determined with geometrical-optics crystal phase functions.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Reichardt, J; Hess, M; Macke, A</p> <p>2000-04-20</p> <p>Multiple-scattering correction factors for cirrus particle extinction coefficients measured with Raman and high spectral resolution lidars are calculated with a radiative-transfer model. Cirrus particle-ensemble phase functions are computed from single-crystal phase functions derived in a geometrical-optics approximation. Seven crystal types are considered. In cirrus clouds with height-independent particle extinction coefficients the general pattern of the multiple-scattering parameters has a steep onset at cloud base with values of 0.5-0.7 followed by a gradual and monotonic decrease to 0.1-0.2 at cloud top. The larger the scattering particles are, the more gradual is the rate of decrease. Multiple-scattering parameters of complex crystals and of imperfect hexagonal columns and plates can be well approximated by those of projected-area equivalent ice spheres, whereas perfect hexagonal crystals show values as much as 70% higher than those of spheres. The dependencies of the multiple-scattering parameters on cirrus particle spectrum, base height, and geometric depth and on the lidar parameters laser wavelength and receiver field of view, are discussed, and a set of multiple-scattering parameter profiles for the correction of extinction measurements in homogeneous cirrus is provided.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_19");'>19</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li class="active"><span>21</span></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_21 --> <div id="page_22" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="421"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013JGRD..118.9804L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013JGRD..118.9804L"><span>A multi-resolution ensemble study of a tropical urban environment and its interactions with the background regional atmosphere</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Li, Xian-Xiang; Koh, Tieh-Yong; Entekhabi, Dara; Roth, Matthias; Panda, Jagabandhu; Norford, Leslie K.</p> <p>2013-09-01</p> <p>This study employed the Weather Research and Forecasting model with a single-layer urban canopy model to investigate the urban environment of a tropical city, Singapore. The coupled model was evaluated against available observational data from a sensor network and flux tower. The effects of land use type and anthropogenic heat (AH) on the thermal and wind environment were investigated with a series of sensitivity tests using an ensemble approach for low advection, high convective available potential energy, intermonsoon season cases. The diurnal cycle and spatial pattern of urban heat island (UHI) intensity and planetary boundary layer height were investigated. The mean UHI intensity peaked in the early morning at 2.2°C, reaching 2.4°C in industrial areas. Sea and land breezes developed during daytime and nighttime, respectively, with the former much stronger than the latter. The model predicted that sea breezes from different coastlines of the Malay Peninsula meet and converge, inducing strong updrafts. AH was found to play roles in all the processes studied, while the effect of different land use types was most pronounced during nighttime, and least visible near noon.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1389988-evaluating-lossy-data-compression-climate-simulation-data-within-large-ensemble','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1389988-evaluating-lossy-data-compression-climate-simulation-data-within-large-ensemble"><span>Evaluating lossy data compression on climate simulation data within a large ensemble</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Baker, Allison H.; Hammerling, Dorit M.; Mickelson, Sheri A.; ...</p> <p>2016-12-07</p> <p>High-resolution Earth system model simulations generate enormous data volumes, and retaining the data from these simulations often strains institutional storage resources. Further, these exceedingly large storage requirements negatively impact science objectives, for example, by forcing reductions in data output frequency, simulation length, or ensemble size. To lessen data volumes from the Community Earth System Model (CESM), we advocate the use of lossy data compression techniques. While lossy data compression does not exactly preserve the original data (as lossless compression does), lossy techniques have an advantage in terms of smaller storage requirements. To preserve the integrity of the scientific simulation data,more » the effects of lossy data compression on the original data should, at a minimum, not be statistically distinguishable from the natural variability of the climate system, and previous preliminary work with data from CESM has shown this goal to be attainable. However, to ultimately convince climate scientists that it is acceptable to use lossy data compression, we provide climate scientists with access to publicly available climate data that have undergone lossy data compression. In particular, we report on the results of a lossy data compression experiment with output from the CESM Large Ensemble (CESM-LE) Community Project, in which we challenge climate scientists to examine features of the data relevant to their interests, and attempt to identify which of the ensemble members have been compressed and reconstructed. We find that while detecting distinguishing features is certainly possible, the compression effects noticeable in these features are often unimportant or disappear in post-processing analyses. In addition, we perform several analyses that directly compare the original data to the reconstructed data to investigate the preservation, or lack thereof, of specific features critical to climate science. Overall, we conclude that applying lossy data compression to climate simulation data is both advantageous in terms of data reduction and generally acceptable in terms of effects on scientific results.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016GMD.....9.4381B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016GMD.....9.4381B"><span>Evaluating lossy data compression on climate simulation data within a large ensemble</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Baker, Allison H.; Hammerling, Dorit M.; Mickelson, Sheri A.; Xu, Haiying; Stolpe, Martin B.; Naveau, Phillipe; Sanderson, Ben; Ebert-Uphoff, Imme; Samarasinghe, Savini; De Simone, Francesco; Carbone, Francesco; Gencarelli, Christian N.; Dennis, John M.; Kay, Jennifer E.; Lindstrom, Peter</p> <p>2016-12-01</p> <p>High-resolution Earth system model simulations generate enormous data volumes, and retaining the data from these simulations often strains institutional storage resources. Further, these exceedingly large storage requirements negatively impact science objectives, for example, by forcing reductions in data output frequency, simulation length, or ensemble size. To lessen data volumes from the Community Earth System Model (CESM), we advocate the use of lossy data compression techniques. While lossy data compression does not exactly preserve the original data (as lossless compression does), lossy techniques have an advantage in terms of smaller storage requirements. To preserve the integrity of the scientific simulation data, the effects of lossy data compression on the original data should, at a minimum, not be statistically distinguishable from the natural variability of the climate system, and previous preliminary work with data from CESM has shown this goal to be attainable. However, to ultimately convince climate scientists that it is acceptable to use lossy data compression, we provide climate scientists with access to publicly available climate data that have undergone lossy data compression. In particular, we report on the results of a lossy data compression experiment with output from the CESM Large Ensemble (CESM-LE) Community Project, in which we challenge climate scientists to examine features of the data relevant to their interests, and attempt to identify which of the ensemble members have been compressed and reconstructed. We find that while detecting distinguishing features is certainly possible, the compression effects noticeable in these features are often unimportant or disappear in post-processing analyses. In addition, we perform several analyses that directly compare the original data to the reconstructed data to investigate the preservation, or lack thereof, of specific features critical to climate science. Overall, we conclude that applying lossy data compression to climate simulation data is both advantageous in terms of data reduction and generally acceptable in terms of effects on scientific results.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1389988','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1389988"><span>Evaluating lossy data compression on climate simulation data within a large ensemble</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Baker, Allison H.; Hammerling, Dorit M.; Mickelson, Sheri A.</p> <p></p> <p>High-resolution Earth system model simulations generate enormous data volumes, and retaining the data from these simulations often strains institutional storage resources. Further, these exceedingly large storage requirements negatively impact science objectives, for example, by forcing reductions in data output frequency, simulation length, or ensemble size. To lessen data volumes from the Community Earth System Model (CESM), we advocate the use of lossy data compression techniques. While lossy data compression does not exactly preserve the original data (as lossless compression does), lossy techniques have an advantage in terms of smaller storage requirements. To preserve the integrity of the scientific simulation data,more » the effects of lossy data compression on the original data should, at a minimum, not be statistically distinguishable from the natural variability of the climate system, and previous preliminary work with data from CESM has shown this goal to be attainable. However, to ultimately convince climate scientists that it is acceptable to use lossy data compression, we provide climate scientists with access to publicly available climate data that have undergone lossy data compression. In particular, we report on the results of a lossy data compression experiment with output from the CESM Large Ensemble (CESM-LE) Community Project, in which we challenge climate scientists to examine features of the data relevant to their interests, and attempt to identify which of the ensemble members have been compressed and reconstructed. We find that while detecting distinguishing features is certainly possible, the compression effects noticeable in these features are often unimportant or disappear in post-processing analyses. In addition, we perform several analyses that directly compare the original data to the reconstructed data to investigate the preservation, or lack thereof, of specific features critical to climate science. Overall, we conclude that applying lossy data compression to climate simulation data is both advantageous in terms of data reduction and generally acceptable in terms of effects on scientific results.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMPP22A..02R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMPP22A..02R"><span>Impacts of weather versus climate and driver uncertainty on multi-centennial ecosystem model simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rollinson, C.; Simkins, J.; Fer, I.; Desai, A. R.; Dietze, M.</p> <p>2017-12-01</p> <p>Simulations of ecosystem dynamics and comparisons with empirical data require accurate, continuous, and often sub-daily meteorology records that are spatially aligned to the scale of the empirical data. A wealth of meteorology data for the past, present, and future is available through site-specific observations, modern reanalysis products, and gridded GCM simulations. However, these products are mismatched in spatial and temporal resolution, often with both different means and seasonal patterns. We have designed and implemented a two-step meteorological downscaling and ensemble generation method that combines multiple meteorology data products through debiasing and temporal downscaling protocols. Our methodology is designed to preserve the covariance among seven meteorological variables for use as drivers in ecosystem model simulations: temperature, precipitation, short- and longwave radiation, surface pressure, humidity, and wind. Furthermore, our method propagates uncertainty through the downscaling process and results in ensembles of meteorology that can be compared to paleoclimate reconstructions and used to analyze the effects of both high- and low-frequency climate anomalies on ecosystem dynamics. Using a multiple linear regression approach, we have combined hourly, 0.125-degree gridded data from the NLDAS (1980-present) with CRUNCEP (1901-2010) and CMIP5 historical (1850-2005), past millennium (850-1849), and future (1950-2100) GCM simulations. This has resulted in an ensemble of continuous, hourly-resolved meteorology from from the paleo era into the future with variability in weather events as well as low-frequency climatic changes. We investigate the influence of extreme sub-daily weather phenomena versus long-term climatic changes in an ensemble of ecosystem models that range in atmospheric and biological complexity. Through data assimilation with paleoclimate reconstructions of past climate, we can improve data-model comparisons using observations of vegetation change from the past 1200 years. Accounting for driver uncertainty in model evaluation can help determine the relative influence of structural versus parameterization errors in ecosystem modelings.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AGUFMNG34A..05R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AGUFMNG34A..05R"><span>Dynamics Under Location Uncertainty: Model Derivation, Modified Transport and Uncertainty Quantification</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Resseguier, V.; Memin, E.; Chapron, B.; Fox-Kemper, B.</p> <p>2017-12-01</p> <p>In order to better observe and predict geophysical flows, ensemble-based data assimilation methods are of high importance. In such methods, an ensemble of random realizations represents the variety of the simulated flow's likely behaviors. For this purpose, randomness needs to be introduced in a suitable way and physically-based stochastic subgrid parametrizations are promising paths. This talk will propose a new kind of such a parametrization referred to as modeling under location uncertainty. The fluid velocity is decomposed into a resolved large-scale component and an aliased small-scale one. The first component is possibly random but time-correlated whereas the second is white-in-time but spatially-correlated and possibly inhomogeneous and anisotropic. With such a velocity, the material derivative of any - possibly active - tracer is modified. Three new terms appear: a correction of the large-scale advection, a multiplicative noise and a possibly heterogeneous and anisotropic diffusion. This parameterization naturally ensures attractive properties such as energy conservation for each realization. Additionally, this stochastic material derivative and the associated Reynolds' transport theorem offer a systematic method to derive stochastic models. In particular, we will discuss the consequences of the Quasi-Geostrophic assumptions in our framework. Depending on the turbulence amount, different models with different physical behaviors are obtained. Under strong turbulence assumptions, a simplified diagnosis of frontolysis and frontogenesis at the surface of the ocean is possible in this framework. A Surface Quasi-Geostrophic (SQG) model with a weaker noise influence has also been simulated. A single realization better represents small scales than a deterministic SQG model at the same resolution. Moreover, an ensemble accurately predicts extreme events, bifurcations as well as the amplitudes and the positions of the simulation errors. Figure 1 highlights this last result and compares it to the strong error underestimation of an ensemble simulated from the deterministic dynamic with random initial conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1813026O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1813026O"><span>Global system for hydrological monitoring and forecasting in real time at high resolution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ortiz, Enrique; De Michele, Carlo; Todini, Ezio; Cifres, Enrique</p> <p>2016-04-01</p> <p>This project presented at the EGU 2016 born of solidarity and the need to dignify the most disadvantaged people living in the poorest countries (Africa, South America and Asia, which are continually exposed to changes in the hydrologic cycle suffering events of large floods and/or long periods of droughts. It is also a special year this 2016, Year of Mercy, in which we must engage with the most disadvantaged of our Planet (Gaia) making available to them what we do professionally and scientifically. The project called "Global system for hydrological monitoring and forecasting in real time at high resolution" is Non-Profit and aims to provide at global high resolution (1km2) hydrological monitoring and forecasting in real time and continuously coupling Weather Forecast of Global Circulation Models, such us GFS-0.25° (Deterministic and Ensembles Run) forcing a physically based distributed hydrological model computationally efficient, such as the latest version extended of TOPKAPI model, named TOPKAPI-eXtended. Finally using the MCP approach for the proper use of ensembles for Predictive Uncertainty assessment essentially based on a multiple regression in the Normal space, can be easily extended to use ensembles to represent the local (in time) smaller or larger conditional predictive uncertainty, as a function of the ensemble spread. In this way, each prediction in time accounts for both the predictive uncertainty of the ensemble mean and that of the ensemble spread. To perform a continuous hydrological modeling with TOPKAPI-X model and have hot start of hydrological status of watersheds, the system assimilated products of rainfall and temperature derived from remote sensing, such as product 3B42RT of TRMM NASA and others.The system will be integrated into a Decision Support System (DSS) platform, based on geographical data. The DSS is a web application (For Pc, Tablet/Mobile phone): It does not need installation (all you need is a web browser and an internet connection) and not need update (all upgrade are deployed on the remote server)and DSS is a classical client-server application. The client side will be an HTML 5-CSS 3 application, it runs in one of the most common browser. The server side consist in: A web server (Apache web server); a map server (Geoserver); a Geographical q3456Relational Database Management Sytem (Postgresql+Postgis); Tools based on GDAL Lybraries. A customized web page will be implemented to publish all hydrometeorological information and forecast runs (free) for all users in the world. In this first presentation of the project are invited to attend all those scientific / technical people, Universities, Research Centers (public or private) who want to collaborate in it, opening a brainstorming to improve the System. References: • Liu Z. and Todini E., (2002). Towards a comprehensive physically based rainfall-runoff model. Hydrology and Earth System Sciences (HESS), 6(5):859-881, 2002. • Thielen, J., Bartholmes, J., Ramos, M.-H., and de Roo, A., (2009): The European Flood Alert System - Part 1: Concept and development, Hydrol. Earth Syst. Sci., 13, 125-140, 2009. • Coccia C., Mazzetti C., Ortiz E., Todini E., (2010) - A different soil conceptualization for the TOPKAPI model application within the DMIP 2. American Geophysical Union. Fall Meeting, San Francisco H21H-07, 2010. • Pappenberger, F., Cloke, H. L., Balsamo, G., Ngo-Duc, T., and Oki,T., (2010) Global runoff routing with the hydrological component of the ECMWF NWP system, Int. J. Climatol., 30, 2155-2174, 2010. • Coccia, G. and Todini, E., (2011). Recent developments in predictive uncertainty assessment based on the Model Conditional Processor approach. Hydrology and Earth System Sciences, 15, 3253-3274, 2011. • Wu, H., Adler, R. F., Hong, Y., Tian, Y., and Policelli, F.,(2012): Evaluation of Global Flood Detection Using Satellite-Based Rainfall and a Hydrologic Model, J. Hydrometeorol., 13, 1268-1284, 2012. • Simth M. et al., (2013). The Distributed Model Intercomparison Project - Phase 2: Experiment Design and Summary Results of the Western Basin Experiments, Journal of Hydrology 507, 300-329, 2013. • Pontificiae Academiae Scientiarvm (2014). Proceedings of the Joint Workshop on 2-6 May 2014: Sustainable Humanity Sustainable Nature Our Responsibility. Pontificiae Academiae Scientiarvm Extra Series 41. Vatican City. 2014 • Encyclical letter CARITAS IN VERITATE of the supreme pontiff Benedict XVI to the bishops, priests and deacons, men and women religious the lay faithful and all people of good will on integral human development in charity and truth. Vatican City . 2009. • Encyclical letter LAUDATO SI' of the holy father Francis on care for our common home. Vatican City. 2015</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018JPRS..140..133Z','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018JPRS..140..133Z"><span>A hybrid MLP-CNN classifier for very fine resolution remotely sensed image classification</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Zhang, Ce; Pan, Xin; Li, Huapeng; Gardiner, Andy; Sargent, Isabel; Hare, Jonathon; Atkinson, Peter M.</p> <p>2018-06-01</p> <p>The contextual-based convolutional neural network (CNN) with deep architecture and pixel-based multilayer perceptron (MLP) with shallow structure are well-recognized neural network algorithms, representing the state-of-the-art deep learning method and the classical non-parametric machine learning approach, respectively. The two algorithms, which have very different behaviours, were integrated in a concise and effective way using a rule-based decision fusion approach for the classification of very fine spatial resolution (VFSR) remotely sensed imagery. The decision fusion rules, designed primarily based on the classification confidence of the CNN, reflect the generally complementary patterns of the individual classifiers. In consequence, the proposed ensemble classifier MLP-CNN harvests the complementary results acquired from the CNN based on deep spatial feature representation and from the MLP based on spectral discrimination. Meanwhile, limitations of the CNN due to the adoption of convolutional filters such as the uncertainty in object boundary partition and loss of useful fine spatial resolution detail were compensated. The effectiveness of the ensemble MLP-CNN classifier was tested in both urban and rural areas using aerial photography together with an additional satellite sensor dataset. The MLP-CNN classifier achieved promising performance, consistently outperforming the pixel-based MLP, spectral and textural-based MLP, and the contextual-based CNN in terms of classification accuracy. This research paves the way to effectively address the complicated problem of VFSR image classification.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16371468','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16371468"><span>Determination of an ensemble of structures representing the intermediate state of the bacterial immunity protein Im7.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Gsponer, Joerg; Hopearuoho, Harri; Whittaker, Sara B-M; Spence, Graham R; Moore, Geoffrey R; Paci, Emanuele; Radford, Sheena E; Vendruscolo, Michele</p> <p>2006-01-03</p> <p>We present a detailed structural characterization of the intermediate state populated during the folding and unfolding of the bacterial immunity protein Im7. We achieve this result by incorporating a variety of experimental data available for this species in molecular dynamics simulations. First, we define the structure of the exchange-competent intermediate state of Im7 by using equilibrium hydrogen-exchange protection factors. Second, we use this ensemble to predict Phi-values and compare the results with the experimentally determined Phi-values of the kinetic refolding intermediate. Third, we predict chemical-shift measurements and compare them with the measured chemical shifts of a mutational variant of Im7 for which the kinetic folding intermediate is the most stable state populated at equilibrium. Remarkably, we found that the properties of the latter two species are predicted with high accuracy from the exchange-competent intermediate that we determined, suggesting that these three states are characterized by a similar architecture in which helices I, II, and IV are aligned in a native-like, but reorganized, manner. Furthermore, the structural ensemble that we obtained enabled us to rationalize the results of tryptophan fluorescence experiments in the WT protein and a series of mutational variants. The results show that the integration of diverse sets of experimental data at relatively low structural resolution is a powerful approach that can provide insights into the structural organization of this conformationally heterogeneous three-helix intermediate with unprecedented detail and highlight the importance of both native and non-native interactions in stabilizing its structure.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ClDy..tmp..811L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ClDy..tmp..811L"><span>Australian snowpack in the NARCliM ensemble: evaluation, bias correction and future projections</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Luca, Alejandro Di; Evans, Jason P.; Ji, Fei</p> <p>2017-10-01</p> <p>In this study we evaluate the ability of an ensemble of high-resolution Regional Climate Model simulations to represent snow cover characteristics over the Australian Alps and go on to asses future projections of snowpack characteristics. Our results show that the ensemble presents a cold temperature bias and overestimates total precipitation leading to a general overestimation of the snow cover as compared with MODIS satellite data. We then produce a new set of snowpack characteristics by running a temperature based snow melt/accumulation model forced by bias corrected temperature and precipitation fields. While some positive snow cover biases remain, the bias corrected (BC) dataset show large improvements regarding the simulation of total amounts, seasonality and spatial distribution of the snow cover compared with MODIS products. Both the raw and BC datasets are then used to assess future changes in the snowpack characteristics. Both datasets show robust increases in near-surface temperatures and decreases in snowfall that lead to a substantial reduction of the snowpack over the Australian Alps. The snowpack decreases by about 15 and 60% by 2030 and 2070 respectively. While the BC data introduce large differences in the simulation of the present climate snowpack, in relative terms future changes appear to be similar to those obtained using the raw data. Future temperature projections show a clear dependence with elevation through the snow-albedo feedback effect that affects snowpack projections. Uncertainties in future projections of the snowpack are large in both datasets and are mainly dominated by the choice of the lateral boundary conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMNG41B1738F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMNG41B1738F"><span>Development of the NHM-LETKF regional reanalysis system assimilating conventional observations only</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Fukui, S.; Iwasaki, T.; Saito, K. K.; Seko, H.; Kunii, M.</p> <p>2016-12-01</p> <p>The information about long-term high-resolution atmospheric fields is very useful for studying meso-scale responses to climate change or analyzing extreme events. We are developing a NHM-LETKF (the local ensemble transform Kalman filter with the nonhydrostatic model of the Japan Meteorological Agency (JMA)) regional reanalysis system assimilating only conventional observations that are available over about 60 years such as surface observations at observatories and upper air observations with radiosondes. The domain covers Japan and its surroundings. Before the long-term reanalysis is performed, an experiment using the system was conducted over August in 2014 in order to identify effectiveness and problems of the regional reanalysis system. In this study, we investigated the six-hour accumulated precipitations obtained by integration from the analysis fields. The reproduced precipitation was compared with the JMA's Radar/Rain-gauge Analyzed Precipitation data over Japan islands and the precipitation of JRA-55, which is used as lateral boundary conditions. The comparisons reveal the underestimation of the precipitation in the regional reanalysis. The underestimation is improved by extending the forecast time. In the regional reanalysis system, the analysis fields are derived using the ensemble mean fields, where the conflicting components among ensemble members are filtered out. Therefore, it is important to tune the inflation factor and lateral boundary perturbations not to smooth the analysis fields excessively and to consider more time to spin-up the fields. In the extended run, the underestimation still remains. This implies that the underestimation is attributed to the forecast model itself as well as the analysis scheme.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26213518','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26213518"><span>Ensemble downscaling in coupled solar wind-magnetosphere modeling for space weather forecasting.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Owens, M J; Horbury, T S; Wicks, R T; McGregor, S L; Savani, N P; Xiong, M</p> <p>2014-06-01</p> <p>Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind "noise," which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical "downscaling" of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme. Solar wind models must be downscaled in order to drive magnetospheric models Ensemble downscaling is more effective than deterministic downscaling The magnetosphere responds nonlinearly to small-scale solar wind fluctuations.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ThApC.tmp..195C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ThApC.tmp..195C"><span>Exploring the future change space for fire weather in southeast Australia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Clarke, Hamish; Evans, Jason P.</p> <p>2018-05-01</p> <p>High-resolution projections of climate change impacts on fire weather conditions in southeast Australia out to 2080 are presented. Fire weather is represented by the McArthur Forest Fire Danger Index (FFDI), calculated from an objectively designed regional climate model ensemble. Changes in annual cumulative FFDI vary widely, from - 337 (- 21%) to + 657 (+ 24%) in coastal areas and - 237 (- 12%) to + 1143 (+ 26%) in inland areas. A similar spread is projected in extreme FFDI values. In coastal regions, the number of prescribed burning days is projected to change from - 11 to + 10 in autumn and - 10 to + 3 in spring. Across the ensemble, the most significant increases in fire weather and decreases in prescribed burn windows are projected to take place in spring. Partial bias correction of FFDI leads to similar projections but with a greater spread, particularly in extreme values. The partially bias-corrected FFDI performs similarly to uncorrected FFDI compared to the observed annual cumulative FFDI (ensemble root mean square error spans 540 to 1583 for uncorrected output and 695 to 1398 for corrected) but is generally worse for FFDI values above 50. This emphasizes the need to consider inter-variable relationships when bias-correcting for complex phenomena such as fire weather. There is considerable uncertainty in the future trajectory of fire weather in southeast Australia, including the potential for less prescribed burning days and substantially greater fire danger in spring. Selecting climate models on the basis of multiple criteria can lead to more informative projections and allow an explicit exploration of uncertainty.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFM.A51I0201I','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFM.A51I0201I"><span>Probabilistic regional climate projection in Japan using a regression model with CMIP5 multi-model ensemble experiments</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ishizaki, N. N.; Dairaku, K.; Ueno, G.</p> <p>2016-12-01</p> <p>We have developed a statistical downscaling method for estimating probabilistic climate projection using CMIP5 multi general circulation models (GCMs). A regression model was established so that the combination of weights of GCMs reflects the characteristics of the variation of observations at each grid point. Cross validations were conducted to select GCMs and to evaluate the regression model to avoid multicollinearity. By using spatially high resolution observation system, we conducted statistically downscaled probabilistic climate projections with 20-km horizontal grid spacing. Root mean squared errors for monthly mean air surface temperature and precipitation estimated by the regression method were the smallest compared with the results derived from a simple ensemble mean of GCMs and a cumulative distribution function based bias correction method. Projected changes in the mean temperature and precipitation were basically similar to those of the simple ensemble mean of GCMs. Mean precipitation was generally projected to increase associated with increased temperature and consequent increased moisture content in the air. Weakening of the winter monsoon may affect precipitation decrease in some areas. Temperature increase in excess of 4 K was expected in most areas of Japan in the end of 21st century under RCP8.5 scenario. The estimated probability of monthly precipitation exceeding 300 mm would increase around the Pacific side during the summer and the Japan Sea side during the winter season. This probabilistic climate projection based on the statistical method can be expected to bring useful information to the impact studies and risk assessments.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008SPIE.6915E..0UV','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008SPIE.6915E..0UV"><span>A consensus embedding approach for segmentation of high resolution in vivo prostate magnetic resonance imagery</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Viswanath, Satish; Rosen, Mark; Madabhushi, Anant</p> <p>2008-03-01</p> <p>Current techniques for localization of prostatic adenocarcinoma (CaP) via blinded trans-rectal ultrasound biopsy are associated with a high false negative detection rate. While high resolution endorectal in vivo Magnetic Resonance (MR) prostate imaging has been shown to have improved contrast and resolution for CaP detection over ultrasound, similarity in intensity characteristics between benign and cancerous regions on MR images contribute to a high false positive detection rate. In this paper, we present a novel unsupervised segmentation method that employs manifold learning via consensus schemes for detection of cancerous regions from high resolution 1.5 Tesla (T) endorectal in vivo prostate MRI. A significant contribution of this paper is a method to combine multiple weak, lower-dimensional representations of high dimensional feature data in a way analogous to classifier ensemble schemes, and hence create a stable and accurate reduced dimensional representation. After correcting for MR image intensity artifacts, such as bias field inhomogeneity and intensity non-standardness, our algorithm extracts over 350 3D texture features at every spatial location in the MR scene at multiple scales and orientations. Non-linear dimensionality reduction schemes such as Locally Linear Embedding (LLE) and Graph Embedding (GE) are employed to create multiple low dimensional data representations of this high dimensional texture feature space. Our novel consensus embedding method is used to average object adjacencies from within the multiple low dimensional projections so that class relationships are preserved. Unsupervised consensus clustering is then used to partition the objects in this consensus embedding space into distinct classes. Quantitative evaluation on 18 1.5 T prostate MR data against corresponding histology obtained from the multi-site ACRIN trials show a sensitivity of 92.65% and a specificity of 82.06%, which suggests that our method is successfully able to detect suspicious regions in the prostate.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1427169-three-dimensional-localization-nanoscale-battery-reactions-using-soft-ray-tomography','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1427169-three-dimensional-localization-nanoscale-battery-reactions-using-soft-ray-tomography"><span>Three-dimensional localization of nanoscale battery reactions using soft X-ray tomography</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Yu, Young-Sang; Farmand, Maryam; Kim, Chunjoong; ...</p> <p>2018-03-02</p> <p>Battery function is determined by the efficiency and reversibility of the electrochemical phase transformations at solid electrodes. The microscopic tools available to study the chemical states of matter with the required spatial resolution and chemical specificity are intrinsically limited when studying complex architectures by their reliance on two-dimensional projections of thick material. Here in this paper, we report the development of soft X-ray ptychographic tomography, which resolves chemical states in three dimensions at 11 nm spatial resolution. We study an ensemble of nano-plates of lithium iron phosphate extracted from a battery electrode at 50% state of charge. Using a setmore » of nanoscale tomograms, we quantify the electrochemical state and resolve phase boundaries throughout the volume of individual nanoparticles. These observations reveal multiple reaction points, intra-particle heterogeneity, and size effects that highlight the importance of multi-dimensional analytical tools in providing novel insight to the design of the next generation of high-performance devices.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1427169','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1427169"><span>Three-dimensional localization of nanoscale battery reactions using soft X-ray tomography</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Yu, Young-Sang; Farmand, Maryam; Kim, Chunjoong</p> <p></p> <p>Battery function is determined by the efficiency and reversibility of the electrochemical phase transformations at solid electrodes. The microscopic tools available to study the chemical states of matter with the required spatial resolution and chemical specificity are intrinsically limited when studying complex architectures by their reliance on two-dimensional projections of thick material. Here in this paper, we report the development of soft X-ray ptychographic tomography, which resolves chemical states in three dimensions at 11 nm spatial resolution. We study an ensemble of nano-plates of lithium iron phosphate extracted from a battery electrode at 50% state of charge. Using a setmore » of nanoscale tomograms, we quantify the electrochemical state and resolve phase boundaries throughout the volume of individual nanoparticles. These observations reveal multiple reaction points, intra-particle heterogeneity, and size effects that highlight the importance of multi-dimensional analytical tools in providing novel insight to the design of the next generation of high-performance devices.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28521132','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28521132"><span>Linking Neurons to Network Function and Behavior by Two-Photon Holographic Optogenetics and Volumetric Imaging.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dal Maschio, Marco; Donovan, Joseph C; Helmbrecht, Thomas O; Baier, Herwig</p> <p>2017-05-17</p> <p>We introduce a flexible method for high-resolution interrogation of circuit function, which combines simultaneous 3D two-photon stimulation of multiple targeted neurons, volumetric functional imaging, and quantitative behavioral tracking. This integrated approach was applied to dissect how an ensemble of premotor neurons in the larval zebrafish brain drives a basic motor program, the bending of the tail. We developed an iterative photostimulation strategy to identify minimal subsets of channelrhodopsin (ChR2)-expressing neurons that are sufficient to initiate tail movements. At the same time, the induced network activity was recorded by multiplane GCaMP6 imaging across the brain. From this dataset, we computationally identified activity patterns associated with distinct components of the elicited behavior and characterized the contributions of individual neurons. Using photoactivatable GFP (paGFP), we extended our protocol to visualize single functionally identified neurons and reconstruct their morphologies. Together, this toolkit enables linking behavior to circuit activity with unprecedented resolution. Copyright © 2017 Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1612605D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1612605D"><span>Atmospheric rivers in a hierarchy of high resolution global climate models: results from the UPSCALE simulation campaign</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Demory, Marie-Estelle; Vidale, Pier-Luigi; Schiemann, Reinhard; Roberts, Malcolm; Mizielinski, Matthew</p> <p>2014-05-01</p> <p>A traceable hierarchy of global climate models (based on the Met Office Unified Model, GA3 formulation), with mesh sizes ranging from 130km to 25km, has been developed in order to study the impact of improved representation of small-scale processes on the mean climate, its variability and extremes. Five-member ensembles of atmosphere-only integrations were completed at these resolutions, each 27 years in length, using both present day forcing and a future climate scenario. These integrations, collectively known as the "UPSCALE campaign", were completed using time provided by the European PrACE project on supercomputer HERMIT (HLRS Stuttgart). A wide variety of processes are being studied to assess these integrations, in particular with regards to the role of resolution. It has been shown that the relatively coarse resolution of atmospheric general circulation models (AGCMs) limits their ability to represent moisture transport from ocean to land. Understanding of the processes underlying this observed improvement with higher resolution remains insufficient. Atmospheric Rivers (ARs) are an important process of moisture transport onto land in mid-latitude eddies and have been shown by Lavers et al. (2012) to be involved in creating the moisture supply that sustains extreme precipitation events. We investigated the ability of a state-of-the art climate model to represent the location, frequency and 3D structure of atmospheric rivers affecting Western Europe, with a focus on the UK. We show that the climatology of atmospheric rivers, in particular frequency, is underrepresented in the GCM at standard resolution and that this is slightly improved at high resolution (25km): our results are in better agreement with reanalysis data, even if sizable biases remain. The three-dimensional structure of the atmospheric rivers is also more credibly represented at high-resolution. Some aspects of the relationship between the improved simulation in current climate conditions, and how this impacts on changes in the future climate, with much larger atmospheric moisture availability, will also be discussed. In particular, we aim to quantify the relative roles of atmospheric transport and increased precipitation rates in the higher quantiles.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014EGUGA..1613372V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014EGUGA..1613372V"><span>Ensemble catchment hydrological modelling for climate change impact analysis</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vansteenkiste, Thomas; Ntegeka, Victor; Willems, Patrick</p> <p>2014-05-01</p> <p>It is vital to investigate how the hydrological model structure affects the climate change impact given that future changes not in the range for which the models were calibrated or validated are likely. Thus an ensemble modelling approach which involves a diversity of models with different structures such as spatial resolutions and process descriptions is crucial. The ensemble modelling approach was applied to a set of models: from the lumped conceptual models NAM, PDM and VHM, an intermediate detailed and distributed model WetSpa, to the highly detailed and fully distributed model MIKE-SHE. Explicit focus was given to the high and low flow extremes. All models were calibrated for sub flows and quick flows derived from rainfall and potential evapotranspiration (ETo) time series. In general, all models were able to produce reliable estimates of the flow regimes under the current climate for extreme peak and low flows. An intercomparison of the low and high flow changes under changed climatic conditions was made using climate scenarios tailored for extremes. Tailoring was important for two reasons. First, since the use of many scenarios was not feasible it was necessary to construct few scenarios that would reasonably represent the range of extreme impacts. Second, scenarios would be more informative as changes in high and low flows would be easily traced to changes of ETo and rainfall; the tailored scenarios are constructed using seasonal changes that are defined using different levels of magnitude (high, mean and low) for rainfall and ETo. After simulation of these climate scenarios in the five hydrological models, close agreement was found among the models. The different models predicted similar range of peak flow changes. For the low flows, however, the differences in the projected impact range by different hydrological models was larger, particularly for the drier scenarios. This suggests that the hydrological model structure is critical in low flow predictions, more than in high flow conditions. Hence, the mechanism of the slow flow component simulation requires further attention. It is concluded that a multi-model ensemble approach where different plausible model structures are applied, is extremely useful. It improves the reliability of climate change impact results and allows decision making to be based on uncertainty assessment that includes model structure related uncertainties. References: Ntegeka, V., Baguis, P., Roulin, E., Willems, P., 2014. Developing tailored climate change scenarios for hydrological impact assessments. Journal of Hydrology, 508C, 307-321 Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., Willems, P., De Smedt, F., Batelaan, O., 2013. Climate change impact on river flows and catchment hydrology: a comparison of two spatially distributed models. Hydrological Processes, 27(25), 3649-3662. Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., Van Steenbergen, N., De Smedt, F., Batelaan, O., Pereira, F., Willems, P., 2014. Intercomparison of five lumped and distributed models for catchment runoff and extreme flow simulation. Journal of Hydrology, in press. Vansteenkiste, Th., Tavakoli, M., Ntegeka, V., De Smedt, F., Batelaan, O., Pereira, F., Willems, P., 2014. Intercomparison of climate scenario impact predictions by a lumped and distributed model ensemble. Journal of Hydrology, in revision.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_20");'>20</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li class="active"><span>22</span></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_22 --> <div id="page_23" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="441"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4109431','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4109431"><span>Conductor gestures influence evaluations of ensemble performance</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Morrison, Steven J.; Price, Harry E.; Smedley, Eric M.; Meals, Cory D.</p> <p>2014-01-01</p> <p>Previous research has found that listener evaluations of ensemble performances vary depending on the expressivity of the conductor’s gestures, even when performances are otherwise identical. It was the purpose of the present study to test whether this effect of visual information was evident in the evaluation of specific aspects of ensemble performance: articulation and dynamics. We constructed a set of 32 music performances that combined auditory and visual information and were designed to feature a high degree of contrast along one of two target characteristics: articulation and dynamics. We paired each of four music excerpts recorded by a chamber ensemble in both a high- and low-contrast condition with video of four conductors demonstrating high- and low-contrast gesture specifically appropriate to either articulation or dynamics. Using one of two equivalent test forms, college music majors and non-majors (N = 285) viewed sixteen 30 s performances and evaluated the quality of the ensemble’s articulation, dynamics, technique, and tempo along with overall expressivity. Results showed significantly higher evaluations for performances featuring high rather than low conducting expressivity regardless of the ensemble’s performance quality. Evaluations for both articulation and dynamics were strongly and positively correlated with evaluations of overall ensemble expressivity. PMID:25104944</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=poor+AND+performances&pg=3&id=EJ928166','ERIC'); return false;" href="https://eric.ed.gov/?q=poor+AND+performances&pg=3&id=EJ928166"><span>The Effect of Ensemble Performance Quality on the Evaluation of Conducting Expressivity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Silvey, Brian A.</p> <p>2011-01-01</p> <p>This study was designed to examine whether the presence of excellent or poor ensemble performances would influence the ratings assigned by ensemble members to conductors who demonstrated highly expressive conducting. Two conductors were videotaped conducting one of two excerpts from an arrangement of Frank Ticheli's "Loch Lomond." These videos…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013EGUGA..15.8220L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013EGUGA..15.8220L"><span>Propagation of radar rainfall uncertainty in urban flood simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liguori, Sara; Rico-Ramirez, Miguel</p> <p>2013-04-01</p> <p>This work discusses the results of the implementation of a novel probabilistic system designed to improve ensemble sewer flow predictions for the drainage network of a small urban area in the North of England. The probabilistic system has been developed to model the uncertainty associated to radar rainfall estimates and propagate it through radar-based ensemble sewer flow predictions. The assessment of this system aims at outlining the benefits of addressing the uncertainty associated to radar rainfall estimates in a probabilistic framework, to be potentially implemented in the real-time management of the sewer network in the study area. Radar rainfall estimates are affected by uncertainty due to various factors [1-3] and quality control and correction techniques have been developed in order to improve their accuracy. However, the hydrological use of radar rainfall estimates and forecasts remains challenging. A significant effort has been devoted by the international research community to the assessment of the uncertainty propagation through probabilistic hydro-meteorological forecast systems [4-5], and various approaches have been implemented for the purpose of characterizing the uncertainty in radar rainfall estimates and forecasts [6-11]. A radar-based ensemble stochastic approach, similar to the one implemented for use in the Southern-Alps by the REAL system [6], has been developed for the purpose of this work. An ensemble generator has been calibrated on the basis of the spatial-temporal characteristics of the residual error in radar estimates assessed with reference to rainfall records from around 200 rain gauges available for the year 2007, previously post-processed and corrected by the UK Met Office [12-13]. Each ensemble member is determined by summing a perturbation field to the unperturbed radar rainfall field. The perturbations are generated by imposing the radar error spatial and temporal correlation structure to purely stochastic fields. A hydrodynamic sewer network model implemented in the Infoworks software was used to model the rainfall-runoff process in the urban area. The software calculates the flow through the sewer conduits of the urban model using rainfall as the primary input. The sewer network is covered by 25 radar pixels with a spatial resolution of 1 km2. The majority of the sewer system is combined, carrying both urban rainfall runoff as well as domestic and trade waste water [11]. The urban model was configured to receive the probabilistic radar rainfall fields. The results showed that the radar rainfall ensembles provide additional information about the uncertainty in the radar rainfall measurements that can be propagated in urban flood modelling. The peaks of the measured flow hydrographs are often bounded within the uncertainty area produced by using the radar rainfall ensembles. This is in fact one of the benefits of using radar rainfall ensembles in urban flood modelling. More work needs to be done in improving the urban models, but this is out of the scope of this research. The rainfall uncertainty cannot explain the whole uncertainty shown in the flow simulations, and additional sources of uncertainty will come from the structure of the urban models as well as the large number of parameters required by these models. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and the UK Environment Agency for providing the various data sets. We also thank Yorkshire Water Services Ltd for providing the urban model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1. References [1] Browning KA, 1978. Meteorological applications of radar. Reports on Progress in Physics 41 761 Doi: 10.1088/0034-4885/41/5/003 [2] Rico-Ramirez MA, Cluckie ID, Shepherd G, Pallot A, 2007. A high-resolution radar experiment on the island of Jersey. Meteorological Applications 14: 117-129. [3] Villarini G, Krajewski WF, 2010. Review of the different sources of uncertainty in single polarization radar-based estimates of rainfall. Surveys in Geophysics 31: 107-129. [4] Rossa A, Liechti K, Zappa M, Bruen M, Germann U, Haase G, Keil C, Krahe P, 2011. The COST 731 Action: A review on uncertainty propagation in advanced hydrometeorological forecast systems. Atmospheric Research 100, 150-167. [5] Rossa A, Bruen M, Germann U, Haase G, Keil C, Krahe P, Zappa M, 2010. Overview and Main Results on the interdisciplinary effort in flood forecasting COST 731-Propagation of Uncertainty in Advanced Meteo-Hydrological Forecast Systems. Proceedings of Sixth European Conference on Radar in Meteorology and Hydrology ERAD 2010. [6] Germann U, Berenguer M, Sempere-Torres D, Zappa M, 2009. REAL - ensemble radar precipitation estimation for hydrology in a mountainous region. Quarterly Journal of the Royal Meteorological Society 135: 445-456. [8] Bowler NEH, Pierce CE, Seed AW, 2006. STEPS: a probabilistic precipitation forecasting scheme which merges and extrapolation nowcast with downscaled NWP. Quarterly Journal of the Royal Meteorological Society 132: 2127-2155. [9] Zappa M, Rotach MW, Arpagaus M, Dorninger M, Hegg C, Montani A, Ranzi R, Ament F, Germann U, Grossi G et al., 2008. MAP D-PHASE: real-time demonstration of hydrological ensemble prediction systems. Atmospheric Science Letters 9, 80-87. [10] Liguori S, Rico-Ramirez MA. Quantitative assessment of short-term rainfall forecasts from radar nowcasts and MM5 forecasts. Hydrological Processes, accepted article. DOI: 10.1002/hyp.8415 [11] Liguori S, Rico-Ramirez MA, Schellart ANA, Saul AJ, 2012. Using probabilistic radar rainfall nowcasts and NWP forecasts for flow prediction in urban catchments. Atmospheric Research 103: 80-95. [12] Harrison DL, Driscoll SJ, Kitchen M, 2000. Improving precipitation estimates from weather radar using quality control and correction techniques. Meteorological Applications 7: 135-144. [13] Harrison DL, Scovell RW, Kitchen M, 2009. High-resolution precipitation estimates for hydrological uses. Proceedings of the Institution of Civil Engineers - Water Management 162: 125-135.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JGRD..12210773R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JGRD..12210773R"><span>Projections of Future Precipitation Extremes Over Europe: A Multimodel Assessment of Climate Simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Rajczak, Jan; Schär, Christoph</p> <p>2017-10-01</p> <p>Projections of precipitation and its extremes over the European continent are analyzed in an extensive multimodel ensemble of 12 and 50 km resolution EURO-CORDEX Regional Climate Models (RCMs) forced by RCP2.6, RCP4.5, and RCP8.5 (Representative Concentration Pathway) aerosol and greenhouse gas emission scenarios. A systematic intercomparison with ENSEMBLES RCMs is carried out, such that in total information is provided for an unprecedentedly large data set of 100 RCM simulations. An evaluation finds very reasonable skill for the EURO-CORDEX models in simulating temporal and geographical variations of (mean and heavy) precipitation at both horizontal resolutions. Heavy and extreme precipitation events are projected to intensify across most of Europe throughout the whole year. All considered models agree on a distinct intensification of extremes by often more than +20% in winter and fall and over central and northern Europe. A reduction of rainy days and mean precipitation in summer is simulated by a large majority of models in the Mediterranean area, but intermodel spread between the simulations is large. In central Europe and France during summer, models project decreases in precipitation but more intense heavy and extreme rainfalls. Comparison to previous RCM projections from ENSEMBLES reveals consistency but slight differences in summer, where reductions in southern European precipitation are not as pronounced as previously projected. The projected changes of the European hydrological cycle may have substantial impact on environmental and anthropogenic systems. In particular, the simulations indicate a rising probability of summertime drought in southern Europe and more frequent and intense heavy rainfall across all of Europe.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26356316','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26356316"><span>An empirical study of ensemble-based semi-supervised learning approaches for imbalanced splice site datasets.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Stanescu, Ana; Caragea, Doina</p> <p>2015-01-01</p> <p>Recent biochemical advances have led to inexpensive, time-efficient production of massive volumes of raw genomic data. Traditional machine learning approaches to genome annotation typically rely on large amounts of labeled data. The process of labeling data can be expensive, as it requires domain knowledge and expert involvement. Semi-supervised learning approaches that can make use of unlabeled data, in addition to small amounts of labeled data, can help reduce the costs associated with labeling. In this context, we focus on the problem of predicting splice sites in a genome using semi-supervised learning approaches. This is a challenging problem, due to the highly imbalanced distribution of the data, i.e., small number of splice sites as compared to the number of non-splice sites. To address this challenge, we propose to use ensembles of semi-supervised classifiers, specifically self-training and co-training classifiers. Our experiments on five highly imbalanced splice site datasets, with positive to negative ratios of 1-to-99, showed that the ensemble-based semi-supervised approaches represent a good choice, even when the amount of labeled data consists of less than 1% of all training data. In particular, we found that ensembles of co-training and self-training classifiers that dynamically balance the set of labeled instances during the semi-supervised iterations show improvements over the corresponding supervised ensemble baselines. In the presence of limited amounts of labeled data, ensemble-based semi-supervised approaches can successfully leverage the unlabeled data to enhance supervised ensembles learned from highly imbalanced data distributions. Given that such distributions are common for many biological sequence classification problems, our work can be seen as a stepping stone towards more sophisticated ensemble-based approaches to biological sequence annotation in a semi-supervised framework.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4565116','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4565116"><span>An empirical study of ensemble-based semi-supervised learning approaches for imbalanced splice site datasets</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p></p> <p>2015-01-01</p> <p>Background Recent biochemical advances have led to inexpensive, time-efficient production of massive volumes of raw genomic data. Traditional machine learning approaches to genome annotation typically rely on large amounts of labeled data. The process of labeling data can be expensive, as it requires domain knowledge and expert involvement. Semi-supervised learning approaches that can make use of unlabeled data, in addition to small amounts of labeled data, can help reduce the costs associated with labeling. In this context, we focus on the problem of predicting splice sites in a genome using semi-supervised learning approaches. This is a challenging problem, due to the highly imbalanced distribution of the data, i.e., small number of splice sites as compared to the number of non-splice sites. To address this challenge, we propose to use ensembles of semi-supervised classifiers, specifically self-training and co-training classifiers. Results Our experiments on five highly imbalanced splice site datasets, with positive to negative ratios of 1-to-99, showed that the ensemble-based semi-supervised approaches represent a good choice, even when the amount of labeled data consists of less than 1% of all training data. In particular, we found that ensembles of co-training and self-training classifiers that dynamically balance the set of labeled instances during the semi-supervised iterations show improvements over the corresponding supervised ensemble baselines. Conclusions In the presence of limited amounts of labeled data, ensemble-based semi-supervised approaches can successfully leverage the unlabeled data to enhance supervised ensembles learned from highly imbalanced data distributions. Given that such distributions are common for many biological sequence classification problems, our work can be seen as a stepping stone towards more sophisticated ensemble-based approaches to biological sequence annotation in a semi-supervised framework. PMID:26356316</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14..528L','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14..528L"><span>Probabilistic forecasts based on radar rainfall uncertainty</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Liguori, S.; Rico-Ramirez, M. A.</p> <p>2012-04-01</p> <p>The potential advantages resulting from integrating weather radar rainfall estimates in hydro-meteorological forecasting systems is limited by the inherent uncertainty affecting radar rainfall measurements, which is due to various sources of error [1-3]. The improvement of quality control and correction techniques is recognized to play a role for the future improvement of radar-based flow predictions. However, the knowledge of the uncertainty affecting radar rainfall data can also be effectively used to build a hydro-meteorological forecasting system in a probabilistic framework. This work discusses the results of the implementation of a novel probabilistic forecasting system developed to improve ensemble predictions over a small urban area located in the North of England. An ensemble of radar rainfall fields can be determined as the sum of a deterministic component and a perturbation field, the latter being informed by the knowledge of the spatial-temporal characteristics of the radar error assessed with reference to rain-gauges measurements. This approach is similar to the REAL system [4] developed for use in the Southern-Alps. The radar uncertainty estimate can then be propagated with a nowcasting model, used to extrapolate an ensemble of radar rainfall forecasts, which can ultimately drive hydrological ensemble predictions. A radar ensemble generator has been calibrated using radar rainfall data made available from the UK Met Office after applying post-processing and corrections algorithms [5-6]. One hour rainfall accumulations from 235 rain gauges recorded for the year 2007 have provided the reference to determine the radar error. Statistics describing the spatial characteristics of the error (i.e. mean and covariance) have been computed off-line at gauges location, along with the parameters describing the error temporal correlation. A system has then been set up to impose the space-time error properties to stochastic perturbations, generated in real-time at gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20070031775&hterms=Database+uses&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D70%26Ntt%3DDatabase%2Buses','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20070031775&hterms=Database+uses&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D70%26Ntt%3DDatabase%2Buses"><span>New Software for Ensemble Creation in the Spitzer-Space-Telescope Operations Database</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Laher, Russ; Rector, John</p> <p>2004-01-01</p> <p>Some of the computer pipelines used to process digital astronomical images from NASA's Spitzer Space Telescope require multiple input images, in order to generate high-level science and calibration products. The images are grouped into ensembles according to well documented ensemble-creation rules by making explicit associations in the operations Informix database at the Spitzer Science Center (SSC). The advantage of this approach is that a simple database query can retrieve the required ensemble of pipeline input images. New and improved software for ensemble creation has been developed. The new software is much faster than the existing software because it uses pre-compiled database stored-procedures written in Informix SPL (SQL programming language). The new software is also more flexible because the ensemble creation rules are now stored in and read from newly defined database tables. This table-driven approach was implemented so that ensemble rules can be inserted, updated, or deleted without modifying software.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AIPC.1900d0003D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AIPC.1900d0003D"><span>Regge trajectories and Hagedorn behavior: Hadronic realizations of dynamical dark matter</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Dienes, Keith R.; Huang, Fei; Su, Shufang; Thomas, Brooks</p> <p>2017-11-01</p> <p>Dynamical Dark Matter (DDM) is an alternative framework for dark-matter physics in which the dark sector comprises a vast ensemble of particle species whose Standard-Model decay widths are balanced against their cosmological abundances. In this talk, we study the properties of a hitherto-unexplored class of DDM ensembles in which the ensemble constituents are the "hadronic" resonances associated with the confining phase of a strongly-coupled dark sector. Such ensembles exhibit masses lying along Regge trajectories and Hagedorn-like densities of states that grow exponentially with mass. We investigate the applicable constraints on such dark-"hadronic" DDM ensembles and find that these constraints permit a broad range of mass and confinement scales for these ensembles. We also find that the distribution of the total present-day abundance across the ensemble is highly correlated with the values of these scales. This talk reports on research originally presented in Ref. [1].</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17496035','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17496035"><span>Computational prediction of atomic structures of helical membrane proteins aided by EM maps.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kovacs, Julio A; Yeager, Mark; Abagyan, Ruben</p> <p>2007-09-15</p> <p>Integral membrane proteins pose a major challenge for protein-structure prediction because only approximately 100 high-resolution structures are available currently, thereby impeding the development of rules or empirical potentials to predict the packing of transmembrane alpha-helices. However, when an intermediate-resolution electron microscopy (EM) map is available, it can be used to provide restraints which, in combination with a suitable computational protocol, make structure prediction feasible. In this work we present such a protocol, which proceeds in three stages: 1), generation of an ensemble of alpha-helices by flexible fitting into each of the density rods in the low-resolution EM map, spanning a range of rotational angles around the main helical axes and translational shifts along the density rods; 2), fast optimization of side chains and scoring of the resulting conformations; and 3), refinement of the lowest-scoring conformations with internal coordinate mechanics, by optimizing the van der Waals, electrostatics, hydrogen bonding, torsional, and solvation energy contributions. In addition, our method implements a penalty term through a so-called tethering map, derived from the EM map, which restrains the positions of the alpha-helices. The protocol was validated on three test cases: GpA, KcsA, and MscL.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015SPIE.9420E..0MD','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015SPIE.9420E..0MD"><span>Performance assessment of automated tissue characterization for prostate H and E stained histopathology</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>DiFranco, Matthew D.; Reynolds, Hayley M.; Mitchell, Catherine; Williams, Scott; Allan, Prue; Haworth, Annette</p> <p>2015-03-01</p> <p>Reliable automated prostate tumor detection and characterization in whole-mount histology images is sought in many applications, including post-resection tumor staging and as ground-truth data for multi-parametric MRI interpretation. In this study, an ensemble-based supervised classification algorithm for high-resolution histology images was trained on tile-based image features including histogram and gray-level co-occurrence statistics. The algorithm was assessed using different combinations of H and E prostate slides from two separate medical centers and at two different magnifications (400x and 200x), with the aim of applying tumor classification models to new data. Slides from both datasets were annotated by expert pathologists in order to identify homogeneous cancerous and non-cancerous tissue regions of interest, which were then categorized as (1) low-grade tumor (LG-PCa), including Gleason 3 and high-grade prostatic intraepithelial neoplasia (HG-PIN), (2) high-grade tumor (HG-PCa), including various Gleason 4 and 5 patterns, or (3) non-cancerous, including benign stroma and benign prostatic hyperplasia (BPH). Classification models for both LG-PCa and HG-PCa were separately trained using a support vector machine (SVM) approach, and per-tile tumor prediction maps were generated from the resulting ensembles. Results showed high sensitivity for predicting HG-PCa with an AUC up to 0.822 using training data from both medical centres, while LG-PCa showed a lower sensitivity of 0.763 with the same training data. Visual inspection of cancer probability heatmaps from 9 patients showed that 17/19 tumors were detected, and HG-PCa generally reported less false positives than LG-PCa.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16907499','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16907499"><span>Hybrid quantum processors: molecular ensembles as quantum memory for solid state circuits.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rabl, P; DeMille, D; Doyle, J M; Lukin, M D; Schoelkopf, R J; Zoller, P</p> <p>2006-07-21</p> <p>We investigate a hybrid quantum circuit where ensembles of cold polar molecules serve as long-lived quantum memories and optical interfaces for solid state quantum processors. The quantum memory realized by collective spin states (ensemble qubit) is coupled to a high-Q stripline cavity via microwave Raman processes. We show that, for convenient trap-surface distances of a few microm, strong coupling between the cavity and ensemble qubit can be achieved. We discuss basic quantum information protocols, including a swap from the cavity photon bus to the molecular quantum memory, and a deterministic two qubit gate. Finally, we investigate coherence properties of molecular ensemble quantum bits.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20150000155','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20150000155"><span>Assessing the Impact of Pre-gpm Microwave Precipitation Observations in the Goddard WRF Ensemble Data Assimilation System</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Chambon, Philippe; Zhang, Sara Q.; Hou, Arthur Y.; Zupanski, Milija; Cheung, Samson</p> <p>2013-01-01</p> <p>The forthcoming Global Precipitation Measurement (GPM) Mission will provide next generation precipitation observations from a constellation of satellites. Since precipitation by nature has large variability and low predictability at cloud-resolving scales, the impact of precipitation data on the skills of mesoscale numerical weather prediction (NWP) is largely affected by the characterization of background and observation errors and the representation of nonlinear cloud/precipitation physics in an NWP data assimilation system. We present a data impact study on the assimilation of precipitation-affected microwave (MW) radiances from a pre-GPM satellite constellation using the Goddard WRF Ensemble Data Assimilation System (Goddard WRF-EDAS). A series of assimilation experiments are carried out in a Weather Research Forecast (WRF) model domain of 9 km resolution in western Europe. Sensitivities to observation error specifications, background error covariance estimated from ensemble forecasts with different ensemble sizes, and MW channel selections are examined through single-observation assimilation experiments. An empirical bias correction for precipitation-affected MW radiances is developed based on the statistics of radiance innovations in rainy areas. The data impact is assessed by full data assimilation cycling experiments for a storm event that occurred in France in September 2010. Results show that the assimilation of MW precipitation observations from a satellite constellation mimicking GPM has a positive impact on the accumulated rain forecasts verified with surface radar rain estimates. The case-study on a convective storm also reveals that the accuracy of ensemble-based background error covariance is limited by sampling errors and model errors such as precipitation displacement and unresolved convective scale instability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1914104P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1914104P"><span>Atmospheric river influence on the intensification of extreme hydrologic events over the Western United States under climate change scenarios</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pagán, Brianna; Ashfaq, Moetasim; Nayak, Munir; Rastogi, Deeksha; Margulis, Steven; Pal, Jeremy</p> <p>2017-04-01</p> <p>The Western United States shares limited snowmelt driven water supplies amongst millions of people, a multi-billion dollar agriculture industry and fragile ecosystems. The climatology of the region is highly variable, characterized by the frequent occurrences of both flood and drought conditions that cause increasingly challenging water management issues. Although variable year to year, up to half of California's total precipitation can be linked to atmospheric rivers (ARs). Most notably, ARs have been connected to nearly every major historic flood in the region, establishing its critical role to water supply. Numerous prior studies have considered potential climate change impacts over the Western United States and have generally concluded that warmer temperatures will reduce snowpack and shift runoff timing, causing reductions to water supply. Here we examine the role of ARs as one mechanism for explaining projected increases in flood and drought frequency and intensity under climate change scenarios, vital information for water resource managers. A hierarchical modeling framework to downscale 11 coupled global climate models from CMIP5 is used to form an ensemble of high-resolution dynamically downscaled regional climate model (via RegCM4) simulations at 18-km and hydrological (via VIC) simulations at a 4-km resolution for baseline (1965-2005) and future (2010-2050) periods under RCP 8.5. Each ensemble member's ability to capture observational AR climatology over the baseline period is evaluated. Baseline to future period changes to AR size, duration, seasonal timing, trajectory, magnitude and frequency are presented. These changes to the characterizations of ARs in the region are used to determine if any links exist to changes in snowpack volume, runoff timing, and the occurrence of daily and annual cumulative extreme precipitation and runoff events. Shifts in extreme AR frequency and magnitude are expected to increase flood risks, which without adequate multi-year reservoir storage solutions could further strain water supply resources.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2010AAS...21535403E','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2010AAS...21535403E"><span>Dense cores of GMAs in M51</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Egusa, Fumi; Koda, J.; Scoville, N. Z.</p> <p>2010-01-01</p> <p>We present sensitive and high angular resolution CO(1-0) data obtained by CARMA observations toward the nearby grand-design spiral galaxy M51. From the data, Giant Molecular Associations (GMAs) in a spiral arm are found to be resolved into a few small clumps with mass of 106 Msun and size of 40 pc. As the densities of these clumps are estimated to be larger than 300 cm-3, we regard them as dense cores of GMAs. If GMAs were just confusion of Giant Molecular Clouds (GMCs) whose typical mass and size are almost the same as those of the detected clumps, we should have detected tens or more of them per each GMA considering the sensitivity of our data. However, only one or two cores are found in each GMA, indicating that GMAs are not ensembles of GMCs but are discrete smooth structures. This result is consistent with the conclusion by Koda et al. (2009), who worked on lower resolution CO data of M51. In addition, we have found that these cores are located downstream of the spiral arm. This suggests that the core formation of GMAs and their evolution are triggered by the spiral structure, or density waves. Our high resolution data reveal the inner structure of GMAs and its relationships to the global structure for the first time in grand-design spiral galaxies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015EGUGA..1713173O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015EGUGA..1713173O"><span>Wind-Farm Forecasting Using the HARMONIE Weather Forecast Model and Bayes Model Averaging for Bias Removal.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>O'Brien, Enda; McKinstry, Alastair; Ralph, Adam</p> <p>2015-04-01</p> <p>Building on previous work presented at EGU 2013 (http://www.sciencedirect.com/science/article/pii/S1876610213016068 ), more results are available now from a different wind-farm in complex terrain in southwest Ireland. The basic approach is to interpolate wind-speed forecasts from an operational weather forecast model (i.e., HARMONIE in the case of Ireland) to the precise location of each wind-turbine, and then use Bayes Model Averaging (BMA; with statistical information collected from a prior training-period of e.g., 25 days) to remove systematic biases. Bias-corrected wind-speed forecasts (and associated power-generation forecasts) are then provided twice daily (at 5am and 5pm) out to 30 hours, with each forecast validation fed back to BMA for future learning. 30-hr forecasts from the operational Met Éireann HARMONIE model at 2.5km resolution have been validated against turbine SCADA observations since Jan. 2014. An extra high-resolution (0.5km grid-spacing) HARMONIE configuration has been run since Nov. 2014 as an extra member of the forecast "ensemble". A new version of HARMONIE with extra filters designed to stabilize high-resolution configurations has been run since Jan. 2015. Measures of forecast skill and forecast errors will be provided, and the contributions made by the various physical and computational enhancements to HARMONIE will be quantified.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/16342274','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/16342274"><span>Evaluating the quality of NMR structures by local density of protons.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ban, Yih-En Andrew; Rudolph, Johannes; Zhou, Pei; Edelsbrunner, Herbert</p> <p>2006-03-01</p> <p>Evaluating the quality of experimentally determined protein structural models is an essential step toward identifying potential errors and guiding further structural refinement. Herein, we report the use of proton local density as a sensitive measure to assess the quality of nuclear magnetic resonance (NMR) structures. Using 256 high-resolution crystal structures with protons added and optimized, we show that the local density of different proton types display distinct distributions. These distributions can be characterized by statistical moments and are used to establish local density Z-scores for evaluating both global and local packing for individual protons. Analysis of 546 crystal structures at various resolutions shows that the local density Z-scores increase as the structural resolution decreases and correlate well with the ClashScore (Word et al. J Mol Biol 1999;285(4):1711-1733) generated by all atom contact analysis. Local density Z-scores for NMR structures exhibit a significantly wider range of values than for X-ray structures and demonstrate a combination of potentially problematic inflation and compression. Water-refined NMR structures show improved packing quality. Our analysis of a high-quality structural ensemble of ubiquitin refined against order parameters shows proton density distributions that correlate nearly perfectly with our standards derived from crystal structures, further validating our approach. We present an automated analysis and visualization tool for proton packing to evaluate the quality of NMR structures. 2005 Wiley-Liss, Inc.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25695255','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25695255"><span>Using high-resolution future climate scenarios to forecast Bromus tectorum invasion in Rocky Mountain National Park.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>West, Amanda M; Kumar, Sunil; Wakie, Tewodros; Brown, Cynthia S; Stohlgren, Thomas J; Laituri, Melinda; Bromberg, Jim</p> <p>2015-01-01</p> <p>National Parks are hallmarks of ecosystem preservation in the United States. The introduction of alien invasive plant species threatens protection of these areas. Bromus tectorum L. (commonly called downy brome or cheatgrass), which is found in Rocky Mountain National Park (hereafter, the Park), Colorado, USA, has been implicated in early spring competition with native grasses, decreased soil nitrogen, altered nutrient and hydrologic regimes, and increased fire intensity. We estimated the potential distribution of B. tectorum in the Park based on occurrence records (n = 211), current and future climate, and distance to roads and trails. An ensemble of six future climate scenarios indicated the habitable area of B. tectorum may increase from approximately 5.5% currently to 20.4% of the Park by the year 2050. Using ordination methods we evaluated the climatic space occupied by B. tectorum in the Park and how this space may shift given future climate change. Modeling climate change at a small extent (1,076 km2) and at a fine spatial resolution (90 m) is a novel approach in species distribution modeling, and may provide inference for microclimates not captured in coarse-scale models. Maps from our models serve as high-resolution hypotheses that can be improved over time by land managers to set priorities for surveys and removal of invasive species such as B. tectorum.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4335003','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4335003"><span>Using High-Resolution Future Climate Scenarios to Forecast Bromus tectorum Invasion in Rocky Mountain National Park</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>West, Amanda M.; Kumar, Sunil; Wakie, Tewodros; Brown, Cynthia S.; Stohlgren, Thomas J.; Laituri, Melinda; Bromberg, Jim</p> <p>2015-01-01</p> <p>National Parks are hallmarks of ecosystem preservation in the United States. The introduction of alien invasive plant species threatens protection of these areas. Bromus tectorum L. (commonly called downy brome or cheatgrass), which is found in Rocky Mountain National Park (hereafter, the Park), Colorado, USA, has been implicated in early spring competition with native grasses, decreased soil nitrogen, altered nutrient and hydrologic regimes, and increased fire intensity. We estimated the potential distribution of B. tectorum in the Park based on occurrence records (n = 211), current and future climate, and distance to roads and trails. An ensemble of six future climate scenarios indicated the habitable area of B. tectorum may increase from approximately 5.5% currently to 20.4% of the Park by the year 2050. Using ordination methods we evaluated the climatic space occupied by B. tectorum in the Park and how this space may shift given future climate change. Modeling climate change at a small extent (1,076 km2) and at a fine spatial resolution (90 m) is a novel approach in species distribution modeling, and may provide inference for microclimates not captured in coarse-scale models. Maps from our models serve as high-resolution hypotheses that can be improved over time by land managers to set priorities for surveys and removal of invasive species such as B. tectorum. PMID:25695255</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017APS..MARR13007G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017APS..MARR13007G"><span>Coherently coupling distinct spin ensembles through a high critical temperature superconducting resonator</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Ghirri, Alberto; Bonizzoni, Claudio; Troiani, Filippo; Affronte, Marco</p> <p></p> <p>The problem of coupling remote ensembles of two-level systems through cavity photons is revisited by using molecular spin centers and a high critical temperature superconducting coplanar resonator. By using PyBTM organic radicals, we achieved the strong coupling regime with values of the cooperativity reaching 4300 at 2 K. We show that up to three distinct spin ensembles are simultaneously coupled through the resonator mode. The ensembles are made physically distinguishable by chemically varying the g-factor and by exploiting the inhomogeneities of the applied magnetic field. The coherent mixing of the spin and field modes is demonstrated by the observed multiple anticrossing, along with the simulations performed within the input-output formalism, and quantified by suitable entropic measures.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28683393','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28683393"><span>Flavivirus structural heterogeneity: implications for cell entry.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Rey, Félix A; Stiasny, Karin; Heinz, Franz X</p> <p>2017-06-01</p> <p>The explosive spread of Zika virus is the most recent example of the threat imposed to human health by flaviviruses. High-resolution structures are available for several of these arthropod-borne viruses, revealing alternative icosahedral organizations of immature and mature virions. Incomplete proteolytic maturation, however, results in a cloud of highly heterogeneous mosaic particles. This heterogeneity is further expanded by a dynamic behavior of the viral envelope glycoproteins. The ensemble of heterogeneous and dynamic infectious particles circulating in infected hosts offers a range of alternative possible receptor interaction sites at their surfaces, potentially contributing to the broad flavivirus host-range and variation in tissue tropism. The potential synergy between heterogeneous particles in the circulating cloud thus provides an additional dimension to understand the unanticipated properties of Zika virus in its recent outbreaks. Copyright © 2017 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JESS..126...57C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JESS..126...57C"><span>The sensitivity to the microphysical schemes on the skill of forecasting the track and intensity of tropical cyclones using WRF-ARW model</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Choudhury, Devanil; Das, Someshwar</p> <p>2017-06-01</p> <p>The Advanced Research WRF (ARW) model is used to simulate Very Severe Cyclonic Storms (VSCS) Hudhud (7-13 October, 2014), Phailin (8-14 October, 2013) and Lehar (24-29 November, 2013) to investigate the sensitivity to microphysical schemes on the skill of forecasting track and intensity of the tropical cyclones for high-resolution (9 and 3 km) 120-hr model integration. For cloud resolving grid scale (<5 km) cloud microphysics plays an important role. The performance of the Goddard, Thompson, LIN and NSSL schemes are evaluated and compared with observations and a CONTROL forecast. This study is aimed to investigate the sensitivity to microphysics on the track and intensity with explicitly resolved convection scheme. It shows that the Goddard one-moment bulk liquid-ice microphysical scheme provided the highest skill on the track whereas for intensity both Thompson and Goddard microphysical schemes perform better. The Thompson scheme indicates the highest skill in intensity at 48, 96 and 120 hr, whereas at 24 and 72 hr, the Goddard scheme provides the highest skill in intensity. It is known that higher resolution domain produces better intensity and structure of the cyclones and it is desirable to resolve the convection with sufficiently high resolution and with the use of explicit cloud physics. This study suggests that the Goddard cumulus ensemble microphysical scheme is suitable for high resolution ARW simulation for TC's track and intensity over the BoB. Although the present study is based on only three cyclones, it could be useful for planning real-time predictions using ARW modelling system.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20110004881','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20110004881"><span>The Super Tuesday Outbreak: Forecast Sensitivities to Single-Moment Microphysics Schemes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Molthan, Andrew L.; Case, Jonathan L.; Dembek, Scott R.; Jedlovec, Gary J.; Lapenta, William M.</p> <p>2008-01-01</p> <p>Forecast precipitation and radar characteristics are used by operational centers to guide the issuance of advisory products. As operational numerical weather prediction is performed at increasingly finer spatial resolution, convective precipitation traditionally represented by sub-grid scale parameterization schemes is now being determined explicitly through single- or multi-moment bulk water microphysics routines. Gains in forecasting skill are expected through improved simulation of clouds and their microphysical processes. High resolution model grids and advanced parameterizations are now available through steady increases in computer resources. As with any parameterization, their reliability must be measured through performance metrics, with errors noted and targeted for improvement. Furthermore, the use of these schemes within an operational framework requires an understanding of limitations and an estimate of biases so that forecasters and model development teams can be aware of potential errors. The National Severe Storms Laboratory (NSSL) Spring Experiments have produced daily, high resolution forecasts used to evaluate forecast skill among an ensemble with varied physical parameterizations and data assimilation techniques. In this research, high resolution forecasts of the 5-6 February 2008 Super Tuesday Outbreak are replicated using the NSSL configuration in order to evaluate two components of simulated convection on a large domain: sensitivities of quantitative precipitation forecasts to assumptions within a single-moment bulk water microphysics scheme, and to determine if these schemes accurately depict the reflectivity characteristics of well-simulated, organized, cold frontal convection. As radar returns are sensitive to the amount of hydrometeor mass and the distribution of mass among variably sized targets, radar comparisons may guide potential improvements to a single-moment scheme. In addition, object-based verification metrics are evaluated for their utility in gauging model performance and QPF variability.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1992ExFl...13..386S','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1992ExFl...13..386S"><span>Experimental criteria for the determination of fractal parameters of premixed turbulent flames</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Shepherd, I. G.; Cheng, Robert K.; Talbot, L.</p> <p>1992-10-01</p> <p>The influence of spatial resolution, digitization noise, the number of records used for averaging, and the method of analysis on the determination of the fractal parameters of a high Damköhler number, methane/air, premixed, turbulent stagnation-point flame are investigated in this paper. The flow exit velocity was 5 m/s and the turbulent Reynolds number was 70 based on a integral scale of 3 mm and a turbulent intensity of 7%. The light source was a copper vapor laser which delivered 20 nsecs, 5 mJ pulses at 4 kHz and the tomographic cross-sections of the flame were recorded by a high speed movie camera. The spatial resolution of the images is 155 × 121 μm/pixel with a field of view of 50 × 65 mm. The stepping caliper technique for obtaining the fractal parameters is found to give the clearest indication of the cutoffs and the effects of noise. It is necessary to ensemble average the results from more than 25 statistically independent images to reduce sufficiently the scatter in the fractal parameters. The effects of reduced spatial resolution on fractal plots are estimated by artificial degradation of the resolution of the digitized flame boundaries. The effect of pixel resolution, an apparent increase in flame length below the inner scale rolloff, appears in the fractal plots when the measurent scale is less than approximately twice the pixel resolution. Although a clearer determination of fractal parameters is obtained by local averaging of the flame boundaries which removes digitization noise, at low spatial resolution this technique can reduce the fractal dimension. The degree of fractal isotropy of the flame surface can have a significant effect on the estimation of the flame surface area and hence burning rate from two-dimensional images. To estimate this isotropy a determination of the outer cutoff is required and three-dimensional measurements are probably also necessary.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2829965','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=2829965"><span>Single-Molecule and Superresolution Imaging in Live Bacteria Cells</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Biteen, Julie S.; Moerner, W.E.</p> <p>2010-01-01</p> <p>Single-molecule imaging enables biophysical measurements devoid of ensemble averaging, gives enhanced spatial resolution beyond the diffraction limit, and permits superresolution reconstructions. Here, single-molecule and superresolution imaging are applied to the study of proteins in live Caulobacter crescentus cells to illustrate the power of these methods in bacterial imaging. Based on these techniques, the diffusion coefficient and dynamics of the histidine protein kinase PleC, the localization behavior of the polar protein PopZ, and the treadmilling behavior and protein superstructure of the structural protein MreB are investigated with sub-40-nm spatial resolution, all in live cells. PMID:20300204</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016ExA....41..351C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016ExA....41..351C"><span>High precision radial velocities with GIANO spectra</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Carleo, I.; Sanna, N.; Gratton, R.; Benatti, S.; Bonavita, M.; Oliva, E.; Origlia, L.; Desidera, S.; Claudi, R.; Sissa, E.</p> <p>2016-06-01</p> <p>Radial velocities (RV) measured from near-infrared (NIR) spectra are a potentially excellent tool to search for extrasolar planets around cool or active stars. High resolution infrared (IR) spectrographs now available are reaching the high precision of visible instruments, with a constant improvement over time. GIANO is an infrared echelle spectrograph at the Telescopio Nazionale Galileo (TNG) and it is a powerful tool to provide high resolution spectra for accurate RV measurements of exoplanets and for chemical and dynamical studies of stellar or extragalactic objects. No other high spectral resolution IR instrument has GIANO's capability to cover the entire NIR wavelength range (0.95-2.45 μm) in a single exposure. In this paper we describe the ensemble of procedures that we have developed to measure high precision RVs on GIANO spectra acquired during the Science Verification (SV) run, using the telluric lines as wavelength reference. We used the Cross Correlation Function (CCF) method to determine the velocity for both the star and the telluric lines. For this purpose, we constructed two suitable digital masks that include about 2000 stellar lines, and a similar number of telluric lines. The method is applied to various targets with different spectral type, from K2V to M8 stars. We reached different precisions mainly depending on the H-magnitudes: for H ˜ 5 we obtain an rms scatter of ˜ 10 m s-1, while for H ˜ 9 the standard deviation increases to ˜ 50 ÷ 80 m s-1. The corresponding theoretical error expectations are ˜ 4 m s-1 and 30 m s-1, respectively. Finally we provide the RVs measured with our procedure for the targets observed during GIANO Science Verification.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFMGC53A0700P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFMGC53A0700P"><span>Northern African and Indian Precipitation at the end of the 21st Century: An Integrated Application of Regional and Global Climate Models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Patricola, C. M.; Cook, K. H.</p> <p>2008-12-01</p> <p>As greenhouse warming continues there is growing concern about the future climate of both Africa, which is highlighted by the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC AR4) as exceptionally vulnerable to climate change, and India. Precipitation projections from the AOGCMs of the IPCC AR4 are relatively consistent over India, but not over northern Africa. Inconsistencies can be related to the model's inability to capture climate process correctly, deficiencies in physical parameterizations, different SST projections, or horizontal atmospheric resolution that is too coarse to realistically represent the tight gradients over West Africa and complex topography of East Africa and India. Treatment of the land surface in a model may also be an issue over West Africa and India where land-surface/atmosphere interactions are very important. Here a method for simulating future climate is developed and applied using a high-resolution regional model in conjunction with output from a suite of AOGCMs, drawing on the advantages of both the regional and global modeling approaches. Integration by the regional model allows for finer horizontal resolution and regionally appropriate selection of parameterizations and land-surface model. AOGCM output is used to provide SST projections and lateral boundary conditions to constrain the regional model. The control simulation corresponds to 1981-2000, and eight future simulations representing 2081-2100 are conducted, each constrained by a different AOGCM and forced by CO2 concentrations from the SRES A2 emissions scenario. After model spin-up, May through October remain for investigation. Analysis is focused on climate change parameters important for impacts on agriculture and water resource management, and is presented in a format compatible with the IPCC reports. Precipitation projections simulated by the regional model are quite consistent, with 75% or more ensemble members agreeing on the sign of the anomaly over vast regions of Africa and India. Over West Africa, where the regional model provides the greatest improvement over the AOGCMs in consistency of ensemble members, precipitation at the end of the century is generally projected to increase during May and decrease in June and July. Wetter conditions are simulated during August though October, with the exception of drying close to the Guinean Coast in August. In late summer, high rainfall rates are simulated more frequently in the future, indicating the possibility for increases in flooding events. The regional model's projections over India are in stark contrast to the AOGCM's, producing intense and generally widespread drying in August and September. The very promising method developed here is young and further potential developments are recognized, including the addition of ocean, vegetation, and dust models. Ensembles which employ other regional models, sets of parameterizations, and emissions scenarios should also be explored.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018ClDy...50.4455B','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018ClDy...50.4455B"><span>Near-surface wind variability over the broader Adriatic region: insights from an ensemble of regional climate models</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Belušić, Andreina; Prtenjak, Maja Telišman; Güttler, Ivan; Ban, Nikolina; Leutwyler, David; Schär, Christoph</p> <p>2018-06-01</p> <p>Over the past few decades the horizontal resolution of regional climate models (RCMs) has steadily increased, leading to a better representation of small-scale topographic features and more details in simulating dynamical aspects, especially in coastal regions and over complex terrain. Due to its complex terrain, the broader Adriatic region represents a major challenge to state-of-the-art RCMs in simulating local wind systems realistically. The objective of this study is to identify the added value in near-surface wind due to the refined grid spacing of RCMs. For this purpose, we use a multi-model ensemble composed of CORDEX regional climate simulations at 0.11° and 0.44° grid spacing, forced by the ERA-Interim reanalysis, a COSMO convection-parameterizing simulation at 0.11° and a COSMO convection-resolving simulation at 0.02° grid spacing. Surface station observations from this region and satellite QuikSCAT data over the Adriatic Sea have been compared against daily output obtained from the available simulations. Both day-to-day wind and its frequency distribution are examined. The results indicate that the 0.44° RCMs rarely outperform ERA-Interim reanalysis, while the performance of the high-resolution simulations surpasses that of ERA-Interim. We also disclose that refining the grid spacing to a few km is needed to properly capture the small-scale wind systems. Finally, we show that the simulations frequently yield the accurate angle of local wind regimes, such as for the Bora flow, but overestimate the associated wind magnitude. Finally, spectral analysis shows good agreement between measurements and simulations, indicating the correct temporal variability of the wind speed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1426757-evaluations-high-resolution-dynamically-downscaled-ensembles-over-contiguous-united-states-climate-dynamics','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1426757-evaluations-high-resolution-dynamically-downscaled-ensembles-over-contiguous-united-states-climate-dynamics"><span>Evaluations of high-resolution dynamically downscaled ensembles over the contiguous United States Climate Dynamics</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Zobel, Zachary; Wang, Jiali; Wuebbles, Donald J.</p> <p></p> <p>This study uses Weather Research and Forecast (WRF) model to evaluate the performance of six dynamical downscaled decadal historical simulations with 12-km resolution for a large domain (7200 x 6180 km) that covers most of North America. The initial and boundary conditions are from three global climate models (GCMs) and one reanalysis data. The GCMs employed in this study are the Geophysical Fluid Dynamics Laboratory Earth System Model with Generalized Ocean Layer Dynamics component, Community Climate System Model, version 4, and the Hadley Centre Global Environment Model, version 2-Earth System. The reanalysis data is from the National Centers for Environmentalmore » Prediction-US. Department of Energy Reanalysis II. We analyze the effects of bias correcting, the lateral boundary conditions and the effects of spectral nudging. We evaluate the model performance for seven surface variables and four upper atmospheric variables based on their climatology and extremes for seven subregions across the United States. The results indicate that the simulation’s performance depends on both location and the features/variable being tested. We find that the use of bias correction and/or nudging is beneficial in many situations, but employing these when running the RCM is not always an improvement when compared to the reference data. The use of an ensemble mean and median leads to a better performance in measuring the climatology, while it is significantly biased for the extremes, showing much larger differences than individual GCM driven model simulations from the reference data. This study provides a comprehensive evaluation of these historical model runs in order to make informed decisions when making future projections.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1188220-racoro-continental-boundary-layer-cloud-investigations-part-case-study-development-ensemble-large-scale-forcings','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1188220-racoro-continental-boundary-layer-cloud-investigations-part-case-study-development-ensemble-large-scale-forcings"><span>RACORO continental boundary layer cloud investigations. Part I: Case study development and ensemble large-scale forcings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; ...</p> <p>2015-06-19</p> <p>Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60-hour case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in-situ measurements from the RACORO field campaign and remote-sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functionsmore » for concise representation in models. Values of the aerosol hygroscopicity parameter, κ, are derived from observations to be ~0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing datasets are derived from the ARM variational analysis, ECMWF forecasts, and a multi-scale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in 'trial' large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=20160003596&hterms=concise+mixed+methods&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Dconcise%2Bmixed%2Bmethods%26Nf%3DPublication-Date%257CGT%2B20100101','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=20160003596&hterms=concise+mixed+methods&qs=N%3D0%26Ntk%3DAll%26Ntx%3Dmode%2Bmatchall%26Ntt%3Dconcise%2Bmixed%2Bmethods%26Nf%3DPublication-Date%257CGT%2B20100101"><span>RACORO Continental Boundary Layer Cloud Investigations: 1. Case Study Development and Ensemble Large-Scale Forcings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; Endo, Satoshi; Lin, Wuyin; Wang, Jian; Feng, Sha; Zhang, Yunyan; Turner, David D.; Liu, Yangang; <a style="text-decoration: none; " href="javascript:void(0); " onClick="displayelement('author_20160003596'); toggleEditAbsImage('author_20160003596_show'); toggleEditAbsImage('author_20160003596_hide'); "> <img style="display:inline; width:12px; height:12px; " src="images/arrow-up.gif" width="12" height="12" border="0" alt="hide" id="author_20160003596_show"> <img style="width:12px; height:12px; display:none; " src="images/arrow-down.gif" width="12" height="12" border="0" alt="hide" id="author_20160003596_hide"></p> <p>2015-01-01</p> <p>Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60 h case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in situ measurements from the Routine AAF (Atmospheric Radiation Measurement (ARM) Aerial Facility) CLOWD (Clouds with Low Optical Water Depth) Optical Radiative Observations (RACORO) field campaign and remote sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functions for concise representation in models. Values of the aerosol hygroscopicity parameter, kappa, are derived from observations to be approximately 0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing data sets are derived from the ARM variational analysis, European Centre for Medium-Range Weather Forecasts, and a multiscale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in "trial" large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2015JGRD..120.5962V','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2015JGRD..120.5962V"><span>RACORO continental boundary layer cloud investigations: 1. Case study development and ensemble large-scale forcings</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; Endo, Satoshi; Lin, Wuyin; Wang, Jian; Feng, Sha; Zhang, Yunyan; Turner, David D.; Liu, Yangang; Li, Zhijin; Xie, Shaocheng; Ackerman, Andrew S.; Zhang, Minghua; Khairoutdinov, Marat</p> <p>2015-06-01</p> <p>Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60 h case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in situ measurements from the Routine AAF (Atmospheric Radiation Measurement (ARM) Aerial Facility) CLOWD (Clouds with Low Optical Water Depth) Optical Radiative Observations (RACORO) field campaign and remote sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functions for concise representation in models. Values of the aerosol hygroscopicity parameter, κ, are derived from observations to be 0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing data sets are derived from the ARM variational analysis, European Centre for Medium-Range Weather Forecasts, and a multiscale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in "trial" large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.5576N','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.5576N"><span>A two-model hydrologic ensemble prediction of hydrograph: case study from the upper Nysa Klodzka river basin (SW Poland)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Niedzielski, Tomasz; Mizinski, Bartlomiej</p> <p>2016-04-01</p> <p>The HydroProg system has been elaborated in frame of the research project no. 2011/01/D/ST10/04171 of the National Science Centre of Poland and is steadily producing multimodel ensemble predictions of hydrograph in real time. Although there are six ensemble members available at present, the longest record of predictions and their statistics is available for two data-based models (uni- and multivariate autoregressive models). Thus, we consider 3-hour predictions of water levels, with lead times ranging from 15 to 180 minutes, computed every 15 minutes since August 2013 for the Nysa Klodzka basin (SW Poland) using the two approaches and their two-model ensemble. Since the launch of the HydroProg system there have been 12 high flow episodes, and the objective of this work is to present the performance of the two-model ensemble in the process of forecasting these events. For a sake of brevity, we limit our investigation to a single gauge located at the Nysa Klodzka river in the town of Klodzko, which is centrally located in the studied basin. We identified certain regular scenarios of how the models perform in predicting the high flows in Klodzko. At the initial phase of the high flow, well before the rising limb of hydrograph, the two-model ensemble is found to provide the most skilful prognoses of water levels. However, while forecasting the rising limb of hydrograph, either the two-model solution or the vector autoregressive model offers the best predictive performance. In addition, it is hypothesized that along with the development of the rising limb phase, the vector autoregression becomes the most skilful approach amongst the scrutinized ones. Our simple two-model exercise confirms that multimodel hydrologic ensemble predictions cannot be treated as universal solutions suitable for forecasting the entire high flow event, but their superior performance may hold only for certain phases of a high flow.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24672402','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24672402"><span>Constructing better classifier ensemble based on weighted accuracy and diversity measure.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Zeng, Xiaodong; Wong, Derek F; Chao, Lidia S</p> <p>2014-01-01</p> <p>A weighted accuracy and diversity (WAD) method is presented, a novel measure used to evaluate the quality of the classifier ensemble, assisting in the ensemble selection task. The proposed measure is motivated by a commonly accepted hypothesis; that is, a robust classifier ensemble should not only be accurate but also different from every other member. In fact, accuracy and diversity are mutual restraint factors; that is, an ensemble with high accuracy may have low diversity, and an overly diverse ensemble may negatively affect accuracy. This study proposes a method to find the balance between accuracy and diversity that enhances the predictive ability of an ensemble for unknown data. The quality assessment for an ensemble is performed such that the final score is achieved by computing the harmonic mean of accuracy and diversity, where two weight parameters are used to balance them. The measure is compared to two representative measures, Kappa-Error and GenDiv, and two threshold measures that consider only accuracy or diversity, with two heuristic search algorithms, genetic algorithm, and forward hill-climbing algorithm, in ensemble selection tasks performed on 15 UCI benchmark datasets. The empirical results demonstrate that the WAD measure is superior to others in most cases.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3925515','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3925515"><span>Constructing Better Classifier Ensemble Based on Weighted Accuracy and Diversity Measure</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Chao, Lidia S.</p> <p>2014-01-01</p> <p>A weighted accuracy and diversity (WAD) method is presented, a novel measure used to evaluate the quality of the classifier ensemble, assisting in the ensemble selection task. The proposed measure is motivated by a commonly accepted hypothesis; that is, a robust classifier ensemble should not only be accurate but also different from every other member. In fact, accuracy and diversity are mutual restraint factors; that is, an ensemble with high accuracy may have low diversity, and an overly diverse ensemble may negatively affect accuracy. This study proposes a method to find the balance between accuracy and diversity that enhances the predictive ability of an ensemble for unknown data. The quality assessment for an ensemble is performed such that the final score is achieved by computing the harmonic mean of accuracy and diversity, where two weight parameters are used to balance them. The measure is compared to two representative measures, Kappa-Error and GenDiv, and two threshold measures that consider only accuracy or diversity, with two heuristic search algorithms, genetic algorithm, and forward hill-climbing algorithm, in ensemble selection tasks performed on 15 UCI benchmark datasets. The empirical results demonstrate that the WAD measure is superior to others in most cases. PMID:24672402</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMGC13H0761G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMGC13H0761G"><span>Persistent Cold Air Outbreaks over North America Under Climate Warming</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gao, Y.; Leung, L. R.; Lu, J.</p> <p>2014-12-01</p> <p>This study evaluates the change of cold air outbreaks (CAO) over North America using Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model ensemble of global climate simulations as well as regional high resolution climate simulations. In future, while robust decrease of CAO duration dominates in most of the North America, the decrease over northwestern U.S. was found to have much smaller magnitude than the surrounding regions. We found statistically significant increase of the sea level pressure over gulf of Alaska, leading to the advection of cold air to northwestern U.S.. By shifting the probability distribution of present temperature towards future warmer conditions, we identified the changes in large scale circulation contribute to about 50% of the enhanced sea level pressure. Using the high resolution regional climate model results, we found that increases of existing snowpack could potentially trigger the increase of CAO in the near future over the southwestern U.S. and Rocky Mountain through surface albedo effects. By the end of this century, the top 5 most extreme historical CAO events may still occur and wind chill warning will continue to have societal impacts over North America in particular over northwestern United States.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2001AGUFM.A41B0059C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2001AGUFM.A41B0059C"><span>Application of ensemble back trajectory and factor analysis methods to aerosol data from Fort Meade, MD: Implications for sources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Chen, L. A.; Doddridge, B. G.; Dickerson, R. R.</p> <p>2001-12-01</p> <p>As the primary field experiment for Maryland Aerosol Research and CHaracterization (MARCH-Atlantic) study, chemically speciated PM2.5 has been sampled at Fort Meade (FME, 39.10° N 76.74° W) since July 1999. FME is suburban, located in the middle of the bustling Baltimore-Washington corridor, which is generally downwind of the highly industrialized Midwest. Due to this unique sampling location, the PM2.5 observed at FME is expected to be of both local and regional sources, with relative contributions varying temporally. This variation, believed to be largely controlled by the meteorology, influences day-to-day or seasonal profiles of PM2.5 mass concentration and chemical composition. Air parcel back trajectories, which describe the path of air parcels traveling backward in time from site (receptor), reflect changes in the synoptic meteorological conditions. In this paper, an ensemble back trajectory method is employed to study the meteorology associated with each high/low PM2.5 episode in different seasons. For every sampling day, the residence time of air parcels within the eastern US at a 1° x 1° x 500 m geographic resolution can be estimated in order to resolve areas likely dominating the production of various PM2.5 components. Local sources are found to be more dominant in winter than in summer. "Factor analysis" is based on mass balance approach, providing useful insights on air pollution data. Here, a newly developed factor analysis model (UNMIX) is used to extract source profiles and contributions from the speciated PM2.5 data. Combing the model results with ensemble back trajectory method improves the understanding of the source regions and helps partition the contributions from local or more distant areas. >http://www.meto.umd.edu/~bruce/MARCH-Atl.html</a></p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1006006-denaturant-dependent-conformational-changes-beta-trefoil-protein-global-residue-specific-aspects-equilibrium-denaturation-process','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1006006-denaturant-dependent-conformational-changes-beta-trefoil-protein-global-residue-specific-aspects-equilibrium-denaturation-process"><span>Denaturant-Dependent Conformational Changes in a [beta]-Trefoil Protein: Global and Residue-Specific Aspects of an Equilibrium Denaturation Process</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Latypov, Ramil F.; Liu, Dingjiang; Jacob, Jaby</p> <p>2010-01-12</p> <p>Conformational properties of the folded and unfolded ensembles of human interleukin-1 receptor antagonist (IL-1ra) are strongly denaturant-dependent as evidenced by high-resolution two-dimensional nuclear magnetic resonance (NMR), limited proteolysis, and small-angle X-ray scattering (SAXS). The folded ensemble was characterized in detail in the presence of different urea concentrations by 1H-15N HSQC NMR. The {beta}-trefoil fold characteristic of native IL-1ra was preserved until the unfolding transition region beginning at 4 M urea. At the same time, a subset of native resonances disappeared gradually starting at low denaturant concentrations, indicating noncooperative changes in the folded state. Additional evidence of structural perturbations came frommore » the chemical shift analysis, nonuniform and bell-shaped peak intensity profiles, and limited proteolysis. In particular, the following nearby regions of the tertiary structure became progressively destabilized with increasing urea concentrations: the {beta}-hairpin interface of trefoils 1 and 2 and the H2a-H2 helical region. These regions underwent small-scale perturbations within the native baseline region in the absence of populated molten globule-like states. Similar regions were affected by elevated temperatures known to induce irreversible aggregation of IL-1ra. Further evidence of structural transitions invoking near-native conformations came from an optical spectroscopy analysis of its single-tryptophan variant W17A. The increase in the radius of gyration was associated with a single equilibrium unfolding transition in the case of two different denaturants, urea and guanidine hydrochloride (GuHCl). However, the compactness of urea- and GuHCl-unfolded molecules was comparable only at high denaturant concentrations and deviated under less denaturing conditions. Our results identified the role of conformational flexibility in IL-1ra aggregation and shed light on the nature of structural transitions within the folded ensembles of other {beta}-trefoil proteins, such as IL-1{beta} and hFGF-1.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012EGUGA..14.7286T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012EGUGA..14.7286T"><span>Generalization of information-based concepts in forecast verification</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tödter, J.; Ahrens, B.</p> <p>2012-04-01</p> <p>This work deals with information-theoretical methods in probabilistic forecast verification. Recent findings concerning the Ignorance Score are shortly reviewed, then the generalization to continuous forecasts is shown. For ensemble forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are the prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up the natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The applicability and usefulness of the conceptually appealing CRIGN is illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This is also directly applicable to the more traditional CRPS.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5088496','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5088496"><span>ClustENM: ENM-Based Sampling of Essential Conformational Space at Full Atomic Resolution</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kurkcuoglu, Zeynep; Bahar, Ivet; Doruker, Pemra</p> <p>2016-01-01</p> <p>Accurate sampling of conformational space and, in particular, the transitions between functional substates has been a challenge in molecular dynamic (MD) simulations of large biomolecular systems. We developed an Elastic Network Model (ENM)-based computational method, ClustENM, for sampling large conformational changes of biomolecules with various sizes and oligomerization states. ClustENM is an iterative method that combines ENM with energy minimization and clustering steps. It is an unbiased technique, which requires only an initial structure as input, and no information about the target conformation. To test the performance of ClustENM, we applied it to six biomolecular systems: adenylate kinase (AK), calmodulin, p38 MAP kinase, HIV-1 reverse transcriptase (RT), triosephosphate isomerase (TIM), and the 70S ribosomal complex. The generated ensembles of conformers determined at atomic resolution show good agreement with experimental data (979 structures resolved by X-ray and/or NMR) and encompass the subspaces covered in independent MD simulations for TIM, p38, and RT. ClustENM emerges as a computationally efficient tool for characterizing the conformational space of large systems at atomic detail, in addition to generating a representative ensemble of conformers that can be advantageously used in simulating substrate/ligand-binding events. PMID:27494296</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1815261P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1815261P"><span>Impact of climate change on Precipitation and temperature under the RCP 8.5 and A1B scenarios in an Alpine Cathment (Alto-Genil Basin,southeast Spain). A comparison of statistical downscaling methods</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pulido-Velazquez, David; Juan Collados-Lara, Antonio; Pardo-Iguzquiza, Eulogio; Jimeno-Saez, Patricia; Fernandez-Chacon, Francisca</p> <p>2016-04-01</p> <p>In order to design adaptive strategies to global change we need to assess the future impact of climate change on water resources, which depends on precipitation and temperature series in the systems. The objective of this work is to generate future climate series in the "Alto Genil" Basin (southeast Spain) for the period 2071-2100 by perturbing the historical series using different statistical methods. For this targeted we use information coming from regionals climate model simulations (RCMs) available in two European projects, CORDEX (2013), with a spatial resolution of 12.5 km, and ENSEMBLES (2009), with a spatial resolution of 25 km. The historical climate series used for the period 1971-2000 have been obtained from Spain02 project (2012) which has the same spatial resolution that CORDEX project (both use the EURO-CORDEX grid). Two emission scenarios have been considered: the Representative Concentration Pathways (RCP) 8.5 emissions scenario, which is the most unfavorable scenario considered in the fifth Assessment Report (AR5) by the Intergovernmental Panel on Climate Change (IPCC), and the A1B emission scenario of fourth Assessment Report (AR4). We use the RCM simulations to create an ensemble of predictions weighting their information according to their ability to reproduce the main statistic of the historical climatology. A multi-objective analysis has been performed to identify which models are better in terms of goodness of fit to the cited statistic of the historical series. The ensemble of the CORDEX and the ENSEMBLES projects has been finally created with nine and four models respectively. These ensemble series have been used to assess the anomalies in mean and standard deviation (differences between the control and future RCM series). A "delta-change" method (Pulido-Velazquez et al., 2011) has been applied to define future series by modifying the historical climate series in accordance with the cited anomalies in mean and standard deviation. A comparison between results for scenario A1B and RCP8.5 has been performed. The reduction obtained for the mean rainfall respect to the historical are 24.2 % and 24.4 % respectively, and the increment in the temperature are 46.3 % and 31.2 % respectively. A sensitivity analysis of the results to the statistical downscaling techniques employed has been performed. The next techniques have been explored: Perturbation method or "delta-change"; Regression method (a regression function which relates the RCM and the historic information will be used to generate future climate series for the fixed period); Quantile mapping, (it attempts to find a transformation function which relates the observed variable and the modeled variable maintaining an statistical distribution equals the observed variable); Stochastic weather generator (SWG): They can be uni-site or multi-site (which considers the spatial correlation of climatic series). A comparative analysis of these techniques has been performed identifying the advantages and disadvantages of each of them. Acknowledgments: This research has been partially supported by the GESINHIMPADAPT project (CGL2013-48424-C2-2-R) with Spanish MINECO funds. We would also like to thank Spain02, ENSEMBLES and CORDEX projects for the data provided for this study.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24667482','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24667482"><span>NIMEFI: gene regulatory network inference using multiple ensemble feature importance algorithms.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ruyssinck, Joeri; Huynh-Thu, Vân Anh; Geurts, Pierre; Dhaene, Tom; Demeester, Piet; Saeys, Yvan</p> <p>2014-01-01</p> <p>One of the long-standing open challenges in computational systems biology is the topology inference of gene regulatory networks from high-throughput omics data. Recently, two community-wide efforts, DREAM4 and DREAM5, have been established to benchmark network inference techniques using gene expression measurements. In these challenges the overall top performer was the GENIE3 algorithm. This method decomposes the network inference task into separate regression problems for each gene in the network in which the expression values of a particular target gene are predicted using all other genes as possible predictors. Next, using tree-based ensemble methods, an importance measure for each predictor gene is calculated with respect to the target gene and a high feature importance is considered as putative evidence of a regulatory link existing between both genes. The contribution of this work is twofold. First, we generalize the regression decomposition strategy of GENIE3 to other feature importance methods. We compare the performance of support vector regression, the elastic net, random forest regression, symbolic regression and their ensemble variants in this setting to the original GENIE3 algorithm. To create the ensemble variants, we propose a subsampling approach which allows us to cast any feature selection algorithm that produces a feature ranking into an ensemble feature importance algorithm. We demonstrate that the ensemble setting is key to the network inference task, as only ensemble variants achieve top performance. As second contribution, we explore the effect of using rankwise averaged predictions of multiple ensemble algorithms as opposed to only one. We name this approach NIMEFI (Network Inference using Multiple Ensemble Feature Importance algorithms) and show that this approach outperforms all individual methods in general, although on a specific network a single method can perform better. An implementation of NIMEFI has been made publicly available.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29325871','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29325871"><span>Use of ultraviolet-fluorescence-based simulation in evaluation of personal protective equipment worn for first assessment and care of a patient with suspected high-consequence infectious disease.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Hall, S; Poller, B; Bailey, C; Gregory, S; Clark, R; Roberts, P; Tunbridge, A; Poran, V; Evans, C; Crook, B</p> <p>2018-06-01</p> <p>Variations currently exist across the UK in the choice of personal protective equipment (PPE) used by healthcare workers when caring for patients with suspected high-consequence infectious diseases (HCIDs). To test the protection afforded to healthcare workers by current PPE ensembles during assessment of a suspected HCID case, and to provide an evidence base to justify proposal of a unified PPE ensemble for healthcare workers across the UK. One 'basic level' (enhanced precautions) PPE ensemble and five 'suspected case' PPE ensembles were evaluated in volunteer trials using 'Violet'; an ultraviolet-fluorescence-based simulation exercise to visualize exposure/contamination events. Contamination was photographed and mapped. There were 147 post-simulation and 31 post-doffing contamination events, from a maximum of 980, when evaluating the basic level of PPE. Therefore, this PPE ensemble did not afford adequate protection, primarily due to direct contamination of exposed areas of the skin. For the five suspected case ensembles, 1584 post-simulation contamination events were recorded, from a maximum of 5110. Twelve post-doffing contamination events were also observed (face, two events; neck, one event; forearm, one event; lower legs, eight events). All suspected case PPE ensembles either had post-doffing contamination events or other significant disadvantages to their use. This identified the need to design a unified PPE ensemble and doffing procedure, incorporating the most protective PPE considered for each body area. This work has been presented to, and reviewed by, key stakeholders to decide on a proposed unified ensemble, subject to further evaluation. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..18.4655C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..18.4655C"><span>Ensemble reconstruction of severe low flow events in France since 1871</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Caillouet, Laurie; Vidal, Jean-Philippe; Sauquet, Eric; Devers, Alexandre; Graff, Benjamin</p> <p>2016-04-01</p> <p>This work presents a study of severe low flow events that occurred from 1871 onwards for a large number of near-natural catchments in France. It aims at assessing and comparing their characteristics to improve our knowledge on historical events and to provide a selection of benchmark events for climate change adaptation purposes. The historical depth of streamflow observations is generally limited to the last 50 years and therefore offers too small a sample of severe low flow events to properly explore the long-term evolution of their characteristics and associated impacts. In order to overcome this limit, this work takes advantage of a 140-year ensemble hydrometeorological dataset over France based on: (1) a probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France (Caillouet et al., 2015), and (2) a continuous hydrological modelling that uses the high-resolution meteorological reconstructions as forcings over the whole period. This dataset provides an ensemble of 25 equally plausible daily streamflow time series for a reference network of stations in France over the whole 1871-2012 period. Severe low flow events are identified based on a combination of a fixed threshold and a daily variable threshold. Each event is characterized by its deficit, duration and timing by applying the Sequent Peak Algorithm. The procedure is applied to the 25 simulated time series as well as to the observed time series in order to compare observed and simulated events over the recent period, and to characterize in a probabilistic way unrecorded historical events. The ensemble aspect of the reconstruction leads to address specific issues, for properly defining events across ensemble simulations, as well as for adequately comparing the simulated characteristics to the observed ones. This study brings forward the outstanding 1921 and 1940s events but also older and less known ones that occurred during the last decade of the 19th century. For the first time, severe low flow events are qualified in a homogeneous way over 140 years on a large set of near-natural French catchments, allowing for detailed analyses of the effect of climate variability and anthropogenic climate change on low flow hydrology. Caillouet, L., Vidal, J.-P., Sauquet, E., and Graff, B. (2015) Probabilistic precipitation and temperature downscaling of the Twentieth Century Reanalysis over France, Clim. Past Discuss., 11, 4425-4482, doi:10.5194/cpd-11-4425-2015</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4756621','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=4756621"><span>Ensembl regulation resources</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Zerbino, Daniel R.; Johnson, Nathan; Juetteman, Thomas; Sheppard, Dan; Wilder, Steven P.; Lavidas, Ilias; Nuhn, Michael; Perry, Emily; Raffaillac-Desfosses, Quentin; Sobral, Daniel; Keefe, Damian; Gräf, Stefan; Ahmed, Ikhlak; Kinsella, Rhoda; Pritchard, Bethan; Brent, Simon; Amode, Ridwan; Parker, Anne; Trevanion, Steven; Birney, Ewan; Dunham, Ian; Flicek, Paul</p> <p>2016-01-01</p> <p>New experimental techniques in epigenomics allow researchers to assay a diversity of highly dynamic features such as histone marks, DNA modifications or chromatin structure. The study of their fluctuations should provide insights into gene expression regulation, cell differentiation and disease. The Ensembl project collects and maintains the Ensembl regulation data resources on epigenetic marks, transcription factor binding and DNA methylation for human and mouse, as well as microarray probe mappings and annotations for a variety of chordate genomes. From this data, we produce a functional annotation of the regulatory elements along the human and mouse genomes with plans to expand to other species as data becomes available. Starting from well-studied cell lines, we will progressively expand our library of measurements to a greater variety of samples. Ensembl’s regulation resources provide a central and easy-to-query repository for reference epigenomes. As with all Ensembl data, it is freely available at http://www.ensembl.org, from the Perl and REST APIs and from the public Ensembl MySQL database server at ensembldb.ensembl.org. Database URL: http://www.ensembl.org PMID:26888907</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3164293','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=3164293"><span>A pH-dependent conformational ensemble mediates proton transport through the influenza A/M2 protein†</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Polishchuk, Alexei L.; Lear, James D.; Ma, Chunlong; Lamb, Robert A.; Pinto, Lawrence H.; DeGrado, William F.</p> <p>2010-01-01</p> <p>The influenza A M2 protein exhibits inwardly rectifying, pH-activated proton transport that saturates at low pH. A comparison of high-resolution structures of the transmembrane domain at high and low pH suggests that pH-dependent conformational changes may facilitate proton conduction by alternately changing the accessibility of the N-terminal and C-terminal regions of the channel as a proton transits through the transmembrane domain. Here, we show that M2 functionally reconstituted in liposomes populates at least three different conformational states over a physiologically relevant pH range, with transition midpoints that are consistent with previously reported His37 pKas. We then develop and test two similar, quantitative mechanistic models of proton transport, where protonation shifts the equilibrium between structural states having different proton affinities and solvent accessibilities. The models account well for a collection of experimental data sets over a wide range of pHs and voltages and require only a small number of adjustable parameters to accurately describe the data. While the kinetic models do not require any specific conformation for the protein, they nevertheless are consistent with a large body of structural information based on high-resolution NMR and crystallographic structures, optical spectroscopy, and MD calculations. PMID:20968306</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5333200','PMC'); return false;" href="https://www.pubmedcentral.nih.gov/articlerender.fcgi?tool=pmcentrez&artid=5333200"><span>Ensembles of NLP Tools for Data Element Extraction from Clinical Notes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pmc">PubMed Central</a></p> <p>Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D.; Day, Michele E.; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan</p> <p>2016-01-01</p> <p>Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort. PMID:28269947</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28269947','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28269947"><span>Ensembles of NLP Tools for Data Element Extraction from Clinical Notes.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Kuo, Tsung-Ting; Rao, Pallavi; Maehara, Cleo; Doan, Son; Chaparro, Juan D; Day, Michele E; Farcas, Claudiu; Ohno-Machado, Lucila; Hsu, Chun-Nan</p> <p>2016-01-01</p> <p>Natural Language Processing (NLP) is essential for concept extraction from narrative text in electronic health records (EHR). To extract numerous and diverse concepts, such as data elements (i.e., important concepts related to a certain medical condition), a plausible solution is to combine various NLP tools into an ensemble to improve extraction performance. However, it is unclear to what extent ensembles of popular NLP tools improve the extraction of numerous and diverse concepts. Therefore, we built an NLP ensemble pipeline to synergize the strength of popular NLP tools using seven ensemble methods, and to quantify the improvement in performance achieved by ensembles in the extraction of data elements for three very different cohorts. Evaluation results show that the pipeline can improve the performance of NLP tools, but there is high variability depending on the cohort.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2008AGUFMOS52B..02R','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2008AGUFMOS52B..02R"><span>Downscaling an Eddy-Resolving Global Model for the Continental Shelf off South Eastern Australia</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Roughan, M.; Baird, M.; MacDonald, H.; Oke, P.</p> <p>2008-12-01</p> <p>The Australian Bluelink collaboration between CSIRO, the Bureau of Meteorology and the Royal Australian Navy has made available to the research community the output of BODAS (Bluelink ocean data assimilation system), an ensemble optimal interpolation reanalysis system with ~10 km resolution around Australia. Within the Bluelink project, BODAS fields are assimilated into a dynamic ocean model of the same resolution to produce BRAN (BlueLink ReANalysis, a hindcast of water properties around Australia from 1992 to 2004). In this study, BODAS hydrographic fields are assimilated into a ~ 3 km resolution Princeton Ocean Model (POM) configuration of the coastal ocean off SE Australia. Experiments were undertaken to establish the optimal strength and duration of the assimilation of BODAS fields into the 3 km resolution POM configuration for the purpose of producing hindcasts of ocean state. It is shown that the resultant downscaling of Bluelink products is better able to reproduce coastal features, particularly velocities and hydrography over the continental shelf off south eastern Australia. The BODAS-POM modelling system is used to provide a high-resolution simulation of the East Australian Current over the period 1992 to 2004. One of the applications that we will present is an investigation of the seasonal and inter-annual variability in the dispersion of passive particles in the East Australian Current. The practical outcome is an estimate of the connectivity of estuaries along the coast of southeast Australia, which is relevant for the dispersion of marine pests.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2009AGUFM.H43F1096K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2009AGUFM.H43F1096K"><span>Many-objective Groundwater Monitoring Network Design Using Bias-Aware Ensemble Kalman Filtering and Evolutionary Optimization</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Kollat, J. B.; Reed, P. M.</p> <p>2009-12-01</p> <p>This study contributes the ASSIST (Adaptive Strategies for Sampling in Space and Time) framework for improving long-term groundwater monitoring decisions across space and time while accounting for the influences of systematic model errors (or predictive bias). The ASSIST framework combines contaminant flow-and-transport modeling, bias-aware ensemble Kalman filtering (EnKF) and many-objective evolutionary optimization. Our goal in this work is to provide decision makers with a fuller understanding of the information tradeoffs they must confront when performing long-term groundwater monitoring network design. Our many-objective analysis considers up to 6 design objectives simultaneously and consequently synthesizes prior monitoring network design methodologies into a single, flexible framework. This study demonstrates the ASSIST framework using a tracer study conducted within a physical aquifer transport experimental tank located at the University of Vermont. The tank tracer experiment was extensively sampled to provide high resolution estimates of tracer plume behavior. The simulation component of the ASSIST framework consists of stochastic ensemble flow-and-transport predictions using ParFlow coupled with the Lagrangian SLIM transport model. The ParFlow and SLIM ensemble predictions are conditioned with tracer observations using a bias-aware EnKF. The EnKF allows decision makers to enhance plume transport predictions in space and time in the presence of uncertain and biased model predictions by conditioning them on uncertain measurement data. In this initial demonstration, the position and frequency of sampling were optimized to: (i) minimize monitoring cost, (ii) maximize information provided to the EnKF, (iii) minimize failure to detect the tracer, (iv) maximize the detection of tracer flux, (v) minimize error in quantifying tracer mass, and (vi) minimize error in quantifying the moment of the tracer plume. The results demonstrate that the many-objective problem formulation provides a tremendous amount of information for decision makers. Specifically our many-objective analysis highlights the limitations and potentially negative design consequences of traditional single and two-objective problem formulations. These consequences become apparent through visual exploration of high-dimensional tradeoffs and the identification of regions with interesting compromise solutions. The prediction characteristics of these compromise designs are explored in detail, as well as their implications for subsequent design decisions in both space and time.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018EL....12158002C','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018EL....12158002C"><span>Cross-sectional fluctuation scaling in the high-frequency illiquidity of Chinese stocks</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Cai, Qing; Gao, Xing-Lu; Zhou, Wei-Xing; Stanley, H. Eugene</p> <p>2018-03-01</p> <p>Taylor's law of temporal and ensemble fluctuation scaling has been ubiquitously observed in diverse complex systems including financial markets. Stock illiquidity is an important nonadditive financial quantity, which is found to comply with Taylor's temporal fluctuation scaling law. In this paper, we perform the cross-sectional analysis of the 1 min high-frequency illiquidity time series of Chinese stocks and unveil the presence of Taylor's law of ensemble fluctuation scaling. The estimated daily Taylor scaling exponent fluctuates around 1.442. We find that Taylor's scaling exponents of stock illiquidity do not relate to the ensemble mean and ensemble variety of returns. Our analysis uncovers a new scaling law of financial markets and might stimulate further investigations for a better understanding of financial markets' dynamics.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29284916','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29284916"><span>Comparison of Basic and Ensemble Data Mining Methods in Predicting 5-Year Survival of Colorectal Cancer Patients.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Pourhoseingholi, Mohamad Amin; Kheirian, Sedigheh; Zali, Mohammad Reza</p> <p>2017-12-01</p> <p>Colorectal cancer (CRC) is one of the most common malignancies and cause of cancer mortality worldwide. Given the importance of predicting the survival of CRC patients and the growing use of data mining methods, this study aims to compare the performance of models for predicting 5-year survival of CRC patients using variety of basic and ensemble data mining methods. The CRC dataset from The Shahid Beheshti University of Medical Sciences Research Center for Gastroenterology and Liver Diseases were used for prediction and comparative study of the base and ensemble data mining techniques. Feature selection methods were used to select predictor attributes for classification. The WEKA toolkit and MedCalc software were respectively utilized for creating and comparing the models. The obtained results showed that the predictive performance of developed models was altogether high (all greater than 90%). Overall, the performance of ensemble models was higher than that of basic classifiers and the best result achieved by ensemble voting model in terms of area under the ROC curve (AUC= 0.96). AUC Comparison of models showed that the ensemble voting method significantly outperformed all models except for two methods of Random Forest (RF) and Bayesian Network (BN) considered the overlapping 95% confidence intervals. This result may indicate high predictive power of these two methods along with ensemble voting for predicting 5-year survival of CRC patients.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=JAZZ&pg=6&id=EJ711241','ERIC'); return false;" href="https://eric.ed.gov/?q=JAZZ&pg=6&id=EJ711241"><span>Gender and Participation in High School and College Instrumental Jazz Ensembles</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>McKeage, Kathleen M.</p> <p>2004-01-01</p> <p>This study is an examination of the relationship between gender and participation in high school and college instrumental jazz ensembles. Student demographic and attitudinal information was collected using the researcher-designed Instrumental Jazz Participation Survey (IJPS). Undergraduate college band students (N = 628) representing 15 programs…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUFMIN13D..01K','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUFMIN13D..01K"><span>Next-Generation Climate Modeling Science Challenges for Simulation, Workflow and Analysis Systems</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Koch, D. M.; Anantharaj, V. G.; Bader, D. C.; Krishnan, H.; Leung, L. R.; Ringler, T.; Taylor, M.; Wehner, M. F.; Williams, D. N.</p> <p>2016-12-01</p> <p>We will present two examples of current and future high-resolution climate-modeling research that are challenging existing simulation run-time I/O, model-data movement, storage and publishing, and analysis. In each case, we will consider lessons learned as current workflow systems are broken by these large-data science challenges, as well as strategies to repair or rebuild the systems. First we consider the science and workflow challenges to be posed by the CMIP6 multi-model HighResMIP, involving around a dozen modeling groups performing quarter-degree simulations, in 3-member ensembles for 100 years, with high-frequency (1-6 hourly) diagnostics, which is expected to generate over 4PB of data. An example of science derived from these experiments will be to study how resolution affects the ability of models to capture extreme-events such as hurricanes or atmospheric rivers. Expected methods to transfer (using parallel Globus) and analyze (using parallel "TECA" software tools) HighResMIP data for such feature-tracking by the DOE CASCADE project will be presented. A second example will be from the Accelerated Climate Modeling for Energy (ACME) project, which is currently addressing challenges involving multiple century-scale coupled high resolution (quarter-degree) climate simulations on DOE Leadership Class computers. ACME is anticipating production of over 5PB of data during the next 2 years of simulations, in order to investigate the drivers of water cycle changes, sea-level-rise, and carbon cycle evolution. The ACME workflow, from simulation to data transfer, storage, analysis and publication will be presented. Current and planned methods to accelerate the workflow, including implementing run-time diagnostics, and implementing server-side analysis to avoid moving large datasets will be presented.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018EP%26S...70...74O','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018EP%26S...70...74O"><span>Data assimilation experiment of precipitable water vapor observed by a hyper-dense GNSS receiver network using a nested NHM-LETKF system</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Oigawa, Masanori; Tsuda, Toshitaka; Seko, Hiromu; Shoji, Yoshinori; Realini, Eugenio</p> <p>2018-05-01</p> <p>We studied the assimilation of high-resolution precipitable water vapor (PWV) data derived from a hyper-dense global navigation satellite system network around Uji city, Kyoto, Japan, which had a mean inter-station distance of about 1.7 km. We focused on a heavy rainfall event that occurred on August 13-14, 2012, around Uji city. We employed a local ensemble transform Kalman filter as the data assimilation method. The inhomogeneity of the observed PWV increased on a scale of less than 10 km in advance of the actual rainfall detected by the rain gauge. Zenith wet delay data observed by the Uji network showed that the characteristic length scale of water vapor distribution during the rainfall ranged from 1.9 to 3.5 km. It is suggested that the assimilation of PWV data with high horizontal resolution (a few km) improves the forecast accuracy. We conducted the assimilation experiment of high-resolution PWV data, using both small horizontal localization radii and a conventional horizontal localization radius. We repeated the sensitivity experiment, changing the mean horizontal spacing of the PWV data from 1.7 to 8.0 km. When the horizontal spacing of assimilated PWV data was decreased from 8.0 to 3.5 km, the accuracy of the simulated hourly rainfall amount worsened in the experiment that used the conventional localization radius for the assimilation of PWV. In contrast, the accuracy of hourly rainfall amounts improved when we applied small horizontal localization radii. In the experiment that used the small horizontal localization radii, the accuracy of the hourly rainfall amount was most improved when the horizontal resolution of the assimilated PWV data was 3.5 km. The optimum spatial resolution of PWV data was related to the characteristic length scale of water vapor variability.[Figure not available: see fulltext.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016EGUGA..1810119D','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016EGUGA..1810119D"><span>Flash flood warnings for ungauged basins based on high-resolution precipitation forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Demargne, Julie; Javelle, Pierre; Organde, Didier; de Saint Aubin, Céline; Janet, Bruno</p> <p>2016-04-01</p> <p>Early detection of flash floods, which are typically triggered by severe rainfall events, is still challenging due to large meteorological and hydrologic uncertainties at the spatial and temporal scales of interest. Also the rapid rising of waters necessarily limits the lead time of warnings to alert communities and activate effective emergency procedures. To better anticipate such events and mitigate their impacts, the French national service in charge of flood forecasting (SCHAPI) is implementing a national flash flood warning system for small-to-medium (up to 1000 km²) ungauged basins based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014). The current deterministic AIGA system has been run in real-time in the South of France since 2005 and has been tested in the RHYTMME project (rhytmme.irstea.fr/). It ingests the operational radar-gauge QPE grids from Météo-France to run a simplified hourly distributed hydrologic model at a 1-km² resolution every 15 minutes. This produces real-time peak discharge estimates along the river network, which are subsequently compared to regionalized flood frequency estimates to provide warnings according to the AIGA-estimated return period of the ongoing event. The calibration and regionalization of the hydrologic model has been recently enhanced for implementing the national flash flood warning system for the entire French territory by 2016. To further extend the effective warning lead time, the flash flood warning system is being enhanced to ingest Météo-France's AROME-NWC high-resolution precipitation nowcasts. The AROME-NWC system combines the most recent available observations with forecasts from the nowcasting version of the AROME convection-permitting model (Auger et al. 2015). AROME-NWC pre-operational deterministic precipitation forecasts, produced every hour at a 2.5-km resolution for a 6-hr forecast horizon, were provided for 3 significant rain events in September and November 2014 and ingested as time-lagged ensembles. The time-lagged approach is a practical choice of accounting for the atmospheric forecast uncertainty when no extensive forecast archive is available for statistical modelling. The evaluation on 185 basins in the South of France showed significant improvements in terms of flash flood event detection and effective warning lead-time, compared to warnings from the current AIGA setup (without any future precipitation). Various verification metrics (e.g., Relative Mean Error, Brier Skill Score) show the skill of ensemble precipitation and flow forecasts compared to single-valued persistency benchmarks. Planned enhancements include integrating additional probabilistic NWP products (e.g., AROME precipitation ensembles on longer forecast horizon), accounting for and reducing hydrologic uncertainties from the model parameters and initial conditions via data assimilation, and developing a comprehensive observational and post-event damage database to determine decision-relevant warning thresholds for flood magnitude and probability. Javelle, P., Demargne, J., Defrance, D., Arnaud, P., 2014. Evaluating flash flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal, doi: 10.1080/02626667.2014.923970 Auger, L., Dupont, O., Hagelin, S., Brousseau, P., Brovelli, P., 2015. AROME-NWC: a new nowcasting tool based on an operational mesoscale forecasting system. Quarterly Journal of the Royal Meteorological Society, 141: 1603-1611, doi: 10.1002/qj.2463</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017ACP....1713103H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017ACP....1713103H"><span>Ensemble prediction of air quality using the WRF/CMAQ model system for health effect studies in China</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hu, Jianlin; Li, Xun; Huang, Lin; Ying, Qi; Zhang, Qiang; Zhao, Bin; Wang, Shuxiao; Zhang, Hongliang</p> <p>2017-11-01</p> <p>Accurate exposure estimates are required for health effect analyses of severe air pollution in China. Chemical transport models (CTMs) are widely used to provide spatial distribution, chemical composition, particle size fractions, and source origins of air pollutants. The accuracy of air quality predictions in China is greatly affected by the uncertainties of emission inventories. The Community Multiscale Air Quality (CMAQ) model with meteorological inputs from the Weather Research and Forecasting (WRF) model were used in this study to simulate air pollutants in China in 2013. Four simulations were conducted with four different anthropogenic emission inventories, including the Multi-resolution Emission Inventory for China (MEIC), the Emission Inventory for China by School of Environment at Tsinghua University (SOE), the Emissions Database for Global Atmospheric Research (EDGAR), and the Regional Emission inventory in Asia version 2 (REAS2). Model performance of each simulation was evaluated against available observation data from 422 sites in 60 cities across China. Model predictions of O3 and PM2.5 generally meet the model performance criteria, but performance differences exist in different regions, for different pollutants, and among inventories. Ensemble predictions were calculated by linearly combining the results from different inventories to minimize the sum of the squared errors between the ensemble results and the observations in all cities. The ensemble concentrations show improved agreement with observations in most cities. The mean fractional bias (MFB) and mean fractional errors (MFEs) of the ensemble annual PM2.5 in the 60 cities are -0.11 and 0.24, respectively, which are better than the MFB (-0.25 to -0.16) and MFE (0.26-0.31) of individual simulations. The ensemble annual daily maximum 1 h O3 (O3-1h) concentrations are also improved, with mean normalized bias (MNB) of 0.03 and mean normalized errors (MNE) of 0.14, compared to MNB of 0.06-0.19 and MNE of 0.16-0.22 of the individual predictions. The ensemble predictions agree better with observations with daily, monthly, and annual averaging times in all regions of China for both PM2.5 and O3-1h. The study demonstrates that ensemble predictions from combining predictions from individual emission inventories can improve the accuracy of predicted temporal and spatial distributions of air pollutants. This study is the first ensemble model study in China using multiple emission inventories, and the results are publicly available for future health effect studies.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2018PhRvA..97a2312M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2018PhRvA..97a2312M"><span>Generating maximally-path-entangled number states in two spin ensembles coupled to a superconducting flux qubit</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Maleki, Yusef; Zheltikov, Aleksei M.</p> <p>2018-01-01</p> <p>An ensemble of nitrogen-vacancy (NV) centers coupled to a circuit QED device is shown to enable an efficient, high-fidelity generation of high-N00N states. Instead of first creating entanglement and then increasing the number of entangled particles N , our source of high-N00N states first prepares a high-N Fock state in one of the NV ensembles and then entangles it to the rest of the system. With such a strategy, high-N N00N states can be generated in just a few operational steps with an extraordinary fidelity. Once prepared, such a state can be stored over a longer period of time due to the remarkably long coherence time of NV centers.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017AdWR..109...58T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017AdWR..109...58T"><span>Efficient multi-scenario Model Predictive Control for water resources management with ensemble streamflow forecasts</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Tian, Xin; Negenborn, Rudy R.; van Overloop, Peter-Jules; María Maestre, José; Sadowska, Anna; van de Giesen, Nick</p> <p>2017-11-01</p> <p>Model Predictive Control (MPC) is one of the most advanced real-time control techniques that has been widely applied to Water Resources Management (WRM). MPC can manage the water system in a holistic manner and has a flexible structure to incorporate specific elements, such as setpoints and constraints. Therefore, MPC has shown its versatile performance in many branches of WRM. Nonetheless, with the in-depth understanding of stochastic hydrology in recent studies, MPC also faces the challenge of how to cope with hydrological uncertainty in its decision-making process. A possible way to embed the uncertainty is to generate an Ensemble Forecast (EF) of hydrological variables, rather than a deterministic one. The combination of MPC and EF results in a more comprehensive approach: Multi-scenario MPC (MS-MPC). In this study, we will first assess the model performance of MS-MPC, considering an ensemble streamflow forecast. Noticeably, the computational inefficiency may be a critical obstacle that hinders applicability of MS-MPC. In fact, with more scenarios taken into account, the computational burden of solving an optimization problem in MS-MPC accordingly increases. To deal with this challenge, we propose the Adaptive Control Resolution (ACR) approach as a computationally efficient scheme to practically reduce the number of control variables in MS-MPC. In brief, the ACR approach uses a mixed-resolution control time step from the near future to the distant future. The ACR-MPC approach is tested on a real-world case study: an integrated flood control and navigation problem in the North Sea Canal of the Netherlands. Such an approach reduces the computation time by 18% and up in our case study. At the same time, the model performance of ACR-MPC remains close to that of conventional MPC.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2016AGUOSEC24B1085H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2016AGUOSEC24B1085H"><span>Influence of Gridded Standoff Measurement Resolution on Numerical Bathymetric Inversion</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Hesser, T.; Farthing, M. W.; Brodie, K.</p> <p>2016-02-01</p> <p>The bathymetry from the surfzone to the shoreline incurs frequent, active movement due to wave energy interacting with the seafloor. Methodologies to measure bathymetry range from point-source in-situ instruments, vessel-mounted single-beam or multi-beam sonar surveys, airborne bathymetric lidar, as well as inversion techniques from standoff measurements of wave processes from video or radar imagery. Each type of measurement has unique sources of error and spatial and temporal resolution and availability. Numerical bathymetry estimation frameworks can use these disparate data types in combination with model-based inversion techniques to produce a "best-estimate of bathymetry" at a given time. Understanding how the sources of error and varying spatial or temporal resolution of each data type affect the end result is critical for determining best practices and in turn increase the accuracy of bathymetry estimation techniques. In this work, we consider an initial step in the development of a complete framework for estimating bathymetry in the nearshore by focusing on gridded standoff measurements and in-situ point observations in model-based inversion at the U.S. Army Corps of Engineers Field Research Facility in Duck, NC. The standoff measurement methods return wave parameters computed using linear wave theory from the direct measurements. These gridded datasets can range in temporal and spatial resolution that do not match the desired model parameters and therefore could lead to a reduction in the accuracy of these methods. Specifically, we investigate the affect of numerical resolution on the accuracy of an Ensemble Kalman Filter bathymetric inversion technique in relation to the spatial and temporal resolution of the gridded standoff measurements. The accuracies of the bathymetric estimates are compared with both high-resolution Real Time Kinematic (RTK) single-beam surveys as well as alternative direct in-situ measurements using sonic altimeters.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>