Insights into the deterministic skill of air quality ensembles ...
Simulations from chemical weather models are subject to uncertainties in the input data (e.g. emission inventory, initial and boundary conditions) as well as those intrinsic to the model (e.g. physical parameterization, chemical mechanism). Multi-model ensembles can improve the forecast skill, provided that certain mathematical conditions are fulfilled. In this work, four ensemble methods were applied to two different datasets, and their performance was compared for ozone (O3), nitrogen dioxide (NO2) and particulate matter (PM10). Apart from the unconditional ensemble average, the approach behind the other three methods relies on adding optimum weights to members or constraining the ensemble to those members that meet certain conditions in time or frequency domain. The two different datasets were created for the first and second phase of the Air Quality Model Evaluation International Initiative (AQMEII). The methods are evaluated against ground level observations collected from the EMEP (European Monitoring and Evaluation Programme) and AirBase databases. The goal of the study is to quantify to what extent we can extract predictable signals from an ensemble with superior skill over the single models and the ensemble mean. Verification statistics show that the deterministic models simulate better O3 than NO2 and PM10, linked to different levels of complexity in the represented processes. The unconditional ensemble mean achieves higher skill compared to each stati
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jie; Draxl, Caroline; Hopson, Thomas
Numerical weather prediction (NWP) models have been widely used for wind resource assessment. Model runs with higher spatial resolution are generally more accurate, yet extremely computational expensive. An alternative approach is to use data generated by a low resolution NWP model, in conjunction with statistical methods. In order to analyze the accuracy and computational efficiency of different types of NWP-based wind resource assessment methods, this paper performs a comparison of three deterministic and probabilistic NWP-based wind resource assessment methodologies: (i) a coarse resolution (0.5 degrees x 0.67 degrees) global reanalysis data set, the Modern-Era Retrospective Analysis for Research and Applicationsmore » (MERRA); (ii) an analog ensemble methodology based on the MERRA, which provides both deterministic and probabilistic predictions; and (iii) a fine resolution (2-km) NWP data set, the Wind Integration National Dataset (WIND) Toolkit, based on the Weather Research and Forecasting model. Results show that: (i) as expected, the analog ensemble and WIND Toolkit perform significantly better than MERRA confirming their ability to downscale coarse estimates; (ii) the analog ensemble provides the best estimate of the multi-year wind distribution at seven of the nine sites, while the WIND Toolkit is the best at one site; (iii) the WIND Toolkit is more accurate in estimating the distribution of hourly wind speed differences, which characterizes the wind variability, at five of the available sites, with the analog ensemble being best at the remaining four locations; and (iv) the analog ensemble computational cost is negligible, whereas the WIND Toolkit requires large computational resources. Future efforts could focus on the combination of the analog ensemble with intermediate resolution (e.g., 10-15 km) NWP estimates, to considerably reduce the computational burden, while providing accurate deterministic estimates and reliable probabilistic assessments.« less
EnsembleGraph: Interactive Visual Analysis of Spatial-Temporal Behavior for Ensemble Simulation Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shu, Qingya; Guo, Hanqi; Che, Limei
We present a novel visualization framework—EnsembleGraph— for analyzing ensemble simulation data, in order to help scientists understand behavior similarities between ensemble members over space and time. A graph-based representation is used to visualize individual spatiotemporal regions with similar behaviors, which are extracted by hierarchical clustering algorithms. A user interface with multiple-linked views is provided, which enables users to explore, locate, and compare regions that have similar behaviors between and then users can investigate and analyze the selected regions in detail. The driving application of this paper is the studies on regional emission influences over tropospheric ozone, which is based onmore » ensemble simulations conducted with different anthropogenic emission absences using the MOZART-4 (model of ozone and related tracers, version 4) model. We demonstrate the effectiveness of our method by visualizing the MOZART-4 ensemble simulation data and evaluating the relative regional emission influences on tropospheric ozone concentrations. Positive feedbacks from domain experts and two case studies prove efficiency of our method.« less
NASA Astrophysics Data System (ADS)
Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.
2013-10-01
This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel radar-based ensemble forecasting chains for flash-flood early warning are investigated in three catchments in the southern Swiss Alps and set in relation to deterministic discharge forecasts for the same catchments. The first radar-based ensemble forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second ensemble forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialised with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. A clear preference was found for the ensemble approach. Discharge forecasts perform better when forced by NORA and REAL-C2 rather then by deterministic weather radar data. Moreover, it was observed that using an ensemble of initial conditions at the forecast initialisation, as in REAL-C2, significantly improved the forecast skill. These forecasts also perform better then forecasts forced by ensemble rainfall forecasts (NORA) initialised form a single initial condition of the hydrological model. Thus the best results were obtained with the REAL-C2 forecasting chain. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.
Deterministic Mean-Field Ensemble Kalman Filtering
Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul
2016-05-03
The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d
Deterministic Mean-Field Ensemble Kalman Filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, Kody J. H.; Tembine, Hamidou; Tempone, Raul
The proof of convergence of the standard ensemble Kalman filter (EnKF) from Le Gland, Monbet, and Tran [Large sample asymptotics for the ensemble Kalman filter, in The Oxford Handbook of Nonlinear Filtering, Oxford University Press, Oxford, UK, 2011, pp. 598--631] is extended to non-Gaussian state-space models. In this paper, a density-based deterministic approximation of the mean-field limit EnKF (DMFEnKF) is proposed, consisting of a PDE solver and a quadrature rule. Given a certain minimal order of convergence κ between the two, this extends to the deterministic filter approximation, which is therefore asymptotically superior to standard EnKF for dimension d
Huisman, J.A.; Breuer, L.; Bormann, H.; Bronstert, A.; Croke, B.F.W.; Frede, H.-G.; Graff, T.; Hubrechts, L.; Jakeman, A.J.; Kite, G.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Viney, N.R.; Willems, P.
2009-01-01
An ensemble of 10 hydrological models was applied to the same set of land use change scenarios. There was general agreement about the direction of changes in the mean annual discharge and 90% discharge percentile predicted by the ensemble members, although a considerable range in the magnitude of predictions for the scenarios and catchments under consideration was obvious. Differences in the magnitude of the increase were attributed to the different mean annual actual evapotranspiration rates for each land use type. The ensemble of model runs was further analyzed with deterministic and probabilistic ensemble methods. The deterministic ensemble method based on a trimmed mean resulted in a single somewhat more reliable scenario prediction. The probabilistic reliability ensemble averaging (REA) method allowed a quantification of the model structure uncertainty in the scenario predictions. It was concluded that the use of a model ensemble has greatly increased our confidence in the reliability of the model predictions. ?? 2008 Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier Filion, Thomas-Charles
2017-06-01
A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. Numerous studies have shown that ensemble forecasts are of higher quality than deterministic ones. Many studies also conclude that decisions based on ensemble rather than deterministic forecasts lead to better decisions in the context of flood mitigation. Hence, it is believed that ensemble forecasts possess a greater economic and social value for both decision makers and the general population. However, the vast majority of, if not all, existing hydro-economic studies rely on a cost-loss ratio framework that assumes a risk-neutral decision maker. To overcome this important flaw, this study borrows from economics and evaluates the economic value of early warning flood systems using the well-known Constant Absolute Risk Aversion (CARA) utility function, which explicitly accounts for the level of risk aversion of the decision maker. This new framework allows for the full exploitation of the information related to a forecasts' uncertainty, making it especially suited for the economic assessment of ensemble or probabilistic forecasts. Rather than comparing deterministic and ensemble forecasts, this study focuses on comparing different types of ensemble forecasts. There are multiple ways of assessing and representing forecast uncertainty. Consequently, there exist many different means of building an ensemble forecasting system for future streamflow. One such possibility is to dress deterministic forecasts using the statistics of past error forecasts. Such dressing methods are popular among operational agencies because of their simplicity and intuitiveness. Another approach is the use of ensemble meteorological forecasts for precipitation and temperature, which are then provided as inputs to one or many hydrological model(s). In this study, three concurrent ensemble streamflow forecasting systems are compared: simple statistically dressed deterministic forecasts, forecasts based on meteorological ensembles, and a variant of the latter that also includes an estimation of state variable uncertainty. This comparison takes place for the Montmorency River, a small flood-prone watershed in southern central Quebec, Canada. The assessment of forecasts is performed for lead times of 1 to 5 days, both in terms of forecasts' quality (relative to the corresponding record of observations) and in terms of economic value, using the new proposed framework based on the CARA utility function. It is found that the economic value of a forecast for a risk-averse decision maker is closely linked to the forecast reliability in predicting the upper tail of the streamflow distribution. Hence, post-processing forecasts to avoid over-forecasting could help improve both the quality and the value of forecasts.
NASA Astrophysics Data System (ADS)
Liu, A.-Peng; Cheng, Liu-Yong; Guo, Qi; Zhang, Shou
2018-02-01
We first propose a scheme for controlled phase-flip gate between a flying photon qubit and the collective spin wave (magnon) of an atomic ensemble assisted by double-sided cavity quantum systems. Then we propose a deterministic controlled-not gate on magnon qubits with parity-check building blocks. Both the gates can be accomplished with 100% success probability in principle. Atomic ensemble is employed so that light-matter coupling is remarkably improved by collective enhancement. We assess the performance of the gates and the results show that they can be faithfully constituted with current experimental techniques.
Towards an improved ensemble precipitation forecast: A probabilistic post-processing approach
NASA Astrophysics Data System (ADS)
Khajehei, Sepideh; Moradkhani, Hamid
2017-03-01
Recently, ensemble post-processing (EPP) has become a commonly used approach for reducing the uncertainty in forcing data and hence hydrologic simulation. The procedure was introduced to build ensemble precipitation forecasts based on the statistical relationship between observations and forecasts. More specifically, the approach relies on a transfer function that is developed based on a bivariate joint distribution between the observations and the simulations in the historical period. The transfer function is used to post-process the forecast. In this study, we propose a Bayesian EPP approach based on copula functions (COP-EPP) to improve the reliability of the precipitation ensemble forecast. Evaluation of the copula-based method is carried out by comparing the performance of the generated ensemble precipitation with the outputs from an existing procedure, i.e. mixed type meta-Gaussian distribution. Monthly precipitation from Climate Forecast System Reanalysis (CFS) and gridded observation from Parameter-Elevation Relationships on Independent Slopes Model (PRISM) have been employed to generate the post-processed ensemble precipitation. Deterministic and probabilistic verification frameworks are utilized in order to evaluate the outputs from the proposed technique. Distribution of seasonal precipitation for the generated ensemble from the copula-based technique is compared to the observation and raw forecasts for three sub-basins located in the Western United States. Results show that both techniques are successful in producing reliable and unbiased ensemble forecast, however, the COP-EPP demonstrates considerable improvement in the ensemble forecast in both deterministic and probabilistic verification, in particular in characterizing the extreme events in wet seasons.
NASA Astrophysics Data System (ADS)
Yan, Y.; Barth, A.; Beckers, J. M.; Candille, G.; Brankart, J. M.; Brasseur, P.
2015-07-01
Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. An incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semiindependent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analyzed jointly. The consistency and complementarity between both validations are highlighted.
Probabilistic Predictions of PM2.5 Using a Novel Ensemble Design for the NAQFC
NASA Astrophysics Data System (ADS)
Kumar, R.; Lee, J. A.; Delle Monache, L.; Alessandrini, S.; Lee, P.
2017-12-01
Poor air quality (AQ) in the U.S. is estimated to cause about 60,000 premature deaths with costs of 100B-150B annually. To reduce such losses, the National AQ Forecasting Capability (NAQFC) at the National Oceanic and Atmospheric Administration (NOAA) produces forecasts of ozone, particulate matter less than 2.5 mm in diameter (PM2.5), and other pollutants so that advance notice and warning can be issued to help individuals and communities limit the exposure and reduce air pollution-caused health problems. The current NAQFC, based on the U.S. Environmental Protection Agency Community Multi-scale AQ (CMAQ) modeling system, provides only deterministic AQ forecasts and does not quantify the uncertainty associated with the predictions, which could be large due to the chaotic nature of atmosphere and nonlinearity in atmospheric chemistry. This project aims to take NAQFC a step further in the direction of probabilistic AQ prediction by exploring and quantifying the potential value of ensemble predictions of PM2.5, and perturbing three key aspects of PM2.5 modeling: the meteorology, emissions, and CMAQ secondary organic aerosol formulation. This presentation focuses on the impact of meteorological variability, which is represented by three members of NOAA's Short-Range Ensemble Forecast (SREF) system that were down-selected by hierarchical cluster analysis. These three SREF members provide the physics configurations and initial/boundary conditions for the Weather Research and Forecasting (WRF) model runs that generate required output variables for driving CMAQ that are missing in operational SREF output. We conducted WRF runs for Jan, Apr, Jul, and Oct 2016 to capture seasonal changes in meteorology. Estimated emissions of trace gases and aerosols via the Sparse Matrix Operator Kernel (SMOKE) system were developed using the WRF output. WRF and SMOKE output drive a 3-member CMAQ mini-ensemble of once-daily, 48-h PM2.5 forecasts for the same four months. The CMAQ mini-ensemble is evaluated against both observations and the current operational deterministic NAQFC products, and analyzed to assess the impact of meteorological biases on PM2.5 variability. Quantification of the PM2.5 prediction uncertainty will prove a key factor to support cost-effective decision-making while protecting public health.
Adaptive correction of ensemble forecasts
NASA Astrophysics Data System (ADS)
Pelosi, Anna; Battista Chirico, Giovanni; Van den Bergh, Joris; Vannitsem, Stephane
2017-04-01
Forecasts from numerical weather prediction (NWP) models often suffer from both systematic and non-systematic errors. These are present in both deterministic and ensemble forecasts, and originate from various sources such as model error and subgrid variability. Statistical post-processing techniques can partly remove such errors, which is particularly important when NWP outputs concerning surface weather variables are employed for site specific applications. Many different post-processing techniques have been developed. For deterministic forecasts, adaptive methods such as the Kalman filter are often used, which sequentially post-process the forecasts by continuously updating the correction parameters as new ground observations become available. These methods are especially valuable when long training data sets do not exist. For ensemble forecasts, well-known techniques are ensemble model output statistics (EMOS), and so-called "member-by-member" approaches (MBM). Here, we introduce a new adaptive post-processing technique for ensemble predictions. The proposed method is a sequential Kalman filtering technique that fully exploits the information content of the ensemble. One correction equation is retrieved and applied to all members, however the parameters of the regression equations are retrieved by exploiting the second order statistics of the forecast ensemble. We compare our new method with two other techniques: a simple method that makes use of a running bias correction of the ensemble mean, and an MBM post-processing approach that rescales the ensemble mean and spread, based on minimization of the Continuous Ranked Probability Score (CRPS). We perform a verification study for the region of Campania in southern Italy. We use two years (2014-2015) of daily meteorological observations of 2-meter temperature and 10-meter wind speed from 18 ground-based automatic weather stations distributed across the region, comparing them with the corresponding COSMO-LEPS ensemble forecasts. Deterministic verification scores (e.g., mean absolute error, bias) and probabilistic scores (e.g., CRPS) are used to evaluate the post-processing techniques. We conclude that the new adaptive method outperforms the simpler running bias-correction. The proposed adaptive method often outperforms the MBM method in removing bias. The MBM method has the advantage of correcting the ensemble spread, although it needs more training data.
NASA Astrophysics Data System (ADS)
Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre
2015-04-01
Sea surface height, sea surface temperature and temperature profiles at depth collected between January and December 2005 are assimilated into a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. 60 ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments. Incremental analysis update scheme is applied in order to reduce spurious oscillations due to the model state correction. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with observations used in the assimilation experiments and independent observations, which goes further than most previous studies and constitutes one of the original points of this paper. Regarding the deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations in order to diagnose the ensemble distribution properties in a deterministic way. Regarding the probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system. The improvement of the assimilation is demonstrated using these validation metrics. Finally, the deterministic validation and the probabilistic validation are analysed jointly. The consistency and complementarity between both validations are highlighted. High reliable situations, in which the RMS error and the CRPS give the same information, are identified for the first time in this paper.
Enhancing Flood Prediction Reliability Using Bayesian Model Averaging
NASA Astrophysics Data System (ADS)
Liu, Z.; Merwade, V.
2017-12-01
Uncertainty analysis is an indispensable part of modeling the hydrology and hydrodynamics of non-idealized environmental systems. Compared to reliance on prediction from one model simulation, using on ensemble of predictions that consider uncertainty from different sources is more reliable. In this study, Bayesian model averaging (BMA) is applied to Black River watershed in Arkansas and Missouri by combining multi-model simulations to get reliable deterministic water stage and probabilistic inundation extent predictions. The simulation ensemble is generated from 81 LISFLOOD-FP subgrid model configurations that include uncertainty from channel shape, channel width, channel roughness and discharge. Model simulation outputs are trained with observed water stage data during one flood event, and BMA prediction ability is validated for another flood event. Results from this study indicate that BMA does not always outperform all members in the ensemble, but it provides relatively robust deterministic flood stage predictions across the basin. Station based BMA (BMA_S) water stage prediction has better performance than global based BMA (BMA_G) prediction which is superior to the ensemble mean prediction. Additionally, high-frequency flood inundation extent (probability greater than 60%) in BMA_G probabilistic map is more accurate than the probabilistic flood inundation extent based on equal weights.
Post-processing method for wind speed ensemble forecast using wind speed and direction
NASA Astrophysics Data System (ADS)
Sofie Eide, Siri; Bjørnar Bremnes, John; Steinsland, Ingelin
2017-04-01
Statistical methods are widely applied to enhance the quality of both deterministic and ensemble NWP forecasts. In many situations, like wind speed forecasting, most of the predictive information is contained in one variable in the NWP models. However, in statistical calibration of deterministic forecasts it is often seen that including more variables can further improve forecast skill. For ensembles this is rarely taken advantage of, mainly due to that it is generally not straightforward how to include multiple variables. In this study, it is demonstrated how multiple variables can be included in Bayesian model averaging (BMA) by using a flexible regression method for estimating the conditional means. The method is applied to wind speed forecasting at 204 Norwegian stations based on wind speed and direction forecasts from the ECMWF ensemble system. At about 85 % of the sites the ensemble forecasts were improved in terms of CRPS by adding wind direction as predictor compared to only using wind speed. On average the improvements were about 5 %, but mainly for moderate to strong wind situations. For weak wind speeds adding wind direction had more or less neutral impact.
NASA Astrophysics Data System (ADS)
Verkade, J. S.; Brown, J. D.; Davids, F.; Reggiani, P.; Weerts, A. H.
2017-12-01
Two statistical post-processing approaches for estimation of predictive hydrological uncertainty are compared: (i) 'dressing' of a deterministic forecast by adding a single, combined estimate of both hydrological and meteorological uncertainty and (ii) 'dressing' of an ensemble streamflow forecast by adding an estimate of hydrological uncertainty to each individual streamflow ensemble member. Both approaches aim to produce an estimate of the 'total uncertainty' that captures both the meteorological and hydrological uncertainties. They differ in the degree to which they make use of statistical post-processing techniques. In the 'lumped' approach, both sources of uncertainty are lumped by post-processing deterministic forecasts using their verifying observations. In the 'source-specific' approach, the meteorological uncertainties are estimated by an ensemble of weather forecasts. These ensemble members are routed through a hydrological model and a realization of the probability distribution of hydrological uncertainties (only) is then added to each ensemble member to arrive at an estimate of the total uncertainty. The techniques are applied to one location in the Meuse basin and three locations in the Rhine basin. Resulting forecasts are assessed for their reliability and sharpness, as well as compared in terms of multiple verification scores including the relative mean error, Brier Skill Score, Mean Continuous Ranked Probability Skill Score, Relative Operating Characteristic Score and Relative Economic Value. The dressed deterministic forecasts are generally more reliable than the dressed ensemble forecasts, but the latter are sharper. On balance, however, they show similar quality across a range of verification metrics, with the dressed ensembles coming out slightly better. Some additional analyses are suggested. Notably, these include statistical post-processing of the meteorological forecasts in order to increase their reliability, thus increasing the reliability of the streamflow forecasts produced with ensemble meteorological forcings.
NASA Astrophysics Data System (ADS)
Lemaire, V. E. P.; Colette, A.; Menut, L.
2015-10-01
Because of its sensitivity to unfavorable weather patterns, air pollution is sensitive to climate change so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projection. However, the computing cost of such method requires optimizing ensemble exploration techniques. By using a training dataset of deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed simple statistical models that could be used to predict air pollutant concentrations. The evolution of the key climate variables driving either particulate or gaseous pollution allows concluding on the robustness of the climate impact on air quality. The climate benefit for PM2.5 was confirmed -0.96 (±0.18), -1.00 (±0.37), -1.16 ± (0.23) μg m-3, for resp. Eastern Europe, Mid Europe and Northern Italy and for the Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy regions a climate penalty on ozone was identified 10.11 (±3.22), 8.23 (±2.06), 9.23 (±1.13), 6.41 (±2.14), 7.43 (±2.02) μg m-3. This technique also allows selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections.
NASA Astrophysics Data System (ADS)
Keane, Richard J.; Plant, Robert S.; Tennant, Warren J.
2016-05-01
The Plant-Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant-Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant-Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.
NASA Technical Reports Server (NTRS)
Bowman, Kevin W.; Shindell, Drew Todd; Worden, H. M.; Lamarque, J. F.; Young, P. J.; Stevenson, D. S.; Qu, Z.; delaTorre, M.; Bergmann, D.; Cameron-Smith, P. J.;
2013-01-01
We use simultaneous observations of tropospheric ozone and outgoing longwave radiation (OLR) sensitivity to tropospheric ozone from the Tropospheric Emission Spectrometer (TES) to evaluate model tropospheric ozone and its effect on OLR simulated by a suite of chemistry-climate models that participated in the Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). The ensemble mean of ACCMIP models show a persistent but modest tropospheric ozone low bias (5-20 ppb) in the Southern Hemisphere (SH) and modest high bias (5-10 ppb) in the Northern Hemisphere (NH) relative to TES ozone for 2005-2010. These ozone biases have a significant impact on the OLR. Using TES instantaneous radiative kernels (IRK), we show that the ACCMIP ensemble mean tropospheric ozone low bias leads up to 120mW/ sq. m OLR high bias locally but zonally compensating errors reduce the global OLR high bias to 39+/- 41mW/ sq. m relative to TES data. We show that there is a correlation (Sq. R = 0.59) between the magnitude of the ACCMIP OLR bias and the deviation of the ACCMIP preindustrial to present day (1750-2010) ozone radiative forcing (RF) from the ensemble ozone RF mean. However, this correlation is driven primarily by models whose absolute OLR bias from tropospheric ozone exceeds 100mW/ sq. m. Removing these models leads to a mean ozone radiative forcing of 394+/- 42mW/ sq. m. The mean is about the same and the standard deviation is about 30% lower than an ensemble ozone RF of 384 +/- 60mW/ sq. m derived from 14 of the 16 ACCMIP models reported in a companion ACCMIP study. These results point towards a profitable direction of combining satellite observations and chemistry-climate model simulations to reduce uncertainty in ozone radiative forcing.
Randomized central limit theorems: A unified theory.
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles' aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles' extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic-scaling all ensemble components by a common deterministic scale. However, there are "random environment" settings in which the underlying scaling schemes are stochastic-scaling the ensemble components by different random scales. Examples of such settings include Holtsmark's law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)-in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes-and present "randomized counterparts" to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
NASA Astrophysics Data System (ADS)
Soltanzadeh, I.; Azadi, M.; Vakili, G. A.
2011-07-01
Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.
Stochastic Watershed Models for Risk Based Decision Making
NASA Astrophysics Data System (ADS)
Vogel, R. M.
2017-12-01
Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation
NASA Astrophysics Data System (ADS)
Yang, Xiu-Qun; Yang, Dejian; Xie, Qian; Zhang, Yaocun; Ren, Xuejuan; Tang, Youmin
2017-04-01
Based on historical forecasts of three quasi-operational multi-model ensemble (MME) systems, this study assesses the superiority of coupled MME over contributing single-model ensembles (SMEs) and over uncoupled atmospheric MME in predicting the Western North Pacific-East Asian summer monsoon variability. The probabilistic and deterministic forecast skills are measured by Brier skill score (BSS) and anomaly correlation (AC), respectively. A forecast-format dependent MME superiority over SMEs is found. The probabilistic forecast skill of the MME is always significantly better than that of each SME, while the deterministic forecast skill of the MME can be lower than that of some SMEs. The MME superiority arises from both the model diversity and the ensemble size increase in the tropics, and primarily from the ensemble size increase in the subtropics. The BSS is composed of reliability and resolution, two attributes characterizing probabilistic forecast skill. The probabilistic skill increase of the MME is dominated by the dramatic improvement in reliability, while resolution is not always improved, similar to AC. A monotonic resolution-AC relationship is further found and qualitatively explained, whereas little relationship can be identified between reliability and AC. It is argued that the MME's success in improving the reliability arises from an effective reduction of the overconfidence in forecast distributions. Moreover, it is examined that the seasonal predictions with coupled MME are more skillful than those with the uncoupled atmospheric MME forced by persisting sea surface temperature (SST) anomalies, since the coupled MME has better predicted the SST anomaly evolution in three key regions.
Extended Range Prediction of Indian Summer Monsoon: Current status
NASA Astrophysics Data System (ADS)
Sahai, A. K.; Abhilash, S.; Borah, N.; Joseph, S.; Chattopadhyay, R.; S, S.; Rajeevan, M.; Mandal, R.; Dey, A.
2014-12-01
The main focus of this study is to develop forecast consensus in the extended range prediction (ERP) of monsoon Intraseasonal oscillations using a suit of different variants of Climate Forecast system (CFS) model. In this CFS based Grand MME prediction system (CGMME), the ensemble members are generated by perturbing the initial condition and using different configurations of CFSv2. This is to address the role of different physical mechanisms known to have control on the error growth in the ERP in the 15-20 day time scale. The final formulation of CGMME is based on 21 ensembles of the standalone Global Forecast System (GFS) forced with bias corrected forecasted SST from CFS, 11 low resolution CFST126 and 11 high resolution CFST382. Thus, we develop the multi-model consensus forecast for the ERP of Indian summer monsoon (ISM) using a suite of different variants of CFS model. This coordinated international effort lead towards the development of specific tailor made regional forecast products over Indian region. Skill of deterministic and probabilistic categorical rainfall forecast as well the verification of large-scale low frequency monsoon intraseasonal oscillations has been carried out using hindcast from 2001-2012 during the monsoon season in which all models are initialized at every five days starting from 16May to 28 September. The skill of deterministic forecast from CGMME is better than the best participating single model ensemble configuration (SME). The CGMME approach is believed to quantify the uncertainty in both initial conditions and model formulation. Main improvement is attained in probabilistic forecast which is because of an increase in the ensemble spread, thereby reducing the error due to over-confident ensembles in a single model configuration. For probabilistic forecast, three tercile ranges are determined by ranking method based on the percentage of ensemble members from all the participating models falls in those three categories. CGMME further added value to both deterministic and probability forecast compared to raw SME's and this better skill is probably flows from large spread and improved spread-error relationship. CGMME system is currently capable of generating ER prediction in real time and successfully delivering its experimental operational ER forecast of ISM for the last few years.
Short-range solar radiation forecasts over Sweden
NASA Astrophysics Data System (ADS)
Landelius, Tomas; Lindskog, Magnus; Körnich, Heiner; Andersson, Sandra
2018-04-01
In this article the performance for short-range solar radiation forecasts by the global deterministic and ensemble models from the European Centre for Medium-Range Weather Forecasts (ECMWF) is compared with an ensemble of the regional mesoscale model HARMONIE-AROME used by the national meteorological services in Sweden, Norway and Finland. Note however that only the control members and the ensemble means are included in the comparison. The models resolution differs considerably with 18 km for the ECMWF ensemble, 9 km for the ECMWF deterministic model, and 2.5 km for the HARMONIE-AROME ensemble. The models share the same radiation code. It turns out that they all underestimate systematically the Direct Normal Irradiance (DNI) for clear-sky conditions. Except for this shortcoming, the HARMONIE-AROME ensemble model shows the best agreement with the distribution of observed Global Horizontal Irradiance (GHI) and DNI values. During mid-day the HARMONIE-AROME ensemble mean performs best. The control member of the HARMONIE-AROME ensemble also scores better than the global deterministic ECMWF model. This is an interesting result since mesoscale models have so far not shown good results when compared to the ECMWF models. Three days with clear, mixed and cloudy skies are used to illustrate the possible added value of a probabilistic forecast. It is shown that in these cases the mesoscale ensemble could provide decision support to a grid operator in terms of forecasts of both the amount of solar power and its probabilities.
Wind power application research on the fusion of the determination and ensemble prediction
NASA Astrophysics Data System (ADS)
Lan, Shi; Lina, Xu; Yuzhu, Hao
2017-07-01
The fused product of wind speed for the wind farm is designed through the use of wind speed products of ensemble prediction from the European Centre for Medium-Range Weather Forecasts (ECMWF) and professional numerical model products on wind power based on Mesoscale Model5 (MM5) and Beijing Rapid Update Cycle (BJ-RUC), which are suitable for short-term wind power forecasting and electric dispatch. The single-valued forecast is formed by calculating the different ensemble statistics of the Bayesian probabilistic forecasting representing the uncertainty of ECMWF ensemble prediction. Using autoregressive integrated moving average (ARIMA) model to improve the time resolution of the single-valued forecast, and based on the Bayesian model averaging (BMA) and the deterministic numerical model prediction, the optimal wind speed forecasting curve and the confidence interval are provided. The result shows that the fusion forecast has made obvious improvement to the accuracy relative to the existing numerical forecasting products. Compared with the 0-24 h existing deterministic forecast in the validation period, the mean absolute error (MAE) is decreased by 24.3 % and the correlation coefficient (R) is increased by 12.5 %. In comparison with the ECMWF ensemble forecast, the MAE is reduced by 11.7 %, and R is increased 14.5 %. Additionally, MAE did not increase with the prolongation of the forecast ahead.
A simple new filter for nonlinear high-dimensional data assimilation
NASA Astrophysics Data System (ADS)
Tödter, Julian; Kirchgessner, Paul; Ahrens, Bodo
2015-04-01
The ensemble Kalman filter (EnKF) and its deterministic variants, mostly square root filters such as the ensemble transform Kalman filter (ETKF), represent a popular alternative to variational data assimilation schemes and are applied in a wide range of operational and research activities. Their forecast step employs an ensemble integration that fully respects the nonlinear nature of the analyzed system. In the analysis step, they implicitly assume the prior state and observation errors to be Gaussian. Consequently, in nonlinear systems, the analysis mean and covariance are biased, and these filters remain suboptimal. In contrast, the fully nonlinear, non-Gaussian particle filter (PF) only relies on Bayes' theorem, which guarantees an exact asymptotic behavior, but because of the so-called curse of dimensionality it is exposed to weight collapse. This work shows how to obtain a new analysis ensemble whose mean and covariance exactly match the Bayesian estimates. This is achieved by a deterministic matrix square root transformation of the forecast ensemble, and subsequently a suitable random rotation that significantly contributes to filter stability while preserving the required second-order statistics. The forecast step remains as in the ETKF. The proposed algorithm, which is fairly easy to implement and computationally efficient, is referred to as the nonlinear ensemble transform filter (NETF). The properties and performance of the proposed algorithm are investigated via a set of Lorenz experiments. They indicate that such a filter formulation can increase the analysis quality, even for relatively small ensemble sizes, compared to other ensemble filters in nonlinear, non-Gaussian scenarios. Furthermore, localization enhances the potential applicability of this PF-inspired scheme in larger-dimensional systems. Finally, the novel algorithm is coupled to a large-scale ocean general circulation model. The NETF is stable, behaves reasonably and shows a good performance with a realistic ensemble size. The results confirm that, in principle, it can be applied successfully and as simple as the ETKF in high-dimensional problems without further modifications of the algorithm, even though it is only based on the particle weights. This proves that the suggested method constitutes a useful filter for nonlinear, high-dimensional data assimilation, and is able to overcome the curse of dimensionality even in deterministic systems.
NASA Astrophysics Data System (ADS)
Yan, Yajing; Barth, Alexander; Beckers, Jean-Marie; Candille, Guillem; Brankart, Jean-Michel; Brasseur, Pierre
2016-04-01
In this paper, four assimilation schemes, including an intermittent assimilation scheme (INT) and three incremental assimilation schemes (IAU 0, IAU 50 and IAU 100), are compared in the same assimilation experiments with a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. The three IAU schemes differ from each other in the position of the increment update window that has the same size as the assimilation window. 0, 50 and 100 correspond to the degree of superposition of the increment update window on the current assimilation window. Sea surface height, sea surface temperature, and temperature profiles at depth collected between January and December 2005 are assimilated. Sixty ensemble members are generated by adding realistic noise to the forcing parameters related to the temperature. The ensemble is diagnosed and validated by comparison between the ensemble spread and the model/observation difference, as well as by rank histogram before the assimilation experiments The relevance of each assimilation scheme is evaluated through analyses on thermohaline variables and the current velocities. The results of the assimilation are assessed according to both deterministic and probabilistic metrics with independent/semi-independent observations. For deterministic validation, the ensemble means, together with the ensemble spreads are compared to the observations, in order to diagnose the ensemble distribution properties in a deterministic way. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centered random variable (RCRV) score in order to investigate the reliability properties of the ensemble forecast system.
NASA Technical Reports Server (NTRS)
McPeters, R.D.; Oltmans, Samuel J.
2000-01-01
NASA is creating a long term satellite ozone time series by combining data from multiple instruments: Nimbus 7 Total Ozone Mapping Spectrometer (TOMS) (1978 - 1993), Meteor 3 TOMS (1991 - 1994), Earth Probe TOMS (1996 - present), Nimbus 7 SB-JV (1978 - 1990), NOAA-9 Solar Backscatter UV Spectrometer (SBUV/2) (1984 - 1997), NOAA-11 SBUV/2 (1989 - 1994), and NOAA-14 SBUV/2 (1995 - present). The stability of individual data sets and possible instrument-to-instrument differences are best checked by comparison with ground-based measurements. We have examined the time dependence of the calibrations of these instruments by comparing satellite derived ozone with that measured by the world primary standard Dobson spectrometer No. 83. This instrument has been maintained since 1962 as a standard for total ozone to an uncertainty of plus or minus 0.5%. Measurements of AD pair ozone made with instrument No. 83 at Mauna Loa observatory most summers since 1979 were compared with coincident TOMS and SBUV(/2) ozone measurements. The comparison shows that the various instruments were stable relative to instrument No. 83 to within about plus or minus 1%, but that there are instrument-to-instrument biases of as much as 3%. Earth Probe TOMS, for example, is 1% to 2% high relative to Nimbus 7 TOMS when the world standard instrument is used as a transfer standard. Similar results are seen when comparisons are made with an ensemble of 41 Dobson stations throughout the world, demonstrating that the ensemble as a whole is stable despite the fact that many instruments within the ensemble have clear calibration changes.
Predictability of short-range forecasting: a multimodel approach
NASA Astrophysics Data System (ADS)
García-Moya, Jose-Antonio; Callado, Alfons; Escribà, Pau; Santos, Carlos; Santos-Muñoz, Daniel; Simarro, Juan
2011-05-01
Numerical weather prediction (NWP) models (including mesoscale) have limitations when it comes to dealing with severe weather events because extreme weather is highly unpredictable, even in the short range. A probabilistic forecast based on an ensemble of slightly different model runs may help to address this issue. Among other ensemble techniques, Multimodel ensemble prediction systems (EPSs) are proving to be useful for adding probabilistic value to mesoscale deterministic models. A Multimodel Short Range Ensemble Prediction System (SREPS) focused on forecasting the weather up to 72 h has been developed at the Spanish Meteorological Service (AEMET). The system uses five different limited area models (LAMs), namely HIRLAM (HIRLAM Consortium), HRM (DWD), the UM (UKMO), MM5 (PSU/NCAR) and COSMO (COSMO Consortium). These models run with initial and boundary conditions provided by five different global deterministic models, namely IFS (ECMWF), UM (UKMO), GME (DWD), GFS (NCEP) and CMC (MSC). AEMET-SREPS (AE) validation on the large-scale flow, using ECMWF analysis, shows a consistent and slightly underdispersive system. For surface parameters, the system shows high skill forecasting binary events. 24-h precipitation probabilistic forecasts are verified using an up-scaling grid of observations from European high-resolution precipitation networks, and compared with ECMWF-EPS (EC).
Cluster-based analysis of multi-model climate ensembles
NASA Astrophysics Data System (ADS)
Hyde, Richard; Hossaini, Ryan; Leeson, Amber A.
2018-06-01
Clustering - the automated grouping of similar data - can provide powerful and unique insight into large and complex data sets, in a fast and computationally efficient manner. While clustering has been used in a variety of fields (from medical image processing to economics), its application within atmospheric science has been fairly limited to date, and the potential benefits of the application of advanced clustering techniques to climate data (both model output and observations) has yet to be fully realised. In this paper, we explore the specific application of clustering to a multi-model climate ensemble. We hypothesise that clustering techniques can provide (a) a flexible, data-driven method of testing model-observation agreement and (b) a mechanism with which to identify model development priorities. We focus our analysis on chemistry-climate model (CCM) output of tropospheric ozone - an important greenhouse gas - from the recent Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). Tropospheric column ozone from the ACCMIP ensemble was clustered using the Data Density based Clustering (DDC) algorithm. We find that a multi-model mean (MMM) calculated using members of the most-populous cluster identified at each location offers a reduction of up to ˜ 20 % in the global absolute mean bias between the MMM and an observed satellite-based tropospheric ozone climatology, with respect to a simple, all-model MMM. On a spatial basis, the bias is reduced at ˜ 62 % of all locations, with the largest bias reductions occurring in the Northern Hemisphere - where ozone concentrations are relatively large. However, the bias is unchanged at 9 % of all locations and increases at 29 %, particularly in the Southern Hemisphere. The latter demonstrates that although cluster-based subsampling acts to remove outlier model data, such data may in fact be closer to observed values in some locations. We further demonstrate that clustering can provide a viable and useful framework in which to assess and visualise model spread, offering insight into geographical areas of agreement among models and a measure of diversity across an ensemble. Finally, we discuss caveats of the clustering techniques and note that while we have focused on tropospheric ozone, the principles underlying the cluster-based MMMs are applicable to other prognostic variables from climate models.
Multi-parametric variational data assimilation for hydrological forecasting
NASA Astrophysics Data System (ADS)
Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.
2017-12-01
Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.
A comparison of breeding and ensemble transform vectors for global ensemble generation
NASA Astrophysics Data System (ADS)
Deng, Guo; Tian, Hua; Li, Xiaoli; Chen, Jing; Gong, Jiandong; Jiao, Meiyan
2012-02-01
To compare the initial perturbation techniques using breeding vectors and ensemble transform vectors, three ensemble prediction systems using both initial perturbation methods but with different ensemble member sizes based on the spectral model T213/L31 are constructed at the National Meteorological Center, China Meteorological Administration (NMC/CMA). A series of ensemble verification scores such as forecast skill of the ensemble mean, ensemble resolution, and ensemble reliability are introduced to identify the most important attributes of ensemble forecast systems. The results indicate that the ensemble transform technique is superior to the breeding vector method in light of the evaluation of anomaly correlation coefficient (ACC), which is a deterministic character of the ensemble mean, the root-mean-square error (RMSE) and spread, which are of probabilistic attributes, and the continuous ranked probability score (CRPS) and its decomposition. The advantage of the ensemble transform approach is attributed to its orthogonality among ensemble perturbations as well as its consistence with the data assimilation system. Therefore, this study may serve as a reference for configuration of the best ensemble prediction system to be used in operation.
NASA Astrophysics Data System (ADS)
Keeble, James; Brown, Hannah; Abraham, N. Luke; Harris, Neil R. P.; Pyle, John A.
2018-06-01
Total column ozone values from an ensemble of UM-UKCA model simulations are examined to investigate different definitions of progress on the road to ozone recovery. The impacts of modelled internal atmospheric variability are accounted for by applying a multiple linear regression model to modelled total column ozone values, and ozone trend analysis is performed on the resulting ozone residuals. Three definitions of recovery are investigated: (i) a slowed rate of decline and the date of minimum column ozone, (ii) the identification of significant positive trends and (iii) a return to historic values. A return to past thresholds is the last state to be achieved. Minimum column ozone values, averaged from 60° S to 60° N, occur between 1990 and 1995 for each ensemble member, driven in part by the solar minimum conditions during the 1990s. When natural cycles are accounted for, identification of the year of minimum ozone in the resulting ozone residuals is uncertain, with minimum values for each ensemble member occurring at different times between 1992 and 2000. As a result of this large variability, identification of the date of minimum ozone constitutes a poor measure of ozone recovery. Trends for the 2000-2017 period are positive at most latitudes and are statistically significant in the mid-latitudes in both hemispheres when natural cycles are accounted for. This significance results largely from the large sample size of the multi-member ensemble. Significant trends cannot be identified by 2017 at the highest latitudes, due to the large interannual variability in the data, nor in the tropics, due to the small trend magnitude, although it is projected that significant trends may be identified in these regions soon thereafter. While significant positive trends in total column ozone could be identified at all latitudes by ˜ 2030, column ozone values which are lower than the 1980 annual mean can occur in the mid-latitudes until ˜ 2050, and in the tropics and high latitudes deep into the second half of the 21st century.
Quantum Ensemble Classification: A Sampling-Based Learning Control Approach.
Chen, Chunlin; Dong, Daoyi; Qi, Bo; Petersen, Ian R; Rabitz, Herschel
2017-06-01
Quantum ensemble classification (QEC) has significant applications in discrimination of atoms (or molecules), separation of isotopes, and quantum information extraction. However, quantum mechanics forbids deterministic discrimination among nonorthogonal states. The classification of inhomogeneous quantum ensembles is very challenging, since there exist variations in the parameters characterizing the members within different classes. In this paper, we recast QEC as a supervised quantum learning problem. A systematic classification methodology is presented by using a sampling-based learning control (SLC) approach for quantum discrimination. The classification task is accomplished via simultaneously steering members belonging to different classes to their corresponding target states (e.g., mutually orthogonal states). First, a new discrimination method is proposed for two similar quantum systems. Then, an SLC method is presented for QEC. Numerical results demonstrate the effectiveness of the proposed approach for the binary classification of two-level quantum ensembles and the multiclass classification of multilevel quantum ensembles.
NASA Astrophysics Data System (ADS)
Lemaire, Vincent E. P.; Colette, Augustin; Menut, Laurent
2016-03-01
Because of its sensitivity to unfavorable weather patterns, air pollution is sensitive to climate change so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, the computing cost of such methods requires optimizing ensemble exploration techniques. By using a training data set from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for eight regions in Europe and developed statistical models that could be used to predict air pollutant concentrations. The evolution of the key climate variables driving either particulate or gaseous pollution allows selecting the members of the EuroCordex ensemble of regional climate projections that should be used in priority for future air quality projections (CanESM2/RCA4; CNRM-CM5-LR/RCA4 and CSIRO-Mk3-6-0/RCA4 and MPI-ESM-LR/CCLM following the EuroCordex terminology). After having tested the validity of the statistical model in predictive mode, we can provide ranges of uncertainty attributed to the spread of the regional climate projection ensemble by the end of the century (2071-2100) for the RCP8.5. In the three regions where the statistical model of the impact of climate change on PM2.5 offers satisfactory performances, we find a climate benefit (a decrease of PM2.5 concentrations under future climate) of -1.08 (±0.21), -1.03 (±0.32), -0.83 (±0.14) µg m-3, for respectively Eastern Europe, Mid-Europe and Northern Italy. In the British-Irish Isles, Scandinavia, France, the Iberian Peninsula and the Mediterranean, the statistical model is not considered skillful enough to draw any conclusion for PM2.5. In Eastern Europe, France, the Iberian Peninsula, Mid-Europe and Northern Italy, the statistical model of the impact of climate change on ozone was considered satisfactory and it confirms the climate penalty bearing upon ozone of 10.51 (±3.06), 11.70 (±3.63), 11.53 (±1.55), 9.86 (±4.41), 4.82 (±1.79) µg m-3, respectively. In the British-Irish Isles, Scandinavia and the Mediterranean, the skill of the statistical model was not considered robust enough to draw any conclusion for ozone pollution.
Deterministically Entangling Two Remote Atomic Ensembles via Light-Atom Mixed Entanglement Swapping
Liu, Yanhong; Yan, Zhihui; Jia, Xiaojun; Xie, Changde
2016-01-01
Entanglement of two distant macroscopic objects is a key element for implementing large-scale quantum networks consisting of quantum channels and quantum nodes. Entanglement swapping can entangle two spatially separated quantum systems without direct interaction. Here we propose a scheme of deterministically entangling two remote atomic ensembles via continuous-variable entanglement swapping between two independent quantum systems involving light and atoms. Each of two stationary atomic ensembles placed at two remote nodes in a quantum network is prepared to a mixed entangled state of light and atoms respectively. Then, the entanglement swapping is unconditionally implemented between the two prepared quantum systems by means of the balanced homodyne detection of light and the feedback of the measured results. Finally, the established entanglement between two macroscopic atomic ensembles is verified by the inseparability criterion of correlation variances between two anti-Stokes optical beams respectively coming from the two atomic ensembles. PMID:27165122
NASA Astrophysics Data System (ADS)
Federico, S.; Avolio, E.; Bellecci, C.; Colacino, M.; Walko, R. L.
2006-03-01
This paper reports preliminary results for a Limited area model Ensemble Prediction System (LEPS), based on RAMS (Regional Atmospheric Modelling System), for eight case studies of moderate-intense precipitation over Calabria, the southernmost tip of the Italian peninsula. LEPS aims to transfer the benefits of a probabilistic forecast from global to regional scales in countries where local orographic forcing is a key factor to force convection. To accomplish this task and to limit computational time in an operational implementation of LEPS, we perform a cluster analysis of ECMWF-EPS runs. Starting from the 51 members that form the ECMWF-EPS we generate five clusters. For each cluster a representative member is selected and used to provide initial and dynamic boundary conditions to RAMS, whose integrations generate LEPS. RAMS runs have 12-km horizontal resolution. To analyze the impact of enhanced horizontal resolution on quantitative precipitation forecasts, LEPS forecasts are compared to a full Brute Force (BF) ensemble. This ensemble is based on RAMS, has 36 km horizontal resolution and is generated by 51 members, nested in each ECMWF-EPS member. LEPS and BF results are compared subjectively and by objective scores. Subjective analysis is based on precipitation and probability maps of case studies whereas objective analysis is made by deterministic and probabilistic scores. Scores and maps are calculated by comparing ensemble precipitation forecasts against reports from the Calabria regional raingauge network. Results show that LEPS provided better rainfall predictions than BF for all case studies selected. This strongly suggests the importance of the enhanced horizontal resolution, compared to ensemble population, for Calabria for these cases. To further explore the impact of local physiographic features on QPF (Quantitative Precipitation Forecasting), LEPS results are also compared with a 6-km horizontal resolution deterministic forecast. Due to local and mesoscale forcing, the high resolution forecast (Hi-Res) has better performance compared to the ensemble mean for rainfall thresholds larger than 10mm but it tends to overestimate precipitation for lower amounts. This yields larger false alarms that have a detrimental effect on objective scores for lower thresholds. To exploit the advantages of a probabilistic forecast compared to a deterministic one, the relation between the ECMWF-EPS 700 hPa geopotential height spread and LEPS performance is analyzed. Results are promising even if additional studies are required.
NASA Technical Reports Server (NTRS)
Young, P. J.; Archibald, A. T.; Bowman, K. W.; Lamarque, J.-F.; Naik, V.; Stevenson, D. S.; Tilmes, S.; Voulgarakis, A.; Wild, O.; Bergmann, D.;
2013-01-01
Present day tropospheric ozone and its changes between 1850 and 2100 are considered, analysing 15 global models that participated in the Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). The ensemble mean compares well against present day observations. The seasonal cycle correlates well, except for some locations in the tropical upper troposphere. Most (75 %) of the models are encompassed with a range of global mean tropospheric ozone column estimates from satellite data, but there is a suggestion of a high bias in the Northern Hemisphere and a low bias in the Southern Hemisphere, which could indicate deficiencies with the ozone precursor emissions. Compared to the present day ensemble mean tropospheric ozone burden of 337+/-23 Tg, the ensemble mean burden for 1850 time slice is approx. 30% lower. Future changes were modelled using emissions and climate projections from four Representative Concentration Pathways (RCPs). Compared to 2000, the relative changes in the ensemble mean tropospheric ozone burden in 2030 (2100) for the different RCPs are: -4% (-16 %) for RCP2.6, 2% (-7%) for RCP4.5, 1% (-9%) for RCP6.0, and 7% (18 %) for RCP8.5. Model agreement on the magnitude of the change is greatest for larger changes. Reductions in most precursor emissions are common across the RCPs and drive ozone decreases in all but RCP8.5, where doubled methane and a 40-150% greater stratospheric influx (estimated from a subset of models) increase ozone. While models with a high ozone burden for the present day also have high ozone burdens for the other time slices, no model consistently predicts large or small ozone changes; i.e. the magnitudes of the burdens and burden changes do not appear to be related simply, and the models are sensitive to emissions and climate changes in different ways. Spatial patterns of ozone changes are well correlated across most models, but are notably different for models without time evolving stratospheric ozone concentrations. A unified approach to ozone budget specifications and a rigorous investigation of the factors that drive tropospheric ozone is recommended to help future studies attribute ozone changes and inter-model differences more clearly.
NASA Astrophysics Data System (ADS)
Dixon, Kenneth
A lightning data assimilation technique is developed for use with observations from the World Wide Lightning Location Network (WWLLN). The technique nudges the water vapor mixing ratio toward saturation within 10 km of a lightning observation. This technique is applied to deterministic forecasts of convective events on 29 June 2012, 17 November 2013, and 19 April 2011 as well as an ensemble forecast of the 29 June 2012 event using the Weather Research and Forecasting (WRF) model. Lightning data are assimilated over the first 3 hours of the forecasts, and the subsequent impact on forecast quality is evaluated. The nudged deterministic simulations for all events produce composite reflectivity fields that are closer to observations. For the ensemble forecasts of the 29 June 2012 event, the improvement in forecast quality from lightning assimilation is more subtle than for the deterministic forecasts, suggesting that the lightning assimilation may improve ensemble convective forecasts where conventional observations (e.g., aircraft, surface, radiosonde, satellite) are less dense or unavailable.
The effects of greenhouse gases on the Antarctic ozone hole in the past, present, and future
NASA Astrophysics Data System (ADS)
Newman, P. A.; Li, F.; Lait, L. R.; Oman, L.
2017-12-01
The Antarctic ozone hole is primarily caused by human-produced ozone depleting substances such as chlorine-containing chlorofluorocarbons (CFCs) and bromine-containing halons. The large ozone spring-time depletion relies on the very-cold conditions of the Antarctic lower stratosphere, and the general containment of air by the polar night jet over Antarctica. Here we show the Goddard Earth Observing System Chemistry Climate Model (GEOSCCM) coupled ocean-atmosphere-chemistry model for exploring the impact of increasing greenhouse gases (GHGs). Model simulations covering the 1960-2010 period are shown for: 1) a control ensemble with observed levels of ODSs and GHGs, 2) an ensemble with fixed 1960 GHG concentrations, and 3) an ensemble with fixed 1960 ODS levels. We look at a similar set of simulations (control, 2005 fixed GHG levels, and 2005 fixed ODS levels) with a new version of GEOSCCM over the period 2005-2100. These future simulations show that the decrease of ODSs leads to similar ozone recovery for both the control run and the fixed GHG scenarios, in spite of GHG forced changes to stratospheric ozone levels. These simulations demonstrate that GHG levels will have major impacts on the stratosphere by 2100, but have only small impacts on the Antarctic ozone hole.
A Semi-empirical Model of the Stratosphere in the Climate System
NASA Astrophysics Data System (ADS)
Sodergren, A. H.; Bodeker, G. E.; Kremser, S.; Meinshausen, M.; McDonald, A.
2014-12-01
Chemistry climate models (CCMs) currently used to project changes in Antarctic ozone are extremely computationally demanding. CCM projections are uncertain due to lack of knowledge of future emissions of greenhouse gases (GHGs) and ozone depleting substances (ODSs), as well as parameterizations within the CCMs that have weakly constrained tuning parameters. While projections should be based on an ensemble of simulations, this is not currently possible due to the complexity of the CCMs. An inexpensive but realistic approach to simulate changes in stratospheric ozone, and its coupling to the climate system, is needed as a complement to CCMs. A simple climate model (SCM) can be used as a fast emulator of complex atmospheric-ocean climate models. If such an SCM includes a representation of stratospheric ozone, the evolution of the global ozone layer can be simulated for a wide range of GHG and ODS emissions scenarios. MAGICC is an SCM used in previous IPCC reports. In the current version of the MAGICC SCM, stratospheric ozone changes depend only on equivalent effective stratospheric chlorine (EESC). In this work, MAGICC is extended to include an interactive stratospheric ozone layer using a semi-empirical model of ozone responses to CO2and EESC, with changes in ozone affecting the radiative forcing in the SCM. To demonstrate the ability of our new, extended SCM to generate projections of global changes in ozone, tuning parameters from 19 coupled atmosphere-ocean general circulation models (AOGCMs) and 10 carbon cycle models (to create an ensemble of 190 simulations) have been used to generate probability density functions of the dates of return of stratospheric column ozone to 1960 and 1980 levels for different latitudes.
Deterministic and storable single-photon source based on a quantum memory.
Chen, Shuai; Chen, Yu-Ao; Strassel, Thorsten; Yuan, Zhen-Sheng; Zhao, Bo; Schmiedmayer, Jörg; Pan, Jian-Wei
2006-10-27
A single-photon source is realized with a cold atomic ensemble (87Rb atoms). A single excitation, written in an atomic quantum memory by Raman scattering of a laser pulse, is retrieved deterministically as a single photon at a predetermined time. It is shown that the production rate of single photons can be enhanced considerably by a feedback circuit while the single-photon quality is conserved. Such a single-photon source is well suited for future large-scale realization of quantum communication and linear optical quantum computation.
The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments
NASA Astrophysics Data System (ADS)
Chen, Fajing; Jiao, Meiyan; Chen, Jing
2013-04-01
Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.
NASA Astrophysics Data System (ADS)
Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.
2018-04-01
A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.
NASA Astrophysics Data System (ADS)
Bennett, J.; David, R. E.; Wang, Q.; Li, M.; Shrestha, D. L.
2016-12-01
Flood forecasting in Australia has historically relied on deterministic forecasting models run only when floods are imminent, with considerable forecaster input and interpretation. These now co-existed with a continually available 7-day streamflow forecasting service (also deterministic) aimed at operational water management applications such as environmental flow releases. The 7-day service is not optimised for flood prediction. We describe progress on developing a system for ensemble streamflow forecasting that is suitable for both flood prediction and water management applications. Precipitation uncertainty is handled through post-processing of Numerical Weather Prediction (NWP) output with a Bayesian rainfall post-processor (RPP). The RPP corrects biases, downscales NWP output, and produces reliable ensemble spread. Ensemble precipitation forecasts are used to force a semi-distributed conceptual rainfall-runoff model. Uncertainty in precipitation forecasts is insufficient to reliably describe streamflow forecast uncertainty, particularly at shorter lead-times. We characterise hydrological prediction uncertainty separately with a 4-stage error model. The error model relies on data transformation to ensure residuals are homoscedastic and symmetrically distributed. To ensure streamflow forecasts are accurate and reliable, the residuals are modelled using a mixture-Gaussian distribution with distinct parameters for the rising and falling limbs of the forecast hydrograph. In a case study of the Murray River in south-eastern Australia, we show ensemble predictions of floods generally have lower errors than deterministic forecasting methods. We also discuss some of the challenges in operationalising short-term ensemble streamflow forecasts in Australia, including meeting the needs for accurate predictions across all flow ranges and comparing forecasts generated by event and continuous hydrological models.
Wave ensemble forecast system for tropical cyclones in the Australian region
NASA Astrophysics Data System (ADS)
Zieger, Stefan; Greenslade, Diana; Kepert, Jeffrey D.
2018-05-01
Forecasting of waves under extreme conditions such as tropical cyclones is vitally important for many offshore industries, but there remain many challenges. For Northwest Western Australia (NW WA), wave forecasts issued by the Australian Bureau of Meteorology have previously been limited to products from deterministic operational wave models forced by deterministic atmospheric models. The wave models are run over global (resolution 1/4∘) and regional (resolution 1/10∘) domains with forecast ranges of + 7 and + 3 day respectively. Because of this relatively coarse resolution (both in the wave models and in the forcing fields), the accuracy of these products is limited under tropical cyclone conditions. Given this limited accuracy, a new ensemble-based wave forecasting system for the NW WA region has been developed. To achieve this, a new dedicated 8-km resolution grid was nested in the global wave model. Over this grid, the wave model is forced with winds from a bias-corrected European Centre for Medium Range Weather Forecast atmospheric ensemble that comprises 51 ensemble members to take into account the uncertainties in location, intensity and structure of a tropical cyclone system. A unique technique is used to select restart files for each wave ensemble member. The system is designed to operate in real time during the cyclone season providing + 10-day forecasts. This paper will describe the wave forecast components of this system and present the verification metrics and skill for specific events.
Operational value of ensemble streamflow forecasts for hydropower production: A Canadian case study
NASA Astrophysics Data System (ADS)
Boucher, Marie-Amélie; Tremblay, Denis; Luc, Perreault; François, Anctil
2010-05-01
Ensemble and probabilistic forecasts have many advantages over deterministic ones, both in meteorology and hydrology (e.g. Krzysztofowicz, 2001). Mainly, they inform the user on the uncertainty linked to the forecast. It has been brought to attention that such additional information could lead to improved decision making (e.g. Wilks and Hamill, 1995; Mylne, 2002; Roulin, 2007), but very few studies concentrate on operational situations involving the use of such forecasts. In addition, many authors have demonstrated that ensemble forecasts outperform deterministic forecasts in terms of performance (e.g. Jaun et al., 2005; Velazquez et al., 2009; Laio and Tamea, 2007). However, such performance is mostly assessed on the basis of numerical scoring rules, which compare the forecasts to the observations, and seldom in terms of management gains. The proposed case study adopts an operational point of view, on the basis that a novel forecasting system has value only if it leads to increase monetary and societal gains (e.g. Murphy, 1994; Laio and Tamea, 2007). More specifically, Environment Canada operational ensemble precipitation forecasts are used to drive the HYDROTEL distributed hydrological model (Fortin et al., 1995), calibrated on the Gatineau watershed located in Québec, Canada. The resulting hydrological ensemble forecasts are then incorporated into Hydro-Québec SOHO stochastic management optimization tool that automatically search for optimal operation decisions for the all reservoirs and hydropower plants located on the basin. The timeline of the study is the fall season of year 2003. This period is especially relevant because of high precipitations that nearly caused a major spill, and forced the preventive evacuation of a portion of the population located near one of the dams. We show that the use of the ensemble forecasts would have reduced the occurrence of spills and flooding, which is of particular importance for dams located in populous area, and increased hydropower production. The ensemble precipitation forecasts extend from March 1st of 2002 to December 31st of 2003. They were obtained using two atmospheric models, SEF (8 members plus the control deterministic forecast) and GEM (8 members). The corresponding deterministic precipitation forecast issued by SEF model is also used within HYDROTEL in order to compare ensemble streamflow forecasts with their deterministic counterparts. Although this study does not incorporate all the sources of uncertainty, precipitation is certainly the most important input for hydrological modeling and conveys a great portion of the total uncertainty. References: Fortin, J.P., Moussa, R., Bocquillon, C. and Villeneuve, J.P. 1995: HYDROTEL, un modèle hydrologique distribué pouvant bénéficier des données fournies par la télédétection et les systèmes d'information géographique, Revue des Sciences de l'Eau, 8(1), 94-124. Jaun, S., Ahrens, B., Walser, A., Ewen, T. and Schaer, C. 2008: A probabilistic view on the August 2005 floods in the upper Rhine catchment, Natural Hazards and Earth System Sciences, 8 (2), 281-291. Krzysztofowicz, R. 2001: The case for probabilistic forecasting in hydrology, Journal of Hydrology, 249, 2-9. Murphy, A.H. 1994: Assessing the economic value of weather forecasts: An overview of methods, results and issues, Meteorological Applications, 1, 69-73. Mylne, K.R. 2002: Decision-Making from probability forecasts based on forecast value, Meteorological Applications, 9, 307-315. Laio, F. and Tamea, S. 2007: Verification tools for probabilistic forecasts of continuous hydrological variables, Hydrology and Earth System Sciences, 11, 1267-1277. Roulin, E. 2007: Skill and relative economic value of medium-range hydrological ensemble predictions, Hydrology and Earth System Sciences, 11, 725-737. Velazquez, J.-A., Petit, T., Lavoie, A., Boucher, M.-A., Turcotte, R., Fortin, V. and Anctil, F. 2009: An evaluation of the Canadian global meteorological ensemble prediction system for short-term hydrological forecasting, Hydrology and Earth System Sciences, 13(11), 2221-2231. Wilks, D.S. and Hamill, T.M. 1995: Potential economic value of ensemble-based surface weather forecasts, Monthly Weather Review, 123(12), 3565-3575.
Ocean state and uncertainty forecasts using HYCOM with Local Ensemble Transfer Kalman Filter (LETKF)
NASA Astrophysics Data System (ADS)
Wei, Mozheng; Hogan, Pat; Rowley, Clark; Smedstad, Ole-Martin; Wallcraft, Alan; Penny, Steve
2017-04-01
An ensemble forecast system based on the US Navy's operational HYCOM using Local Ensemble Transfer Kalman Filter (LETKF) technology has been developed for ocean state and uncertainty forecasts. One of the advantages is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates the operational observations using ensemble method. The background covariance during this assimilation process is supplied with the ensemble, thus it avoids the difficulty of developing tangent linear and adjoint models for 4D-VAR from the complicated hybrid isopycnal vertical coordinate in HYCOM. Another advantage is that the ensemble system provides the valuable uncertainty estimate corresponding to every state forecast from HYCOM. Uncertainty forecasts have been proven to be critical for the downstream users and managers to make more scientifically sound decisions in numerical prediction community. In addition, ensemble mean is generally more accurate and skilful than the single traditional deterministic forecast with the same resolution. We will introduce the ensemble system design and setup, present some results from 30-member ensemble experiment, and discuss scientific, technical and computational issues and challenges, such as covariance localization, inflation, model related uncertainties and sensitivity to the ensemble size.
Multimodel Ensemble Methods for Prediction of Wake-Vortex Transport and Decay Originating NASA
NASA Technical Reports Server (NTRS)
Korner, Stephan; Ahmad, Nashat N.; Holzapfel, Frank; VanValkenburg, Randal L.
2017-01-01
Several multimodel ensemble methods are selected and further developed to improve the deterministic and probabilistic prediction skills of individual wake-vortex transport and decay models. The different multimodel ensemble methods are introduced, and their suitability for wake applications is demonstrated. The selected methods include direct ensemble averaging, Bayesian model averaging, and Monte Carlo simulation. The different methodologies are evaluated employing data from wake-vortex field measurement campaigns conducted in the United States and Germany.
Hydro-economic assessment of hydrological forecasting systems
NASA Astrophysics Data System (ADS)
Boucher, M.-A.; Tremblay, D.; Delorme, L.; Perreault, L.; Anctil, F.
2012-01-01
SummaryAn increasing number of publications show that ensemble hydrological forecasts exhibit good performance when compared to observed streamflow. Many studies also conclude that ensemble forecasts lead to a better performance than deterministic ones. This investigation takes one step further by not only comparing ensemble and deterministic forecasts to observed values, but by employing the forecasts in a stochastic decision-making assistance tool for hydroelectricity production, during a flood event on the Gatineau River in Canada. This allows the comparison between different types of forecasts according to their value in terms of energy, spillage and storage in a reservoir. The motivation for this is to adopt the point of view of an end-user, here a hydroelectricity production society. We show that ensemble forecasts exhibit excellent performances when compared to observations and are also satisfying when involved in operation management for electricity production. Further improvement in terms of productivity can be reached through the use of a simple post-processing method.
An application of ensemble/multi model approach for wind power production forecast.
NASA Astrophysics Data System (ADS)
Alessandrini, S.; Decimi, G.; Hagedorn, R.; Sperati, S.
2010-09-01
The wind power forecast of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast is based on a mesoscale meteorological models that provides the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. The corrected wind data are then used as input in the wind farm power curve to obtain the power forecast. These computations require historical time series of wind measured data (by an anemometer located in the wind farm or on the nacelle) and power data in order to be able to perform the statistical analysis on the past. For this purpose a Neural Network (NN) is trained on the past data and then applied in the forecast task. Considering that the anemometer measurements are not always available in a wind farm a different approach has also been adopted. A training of the NN to link directly the forecasted meteorological data and the power data has also been performed. The normalized RMSE forecast error seems to be lower in most cases by following the second approach. We have examined two wind farms, one located in Denmark on flat terrain and one located in a mountain area in the south of Italy (Sicily). In both cases we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by using two or more models (RAMS, ECMWF deterministic, LAMI, HIRLAM). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error of at least 1% compared to the singles models approach. Moreover the use of a deterministic global model, (e.g. ECMWF deterministic model) seems to reach similar level of accuracy of those of the mesocale models (LAMI and RAMS). Finally we have focused on the possibility of using the ensemble model (ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first day ahead period. In fact low spreads often correspond to low forecast error. For longer forecast horizon the correlation between RMSE and ensemble spread decrease becoming too low to be used for this purpose.
NASA Astrophysics Data System (ADS)
Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter
2016-04-01
Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of actual forecasts with shorter lead times of up to 15 days shows the practical benefit of actual operational data. It appears that the use of stochastic optimization combined with ensemble forecasts leads to a significant higher level of flood protection without compromising the HPP's energy production.
NASA Astrophysics Data System (ADS)
Allen, Douglas R.; Hoppel, Karl W.; Kuhl, David D.
2018-03-01
Extraction of wind and temperature information from stratospheric ozone assimilation is examined within the context of the Navy Global Environmental Model (NAVGEM) hybrid 4-D variational assimilation (4D-Var) data assimilation (DA) system. Ozone can improve the wind and temperature through two different DA mechanisms: (1) through the flow-of-the-day
ensemble background error covariance that is blended together with the static background error covariance and (2) via the ozone continuity equation in the tangent linear model and adjoint used for minimizing the cost function. All experiments assimilate actual conventional data in order to maintain a similar realistic troposphere. In the stratosphere, the experiments assimilate simulated ozone and/or radiance observations in various combinations. The simulated observations are constructed for a case study based on a 16-day cycling truth experiment (TE), which is an analysis with no stratospheric observations. The impact of ozone on the analysis is evaluated by comparing the experiments to the TE for the last 6 days, allowing for a 10-day spin-up. Ozone assimilation benefits the wind and temperature when data are of sufficient quality and frequency. For example, assimilation of perfect (no applied error) global hourly ozone data constrains the stratospheric wind and temperature to within ˜ 2 m s-1 and ˜ 1 K. This demonstrates that there is dynamical information in the ozone distribution that can potentially be used to improve the stratosphere. This is particularly important for the tropics, where radiance observations have difficulty constraining wind due to breakdown of geostrophic balance. Global ozone assimilation provides the largest benefit when the hybrid blending coefficient is an intermediate value (0.5 was used in this study), rather than 0.0 (no ensemble background error covariance) or 1.0 (no static background error covariance), which is consistent with other hybrid DA studies. When perfect global ozone is assimilated in addition to radiance observations, wind and temperature error decreases of up to ˜ 3 m s-1 and ˜ 1 K occur in the tropical upper stratosphere. Assimilation of noisy global ozone (2 % errors applied) results in error reductions of ˜ 1 m s-1 and ˜ 0.5 K in the tropics and slightly increased temperature errors in the Northern Hemisphere polar region. Reduction of the ozone sampling frequency also reduces the benefit of ozone throughout the stratosphere, with noisy polar-orbiting data having only minor impacts on wind and temperature when assimilated with radiances. An examination of ensemble cross-correlations between ozone and other variables shows that a single ozone observation behaves like a potential vorticity (PV) charge
, or a monopole of PV, with rotation about a vertical axis and vertically oriented temperature dipole. Further understanding of this relationship may help in designing observation systems that would optimize the impact of ozone on the dynamics.
Creating a Satellite-Based Record of Tropospheric Ozone
NASA Technical Reports Server (NTRS)
Oetjen, Hilke; Payne, Vivienne H.; Kulawik, Susan S.; Eldering, Annmarie; Worden, John; Edwards, David P.; Francis, Gene L.; Worden, Helen M.
2013-01-01
The TES retrieval algorithm has been applied to IASI radiances. We compare the retrieved ozone profiles with ozone sonde profiles for mid-latitudes for the year 2008. We find a positive bias in the IASI ozone profiles in the UTLS region of up to 22 %. The spatial coverage of the IASI instrument allows sampling of effectively the same air mass with several IASI scenes simultaneously. Comparisons of the root-mean-square of an ensemble of IASI profiles to theoretical errors indicate that the measurement noise and the interference of temperature and water vapour on the retrieval together mostly explain the empirically derived random errors. The total degrees of freedom for signal of the retrieval for ozone are 3.1 +/- 0.2 and the tropospheric degrees of freedom are 1.0 +/- 0.2 for the described cases. IASI ozone profiles agree within the error bars with coincident ozone profiles derived from a TES stare sequence for the ozone sonde station at Bratt's Lake (50.2 deg N, 104.7 deg W).
Evaluation of the Plant-Craig stochastic convection scheme in an ensemble forecasting system
NASA Astrophysics Data System (ADS)
Keane, R. J.; Plant, R. S.; Tennant, W. J.
2015-12-01
The Plant-Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic element only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant-Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant-Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.
Hong-Ou-Mandel Interference between Two Deterministic Collective Excitations in an Atomic Ensemble
NASA Astrophysics Data System (ADS)
Li, Jun; Zhou, Ming-Ti; Jing, Bo; Wang, Xu-Jie; Yang, Sheng-Jun; Jiang, Xiao; Mølmer, Klaus; Bao, Xiao-Hui; Pan, Jian-Wei
2016-10-01
We demonstrate deterministic generation of two distinct collective excitations in one atomic ensemble, and we realize the Hong-Ou-Mandel interference between them. Using Rydberg blockade we create single collective excitations in two different Zeeman levels, and we use stimulated Raman transitions to perform a beam-splitter operation between the excited atomic modes. By converting the atomic excitations into photons, the two-excitation interference is measured by photon coincidence detection with a visibility of 0.89(6). The Hong-Ou-Mandel interference witnesses an entangled NOON state of the collective atomic excitations, and we demonstrate its two times enhanced sensitivity to a magnetic field compared with a single excitation. Our work implements a minimal instance of boson sampling and paves the way for further multimode and multiexcitation studies with collective excitations of atomic ensembles.
The GMAO Hybrid Ensemble-Variational Atmospheric Data Assimilation System: Version 2.0
NASA Technical Reports Server (NTRS)
Todling, Ricardo; El Akkraoui, Amal
2018-01-01
This document describes the implementation and usage of the Goddard Earth Observing System (GEOS) Hybrid Ensemble-Variational Atmospheric Data Assimilation System (Hybrid EVADAS). Its aim is to provide comprehensive guidance to users of GEOS ADAS interested in experimenting with its hybrid functionalities. The document is also aimed at providing a short summary of the state-of-science in this release of the hybrid system. As explained here, the ensemble data assimilation system (EnADAS) mechanism added to GEOS ADAS to enable hybrid data assimilation applications has been introduced to the pre-existing machinery of GEOS in the most non-intrusive possible way. Only very minor changes have been made to the original scripts controlling GEOS ADAS with the objective of facilitating its usage by both researchers and the GMAO's near-real-time Forward Processing applications. In a hybrid scenario two data assimilation systems run concurrently in a two-way feedback mode such that: the ensemble provides background ensemble perturbations required by the ADAS deterministic (typically high resolution) hybrid analysis; and the deterministic ADAS provides analysis information for recentering of the EnADAS analyses and information necessary to ensure that observation bias correction procedures are consistent between both the deterministic ADAS and the EnADAS. The nonintrusive approach to introducing hybrid capability to GEOS ADAS means, in particular, that previously existing features continue to be available. Thus, not only is this upgraded version of GEOS ADAS capable of supporting new applications such as Hybrid 3D-Var, 3D-EnVar, 4D-EnVar and Hybrid 4D-EnVar, it remains possible to use GEOS ADAS in its traditional 3D-Var mode which has been used in both MERRA and MERRA-2. Furthermore, as described in this document, GEOS ADAS also supports a configuration for exercising a purely ensemble-based assimilation strategy which can be fully decoupled from its variational component. We should point out that Release 1.0 of this document was made available to GMAO in mid-2013, when we introduced Hybrid 3D-Var capability to GEOS ADAS. This initial version of the documentation included a considerably different state-of-science introductory section but many of the same detailed description of the mechanisms of GEOS EnADAS. We are glad to report that a few of the desirable Future Works listed in Release 1.0 have now been added to the present version of GEOS EnADAS. These include the ability to exercise an Ensemble Prediction System that uses the ensemble analyses of GEOS EnADAS and (a very early, but functional version of) a tool to support Ensemble Forecast Sensitivity and Observation Impact applications.
Stratospheric ozone levels and their role for the dynamic response to volcanic eruptions
NASA Astrophysics Data System (ADS)
Muthers, Stefan; Anet, Julien G.; Raible, Christoph C.; Brönnimann, Stefan; Arfeuille, Florian; Peter, Tom; Rozanov, Eugene; Shapiro, Alexander; Beer, Juerg; Steinhilber, Friedhelm; Brugnara, Yuri; Schmutz, Werner
2013-04-01
The role of different background ozone climatologies for the dynamic response to tropical volcanic eruptions is analyzed using an ensemble of simulation with the atmospheric-chemistry-ocean model SOCOL/MPIOM. In this sensitivity study a single tropical eruption of Tambora-size is applied to an ensemble with either pre-industrial ozone concentrations or present day concentrations respectively. The analysis focuses on the characteristic of the Northern Europe winter warming pattern following the eruption, that has been identified after several eruptions in observations and in proxy data. The sensitivity study reveals a higher probability for a large and significant winter warming pattern with pre-industrial ozone levels, when the dynamic response of the chemistry to the eruption is disabled in the model. The positive temperature anomaly is driven by a positive NAO-like pressure pattern that lead to the advection of warm Atlantic air towards Northern Europe. With present day concentrations winter warmings are also found in some ensemble members, but overall the probability is strongly reduced. It is shown, that with pre-industial day ozone concentrations the coupling between positive anomalies of the polar vortex and the zonal wind in the troposphere is more effective, which could explain the higher likelihood of positive NAO-like pressure patterns and positive temperature anomalies in Northern Europe.
Multi-model ensembles for assessment of flood losses and associated uncertainty
NASA Astrophysics Data System (ADS)
Figueiredo, Rui; Schröter, Kai; Weiss-Motz, Alexander; Martina, Mario L. V.; Kreibich, Heidi
2018-05-01
Flood loss modelling is a crucial part of risk assessments. However, it is subject to large uncertainty that is often neglected. Most models available in the literature are deterministic, providing only single point estimates of flood loss, and large disparities tend to exist among them. Adopting any one such model in a risk assessment context is likely to lead to inaccurate loss estimates and sub-optimal decision-making. In this paper, we propose the use of multi-model ensembles to address these issues. This approach, which has been applied successfully in other scientific fields, is based on the combination of different model outputs with the aim of improving the skill and usefulness of predictions. We first propose a model rating framework to support ensemble construction, based on a probability tree of model properties, which establishes relative degrees of belief between candidate models. Using 20 flood loss models in two test cases, we then construct numerous multi-model ensembles, based both on the rating framework and on a stochastic method, differing in terms of participating members, ensemble size and model weights. We evaluate the performance of ensemble means, as well as their probabilistic skill and reliability. Our results demonstrate that well-designed multi-model ensembles represent a pragmatic approach to consistently obtain more accurate flood loss estimates and reliable probability distributions of model uncertainty.
The role of ensemble post-processing for modeling the ensemble tail
NASA Astrophysics Data System (ADS)
Van De Vyver, Hans; Van Schaeybroeck, Bert; Vannitsem, Stéphane
2016-04-01
The past decades the numerical weather prediction community has witnessed a paradigm shift from deterministic to probabilistic forecast and state estimation (Buizza and Leutbecher, 2015; Buizza et al., 2008), in an attempt to quantify the uncertainties associated with initial-condition and model errors. An important benefit of a probabilistic framework is the improved prediction of extreme events. However, one may ask to what extent such model estimates contain information on the occurrence probability of extreme events and how this information can be optimally extracted. Different approaches have been proposed and applied on real-world systems which, based on extreme value theory, allow the estimation of extreme-event probabilities conditional on forecasts and state estimates (Ferro, 2007; Friederichs, 2010). Using ensemble predictions generated with a model of low dimensionality, a thorough investigation is presented quantifying the change of predictability of extreme events associated with ensemble post-processing and other influencing factors including the finite ensemble size, lead time and model assumption and the use of different covariates (ensemble mean, maximum, spread...) for modeling the tail distribution. Tail modeling is performed by deriving extreme-quantile estimates using peak-over-threshold representation (generalized Pareto distribution) or quantile regression. Common ensemble post-processing methods aim to improve mostly the ensemble mean and spread of a raw forecast (Van Schaeybroeck and Vannitsem, 2015). Conditional tail modeling, on the other hand, is a post-processing in itself, focusing on the tails only. Therefore, it is unclear how applying ensemble post-processing prior to conditional tail modeling impacts the skill of extreme-event predictions. This work is investigating this question in details. Buizza, Leutbecher, and Isaksen, 2008: Potential use of an ensemble of analyses in the ECMWF Ensemble Prediction System, Q. J. R. Meteorol. Soc. 134: 2051-2066.Buizza and Leutbecher, 2015: The forecast skill horizon, Q. J. R. Meteorol. Soc. 141: 3366-3382.Ferro, 2007: A probability model for verifying deterministic forecasts of extreme events. Weather and Forecasting 22 (5), 1089-1100.Friederichs, 2010: Statistical downscaling of extreme precipitation events using extreme value theory. Extremes 13, 109-132.Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.
Insights into the deterministic skill of air quality ensembles from the analysis of AQMEII data
Simulations from chemical weather models are subject to uncertainties in the input data (e.g. emission inventory, initial and boundary conditions) as well as those intrinsic to the model (e.g. physical parameterization, chemical mechanism). Multi-model ensembles can improve the f...
A regional air quality forecasting system over Europe: the MACC-II daily ensemble production
NASA Astrophysics Data System (ADS)
Marécal, V.; Peuch, V.-H.; Andersson, C.; Andersson, S.; Arteta, J.; Beekmann, M.; Benedictow, A.; Bergström, R.; Bessagnet, B.; Cansado, A.; Chéroux, F.; Colette, A.; Coman, A.; Curier, R. L.; Denier van der Gon, H. A. C.; Drouin, A.; Elbern, H.; Emili, E.; Engelen, R. J.; Eskes, H. J.; Foret, G.; Friese, E.; Gauss, M.; Giannaros, C.; Guth, J.; Joly, M.; Jaumouillé, E.; Josse, B.; Kadygrov, N.; Kaiser, J. W.; Krajsek, K.; Kuenen, J.; Kumar, U.; Liora, N.; Lopez, E.; Malherbe, L.; Martinez, I.; Melas, D.; Meleux, F.; Menut, L.; Moinat, P.; Morales, T.; Parmentier, J.; Piacentini, A.; Plu, M.; Poupkou, A.; Queguiner, S.; Robertson, L.; Rouïl, L.; Schaap, M.; Segers, A.; Sofiev, M.; Thomas, M.; Timmermans, R.; Valdebenito, Á.; van Velthoven, P.; van Versendaal, R.; Vira, J.; Ung, A.
2015-03-01
This paper describes the pre-operational analysis and forecasting system developed during MACC (Monitoring Atmospheric Composition and Climate) and continued in MACC-II (Monitoring Atmospheric Composition and Climate: Interim Implementation) European projects to provide air quality services for the European continent. The paper gives an overall picture of its status at the end of MACC-II (summer 2014). This system is based on seven state-of-the art models developed and run in Europe (CHIMERE, EMEP, EURAD-IM, LOTOS-EUROS, MATCH, MOCAGE and SILAM). These models are used to calculate multi-model ensemble products. The MACC-II system provides daily 96 h forecasts with hourly outputs of 10 chemical species/aerosols (O3, NO2, SO2, CO, PM10, PM2.5, NO, NH3, total NMVOCs and PAN + PAN precursors) over 8 vertical levels from the surface to 5 km height. The hourly analysis at the surface is done a posteriori for the past day using a selection of representative air quality data from European monitoring stations. The performances of the system are assessed daily, weekly and 3 monthly (seasonally) through statistical indicators calculated using the available representative air quality data from European monitoring stations. Results for a case study show the ability of the median ensemble to forecast regional ozone pollution events. The time period of this case study is also used to illustrate that the median ensemble generally outperforms each of the individual models and that it is still robust even if two of the seven models are missing. The seasonal performances of the individual models and of the multi-model ensemble have been monitored since September 2009 for ozone, NO2 and PM10 and show an overall improvement over time. The change of the skills of the ensemble over the past two summers for ozone and the past two winters for PM10 are discussed in the paper. While the evolution of the ozone scores is not significant, there are improvements of PM10 over the past two winters that can be at least partly attributed to new developments on aerosols in the seven individual models. Nevertheless, the year to year changes in the models and ensemble skills are also linked to the variability of the meteorological conditions and of the set of observations used to calculate the statistical indicators. In parallel, a scientific analysis of the results of the seven models and of the ensemble is also done over the Mediterranean area because of the specificity of its meteorology and emissions. The system is robust in terms of the production availability. Major efforts have been done in MACC-II towards the operationalisation of all its components. Foreseen developments and research for improving its performances are discussed in the conclusion.
"OZONE SOURCE APPORTIONMENT IN CMAQ' | Science ...
Ozone source attribution has been used to support various policy purposes including interstate transport (Cross State Air Pollution Rule) by U.S. EPA and ozone nonattainment area designations by State agencies. Common scientific applications include tracking intercontinental transport of ozone and ozone precursors and delineating anthropogenic and non-anthropogenic contribution to ozone in North America. As in the public release due in September 2013, CMAQ’s Integrated Source Apportionment Method (ISAM) attributes PM EC/OC, sulfate, nitrate, ammonium, ozone and its precursors NOx and VOC, to sectors/regions of users’ interest. Although the peroxide-to-nitric acid productions ratio has been the most common indicator to distinguish NOx-limited ozone production from VOC-limited one, other indicators are implemented in addition to allowing for an ensemble decision based on a total of 9 available indicator ratios. Moreover, an alternative approach of ozone attribution based on the idea of chemical sensitivity in a linearized system that has formed the basis of chemical treatment in forward DDM/backward adjoint tools has been implemented in CMAQ. This method does not require categorization into either ozone regime. In this study, ISAM will simulate the 2010 North America ozone using all of the above gas-phase attribution methods. The results are to be compared with zero-out difference out of those sectors in the host model runs. In addition, ozone contribution wil
Assimilation of IASI partial tropospheric columns with an Ensemble Kalman Filter over Europe
NASA Astrophysics Data System (ADS)
Coman, A.; Foret, G.; Beekmann, M.; Eremenko, M.; Dufour, G.; Gaubert, B.; Ung, A.; Schmechtig, C.; Flaud, J.-M.; Bergametti, G.
2011-09-01
Partial lower tropospheric ozone columns provided by the IASI (Infrared Atmospheric Sounding Interferometer) instrument have been assimilated into a chemistry-transport model at continental scale (CHIMERE) using an Ensemble Kalman Filter (EnKF). Analyses are made for the month of July 2007 over the European domain. Launched in 2006, aboard the MetOp-A satellite, IASI shows high sensitivity for ozone in the free troposphere and low sensitivity at the ground; therefore it is important to evaluate if assimilation of these observations can improve free tropospheric ozone, and possibly surface ozone. The analyses are validated against independent ozone observations from sondes, MOZAIC1 aircraft and ground based stations (AIRBASE - the European Air quality dataBase) and compared with respect to the free run of CHIMERE. These comparisons show a decrease in error of 6 parts-per-billion (ppb) in the free troposphere over the Frankfurt area, and also a reduction of the root mean square error (respectively bias) at the surface of 19% (33%) for more than 90% of existing ground stations. This provides evidence of the potential of data assimilation of tropospheric IASI columns to better describe the tropospheric ozone distribution, including surface ozone, despite the lower sensitivity. The changes in concentration resulting from the observational constraints were quantified and several geophysical explanations for the findings of this study were drawn. The corrections were most pronounced over Italy and the Mediterranean region, on the average we noted an average reduction of 8-9 ppb in the free troposphere with respect to the free run, and still a reduction of 5.5 ppb at ground, likely due to a longer residence time of air masses in this part associated to the general circulation pattern (i.e. dominant western circulation) and to persistent anticyclonic conditions over the Mediterranean basin. This is an important geophysical result, since the ozone burden is large over this area, with impact on the radiative balance and air quality. 1 Measurements of OZone, water vapour, carbon monoxide and nitrogen oxides by in-service AIrbus airCraft ( http://mozaic.aero.obs-mip.fr/web/)
Assimilation of IASI partial tropospheric columns with an Ensemble Kalman Filter over Europe
NASA Astrophysics Data System (ADS)
Coman, A.; Foret, G.; Beekmann, M.; Eremenko, M.; Dufour, G.; Gaubert, B.; Ung, A.; Schmechtig, C.; Flaud, J.-M.; Bergametti, G.
2012-03-01
Partial lower tropospheric ozone columns provided by the IASI (Infrared Atmospheric Sounding Interferometer) instrument have been assimilated into a chemistry-transport model at continental scale (CHIMERE) using an Ensemble Square Root Kalman Filter (EnSRF). Analyses are made for the month of July 2007 over the European domain. Launched in 2006, aboard the MetOp-A satellite, IASI shows high sensitivity for ozone in the free troposphere and low sensitivity at the ground; therefore it is important to evaluate if assimilation of these observations can improve free tropospheric ozone, and possibly surface ozone. The analyses are validated against independent ozone observations from sondes, MOZAIC1 aircraft and ground based stations (AIRBASE - the European Air quality dataBase) and compared with respect to the free run of CHIMERE. These comparisons show a decrease in error of 6 parts-per-billion (ppb) in the free troposphere over the Frankfurt area, and also a reduction of the root mean square error (respectively bias) at the surface of 19% (33%) for more than 90% of existing ground stations. This provides evidence of the potential of data assimilation of tropospheric IASI columns to better describe the tropospheric ozone distribution, including surface ozone, despite the lower sensitivity. The changes in concentration resulting from the observational constraints were quantified and several geophysical explanations for the findings of this study were drawn. The corrections were most pronounced over Italy and the Mediterranean region, we noted an average reduction of 8-9 ppb in the free troposphere with respect to the free run, and still a reduction of 5.5 ppb at ground, likely due to a longer residence time of air masses in this part associated to the general circulation pattern (i.e. dominant western circulation) and to persistent anticyclonic conditions over the Mediterranean basin. This is an important geophysical result, since the ozone burden is large over this area, with impact on the radiative balance and air quality. 1 Measurements of OZone, water vapour, carbon monoxide and nitrogen oxides by in-service AIrbus airCraft (http://mozaic.aero.obs-mip.fr/web/).
Deterministic generation of multiparticle entanglement by quantum Zeno dynamics.
Barontini, Giovanni; Hohmann, Leander; Haas, Florian; Estève, Jérôme; Reichel, Jakob
2015-09-18
Multiparticle entangled quantum states, a key resource in quantum-enhanced metrology and computing, are usually generated by coherent operations exclusively. However, unusual forms of quantum dynamics can be obtained when environment coupling is used as part of the state generation. In this work, we used quantum Zeno dynamics (QZD), based on nondestructive measurement with an optical microcavity, to deterministically generate different multiparticle entangled states in an ensemble of 36 qubit atoms in less than 5 microseconds. We characterized the resulting states by performing quantum tomography, yielding a time-resolved account of the entanglement generation. In addition, we studied the dependence of quantum states on measurement strength and quantified the depth of entanglement. Our results show that QZD is a versatile tool for fast and deterministic entanglement generation in quantum engineering applications. Copyright © 2015, American Association for the Advancement of Science.
An application of ensemble/multi model approach for wind power production forecasting
NASA Astrophysics Data System (ADS)
Alessandrini, S.; Pinson, P.; Hagedorn, R.; Decimi, G.; Sperati, S.
2011-02-01
The wind power forecasts of the 3 days ahead period are becoming always more useful and important in reducing the problem of grid integration and energy price trading due to the increasing wind power penetration. Therefore it's clear that the accuracy of this forecast is one of the most important requirements for a successful application. The wind power forecast applied in this study is based on meteorological models that provide the 3 days ahead wind data. A Model Output Statistic correction is then performed to reduce systematic error caused, for instance, by a wrong representation of surface roughness or topography in the meteorological models. For this purpose a training of a Neural Network (NN) to link directly the forecasted meteorological data and the power data has been performed. One wind farm has been examined located in a mountain area in the south of Italy (Sicily). First we compare the performances of a prediction based on meteorological data coming from a single model with those obtained by the combination of models (RAMS, ECMWF deterministic, LAMI). It is shown that the multi models approach reduces the day-ahead normalized RMSE forecast error (normalized by nominal power) of at least 1% compared to the singles models approach. Finally we have focused on the possibility of using the ensemble model system (EPS by ECMWF) to estimate the hourly, three days ahead, power forecast accuracy. Contingency diagram between RMSE of the deterministic power forecast and the ensemble members spread of wind forecast have been produced. From this first analysis it seems that ensemble spread could be used as an indicator of the forecast's accuracy at least for the first three days ahead period.
NASA Astrophysics Data System (ADS)
Owens, Mathew J.; Riley, Pete
2017-11-01
Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).
Owens, Mathew J; Riley, Pete
2017-11-01
Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).
Riley, Pete
2017-01-01
Abstract Long lead‐time space‐weather forecasting requires accurate prediction of the near‐Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near‐Sun solar wind and magnetic field conditions provide the inner boundary condition to three‐dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics‐based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near‐Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near‐Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near‐Sun solar wind speed at a range of latitudes about the sub‐Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun‐Earth line. Propagating these conditions to Earth by a three‐dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one‐dimensional “upwind” scheme is used. The variance in the resulting near‐Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996–2016, the upwind ensemble is found to provide a more “actionable” forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large). PMID:29398982
Super Ensemble-based Aviation Turbulence Guidance (SEATG) for Air Traffic Management (ATM)
NASA Astrophysics Data System (ADS)
Kim, Jung-Hoon; Chan, William; Sridhar, Banavar; Sharman, Robert
2014-05-01
Super Ensemble (ensemble of ten turbulence metrics from time-lagged ensemble members of weather forecast data)-based Aviation Turbulence Guidance (SEATG) is developed using Weather Research and Forecasting (WRF) model and in-situ eddy dissipation rate (EDR) observations equipped on commercial aircraft over the contiguous United States. SEATG is a sequence of five procedures including weather modeling, calculating turbulence metrics, mapping EDR-scale, evaluating metrics, and producing final SEATG forecast. This uses similar methodology to the operational Graphic Turbulence Guidance (GTG) with three major improvements. First, SEATG use a higher resolution (3-km) WRF model to capture cloud-resolving scale phenomena. Second, SEATG computes turbulence metrics for multiple forecasts that are combined at the same valid time resulting in an time-lagged ensemble of multiple turbulence metrics. Third, SEATG provides both deterministic and probabilistic turbulence forecasts to take into account weather uncertainties and user demands. It is found that the SEATG forecasts match well with observed radar reflectivity along a surface front as well as convectively induced turbulence outside the clouds on 7-8 Sep 2012. And, overall performance skill of deterministic SEATG against the observed EDR data during this period is superior to any single turbulence metrics. Finally, probabilistic SEATG is used as an example application of turbulence forecast for air-traffic management. In this study, a simple Wind-Optimal Route (WOR) passing through the potential areas of probabilistic SEATG and Lateral Turbulence Avoidance Route (LTAR) taking into account the SEATG are calculated at z = 35000 ft (z = 12 km) from Los Angeles to John F. Kennedy international airports. As a result, WOR takes total of 239 minutes with 16 minutes of SEATG areas for 40% of moderate turbulence potential, while LTAR takes total of 252 minutes travel time that 5% of fuel would be additionally consumed to entirely avoid the moderate SEATG regions.
Rixen, M.; Ferreira-Coelho, E.; Signell, R.
2008-01-01
Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).
Hybrid quantum processors: molecular ensembles as quantum memory for solid state circuits.
Rabl, P; DeMille, D; Doyle, J M; Lukin, M D; Schoelkopf, R J; Zoller, P
2006-07-21
We investigate a hybrid quantum circuit where ensembles of cold polar molecules serve as long-lived quantum memories and optical interfaces for solid state quantum processors. The quantum memory realized by collective spin states (ensemble qubit) is coupled to a high-Q stripline cavity via microwave Raman processes. We show that, for convenient trap-surface distances of a few microm, strong coupling between the cavity and ensemble qubit can be achieved. We discuss basic quantum information protocols, including a swap from the cavity photon bus to the molecular quantum memory, and a deterministic two qubit gate. Finally, we investigate coherence properties of molecular ensemble quantum bits.
Ozone Sensitivity to Varying Greenhouse Gases and Ozone-Depleting Substances in CCMI-1 Simulations
NASA Technical Reports Server (NTRS)
Morgenstern, Olaf; Stone, Kane A.; Schofield, Robyn; Akiyoshi, Hideharu; Yamashita, Yousuke; Kinnison, Douglas E.; Garcia, Rolando R.; Sudo, Kengo; Plummer, David A.; Scinocca, John;
2018-01-01
Ozone fields simulated for the first phase of the Chemistry-Climate Model Initiative (CCMI-1) will be used as forcing data in the 6th Coupled Model Intercomparison Project. Here we assess, using reference and sensitivity simulations produced for CCMI-1, the suitability of CCMI-1 model results for this process, investigating the degree of consistency amongst models regarding their responses to variations in individual forcings. We consider the influences of methane, nitrous oxide, a combination of chlorinated or brominated ozone-depleting substances, and a combination of carbon dioxide and other greenhouse gases. We find varying degrees of consistency in the models' responses in ozone to these individual forcings, including some considerable disagreement. In particular, the response of total-column ozone to these forcings is less consistent across the multi-model ensemble than profile comparisons. We analyse how stratospheric age of air, a commonly used diagnostic of stratospheric transport, responds to the forcings. For this diagnostic we find some salient differences in model behaviour, which may explain some of the findings for ozone. The findings imply that the ozone fields derived from CCMI-1 are subject to considerable uncertainties regarding the impacts of these anthropogenic forcings. We offer some thoughts on how to best approach the problem of generating a consensus ozone database from a multi-model ensemble such as CCMI-1.
Ozone sensitivity to varying greenhouse gases and ozone-depleting substances in CCMI-1 simulations
NASA Astrophysics Data System (ADS)
Morgenstern, Olaf; Stone, Kane A.; Schofield, Robyn; Akiyoshi, Hideharu; Yamashita, Yousuke; Kinnison, Douglas E.; Garcia, Rolando R.; Sudo, Kengo; Plummer, David A.; Scinocca, John; Oman, Luke D.; Manyin, Michael E.; Zeng, Guang; Rozanov, Eugene; Stenke, Andrea; Revell, Laura E.; Pitari, Giovanni; Mancini, Eva; Di Genova, Glauco; Visioni, Daniele; Dhomse, Sandip S.; Chipperfield, Martyn P.
2018-01-01
Ozone fields simulated for the first phase of the Chemistry-Climate Model Initiative (CCMI-1) will be used as forcing data in the 6th Coupled Model Intercomparison Project. Here we assess, using reference and sensitivity simulations produced for CCMI-1, the suitability of CCMI-1 model results for this process, investigating the degree of consistency amongst models regarding their responses to variations in individual forcings. We consider the influences of methane, nitrous oxide, a combination of chlorinated or brominated ozone-depleting substances, and a combination of carbon dioxide and other greenhouse gases. We find varying degrees of consistency in the models' responses in ozone to these individual forcings, including some considerable disagreement. In particular, the response of total-column ozone to these forcings is less consistent across the multi-model ensemble than profile comparisons. We analyse how stratospheric age of air, a commonly used diagnostic of stratospheric transport, responds to the forcings. For this diagnostic we find some salient differences in model behaviour, which may explain some of the findings for ozone. The findings imply that the ozone fields derived from CCMI-1 are subject to considerable uncertainties regarding the impacts of these anthropogenic forcings. We offer some thoughts on how to best approach the problem of generating a consensus ozone database from a multi-model ensemble such as CCMI-1.
A multiphysical ensemble system of numerical snow modelling
NASA Astrophysics Data System (ADS)
Lafaysse, Matthieu; Cluzet, Bertrand; Dumont, Marie; Lejeune, Yves; Vionnet, Vincent; Morin, Samuel
2017-05-01
Physically based multilayer snowpack models suffer from various modelling errors. To represent these errors, we built the new multiphysical ensemble system ESCROC (Ensemble System Crocus) by implementing new representations of different physical processes in the deterministic coupled multilayer ground/snowpack model SURFEX/ISBA/Crocus. This ensemble was driven and evaluated at Col de Porte (1325 m a.s.l., French alps) over 18 years with a high-quality meteorological and snow data set. A total number of 7776 simulations were evaluated separately, accounting for the uncertainties of evaluation data. The ability of the ensemble to capture the uncertainty associated to modelling errors is assessed for snow depth, snow water equivalent, bulk density, albedo and surface temperature. Different sub-ensembles of the ESCROC system were studied with probabilistic tools to compare their performance. Results show that optimal members of the ESCROC system are able to explain more than half of the total simulation errors. Integrating members with biases exceeding the range corresponding to observational uncertainty is necessary to obtain an optimal dispersion, but this issue can also be a consequence of the fact that meteorological forcing uncertainties were not accounted for. The ESCROC system promises the integration of numerical snow-modelling errors in ensemble forecasting and ensemble assimilation systems in support of avalanche hazard forecasting and other snowpack-modelling applications.
Effect of Recent Sea Surface Temperature Trends on the Arctic Stratospheric Vortex
NASA Technical Reports Server (NTRS)
Garfinkel, Chaim I.; Oman, Luke; Hurwitz, Margaret
2015-01-01
The springtime Arctic polar vortex has cooled significantly over the satellite era, with consequences for ozone concentrations in the springtime transition season. The causes of this cooling trend are deduced by using comprehensive chemistry-climate model experiments. Approximately half of the satellite era early springtime cooling trend in the Arctic lower stratosphere was caused by changing sea surface temperatures (SSTs). An ensemble of experiments forced only by changing SSTs is compared to an ensemble of experiments in which both the observed SSTs and chemically- and radiatively-active trace species are changing. By comparing the two ensembles, it is shown that warming of Indian Ocean, North Pacific, and North Atlantic SSTs, and cooling of the tropical Pacific, have strongly contributed to recent polar stratospheric cooling in late winter and early spring, and to a weak polar stratospheric warming in early winter. When concentrations of ozone-depleting substances and greenhouse gases are fixed, polar ozone concentrations show a small but robust decline due to changing SSTs. Ozone changes are magnified in the presence of changing gas concentrations. The stratospheric changes can be understood by examining the tropospheric height and heat flux anomalies generated by the anomalous SSTs. Finally, recent SST changes have contributed to a decrease in the frequency of late winter stratospheric sudden warmings.
Ensemble Bayesian forecasting system Part I: Theory and algorithms
NASA Astrophysics Data System (ADS)
Herr, Henry D.; Krzysztofowicz, Roman
2015-05-01
The ensemble Bayesian forecasting system (EBFS), whose theory was published in 2001, is developed for the purpose of quantifying the total uncertainty about a discrete-time, continuous-state, non-stationary stochastic process such as a time series of stages, discharges, or volumes at a river gauge. The EBFS is built of three components: an input ensemble forecaster (IEF), which simulates the uncertainty associated with random inputs; a deterministic hydrologic model (of any complexity), which simulates physical processes within a river basin; and a hydrologic uncertainty processor (HUP), which simulates the hydrologic uncertainty (an aggregate of all uncertainties except input). It works as a Monte Carlo simulator: an ensemble of time series of inputs (e.g., precipitation amounts) generated by the IEF is transformed deterministically through a hydrologic model into an ensemble of time series of outputs, which is next transformed stochastically by the HUP into an ensemble of time series of predictands (e.g., river stages). Previous research indicated that in order to attain an acceptable sampling error, the ensemble size must be on the order of hundreds (for probabilistic river stage forecasts and probabilistic flood forecasts) or even thousands (for probabilistic stage transition forecasts). The computing time needed to run the hydrologic model this many times renders the straightforward simulations operationally infeasible. This motivates the development of the ensemble Bayesian forecasting system with randomization (EBFSR), which takes full advantage of the analytic meta-Gaussian HUP and generates multiple ensemble members after each run of the hydrologic model; this auxiliary randomization reduces the required size of the meteorological input ensemble and makes it operationally feasible to generate a Bayesian ensemble forecast of large size. Such a forecast quantifies the total uncertainty, is well calibrated against the prior (climatic) distribution of predictand, possesses a Bayesian coherence property, constitutes a random sample of the predictand, and has an acceptable sampling error-which makes it suitable for rational decision making under uncertainty.
Comparison of TOMS, SBW & SBUV/2 Version 8 Total Column Ozone Data with Data from Groundstations
NASA Technical Reports Server (NTRS)
Labow, G. J.; McPeters, R. D.; Bhartia, P. K.
2004-01-01
The Nimbus-7 and Earth Probe Total Ozone Mapping Spectrometer (TOMS) data as well as SBUV and SBUV/2 data have been reprocessed with a new retrieval algorithm (Version 8) and an updated calibration procedure. An overview will be presented systematically comparing ozone values to an ensemble of Brewer and Dobson spectrophotometers. The comparisons were made as a function of latitude, solar zenith angle, reflectivity and total ozone. Results show that the accuracy of the TOMS retrieval has been improved when aerosols are present in the atmosphere, when snow/ice and sea glint are present, and when ozone in the northern hemisphere is extremely low. TOMS overpass data are derived from the single TOMS best match measurement, almost always located within one degree of the ground station and usually made within an hour of local noon. The Version 8 Earth Probe TOMS ozone values have decreased by an average of about 1% due to a much better understanding of the calibration of the instrument. N-7 SBUV as well as the series of NOAA SBUV/2 column ozone values have also been processed with the Version 8 algorithm and have been compared to values from an ensemble of groundstations. Results show that the SBW column ozone values agree well with the groundstations and the datasets are useful for trend studies.
Dynamic principle for ensemble control tools.
Samoletov, A; Vasiev, B
2017-11-28
Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.
Stochastic inference with spiking neurons in the high-conductance state
NASA Astrophysics Data System (ADS)
Petrovici, Mihai A.; Bill, Johannes; Bytschok, Ilja; Schemmel, Johannes; Meier, Karlheinz
2016-10-01
The highly variable dynamics of neocortical circuits observed in vivo have been hypothesized to represent a signature of ongoing stochastic inference but stand in apparent contrast to the deterministic response of neurons measured in vitro. Based on a propagation of the membrane autocorrelation across spike bursts, we provide an analytical derivation of the neural activation function that holds for a large parameter space, including the high-conductance state. On this basis, we show how an ensemble of leaky integrate-and-fire neurons with conductance-based synapses embedded in a spiking environment can attain the correct firing statistics for sampling from a well-defined target distribution. For recurrent networks, we examine convergence toward stationarity in computer simulations and demonstrate sample-based Bayesian inference in a mixed graphical model. This points to a new computational role of high-conductance states and establishes a rigorous link between deterministic neuron models and functional stochastic dynamics on the network level.
On the accuracy of aerosol photoacoustic spectrometer calibrations using absorption by ozone
NASA Astrophysics Data System (ADS)
Davies, Nicholas W.; Cotterell, Michael I.; Fox, Cathryn; Szpek, Kate; Haywood, Jim M.; Langridge, Justin M.
2018-04-01
In recent years, photoacoustic spectroscopy has emerged as an invaluable tool for the accurate measurement of light absorption by atmospheric aerosol. Photoacoustic instruments require calibration, which can be achieved by measuring the photoacoustic signal generated by known quantities of gaseous ozone. Recent work has questioned the validity of this approach at short visible wavelengths (404 nm), indicating systematic calibration errors of the order of a factor of 2. We revisit this result and test the validity of the ozone calibration method using a suite of multipass photoacoustic cells operating at wavelengths 405, 514 and 658 nm. Using aerosolised nigrosin with mobility-selected diameters in the range 250-425 nm, we demonstrate excellent agreement between measured and modelled ensemble absorption cross sections at all wavelengths, thus demonstrating the validity of the ozone-based calibration method for aerosol photoacoustic spectroscopy at visible wavelengths.
Dominating Scale-Free Networks Using Generalized Probabilistic Methods
Molnár,, F.; Derzsy, N.; Czabarka, É.; Székely, L.; Szymanski, B. K.; Korniss, G.
2014-01-01
We study ensemble-based graph-theoretical methods aiming to approximate the size of the minimum dominating set (MDS) in scale-free networks. We analyze both analytical upper bounds of dominating sets and numerical realizations for applications. We propose two novel probabilistic dominating set selection strategies that are applicable to heterogeneous networks. One of them obtains the smallest probabilistic dominating set and also outperforms the deterministic degree-ranked method. We show that a degree-dependent probabilistic selection method becomes optimal in its deterministic limit. In addition, we also find the precise limit where selecting high-degree nodes exclusively becomes inefficient for network domination. We validate our results on several real-world networks, and provide highly accurate analytical estimates for our methods. PMID:25200937
Rule-based programming paradigm: a formal basis for biological, chemical and physical computation.
Krishnamurthy, V; Krishnamurthy, E V
1999-03-01
A rule-based programming paradigm is described as a formal basis for biological, chemical and physical computations. In this paradigm, the computations are interpreted as the outcome arising out of interaction of elements in an object space. The interactions can create new elements (or same elements with modified attributes) or annihilate old elements according to specific rules. Since the interaction rules are inherently parallel, any number of actions can be performed cooperatively or competitively among the subsets of elements, so that the elements evolve toward an equilibrium or unstable or chaotic state. Such an evolution may retain certain invariant properties of the attributes of the elements. The object space resembles Gibbsian ensemble that corresponds to a distribution of points in the space of positions and momenta (called phase space). It permits the introduction of probabilities in rule applications. As each element of the ensemble changes over time, its phase point is carried into a new phase point. The evolution of this probability cloud in phase space corresponds to a distributed probabilistic computation. Thus, this paradigm can handle tor deterministic exact computation when the initial conditions are exactly specified and the trajectory of evolution is deterministic. Also, it can handle probabilistic mode of computation if we want to derive macroscopic or bulk properties of matter. We also explain how to support this rule-based paradigm using relational-database like query processing and transactions.
Assessment of SWE data assimilation for ensemble streamflow predictions
NASA Astrophysics Data System (ADS)
Franz, Kristie J.; Hogue, Terri S.; Barik, Muhammad; He, Minxue
2014-11-01
An assessment of data assimilation (DA) for Ensemble Streamflow Prediction (ESP) using seasonal water supply hindcasting in the North Fork of the American River Basin (NFARB) and the National Weather Service (NWS) hydrologic forecast models is undertaken. Two parameter sets, one from the California Nevada River Forecast Center (RFC) and one from the Differential Evolution Adaptive Metropolis (DREAM) algorithm, are tested. For each parameter set, hindcasts are generated using initial conditions derived with and without the inclusion of a DA scheme that integrates snow water equivalent (SWE) observations. The DREAM-DA scenario uses an Integrated Uncertainty and Ensemble-based data Assimilation (ICEA) framework that also considers model and parameter uncertainty. Hindcasts are evaluated using deterministic and probabilistic forecast verification metrics. In general, the impact of DA on the skill of the seasonal water supply predictions is mixed. For deterministic (ensemble mean) predictions, the Percent Bias (PBias) is improved with integration of the DA. DREAM-DA and the RFC-DA have the lowest biases and the RFC-DA has the lowest Root Mean Squared Error (RMSE). However, the RFC and DREAM-DA have similar RMSE scores. For the probabilistic predictions, the RFC and DREAM have the highest Continuous Ranked Probability Skill Scores (CRPSS) and the RFC has the best discrimination for low flows. Reliability results are similar between the non-DA and DA tests and the DREAM and DREAM-DA have better reliability than the RFC and RFC-DA for forecast dates February 1 and later. Despite producing improved streamflow simulations in previous studies, the hindcast analysis suggests that the DA method tested may not result in obvious improvements in streamflow forecasts. We advocate that integration of hindcasting and probabilistic metrics provides more rigorous insight on model performance for forecasting applications, such as in this study.
NASA Astrophysics Data System (ADS)
Resseguier, V.; Memin, E.; Chapron, B.; Fox-Kemper, B.
2017-12-01
In order to better observe and predict geophysical flows, ensemble-based data assimilation methods are of high importance. In such methods, an ensemble of random realizations represents the variety of the simulated flow's likely behaviors. For this purpose, randomness needs to be introduced in a suitable way and physically-based stochastic subgrid parametrizations are promising paths. This talk will propose a new kind of such a parametrization referred to as modeling under location uncertainty. The fluid velocity is decomposed into a resolved large-scale component and an aliased small-scale one. The first component is possibly random but time-correlated whereas the second is white-in-time but spatially-correlated and possibly inhomogeneous and anisotropic. With such a velocity, the material derivative of any - possibly active - tracer is modified. Three new terms appear: a correction of the large-scale advection, a multiplicative noise and a possibly heterogeneous and anisotropic diffusion. This parameterization naturally ensures attractive properties such as energy conservation for each realization. Additionally, this stochastic material derivative and the associated Reynolds' transport theorem offer a systematic method to derive stochastic models. In particular, we will discuss the consequences of the Quasi-Geostrophic assumptions in our framework. Depending on the turbulence amount, different models with different physical behaviors are obtained. Under strong turbulence assumptions, a simplified diagnosis of frontolysis and frontogenesis at the surface of the ocean is possible in this framework. A Surface Quasi-Geostrophic (SQG) model with a weaker noise influence has also been simulated. A single realization better represents small scales than a deterministic SQG model at the same resolution. Moreover, an ensemble accurately predicts extreme events, bifurcations as well as the amplitudes and the positions of the simulation errors. Figure 1 highlights this last result and compares it to the strong error underestimation of an ensemble simulated from the deterministic dynamic with random initial conditions.
Impact of inherent meteorology uncertainty on air quality ...
It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is important to understand how uncertainties in these inputs affect the simulated concentrations. Ensembles are one method to explore how uncertainty in meteorology affects air pollution concentrations. Most studies explore this uncertainty by running different meteorological models or the same model with different physics options and in some cases combinations of different meteorological and air quality models. While these have been shown to be useful techniques in some cases, we present a technique that leverages the initial condition perturbations of a weather forecast ensemble, namely, the Short-Range Ensemble Forecast system to drive the four-dimensional data assimilation in the Weather Research and Forecasting (WRF)-Community Multiscale Air Quality (CMAQ) model with a key focus being the response of ozone chemistry and transport. Results confirm that a sizable spread in WRF solutions, including common weather variables of temperature, wind, boundary layer depth, clouds, and radiation, can cause a relatively large range of ozone-mixing ratios. Pollutant transport can be altered by hundreds of kilometers over several days. Ozone-mixing ratios of the ensemble can vary as much as 10–20 ppb
A mesoscale hybrid data assimilation system based on the JMA nonhydrostatic model
NASA Astrophysics Data System (ADS)
Ito, K.; Kunii, M.; Kawabata, T. T.; Saito, K. K.; Duc, L. L.
2015-12-01
This work evaluates the potential of a hybrid ensemble Kalman filter and four-dimensional variational (4D-Var) data assimilation system for predicting severe weather events from a deterministic point of view. This hybrid system is an adjoint-based 4D-Var system using a background error covariance matrix constructed from the mixture of a so-called NMC method and perturbations in a local ensemble transform Kalman filter data assimilation system, both of which are based on the Japan Meteorological Agency nonhydrostatic model. To construct the background error covariance matrix, we investigated two types of schemes. One is a spatial localization scheme and the other is neighboring ensemble approach, which regards the result at a horizontally spatially shifted point in each ensemble member as that obtained from a different realization of ensemble simulation. An assimilation of a pseudo single-observation located to the north of a tropical cyclone (TC) yielded an analysis increment of wind and temperature physically consistent with what is expected for a mature TC in both hybrid systems, whereas an analysis increment in a 4D-Var system using a static background error covariance distorted a structure of the mature TC. Real data assimilation experiments applied to 4 TCs and 3 local heavy rainfall events showed that hybrid systems and EnKF provided better initial conditions than the NMC-based 4D-Var, both for predicting the intensity and track forecast of TCs and for the location and amount of local heavy rainfall events.
Bassen, David M; Vilkhovoy, Michael; Minot, Mason; Butcher, Jonathan T; Varner, Jeffrey D
2017-01-25
Ensemble modeling is a promising approach for obtaining robust predictions and coarse grained population behavior in deterministic mathematical models. Ensemble approaches address model uncertainty by using parameter or model families instead of single best-fit parameters or fixed model structures. Parameter ensembles can be selected based upon simulation error, along with other criteria such as diversity or steady-state performance. Simulations using parameter ensembles can estimate confidence intervals on model variables, and robustly constrain model predictions, despite having many poorly constrained parameters. In this software note, we present a multiobjective based technique to estimate parameter or models ensembles, the Pareto Optimal Ensemble Technique in the Julia programming language (JuPOETs). JuPOETs integrates simulated annealing with Pareto optimality to estimate ensembles on or near the optimal tradeoff surface between competing training objectives. We demonstrate JuPOETs on a suite of multiobjective problems, including test functions with parameter bounds and system constraints as well as for the identification of a proof-of-concept biochemical model with four conflicting training objectives. JuPOETs identified optimal or near optimal solutions approximately six-fold faster than a corresponding implementation in Octave for the suite of test functions. For the proof-of-concept biochemical model, JuPOETs produced an ensemble of parameters that gave both the mean of the training data for conflicting data sets, while simultaneously estimating parameter sets that performed well on each of the individual objective functions. JuPOETs is a promising approach for the estimation of parameter and model ensembles using multiobjective optimization. JuPOETs can be adapted to solve many problem types, including mixed binary and continuous variable types, bilevel optimization problems and constrained problems without altering the base algorithm. JuPOETs is open source, available under an MIT license, and can be installed using the Julia package manager from the JuPOETs GitHub repository.
NASA Astrophysics Data System (ADS)
Khaki, M.; Hoteit, I.; Kuhn, M.; Awange, J.; Forootan, E.; van Dijk, A. I. J. M.; Schumacher, M.; Pattiaratchi, C.
2017-09-01
The time-variable terrestrial water storage (TWS) products from the Gravity Recovery And Climate Experiment (GRACE) have been increasingly used in recent years to improve the simulation of hydrological models by applying data assimilation techniques. In this study, for the first time, we assess the performance of the most popular data assimilation sequential techniques for integrating GRACE TWS into the World-Wide Water Resources Assessment (W3RA) model. We implement and test stochastic and deterministic ensemble-based Kalman filters (EnKF), as well as Particle filters (PF) using two different resampling approaches of Multinomial Resampling and Systematic Resampling. These choices provide various opportunities for weighting observations and model simulations during the assimilation and also accounting for error distributions. Particularly, the deterministic EnKF is tested to avoid perturbing observations before assimilation (that is the case in an ordinary EnKF). Gaussian-based random updates in the EnKF approaches likely do not fully represent the statistical properties of the model simulations and TWS observations. Therefore, the fully non-Gaussian PF is also applied to estimate more realistic updates. Monthly GRACE TWS are assimilated into W3RA covering the entire Australia. To evaluate the filters performances and analyze their impact on model simulations, their estimates are validated by independent in-situ measurements. Our results indicate that all implemented filters improve the estimation of water storage simulations of W3RA. The best results are obtained using two versions of deterministic EnKF, i.e. the Square Root Analysis (SQRA) scheme and the Ensemble Square Root Filter (EnSRF), respectively, improving the model groundwater estimations errors by 34% and 31% compared to a model run without assimilation. Applying the PF along with Systematic Resampling successfully decreases the model estimation error by 23%.
NASA Astrophysics Data System (ADS)
Oh, Seok-Geun; Suh, Myoung-Seok
2017-07-01
The projection skills of five ensemble methods were analyzed according to simulation skills, training period, and ensemble members, using 198 sets of pseudo-simulation data (PSD) produced by random number generation assuming the simulated temperature of regional climate models. The PSD sets were classified into 18 categories according to the relative magnitude of bias, variance ratio, and correlation coefficient, where each category had 11 sets (including 1 truth set) with 50 samples. The ensemble methods used were as follows: equal weighted averaging without bias correction (EWA_NBC), EWA with bias correction (EWA_WBC), weighted ensemble averaging based on root mean square errors and correlation (WEA_RAC), WEA based on the Taylor score (WEA_Tay), and multivariate linear regression (Mul_Reg). The projection skills of the ensemble methods improved generally as compared with the best member for each category. However, their projection skills are significantly affected by the simulation skills of the ensemble member. The weighted ensemble methods showed better projection skills than non-weighted methods, in particular, for the PSD categories having systematic biases and various correlation coefficients. The EWA_NBC showed considerably lower projection skills than the other methods, in particular, for the PSD categories with systematic biases. Although Mul_Reg showed relatively good skills, it showed strong sensitivity to the PSD categories, training periods, and number of members. On the other hand, the WEA_Tay and WEA_RAC showed relatively superior skills in both the accuracy and reliability for all the sensitivity experiments. This indicates that WEA_Tay and WEA_RAC are applicable even for simulation data with systematic biases, a short training period, and a small number of ensemble members.
Evaluation of TIGGE Ensemble Forecasts of Precipitation in Distinct Climate Regions in Iran
NASA Astrophysics Data System (ADS)
Aminyavari, Saleh; Saghafian, Bahram; Delavar, Majid
2018-04-01
The application of numerical weather prediction (NWP) products is increasing dramatically. Existing reports indicate that ensemble predictions have better skill than deterministic forecasts. In this study, numerical ensemble precipitation forecasts in the TIGGE database were evaluated using deterministic, dichotomous (yes/no), and probabilistic techniques over Iran for the period 2008-16. Thirteen rain gauges spread over eight homogeneous precipitation regimes were selected for evaluation. The Inverse Distance Weighting and Kriging methods were adopted for interpolation of the prediction values, downscaled to the stations at lead times of one to three days. To enhance the forecast quality, NWP values were post-processed via Bayesian Model Averaging. The results showed that ECMWF had better scores than other products. However, products of all centers underestimated precipitation in high precipitation regions while overestimating precipitation in other regions. This points to a systematic bias in forecasts and demands application of bias correction techniques. Based on dichotomous evaluation, NCEP did better at most stations, although all centers overpredicted the number of precipitation events. Compared to those of ECMWF and NCEP, UKMO yielded higher scores in mountainous regions, but performed poorly at other selected stations. Furthermore, the evaluations showed that all centers had better skill in wet than in dry seasons. The quality of post-processed predictions was better than those of the raw predictions. In conclusion, the accuracy of the NWP predictions made by the selected centers could be classified as medium over Iran, while post-processing of predictions is recommended to improve the quality.
NASA Astrophysics Data System (ADS)
Bai, Kaixu; Chang, Ni-Bin; Shi, Runhe; Yu, Huijia; Gao, Wei
2017-07-01
A four-step adaptive ozone trend estimation scheme is proposed by integrating multivariate linear regression (MLR) and ensemble empirical mode decomposition (EEMD) to analyze the long-term variability of total column ozone from a set of four observational and reanalysis total ozone data sets, including the rarely explored ERA-Interim total ozone reanalysis, from 1979 to 2009. Consistency among the four data sets was first assessed, indicating a mean relative difference of 1% and root-mean-square error around 2% on average, with respect to collocated ground-based total ozone observations. Nevertheless, large drifts with significant spatiotemporal inhomogeneity were diagnosed in ERA-Interim after 1995. To emphasize long-term trends, natural ozone variations associated with the solar cycle, quasi-biennial oscillation, volcanic aerosols, and El Niño-Southern Oscillation were modeled with MLR and then removed from each total ozone record, respectively, before performing EEMD analyses. The resulting rates of change estimated from the proposed scheme captured the long-term ozone variability well, with an inflection time of 2000 clearly detected. The positive rates of change after 2000 suggest that the ozone layer seems to be on a healing path, but the results are still inadequate to conclude an actual recovery of the ozone layer, and more observational evidence is needed. Further investigations suggest that biases embedded in total ozone records may significantly impact ozone trend estimations by resulting in large uncertainty or even negative rates of change after 2000.
Randomized central limit theorems: A unified theory
NASA Astrophysics Data System (ADS)
Eliazar, Iddo; Klafter, Joseph
2010-08-01
The central limit theorems (CLTs) characterize the macroscopic statistical behavior of large ensembles of independent and identically distributed random variables. The CLTs assert that the universal probability laws governing ensembles’ aggregate statistics are either Gaussian or Lévy, and that the universal probability laws governing ensembles’ extreme statistics are Fréchet, Weibull, or Gumbel. The scaling schemes underlying the CLTs are deterministic—scaling all ensemble components by a common deterministic scale. However, there are “random environment” settings in which the underlying scaling schemes are stochastic—scaling the ensemble components by different random scales. Examples of such settings include Holtsmark’s law for gravitational fields and the Stretched Exponential law for relaxation times. In this paper we establish a unified theory of randomized central limit theorems (RCLTs)—in which the deterministic CLT scaling schemes are replaced with stochastic scaling schemes—and present “randomized counterparts” to the classic CLTs. The RCLT scaling schemes are shown to be governed by Poisson processes with power-law statistics, and the RCLTs are shown to universally yield the Lévy, Fréchet, and Weibull probability laws.
Simultaneous calibration of ensemble river flow predictions over an entire range of lead times
NASA Astrophysics Data System (ADS)
Hemri, S.; Fundel, F.; Zappa, M.
2013-10-01
Probabilistic estimates of future water levels and river discharge are usually simulated with hydrologic models using ensemble weather forecasts as main inputs. As hydrologic models are imperfect and the meteorological ensembles tend to be biased and underdispersed, the ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, in order to achieve both reliable and sharp predictions statistical postprocessing is required. In this work Bayesian model averaging (BMA) is applied to statistically postprocess ensemble runoff raw forecasts for a catchment in Switzerland, at lead times ranging from 1 to 240 h. The raw forecasts have been obtained using deterministic and ensemble forcing meteorological models with different forecast lead time ranges. First, BMA is applied based on mixtures of univariate normal distributions, subject to the assumption of independence between distinct lead times. Then, the independence assumption is relaxed in order to estimate multivariate runoff forecasts over the entire range of lead times simultaneously, based on a BMA version that uses multivariate normal distributions. Since river runoff is a highly skewed variable, Box-Cox transformations are applied in order to achieve approximate normality. Both univariate and multivariate BMA approaches are able to generate well calibrated probabilistic forecasts that are considerably sharper than climatological forecasts. Additionally, multivariate BMA provides a promising approach for incorporating temporal dependencies into the postprocessed forecasts. Its major advantage against univariate BMA is an increase in reliability when the forecast system is changing due to model availability.
The real-time forecasts of ozone (O3) from seven air quality forecast models (AQFMs) are statistically evaluated against observations collected during July and August of 2004 (53 days) through the Aerometric Information Retrieval Now (AIRNow) network at roughly 340 mon...
Impacts of Considering Climate Variability on Investment Decisions in Ethiopia
NASA Astrophysics Data System (ADS)
Strzepek, K.; Block, P.; Rosegrant, M.; Diao, X.
2005-12-01
In Ethiopia, climate extremes, inducing droughts or floods, are not unusual. Monitoring the effects of these extremes, and climate variability in general, is critical for economic prediction and assessment of the country's future welfare. The focus of this study involves adding climate variability to a deterministic, mean climate-driven agro-economic model, in an attempt to understand its effects and degree of influence on general economic prediction indicators for Ethiopia. Four simulations are examined, including a baseline simulation and three investment strategies: simulations of irrigation investment, roads investment, and a combination investment of both irrigation and roads. The deterministic model is transformed into a stochastic model by dynamically adding year-to-year climate variability through climate-yield factors. Nine sets of actual, historic, variable climate data are individually assembled and implemented into the 12-year stochastic model simulation, producing an ensemble of economic prediction indicators. This ensemble allows for a probabilistic approach to planning and policy making, allowing decision makers to consider risk. The economic indicators from the deterministic and stochastic approaches, including rates of return to investments, are significantly different. The predictions of the deterministic model appreciably overestimate the future welfare of Ethiopia; the predictions of the stochastic model, utilizing actual climate data, tend to give a better semblance of what may be expected. Inclusion of climate variability is vital for proper analysis of the predictor values from this agro-economic model.
Parameter estimation for stiff deterministic dynamical systems via ensemble Kalman filter
NASA Astrophysics Data System (ADS)
Arnold, Andrea; Calvetti, Daniela; Somersalo, Erkki
2014-10-01
A commonly encountered problem in numerous areas of applications is to estimate the unknown coefficients of a dynamical system from direct or indirect observations at discrete times of some of the components of the state vector. A related problem is to estimate unobserved components of the state. An egregious example of such a problem is provided by metabolic models, in which the numerous model parameters and the concentrations of the metabolites in tissue are to be estimated from concentration data in the blood. A popular method for addressing similar questions in stochastic and turbulent dynamics is the ensemble Kalman filter (EnKF), a particle-based filtering method that generalizes classical Kalman filtering. In this work, we adapt the EnKF algorithm for deterministic systems in which the numerical approximation error is interpreted as a stochastic drift with variance based on classical error estimates of numerical integrators. This approach, which is particularly suitable for stiff systems where the stiffness may depend on the parameters, allows us to effectively exploit the parallel nature of particle methods. Moreover, we demonstrate how spatial prior information about the state vector, which helps the stability of the computed solution, can be incorporated into the filter. The viability of the approach is shown by computed examples, including a metabolic system modeling an ischemic episode in skeletal muscle, with a high number of unknown parameters.
Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation
NASA Astrophysics Data System (ADS)
Zhao, T.; Cai, X.; Yang, D.
2010-12-01
Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast
NASA Astrophysics Data System (ADS)
Lemaire, Vincent; Colette, Augustin; Menut, Laurent
2016-04-01
Because of its sensitivity to weather patterns, climate change will have an impact on air pollution so that, in the future, a climate penalty could jeopardize the expected efficiency of air pollution mitigation measures. A common method to assess the impact of climate on air quality consists in implementing chemistry-transport models forced by climate projections. However, at present, such impact assessment lack multi-model ensemble approaches to address uncertainties because of the substantial computing cost. Therefore, as a preliminary step towards exploring large climate ensembles with air quality models, we developed an ensemble exploration technique in order to point out the climate models that should be investigated in priority. By using a training dataset from a deterministic projection of climate and air quality over Europe, we identified the main meteorological drivers of air quality for 8 regions in Europe and developed statistical models that could be used to estimate future air pollutant concentrations. Applying this statistical model to the whole EuroCordex ensemble of climate projection, we find a climate penalty for six subregions out of eight (Eastern Europe, France, Iberian Peninsula, Mid Europe and Northern Italy). On the contrary, a climate benefit for PM2.5 was identified for three regions (Eastern Europe, Mid Europe and Northern Italy). The uncertainty of this statistical model challenges limits however the confidence we can attribute to associated quantitative projections. This technique allows however selecting a subset of relevant regional climate model members that should be used in priority for future deterministic projections to propose an adequate coverage of uncertainties. We are thereby proposing a smart ensemble exploration strategy that can also be used for other impacts studies beyond air quality.
Wave ensemble forecast in the Western Mediterranean Sea, application to an early warning system.
NASA Astrophysics Data System (ADS)
Pallares, Elena; Hernandez, Hector; Moré, Jordi; Espino, Manuel; Sairouni, Abdel
2015-04-01
The Western Mediterranean Sea is a highly heterogeneous and variable area, as is reflected on the wind field, the current field, and the waves, mainly in the first kilometers offshore. As a result of this variability, the wave forecast in these regions is quite complicated to perform, usually with some accuracy problems during energetic storm events. Moreover, is in these areas where most of the economic activities take part, including fisheries, sailing, tourism, coastal management and offshore renewal energy platforms. In order to introduce an indicator of the probability of occurrence of the different sea states and give more detailed information of the forecast to the end users, an ensemble wave forecast system is considered. The ensemble prediction systems have already been used in the last decades for the meteorological forecast; to deal with the uncertainties of the initial conditions and the different parametrizations used in the models, which may introduce some errors in the forecast, a bunch of different perturbed meteorological simulations are considered as possible future scenarios and compared with the deterministic forecast. In the present work, the SWAN wave model (v41.01) has been implemented for the Western Mediterranean sea, forced with wind fields produced by the deterministic Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS). The wind fields includes a deterministic forecast (also named control), between 11 and 21 ensemble members, and some intelligent member obtained from the ensemble, as the mean of all the members. Four buoys located in the study area, moored in coastal waters, have been used to validate the results. The outputs include all the time series, with a forecast horizon of 8 days and represented in spaghetti diagrams, the spread of the system and the probability at different thresholds. The main goal of this exercise is to be able to determine the degree of the uncertainty of the wave forecast, meaningful between the 5th and the 8th day of the prediction. The information obtained is then included in an early warning system, designed in the framework of the European project iCoast (ECHO/SUB/2013/661009) with the aim of set alarms in coastal areas depending on the wave conditions, the sea level, the flooding and the run up in the coast.
Deterministic chaos in entangled eigenstates
NASA Astrophysics Data System (ADS)
Schlegel, K. G.; Förster, S.
2008-05-01
We investigate the problem of deterministic chaos in connection with entangled states using the Bohmian formulation of quantum mechanics. We show for a two particle system in a harmonic oscillator potential, that in a case of entanglement and three energy eigen-values the maximum Lyapunov-parameters of a representative ensemble of trajectories for large times develops to a narrow positive distribution, which indicates nearly complete chaotic dynamics. We also present in short results from two time-dependent systems, the anisotropic and the Rabi oscillator.
Verifying and Postprocesing the Ensemble Spread-Error Relationship
NASA Astrophysics Data System (ADS)
Hopson, Tom; Knievel, Jason; Liu, Yubao; Roux, Gregory; Wu, Wanli
2013-04-01
With the increased utilization of ensemble forecasts in weather and hydrologic applications, there is a need to verify their benefit over less expensive deterministic forecasts. One such potential benefit of ensemble systems is their capacity to forecast their own forecast error through the ensemble spread-error relationship. The paper begins by revisiting the limitations of the Pearson correlation alone in assessing this relationship. Next, we introduce two new metrics to consider in assessing the utility an ensemble's varying dispersion. We argue there are two aspects of an ensemble's dispersion that should be assessed. First, and perhaps more fundamentally: is there enough variability in the ensembles dispersion to justify the maintenance of an expensive ensemble prediction system (EPS), irrespective of whether the EPS is well-calibrated or not? To diagnose this, the factor that controls the theoretical upper limit of the spread-error correlation can be useful. Secondly, does the variable dispersion of an ensemble relate to variable expectation of forecast error? Representing the spread-error correlation in relation to its theoretical limit can provide a simple diagnostic of this attribute. A context for these concepts is provided by assessing two operational ensembles: 30-member Western US temperature forecasts for the U.S. Army Test and Evaluation Command and 51-member Brahmaputra River flow forecasts of the Climate Forecast and Applications Project for Bangladesh. Both of these systems utilize a postprocessing technique based on quantile regression (QR) under a step-wise forward selection framework leading to ensemble forecasts with both good reliability and sharpness. In addition, the methodology utilizes the ensemble's ability to self-diagnose forecast instability to produce calibrated forecasts with informative skill-spread relationships. We will describe both ensemble systems briefly, review the steps used to calibrate the ensemble forecast, and present verification statistics using error-spread metrics, along with figures from operational ensemble forecasts before and after calibration.
NASA Astrophysics Data System (ADS)
Miyazaki, Kazuyuki; Bowman, Kevin
2017-07-01
The Atmospheric Chemistry Climate Model Intercomparison Project (ACCMIP) ensemble ozone simulations for the present day from the 2000 decade simulation results are evaluated by a state-of-the-art multi-constituent atmospheric chemical reanalysis that ingests multiple satellite data including the Tropospheric Emission Spectrometer (TES), the Microwave Limb Sounder (MLS), the Ozone Monitoring Instrument (OMI), and the Measurement of Pollution in the Troposphere (MOPITT) for 2005-2009. Validation of the chemical reanalysis against global ozonesondes shows good agreement throughout the free troposphere and lower stratosphere for both seasonal and year-to-year variations, with an annual mean bias of less than 0.9 ppb in the middle and upper troposphere at the tropics and mid-latitudes. The reanalysis provides comprehensive spatiotemporal evaluation of chemistry-model performance that compliments direct ozonesonde comparisons, which are shown to suffer from significant sampling bias. The reanalysis reveals that the ACCMIP ensemble mean overestimates ozone in the northern extratropics by 6-11 ppb while underestimating by up to 18 ppb in the southern tropics over the Atlantic in the lower troposphere. Most models underestimate the spatial variability of the annual mean lower tropospheric concentrations in the extratropics of both hemispheres by up to 70 %. The ensemble mean also overestimates the seasonal amplitude by 25-70 % in the northern extratropics and overestimates the inter-hemispheric gradient by about 30 % in the lower and middle troposphere. A part of the discrepancies can be attributed to the 5-year reanalysis data for the decadal model simulations. However, these differences are less evident with the current sonde network. To estimate ozonesonde sampling biases, we computed model bias separately for global coverage and the ozonesonde network. The ozonesonde sampling bias in the evaluated model bias for the seasonal mean concentration relative to global coverage is 40-50 % over the western Pacific and east Indian Ocean and reaches 110 % over the equatorial Americas and up to 80 % for the global tropics. In contrast, the ozonesonde sampling bias is typically smaller than 30 % for the Arctic regions in the lower and middle troposphere. These systematic biases have implications for ozone radiative forcing and the response of chemistry to climate that can be further quantified as the satellite observational record extends to multiple decades.
AQA - Air Quality model for Austria - Evaluation and Developments
NASA Astrophysics Data System (ADS)
Hirtl, M.; Krüger, B. C.; Baumann-Stanzer, K.; Skomorowski, P.
2009-04-01
The regional weather forecast model ALADIN of the Central Institute for Meteorology and Geodynamics (ZAMG) is used in combination with the chemical transport model CAMx (www.camx.com) to conduct forecasts of gaseous and particulate air pollution over Europe. The forecasts which are done in cooperation with the University of Natural Resources and Applied Life Sciences in Vienna (BOKU) are supported by the regional governments since 2005 with the main interest on the prediction of tropospheric ozone. The daily ozone forecasts are evaluated for the summer 2008 with the observations of about 150 air quality stations in Austria. In 2008 the emission-model SMOKE was integrated into the modelling system to calculate the biogenic emissions. The anthropogenic emissions are based on the newest EMEP data set as well as on regional inventories for the core domain. The performance of SMOKE is shown for a summer period in 2007. In the frame of the COST-action 728 „Enhancing mesoscale meteorological modelling capabilities for air pollution and dispersion applications", multi-model ensembles are used to conduct an international model evaluation. The model calculations of meteorological- and concentration fields are compared to measurements on the ensemble platform at the Joint Research Centre (JRC) in Ispra. The results for 2 episodes in 2006 show the performance of the different models as well as of the model ensemble.
2012-09-01
speed, and 2-m relative humidity (RH) (Kuchera 2011; Kuchera 2011, personal communication ). The AFWA deterministic (non-ensemble) WRF NWP model also...create the runs for this research in late 2010 (Kuchera 2011, personal communication ). The configuration used for the runs is described below, with...object-specific is not just a limitation with automated instrumentation, as a human observer viewing landmarks of various brightnesses is subject to
Minimization for conditional simulation: Relationship to optimal transport
NASA Astrophysics Data System (ADS)
Oliver, Dean S.
2014-05-01
In this paper, we consider the problem of generating independent samples from a conditional distribution when independent samples from the prior distribution are available. Although there are exact methods for sampling from the posterior (e.g. Markov chain Monte Carlo or acceptance/rejection), these methods tend to be computationally demanding when evaluation of the likelihood function is expensive, as it is for most geoscience applications. As an alternative, in this paper we discuss deterministic mappings of variables distributed according to the prior to variables distributed according to the posterior. Although any deterministic mappings might be equally useful, we will focus our discussion on a class of algorithms that obtain implicit mappings by minimization of a cost function that includes measures of data mismatch and model variable mismatch. Algorithms of this type include quasi-linear estimation, randomized maximum likelihood, perturbed observation ensemble Kalman filter, and ensemble of perturbed analyses (4D-Var). When the prior pdf is Gaussian and the observation operators are linear, we show that these minimization-based simulation methods solve an optimal transport problem with a nonstandard cost function. When the observation operators are nonlinear, however, the mapping of variables from the prior to the posterior obtained from those methods is only approximate. Errors arise from neglect of the Jacobian determinant of the transformation and from the possibility of discontinuous mappings.
Establishing and storing of deterministic quantum entanglement among three distant atomic ensembles.
Yan, Zhihui; Wu, Liang; Jia, Xiaojun; Liu, Yanhong; Deng, Ruijie; Li, Shujing; Wang, Hai; Xie, Changde; Peng, Kunchi
2017-09-28
It is crucial for the physical realization of quantum information networks to first establish entanglement among multiple space-separated quantum memories and then, at a user-controlled moment, to transfer the stored entanglement to quantum channels for distribution and conveyance of information. Here we present an experimental demonstration on generation, storage, and transfer of deterministic quantum entanglement among three spatially separated atomic ensembles. The off-line prepared multipartite entanglement of optical modes is mapped into three distant atomic ensembles to establish entanglement of atomic spin waves via electromagnetically induced transparency light-matter interaction. Then the stored atomic entanglement is transferred into a tripartite quadrature entangled state of light, which is space-separated and can be dynamically allocated to three quantum channels for conveying quantum information. The existence of entanglement among three released optical modes verifies that the system has the capacity to preserve multipartite entanglement. The presented protocol can be directly extended to larger quantum networks with more nodes.Continuous-variable encoding is a promising approach for quantum information and communication networks. Here, the authors show how to map entanglement from three spatial optical modes to three separated atomic samples via electromagnetically induced transparency, releasing it later on demand.
NASA Astrophysics Data System (ADS)
Perera, Kushan C.; Western, Andrew W.; Robertson, David E.; George, Biju; Nawarathna, Bandara
2016-06-01
Irrigation demands fluctuate in response to weather variations and a range of irrigation management decisions, which creates challenges for water supply system operators. This paper develops a method for real-time ensemble forecasting of irrigation demand and applies it to irrigation command areas of various sizes for lead times of 1 to 5 days. The ensemble forecasts are based on a deterministic time series model coupled with ensemble representations of the various inputs to that model. Forecast inputs include past flow, precipitation, and potential evapotranspiration. These inputs are variously derived from flow observations from a modernized irrigation delivery system; short-term weather forecasts derived from numerical weather prediction models and observed weather data available from automatic weather stations. The predictive performance for the ensemble spread of irrigation demand was quantified using rank histograms, the mean continuous rank probability score (CRPS), the mean CRPS reliability and the temporal mean of the ensemble root mean squared error (MRMSE). The mean forecast was evaluated using root mean squared error (RMSE), Nash-Sutcliffe model efficiency (NSE) and bias. The NSE values for evaluation periods ranged between 0.96 (1 day lead time, whole study area) and 0.42 (5 days lead time, smallest command area). Rank histograms and comparison of MRMSE, mean CRPS, mean CRPS reliability and RMSE indicated that the ensemble spread is generally a reliable representation of the forecast uncertainty for short lead times but underestimates the uncertainty for long lead times.
NASA Astrophysics Data System (ADS)
Debry, Edouard; Mallet, Vivien; Garaud, Damien; Malherbe, Laure; Bessagnet, Bertrand; Rouïl, Laurence
2010-05-01
Prev'Air is the French operational system for air pollution forecasting. It is developed and maintained by INERIS with financial support from the French Ministry for Environment. On a daily basis it delivers forecasts up to three days ahead for ozone, nitrogene dioxide and particles over France and Europe. Maps of concentration peaks and daily averages are freely available to the general public. More accurate data can be provided to customers and modelers. Prev'Air forecasts are based on the Chemical Transport Model CHIMERE. French authorities rely more and more on this platform to alert the general public in case of high pollution events and to assess the efficiency of regulation measures when such events occur. For example the road speed limit may be reduced in given areas when the ozone level exceeds one regulatory threshold. These operational applications require INERIS to assess the quality of its forecasts and to sensitize end users about the confidence level. Indeed concentrations always remain an approximation of the true concentrations because of the high uncertainty on input data, such as meteorological fields and emissions, because of incomplete or inaccurate representation of physical processes, and because of efficiencies in numerical integration [1]. We would like to present in this communication the uncertainty analysis of the CHIMERE model led in the framework of an INERIS research project aiming, on the one hand, to assess the uncertainty of several deterministic models and, on the other hand, to propose relevant indicators describing air quality forecast and their uncertainty. There exist several methods to assess the uncertainty of one model. Under given assumptions the model may be differentiated into an adjoint model which directly provides the concentrations sensitivity to given parameters. But so far Monte Carlo methods seem to be the most widely and oftenly used [2,3] as they are relatively easy to implement. In this framework one probability density function (PDF) is associated with an input parameter, according to its assumed uncertainty. Then the combined PDFs are propagated into the model, by means of several simulations with randomly perturbed input parameters. One may then obtain an approximation of the PDF of modeled concentrations, provided the Monte Carlo process has reasonably converged. The uncertainty analysis with CHIMERE has been led with a Monte Carlo method on the French domain and on two periods : 13 days during January 2009, with a focus on particles, and 28 days during August 2009, with a focus on ozone. The results show that for the summer period and 500 simulations, the time and space averaged standard deviation for ozone is 16 µg/m3, to be compared with an averaged concentration of 89 µg/m3. It is noteworthy that the space averaged standard deviation for ozone is relatively constant over time (the standard deviation of the timeseries itself is 1.6 µg/m3). The space variation of the ozone standard deviation seems to indicate that emissions have a significant impact, followed by western boundary conditions. Monte Carlo simulations are then post-processed by both ensemble [4] and Bayesian [5] methods in order to assess the quality of the uncertainty estimation. (1) Rao, K.S. Uncertainty Analysis in Atmospheric Dispersion Modeling, Pure and Applied Geophysics, 2005, 162, 1893-1917. (2) Beekmann, M. and Derognat, C. Monte Carlo uncertainty analysis of a regional-scale transport chemistry model constrained by measurements from the Atmospheric Pollution Over the Paris Area (ESQUIF) campaign, Journal of Geophysical Research, 2003, 108, 8559-8576. (3) Hanna, S.R. and Lu, Z. and Frey, H.C. and Wheeler, N. and Vukovich, J. and Arunachalam, S. and Fernau, M. and Hansen, D.A. Uncertainties in predicted ozone concentrations due to input uncertainties for the UAM-V photochemical grid model applied to the July 1995 OTAG domain, Atmospheric Environment, 2001, 35, 891-903. (4) Mallet, V., and B. Sportisse (2006), Uncertainty in a chemistry-transport model due to physical parameterizations and numerical approximations: An ensemble approach applied to ozone modeling, J. Geophys. Res., 111, D01302, doi:10.1029/2005JD006149. (5) Romanowicz, R. and Higson, H. and Teasdale, I. Bayesian uncertainty estimation methodology applied to air pollution modelling, Environmetrics, 2000, 11, 351-371.
Path planning in uncertain flow fields using ensemble method
NASA Astrophysics Data System (ADS)
Wang, Tong; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.
2016-10-01
An ensemble-based approach is developed to conduct optimal path planning in unsteady ocean currents under uncertainty. We focus our attention on two-dimensional steady and unsteady uncertain flows, and adopt a sampling methodology that is well suited to operational forecasts, where an ensemble of deterministic predictions is used to model and quantify uncertainty. In an operational setting, much about dynamics, topography, and forcing of the ocean environment is uncertain. To address this uncertainty, the flow field is parametrized using a finite number of independent canonical random variables with known densities, and the ensemble is generated by sampling these variables. For each of the resulting realizations of the uncertain current field, we predict the path that minimizes the travel time by solving a boundary value problem (BVP), based on the Pontryagin maximum principle. A family of backward-in-time trajectories starting at the end position is used to generate suitable initial values for the BVP solver. This allows us to examine and analyze the performance of the sampling strategy and to develop insight into extensions dealing with general circulation ocean models. In particular, the ensemble method enables us to perform a statistical analysis of travel times and consequently develop a path planning approach that accounts for these statistics. The proposed methodology is tested for a number of scenarios. We first validate our algorithms by reproducing simple canonical solutions, and then demonstrate our approach in more complex flow fields, including idealized, steady and unsteady double-gyre flows.
Shallow cumuli ensemble statistics for development of a stochastic parameterization
NASA Astrophysics Data System (ADS)
Sakradzija, Mirjana; Seifert, Axel; Heus, Thijs
2014-05-01
According to a conventional deterministic approach to the parameterization of moist convection in numerical atmospheric models, a given large scale forcing produces an unique response from the unresolved convective processes. This representation leaves out the small-scale variability of convection, as it is known from the empirical studies of deep and shallow convective cloud ensembles, there is a whole distribution of sub-grid states corresponding to the given large scale forcing. Moreover, this distribution gets broader with the increasing model resolution. This behavior is also consistent with our theoretical understanding of a coarse-grained nonlinear system. We propose an approach to represent the variability of the unresolved shallow-convective states, including the dependence of the sub-grid states distribution spread and shape on the model horizontal resolution. Starting from the Gibbs canonical ensemble theory, Craig and Cohen (2006) developed a theory for the fluctuations in a deep convective ensemble. The micro-states of a deep convective cloud ensemble are characterized by the cloud-base mass flux, which, according to the theory, is exponentially distributed (Boltzmann distribution). Following their work, we study the shallow cumulus ensemble statistics and the distribution of the cloud-base mass flux. We employ a Large-Eddy Simulation model (LES) and a cloud tracking algorithm, followed by a conditional sampling of clouds at the cloud base level, to retrieve the information about the individual cloud life cycles and the cloud ensemble as a whole. In the case of shallow cumulus cloud ensemble, the distribution of micro-states is a generalized exponential distribution. Based on the empirical and theoretical findings, a stochastic model has been developed to simulate the shallow convective cloud ensemble and to test the convective ensemble theory. Stochastic model simulates a compound random process, with the number of convective elements drawn from a Poisson distribution, and cloud properties sub-sampled from a generalized ensemble distribution. We study the role of the different cloud subtypes in a shallow convective ensemble and how the diverse cloud properties and cloud lifetimes affect the system macro-state. To what extent does the cloud-base mass flux distribution deviate from the simple Boltzmann distribution and how does it affect the results from the stochastic model? Is the memory, provided by the finite lifetime of individual clouds, of importance for the ensemble statistics? We also test for the minimal information given as an input to the stochastic model, able to reproduce the ensemble mean statistics and the variability in a convective ensemble. An important property of the resulting distribution of the sub-grid convective states is its scale-adaptivity - the smaller the grid-size, the broader the compound distribution of the sub-grid states.
NASA Astrophysics Data System (ADS)
Li, Tao; Deng, Fu-Guo
2015-10-01
Quantum repeater is one of the important building blocks for long distance quantum communication network. The previous quantum repeaters based on atomic ensembles and linear optical elements can only be performed with a maximal success probability of 1/2 during the entanglement creation and entanglement swapping procedures. Meanwhile, the polarization noise during the entanglement distribution process is harmful to the entangled channel created. Here we introduce a general interface between a polarized photon and an atomic ensemble trapped in a single-sided optical cavity, and with which we propose a high-efficiency quantum repeater protocol in which the robust entanglement distribution is accomplished by the stable spatial-temporal entanglement and it can in principle create the deterministic entanglement between neighboring atomic ensembles in a heralded way as a result of cavity quantum electrodynamics. Meanwhile, the simplified parity-check gate makes the entanglement swapping be completed with unity efficiency, other than 1/2 with linear optics. We detail the performance of our protocol with current experimental parameters and show its robustness to the imperfections, i.e., detuning and coupling variation, involved in the reflection process. These good features make it a useful building block in long distance quantum communication.
Li, Tao; Deng, Fu-Guo
2015-10-27
Quantum repeater is one of the important building blocks for long distance quantum communication network. The previous quantum repeaters based on atomic ensembles and linear optical elements can only be performed with a maximal success probability of 1/2 during the entanglement creation and entanglement swapping procedures. Meanwhile, the polarization noise during the entanglement distribution process is harmful to the entangled channel created. Here we introduce a general interface between a polarized photon and an atomic ensemble trapped in a single-sided optical cavity, and with which we propose a high-efficiency quantum repeater protocol in which the robust entanglement distribution is accomplished by the stable spatial-temporal entanglement and it can in principle create the deterministic entanglement between neighboring atomic ensembles in a heralded way as a result of cavity quantum electrodynamics. Meanwhile, the simplified parity-check gate makes the entanglement swapping be completed with unity efficiency, other than 1/2 with linear optics. We detail the performance of our protocol with current experimental parameters and show its robustness to the imperfections, i.e., detuning and coupling variation, involved in the reflection process. These good features make it a useful building block in long distance quantum communication.
Li, Tao; Deng, Fu-Guo
2015-01-01
Quantum repeater is one of the important building blocks for long distance quantum communication network. The previous quantum repeaters based on atomic ensembles and linear optical elements can only be performed with a maximal success probability of 1/2 during the entanglement creation and entanglement swapping procedures. Meanwhile, the polarization noise during the entanglement distribution process is harmful to the entangled channel created. Here we introduce a general interface between a polarized photon and an atomic ensemble trapped in a single-sided optical cavity, and with which we propose a high-efficiency quantum repeater protocol in which the robust entanglement distribution is accomplished by the stable spatial-temporal entanglement and it can in principle create the deterministic entanglement between neighboring atomic ensembles in a heralded way as a result of cavity quantum electrodynamics. Meanwhile, the simplified parity-check gate makes the entanglement swapping be completed with unity efficiency, other than 1/2 with linear optics. We detail the performance of our protocol with current experimental parameters and show its robustness to the imperfections, i.e., detuning and coupling variation, involved in the reflection process. These good features make it a useful building block in long distance quantum communication. PMID:26502993
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Lionello, Piero
2014-05-01
Advantages of an ensemble prediction forecast (EPF) technique that has been used for sea level (SL) prediction at the Northern Adriatic coast are investigated. The aims is to explore whether EPF is more precise than the traditional Deterministic Forecast (DF) and the value of the added information, mainly on forecast uncertainty. Improving the SL forecast for the city of Venice is of paramount importance for the management and maintenance of this historical city and for operating the movable barriers that are presently being built for its protection. The operational practice is simulated for three months from 1st October to 31st December 2010. The EPF is based on the HYPSE model, which is a standard single-layer nonlinear shallow water model, whose equations are derived from the depth averaged momentum equations and predicts the SL. A description of the model is available in the scientific literature. Forcing of HYPSE are provided by three different sets of 3-hourly ECMWF 10m-wind and MSLP fields: the high resolution meteorological forecast (which is used for the deterministic SL forecast, DF), the control run forecast (CRF, that differs from the DF forecast only for it lower meteorological fields resolution) and the 50 ensemble members of the ECMWF EPS (which are used for the SL-EPS. The resolution of DF fields is T1279 and resolution of both CRF and ECMWF EPS fields is T639 resolution. The 10m wind and MSLP fields have been downloaded at 0.125degs (DF) and 0.25degs(CRF and EPS) and linearly interpolated to the HYPSE grid (which is the same for all simulations). The version of HYPSE used in the SR EPS uses a rectangular mesh grid of variable size, which has the minimum grid step (0.03 degrees) in the northern part of the Adriatic Sea, from where grid step increases with a 1.01 factor in both latitude and longitude (In practice, resolution varies in the range from 3.3 to 7km). Results are analyzed considering the EPS spread, the rms of the simulations, the Brier Skill Score and are compared to observations at tide gauges distributed along the Croatian and Italian coast of the Adriatic Sea. It is shown that the ensemble spread is indeed a reliable indicator of the uncertainty of the storm surge prediction. Further, results show how uncertainty depends on the predicted value of sea level and how it increases with the forecast time range. The accuracy of the ensemble mean forecast is actually larger than that of the deterministic forecast, though the latter is produced by meteorological forcings at higher resolution
NASA Astrophysics Data System (ADS)
Tito Arandia Martinez, Fabian
2014-05-01
Adequate uncertainty assessment is an important issue in hydrological modelling. An important issue for hydropower producers is to obtain ensemble forecasts which truly grasp the uncertainty linked to upcoming streamflows. If properly assessed, this uncertainty can lead to optimal reservoir management and energy production (ex. [1]). The meteorological inputs to the hydrological model accounts for an important part of the total uncertainty in streamflow forecasting. Since the creation of the THORPEX initiative and the TIGGE database, access to meteorological ensemble forecasts from nine agencies throughout the world have been made available. This allows for hydrological ensemble forecasts based on multiple meteorological ensemble forecasts. Consequently, both the uncertainty linked to the architecture of the meteorological model and the uncertainty linked to the initial condition of the atmosphere can be accounted for. The main objective of this work is to show that a weighted combination of meteorological ensemble forecasts based on different atmospheric models can lead to improved hydrological ensemble forecasts, for horizons from one to ten days. This experiment is performed for the Baskatong watershed, a head subcatchment of the Gatineau watershed in the province of Quebec, in Canada. Baskatong watershed is of great importance for hydro-power production, as it comprises the main reservoir for the Gatineau watershed, on which there are six hydropower plants managed by Hydro-Québec. Since the 70's, they have been using pseudo ensemble forecast based on deterministic meteorological forecasts to which variability derived from past forecasting errors is added. We use a combination of meteorological ensemble forecasts from different models (precipitation and temperature) as the main inputs for hydrological model HSAMI ([2]). The meteorological ensembles from eight of the nine agencies available through TIGGE are weighted according to their individual performance and combined to form a grand ensemble. Results show that the hydrological forecasts derived from the grand ensemble perform better than the pseudo ensemble forecasts actually used operationally at Hydro-Québec. References: [1] M. Verbunt, A. Walser, J. Gurtz et al., "Probabilistic flood forecasting with a limited-area ensemble prediction system: Selected case studies," Journal of Hydrometeorology, vol. 8, no. 4, pp. 897-909, Aug, 2007. [2] N. Evora, Valorisation des prévisions météorologiques d'ensemble, Institu de recherceh d'Hydro-Québec 2005. [3] V. Fortin, Le modèle météo-apport HSAMI: historique, théorie et application, Institut de recherche d'Hydro-Québec, 2000.
NASA Astrophysics Data System (ADS)
Lim, S.; Park, S. K.; Zupanski, M.
2015-09-01
Ozone (O3) plays an important role in chemical reactions and is usually incorporated in chemical data assimilation (DA). In tropical cyclones (TCs), O3 usually shows a lower concentration inside the eyewall and an elevated concentration around the eye, impacting meteorological as well as chemical variables. To identify the impact of O3 observations on TC structure, including meteorological and chemical information, we developed a coupled meteorology-chemistry DA system by employing the Weather Research and Forecasting model coupled with Chemistry (WRF-Chem) and an ensemble-based DA algorithm - the maximum likelihood ensemble filter (MLEF). For a TC case that occurred over East Asia, Typhoon Nabi (2005), our results indicate that the ensemble forecast is reasonable, accompanied with larger background state uncertainty over the TC, and also over eastern China. Similarly, the assimilation of O3 observations impacts meteorological and chemical variables near the TC and over eastern China. The strongest impact on air quality in the lower troposphere was over China, likely due to the pollution advection. In the vicinity of the TC, however, the strongest impact on chemical variables adjustment was at higher levels. The impact on meteorological variables was similar in both over China and near the TC. The analysis results are verified using several measures that include the cost function, root mean square (RMS) error with respect to observations, and degrees of freedom for signal (DFS). All measures indicate a positive impact of DA on the analysis - the cost function and RMS error have decreased by 16.9 and 8.87 %, respectively. In particular, the DFS indicates a strong positive impact of observations in the TC area, with a weaker maximum over northeastern China.
Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics
NASA Astrophysics Data System (ADS)
Lazarus, S. M.; Holman, B. P.; Splitt, M. E.
2017-12-01
A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.
NASA Astrophysics Data System (ADS)
Cervone, G.; Clemente-Harding, L.; Alessandrini, S.; Delle Monache, L.
2016-12-01
A methodology based on Artificial Neural Networks (ANN) and an Analog Ensemble (AnEn) is presented to generate 72-hour deterministic and probabilistic forecasts of power generated by photovoltaic (PV) power plants using input from a numerical weather prediction model and computed astronomical variables. ANN and AnEn are used individually and in combination to generate forecasts for three solar power plant located in Italy. The computational scalability of the proposed solution is tested using synthetic data simulating 4,450 PV power stations. The NCAR Yellowstone supercomputer is employed to test the parallel implementation of the proposed solution, ranging from 1 node (32 cores) to 4,450 nodes (141,140 cores). Results show that a combined AnEn + ANN solution yields best results, and that the proposed solution is well suited for massive scale computation.
NASA Astrophysics Data System (ADS)
Silva, Raquel A.; West, J. Jason; Lamarque, Jean-François; Shindell, Drew T.; Collins, William J.; Dalsoren, Stig; Faluvegi, Greg; Folberth, Gerd; Horowitz, Larry W.; Nagashima, Tatsuya; Naik, Vaishali; Rumbold, Steven T.; Sudo, Kengo; Takemura, Toshihiko; Bergmann, Daniel; Cameron-Smith, Philip; Cionni, Irene; Doherty, Ruth M.; Eyring, Veronika; Josse, Beatrice; MacKenzie, Ian A.; Plummer, David; Righi, Mattia; Stevenson, David S.; Strode, Sarah; Szopa, Sophie; Zengast, Guang
2016-08-01
Ambient air pollution from ground-level ozone and fine particulate matter (PM2.5) is associated with premature mortality. Future concentrations of these air pollutants will be driven by natural and anthropogenic emissions and by climate change. Using anthropogenic and biomass burning emissions projected in the four Representative Concentration Pathway scenarios (RCPs), the ACCMIP ensemble of chemistry-climate models simulated future concentrations of ozone and PM2.5 at selected decades between 2000 and 2100. We use output from the ACCMIP ensemble, together with projections of future population and baseline mortality rates, to quantify the human premature mortality impacts of future ambient air pollution. Future air-pollution-related premature mortality in 2030, 2050 and 2100 is estimated for each scenario and for each model using a health impact function based on changes in concentrations of ozone and PM2.5 relative to 2000 and projected future population and baseline mortality rates. Additionally, the global mortality burden of ozone and PM2.5 in 2000 and each future period is estimated relative to 1850 concentrations, using present-day and future population and baseline mortality rates. The change in future ozone concentrations relative to 2000 is associated with excess global premature mortality in some scenarios/periods, particularly in RCP8.5 in 2100 (316 thousand deaths year-1), likely driven by the large increase in methane emissions and by the net effect of climate change projected in this scenario, but it leads to considerable avoided premature mortality for the three other RCPs. However, the global mortality burden of ozone markedly increases from 382 000 (121 000 to 728 000) deaths year-1 in 2000 to between 1.09 and 2.36 million deaths year-1 in 2100, across RCPs, mostly due to the effect of increases in population and baseline mortality rates. PM2.5 concentrations decrease relative to 2000 in all scenarios, due to projected reductions in emissions, and are associated with avoided premature mortality, particularly in 2100: between -2.39 and -1.31 million deaths year-1 for the four RCPs. The global mortality burden of PM2.5 is estimated to decrease from 1.70 (1.30 to 2.10) million deaths year-1 in 2000 to between 0.95 and 1.55 million deaths year-1 in 2100 for the four RCPs due to the combined effect of decreases in PM2.5 concentrations and changes in population and baseline mortality rates. Trends in future air-pollution-related mortality vary regionally across scenarios, reflecting assumptions for economic growth and air pollution control specific to each RCP and region. Mortality estimates differ among chemistry-climate models due to differences in simulated pollutant concentrations, which is the greatest contributor to overall mortality uncertainty for most cases assessed here, supporting the use of model ensembles to characterize uncertainty. Increases in exposed population and baseline mortality rates of respiratory diseases magnify the impact on premature mortality of changes in future air pollutant concentrations and explain why the future global mortality burden of air pollution can exceed the current burden, even where air pollutant concentrations decrease.
Silva, Raquel A; West, J Jason; Lamarque, Jean-François; Shindell, Drew T; Collins, William J; Dalsoren, Stig; Faluvegi, Greg; Folberth, Gerd; Horowitz, Larry W; Nagashima, Tatsuya; Naik, Vaishali; Rumbold, Steven T; Sudo, Kengo; Takemura, Toshihiko; Bergmann, Daniel; Cameron-Smith, Philip; Cionni, Irene; Doherty, Ruth M; Eyring, Veronika; Josse, Beatrice; MacKenzie, I A; Plummer, David; Righi, Mattia; Stevenson, David S; Strode, Sarah; Szopa, Sophie; Zeng, Guang
2016-01-01
Ambient air pollution from ground-level ozone and fine particulate matter (PM 2.5 ) is associated with premature mortality. Future concentrations of these air pollutants will be driven by natural and anthropogenic emissions and by climate change. Using anthropogenic and biomass burning emissions projected in the four Representative Concentration Pathway scenarios (RCPs), the ACCMIP ensemble of chemistry-climate models simulated future concentrations of ozone and PM 2.5 at selected decades between 2000 and 2100. We use output from the ACCMIP ensemble, together with projections of future population and baseline mortality rates, to quantify the human premature mortality impacts of future ambient air pollution. Future air pollution-related premature mortality in 2030, 2050 and 2100 is estimated for each scenario and for each model using a health impact function based on changes in concentrations of ozone and PM 2.5 relative to 2000 and projected future population and baseline mortality rates. Additionally, the global mortality burden of ozone and PM 2.5 in 2000 and each future period is estimated relative to 1850 concentrations, using present-day and future population and baseline mortality rates. The change in future ozone concentrations relative to 2000 is associated with excess global premature mortality in some scenarios/periods, particularly in RCP8.5 in 2100 (316 thousand deaths/year), likely driven by the large increase in methane emissions and by the net effect of climate change projected in this scenario, but it leads to considerable avoided premature mortality for the three other RCPs. However, the global mortality burden of ozone markedly increases from 382,000 (121,000 to 728,000) deaths/year in 2000 to between 1.09 and 2.36 million deaths/year in 2100, across RCPs, mostly due to the effect of increases in population and baseline mortality rates. PM 2.5 concentrations decrease relative to 2000 in all scenarios, due to projected reductions in emissions, and are associated with avoided premature mortality, particularly in 2100: between -2.39 and -1.31 million deaths/year for the four RCPs. The global mortality burden of PM 2.5 is estimated to decrease from 1.70 (1.30 to 2.10) million deaths/year in 2000 to between 0.95 and 1.55 million deaths/year in 2100 for the four RCPs, due to the combined effect of decreases in PM 2.5 concentrations and changes in population and baseline mortality rates. Trends in future air pollution-related mortality vary regionally across scenarios, reflecting assumptions for economic growth and air pollution control specific to each RCP and region. Mortality estimates differ among chemistry-climate models due to differences in simulated pollutant concentrations, which is the greatest contributor to overall mortality uncertainty for most cases assessed here, supporting the use of model ensembles to characterize uncertainty. Increases in exposed population and baseline mortality rates of respiratory diseases magnify the impact on premature mortality of changes in future air pollutant concentrations and explain why the future global mortality burden of air pollution can exceed the current burden, even where air pollutant concentrations decrease.
Regional crop yield forecasting: a probabilistic approach
NASA Astrophysics Data System (ADS)
de Wit, A.; van Diepen, K.; Boogaard, H.
2009-04-01
Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.
NASA Astrophysics Data System (ADS)
Christensen, Hannah; Moroz, Irene; Palmer, Tim
2015-04-01
Forecast verification is important across scientific disciplines as it provides a framework for evaluating the performance of a forecasting system. In the atmospheric sciences, probabilistic skill scores are often used for verification as they provide a way of unambiguously ranking the performance of different probabilistic forecasts. In order to be useful, a skill score must be proper -- it must encourage honesty in the forecaster, and reward forecasts which are reliable and which have good resolution. A new score, the Error-spread Score (ES), is proposed which is particularly suitable for evaluation of ensemble forecasts. It is formulated with respect to the moments of the forecast. The ES is confirmed to be a proper score, and is therefore sensitive to both resolution and reliability. The ES is tested on forecasts made using the Lorenz '96 system, and found to be useful for summarising the skill of the forecasts. The European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system (EPS) is evaluated using the ES. Its performance is compared to a perfect statistical probabilistic forecast -- the ECMWF high resolution deterministic forecast dressed with the observed error distribution. This generates a forecast that is perfectly reliable if considered over all time, but which does not vary from day to day with the predictability of the atmospheric flow. The ES distinguishes between the dynamically reliable EPS forecasts and the statically reliable dressed deterministic forecasts. Other skill scores are tested and found to be comparatively insensitive to this desirable forecast quality. The ES is used to evaluate seasonal range ensemble forecasts made with the ECMWF System 4. The ensemble forecasts are found to be skilful when compared with climatological or persistence forecasts, though this skill is dependent on region and time of year.
The Development of Storm Surge Ensemble Prediction System and Case Study of Typhoon Meranti in 2016
NASA Astrophysics Data System (ADS)
Tsai, Y. L.; Wu, T. R.; Terng, C. T.; Chu, C. H.
2017-12-01
Taiwan is under the threat of storm surge and associated inundation, which is located at a potentially severe storm generation zone. The use of ensemble prediction can help forecasters to know the characteristic of storm surge under the uncertainty of track and intensity. In addition, it can help the deterministic forecasting. In this study, the kernel of ensemble prediction system is based on COMCOT-SURGE (COrnell Multi-grid COupled Tsunami Model - Storm Surge). COMCOT-SURGE solves nonlinear shallow water equations in Open Ocean and coastal regions with the nested-grid scheme and adopts wet-dry-cell treatment to calculate potential inundation area. In order to consider tide-surge interaction, the global TPXO 7.1 tide model provides the tidal boundary conditions. After a series of validations and case studies, COMCOT-SURGE has become an official operating system of Central Weather Bureau (CWB) in Taiwan. In this study, the strongest typhoon in 2016, Typhoon Meranti, is chosen as a case study. We adopt twenty ensemble members from CWB WRF Ensemble Prediction System (CWB WEPS), which differs from parameters of microphysics, boundary layer, cumulus, and surface. From box-and-whisker results, maximum observed storm surges were located in the interval of the first and third quartile at more than 70 % gauge locations, e.g. Toucheng, Chengkung, and Jiangjyun. In conclusion, the ensemble prediction can effectively help forecasters to predict storm surge especially under the uncertainty of storm track and intensity
NASA Astrophysics Data System (ADS)
Xu, Jianhui; Shu, Hong
2014-09-01
This study assesses the analysis performance of assimilating the Moderate Resolution Imaging Spectroradiometer (MODIS)-based albedo and snow cover fraction (SCF) separately or jointly into the physically based Common Land Model (CoLM). A direct insertion method (DI) is proposed to assimilate the black and white-sky albedos into the CoLM. The MODIS-based albedo is calculated with the MODIS bidirectional reflectance distribution function (BRDF) model parameters product (MCD43B1) and the solar zenith angle as estimated in the CoLM for each time step. Meanwhile, the MODIS SCF (MOD10A1) is assimilated into the CoLM using the deterministic ensemble Kalman filter (DEnKF) method. A new DEnKF-albedo assimilation scheme for integrating the DI and DEnKF assimilation schemes is proposed. Our assimilation results are validated against in situ snow depth observations from November 2008 to March 2009 at five sites in the Altay region of China. The experimental results show that all three data assimilation schemes can improve snow depth simulations. But overall, the DEnKF-albedo assimilation shows the best analysis performance as it significantly reduces the bias and root-mean-square error (RMSE) during the snow accumulation and ablation periods at all sites except for the Fuyun site. The SCF assimilation via DEnKF produces better results than the albedo assimilation via DI, implying that the albedo assimilation that indirectly updates the snow depth state variable is less efficient than the direct SCF assimilation. For the Fuyun site, the DEnKF-albedo scheme tends to overestimate the snow depth accumulation with the maximum bias and RMSE values because of the large positive innovation (observation minus forecast).
Impacts of Atmosphere-Ocean Coupling on Southern Hemisphere Climate Change
NASA Technical Reports Server (NTRS)
Li, Feng; Newman, Paul; Pawson, Steven
2013-01-01
Climate in the Southern Hemisphere (SH) has undergone significant changes in recent decades. These changes are closely linked to the shift of the Southern Annular Mode (SAM) towards its positive polarity, which is driven primarily by Antarctic ozone depletion. There is growing evidence that Antarctic ozone depletion has significant impacts on Southern Ocean circulation change. However, it is poorly understood whether and how ocean feedback might impact the SAM and climate change in the SH atmosphere. This outstanding science question is investigated using the Goddard Earth Observing System Coupled Atmosphere-Ocean-Chemistry Climate Model(GEOS-AOCCM).We perform ensemble simulations of the recent past (1960-2010) with and without the interactive ocean. For simulations without the interactive ocean, we use sea surface temperatures and sea ice concentrations produced by the interactive ocean simulations. The differences between these two ensemble simulations quantify the effects of atmosphere-ocean coupling. We will investigate the impacts of atmosphere-ocean coupling on stratospheric processes such as Antarctic ozone depletion and Antarctic polar vortex breakup. We will address whether ocean feedback affects Rossby wave generation in the troposphere and wave propagation into the stratosphere. Another focuson this study is to assess how ocean feedback might affect the tropospheric SAM response to Antarctic ozone depletion
Developing Stochastic Models as Inputs for High-Frequency Ground Motion Simulations
NASA Astrophysics Data System (ADS)
Savran, William Harvey
High-frequency ( 10 Hz) deterministic ground motion simulations are challenged by our understanding of the small-scale structure of the earth's crust and the rupture process during an earthquake. We will likely never obtain deterministic models that can accurately describe these processes down to the meter scale length required for broadband wave propagation. Instead, we can attempt to explain the behavior, in a statistical sense, by including stochastic models defined by correlations observed in the natural earth and through physics based simulations of the earthquake rupture process. Toward this goal, we develop stochastic models to address both of the primary considerations for deterministic ground motion simulations: namely, the description of the material properties in the crust, and broadband earthquake source descriptions. Using borehole sonic log data recorded in Los Angeles basin, we estimate the spatial correlation structure of the small-scale fluctuations in P-wave velocities by determining the best-fitting parameters of a von Karman correlation function. We find that Hurst exponents, nu, between 0.0-0.2, vertical correlation lengths, az, of 15-150m, an standard deviation, sigma of about 5% characterize the variability in the borehole data. Usin these parameters, we generated a stochastic model of velocity and density perturbations and combined with leading seismic velocity models to perform a validation exercise for the 2008, Chino Hills, CA using heterogeneous media. We find that models of velocity and density perturbations can have significant effects on the wavefield at frequencies as low as 0.3 Hz, with ensemble median values of various ground motion metrics varying up to +/-50%, at certain stations, compared to those computed solely from the CVM. Finally, we develop a kinematic rupture generator based on dynamic rupture simulations on geometrically complex faults. We analyze 100 dynamic rupture simulations on strike-slip faults ranging from Mw 6.4-7.2. We find that our dynamic simulations follow empirical scaling relationships for inter-plate strike-slip events, and provide source spectra comparable with an o -2 model. Our rupture generator reproduces GMPE medians and intra-event standard deviations spectral accelerations for an ensemble of 10 Hz fully-deterministic ground motion simulations, as compared to NGA West2 GMPE relationships up to 0.2 seconds.
Application of Wavelet Filters in an Evaluation of Photochemical Model Performance
Air quality model evaluation can be enhanced with time-scale specific comparisons of outputs and observations. For example, high-frequency (hours to one day) time scale information in observed ozone is not well captured by deterministic models and its incorporation into model pe...
On the Identification of Ozone Recovery
NASA Astrophysics Data System (ADS)
Stone, Kane A.; Solomon, Susan; Kinnison, Douglas E.
2018-05-01
As ozone depleting substances decline, stratospheric ozone is displaying signs of healing in the Antarctic lower stratosphere. Here we focus on higher altitudes and the global stratosphere. Two key processes that can influence ozone recovery are evaluated: dynamical variability and solar proton events (SPEs). A nine-member ensemble of free-running simulations indicates that dynamical variability dominates the relatively small ozone recovery signal over 1998-2016 in the subpolar lower stratosphere, particularly near the tropical tropopause. The absence of observed recovery there to date is therefore not unexpected. For the upper stratosphere, high latitudes (50-80°N/S) during autumn and winter show the largest recovery. Large halogen-induced odd oxygen loss there provides a fingerprint of seasonal sensitivity to chlorine trends. However, we show that SPEs also have a profound effect on ozone trends within this region since 2000. Thus, accounting for SPEs is important for detection of recovery in the upper stratosphere.
NASA Astrophysics Data System (ADS)
Simon, E.; Bertino, L.; Samuelsen, A.
2011-12-01
Combined state-parameter estimation in ocean biogeochemical models with ensemble-based Kalman filters is a challenging task due to the non-linearity of the models, the constraints of positiveness that apply to the variables and parameters, and the non-Gaussian distribution of the variables in which they result. Furthermore, these models are sensitive to numerous parameters that are poorly known. Previous works [1] demonstrated that the Gaussian anamorphosis extensions of ensemble-based Kalman filters were relevant tools to perform combined state-parameter estimation in such non-Gaussian framework. In this study, we focus on the estimation of the grazing preferences parameters of zooplankton species. These parameters are introduced to model the diet of zooplankton species among phytoplankton species and detritus. They are positive values and their sum is equal to one. Because the sum-to-one constraint cannot be handled by ensemble-based Kalman filters, a reformulation of the parameterization is proposed. We investigate two types of changes of variables for the estimation of sum-to-one constrained parameters. The first one is based on Gelman [2] and leads to the estimation of normal distributed parameters. The second one is based on the representation of the unit sphere in spherical coordinates and leads to the estimation of parameters with bounded distributions (triangular or uniform). These formulations are illustrated and discussed in the framework of twin experiments realized in the 1D coupled model GOTM-NORWECOM with Gaussian anamorphosis extensions of the deterministic ensemble Kalman filter (DEnKF). [1] Simon E., Bertino L. : Gaussian anamorphosis extension of the DEnKF for combined state and parameter estimation : application to a 1D ocean ecosystem model. Journal of Marine Systems, 2011. doi :10.1016/j.jmarsys.2011.07.007 [2] Gelman A. : Method of Moments Using Monte Carlo Simulation. Journal of Computational and Graphical Statistics, 4, 1, 36-54, 1995.
Calibration of limited-area ensemble precipitation forecasts for hydrological predictions
NASA Astrophysics Data System (ADS)
Diomede, Tommaso; Marsigli, Chiara; Montani, Andrea; Nerozzi, Fabrizio; Paccagnella, Tiziana
2015-04-01
The main objective of this study is to investigate the impact of calibration for limited-area ensemble precipitation forecasts, to be used for driving discharge predictions up to 5 days in advance. A reforecast dataset, which spans 30 years, based on the Consortium for Small Scale Modeling Limited-Area Ensemble Prediction System (COSMO-LEPS) was used for testing the calibration strategy. Three calibration techniques were applied: quantile-to-quantile mapping, linear regression, and analogs. The performance of these methodologies was evaluated in terms of statistical scores for the precipitation forecasts operationally provided by COSMO-LEPS in the years 2003-2007 over Germany, Switzerland, and the Emilia-Romagna region (northern Italy). The analog-based method seemed to be preferred because of its capability of correct position errors and spread deficiencies. A suitable spatial domain for the analog search can help to handle model spatial errors as systematic errors. However, the performance of the analog-based method may degrade in cases where a limited training dataset is available. A sensitivity test on the length of the training dataset over which to perform the analog search has been performed. The quantile-to-quantile mapping and linear regression methods were less effective, mainly because the forecast-analysis relation was not so strong for the available training dataset. A comparison between the calibration based on the deterministic reforecast and the calibration based on the full operational ensemble used as training dataset has been considered, with the aim to evaluate whether reforecasts are really worthy for calibration, given that their computational cost is remarkable. The verification of the calibration process was then performed by coupling ensemble precipitation forecasts with a distributed rainfall-runoff model. This test was carried out for a medium-sized catchment located in Emilia-Romagna, showing a beneficial impact of the analog-based method on the reduction of missed events for discharge predictions.
NASA Astrophysics Data System (ADS)
Ravazzani, Giovanni; Amengual, Arnau; Ceppi, Alessandro; Romero, Romualdo; Homar, Victor; Mancini, Marco
2015-04-01
Analysis of forecasting strategies that can provide a tangible basis for flood early warning procedures and mitigation measures over the Western Mediterranean region is one of the fundamental motivations of the European HyMeX programme. Here, we examine a set of hydro-meteorological episodes that affected the Milano urban area for which the complex flood protection system of the city did not completely succeed before the occurred flash-floods. Indeed, flood damages have exponentially increased in the area during the last 60 years, due to industrial and urban developments. Thus, the improvement of the Milano flood control system needs a synergism between structural and non-structural approaches. The flood forecasting system tested in this work comprises the Flash-flood Event-based Spatially distributed rainfall-runoff Transformation, including Water Balance (FEST-WB) and the Weather Research and Forecasting (WRF) models, in order to provide a hydrological ensemble prediction system (HEPS). Deterministic and probabilistic quantitative precipitation forecasts (QPFs) have been provided by WRF model in a set of 48-hours experiments. HEPS has been generated by combining different physical parameterizations (i.e. cloud microphysics, moist convection and boundary-layer schemes) of the WRF model in order to better encompass the atmospheric processes leading to high precipitation amounts. We have been able to test the value of a probabilistic versus a deterministic framework when driving Quantitative Discharge Forecasts (QDFs). Results highlight (i) the benefits of using a high-resolution HEPS in conveying uncertainties for this complex orographic area and (ii) a better simulation of the most of extreme precipitation events, potentially enabling valuable probabilistic QDFs. Hence, the HEPS copes with the significant deficiencies found in the deterministic QPFs. These shortcomings would prevent to correctly forecast the location and timing of high precipitation rates and total amounts at the catchment scale, thus impacting heavily the deterministic QDFs. In contrast, early warnings would have been possible within a HEPS context for the Milano area, proving the suitability of such system for civil protection purposes.
NASA Astrophysics Data System (ADS)
Qi, Wei; Liu, Junguo; Yang, Hong; Sweetapple, Chris
2018-03-01
Global precipitation products are very important datasets in flow simulations, especially in poorly gauged regions. Uncertainties resulting from precipitation products, hydrological models and their combinations vary with time and data magnitude, and undermine their application to flow simulations. However, previous studies have not quantified these uncertainties individually and explicitly. This study developed an ensemble-based dynamic Bayesian averaging approach (e-Bay) for deterministic discharge simulations using multiple global precipitation products and hydrological models. In this approach, the joint probability of precipitation products and hydrological models being correct is quantified based on uncertainties in maximum and mean estimation, posterior probability is quantified as functions of the magnitude and timing of discharges, and the law of total probability is implemented to calculate expected discharges. Six global fine-resolution precipitation products and two hydrological models of different complexities are included in an illustrative application. e-Bay can effectively quantify uncertainties and therefore generate better deterministic discharges than traditional approaches (weighted average methods with equal and varying weights and maximum likelihood approach). The mean Nash-Sutcliffe Efficiency values of e-Bay are up to 0.97 and 0.85 in training and validation periods respectively, which are at least 0.06 and 0.13 higher than traditional approaches. In addition, with increased training data, assessment criteria values of e-Bay show smaller fluctuations than traditional approaches and its performance becomes outstanding. The proposed e-Bay approach bridges the gap between global precipitation products and their pragmatic applications to discharge simulations, and is beneficial to water resources management in ungauged or poorly gauged regions across the world.
NASA Astrophysics Data System (ADS)
Žabkar, Rahela; Koračin, Darko; Rakovec, Jože
2013-10-01
A high ozone (O3) concentrations episode during a heat wave event in the Northeastern Mediterranean was investigated using the WRF/Chem model. To understand the major model uncertainties and errors as well as the impacts of model inputs on the model accuracy, an ensemble modelling experiment was conducted. The 51-member ensemble was designed by varying model physics parameterization options (PBL schemes with different surface layer and land-surface modules, and radiation schemes); chemical initial and boundary conditions; anthropogenic and biogenic emission inputs; and model domain setup and resolution. The main impacts of the geographical and emission characteristics of three distinct regions (suburban Mediterranean, continental urban, and continental rural) on the model accuracy and O3 predictions were investigated. In spite of the large ensemble set size, the model generally failed to simulate the extremes; however, as expected from probabilistic forecasting the ensemble spread improved results with respect to extremes compared to the reference run. Noticeable model nighttime overestimations at the Mediterranean and some urban and rural sites can be explained by too strong simulated winds, which reduce the impact of dry deposition and O3 titration in the near surface layers during the nighttime. Another possible explanation could be inaccuracies in the chemical mechanisms, which are suggested also by model insensitivity to variations in the nitrogen oxides (NOx) and volatile organic compounds (VOC) emissions. Major impact factors for underestimations of the daytime O3 maxima at the Mediterranean and some rural sites include overestimation of the PBL depths, a lack of information on forest fires, too strong surface winds, and also possible inaccuracies in biogenic emissions. This numerical experiment with the ensemble runs also provided guidance on an optimum model setup and input data.
NASA Technical Reports Server (NTRS)
Silva, Raquel A.; West, J. Jason; Lamarque, Jean-Francois; Shindell, Drew T.; Collins, William J.; Dalsoren, Stig; Faluvegi, Greg; Folberth, Gerd; Horowitz, Larry W.; Nagashima, Tatsuya;
2016-01-01
Ambient air pollution from ground-level ozone and fine particulate matter (PM(sub 2.5)) is associated with premature mortality. Future concentrations of these air pollutants will be driven by natural and anthropogenic emissions and by climate change. Using anthropogenic and biomass burning emissions projected in the four Representative Concentration Pathway scenarios (RCPs), the ACCMIP ensemble of chemistry climate models simulated future concentrations of ozone and PM(sub 2.5) at selected decades between 2000 and 2100. We use output from the ACCMIP ensemble, together with projections of future population and baseline mortality rates, to quantify the human premature mortality impacts of future ambient air pollution. Future air-pollution-related premature mortality in 2030, 2050 and 2100 is estimated for each scenario and for each model using a health impact function based on changes in concentrations of ozone and PM(sub 2.5) relative to 2000 and projected future population and baseline mortality rates. Additionally, the global mortality burden of ozone and PM(sub 2.5) in 2000 and each future period is estimated relative to 1850 concentrations, using present-day and future population and baseline mortality rates. The change in future ozone concentrations relative to 2000 is associated with excess global premature mortality in some scenarios/periods, particularly in RCP8.5 in 2100 (316 thousand deaths per year), likely driven by the large increase in methane emissions and by the net effect of climate change projected in this scenario, but it leads to considerable avoided premature mortality for the three other RCPs. However, the global mortality burden of ozone markedly increases from 382000 (121000 to 728000) deaths per year in 2000 to between 1.09 and 2.36 million deaths per year in 2100, across RCPs, mostly due to the effect of increases in population and baseline mortality rates. PM(sub 2.5) concentrations decrease relative to 2000 in all scenarios, due to projected reductions in emissions, and are associated with avoided premature mortality, particularly in 2100: between 2.39 and 1.31 million deaths per year for the four RCPs. The global mortality burden of PM(sub 2.5) is estimated to decrease from 1.70 (1.30 to 2.10) million deaths per year in 2000 to between 0.95 and 1.55 million deaths per year in 2100 for the four RCPs due to the combined effect of decreases in PM(sub 2.5) concentrations and changes in population and baseline mortality rates. Trends in future air-pollution-related mortality vary regionally across scenarios, reflecting assumptions for economic growth and air pollution control specific to each RCP and region. Mortality estimates differ among chemistry climate models due to differences in simulated pollutant concentrations, which is the greatest contributor to overall mortality uncertainty for most cases assessed here, supporting the use of model ensembles to characterize uncertainty. Increases in exposed population and baseline mortality rates of respiratory diseases magnify the impact on premature mortality of changes in future air pollutant concentrations and explain why the future global mortality burden of air pollution can exceed the current burden, even where air pollutant concentrations decrease.
NASA Astrophysics Data System (ADS)
Rowley, C. D.; Hogan, P. J.; Martin, P.; Thoppil, P.; Wei, M.
2017-12-01
An extended range ensemble forecast system is being developed in the US Navy Earth System Prediction Capability (ESPC), and a global ocean ensemble generation capability to represent uncertainty in the ocean initial conditions has been developed. At extended forecast times, the uncertainty due to the model error overtakes the initial condition as the primary source of forecast uncertainty. Recently, stochastic parameterization or stochastic forcing techniques have been applied to represent the model error in research and operational atmospheric, ocean, and coupled ensemble forecasts. A simple stochastic forcing technique has been developed for application to US Navy high resolution regional and global ocean models, for use in ocean-only and coupled atmosphere-ocean-ice-wave ensemble forecast systems. Perturbation forcing is added to the tendency equations for state variables, with the forcing defined by random 3- or 4-dimensional fields with horizontal, vertical, and temporal correlations specified to characterize different possible kinds of error. Here, we demonstrate the stochastic forcing in regional and global ensemble forecasts with varying perturbation amplitudes and length and time scales, and assess the change in ensemble skill measured by a range of deterministic and probabilistic metrics.
NASA Astrophysics Data System (ADS)
Seko, Hiromu; Kunii, Masaru; Yokota, Sho; Tsuyuki, Tadashi; Miyoshi, Takemasa
2015-12-01
Experiments simulating intense vortices associated with tornadoes that occurred on 6 May 2012 on the Kanto Plain, Japan, were performed with a nested local ensemble transform Kalman filter (LETKF) system. Intense vortices were reproduced by downscale experiments with a 12-member ensemble in which the initial conditions were obtained from the nested LETKF system analyses. The downscale experiments successfully generated intense vortices in three regions similar to the observed vortices, whereas only one tornado was reproduced by a deterministic forecast. The intense vorticity of the strongest tornado, which was observed in the southernmost region, was successfully reproduced by 10 of the 12 ensemble members. An examination of the results of the ensemble downscale experiments showed that the duration of intense vorticities tended to be longer when the vertical shear of the horizontal wind was larger and the lower airflow was more humid. Overall, the study results show that ensemble forecasts have the following merits: (1) probabilistic forecasts of the outbreak of intense vortices associated with tornadoes are possible; (2) the miss rate of outbreaks should decrease; and (3) environmental factors favoring outbreaks can be obtained by comparing the multiple possible scenarios of the ensemble forecasts.
NASA Astrophysics Data System (ADS)
Zheng, Fei; Zhu, Jiang
2017-04-01
How to design a reliable ensemble prediction strategy with considering the major uncertainties of a forecasting system is a crucial issue for performing an ensemble forecast. In this study, a new stochastic perturbation technique is developed to improve the prediction skills of El Niño-Southern Oscillation (ENSO) through using an intermediate coupled model. We first estimate and analyze the model uncertainties from the ensemble Kalman filter analysis results through assimilating the observed sea surface temperatures. Then, based on the pre-analyzed properties of model errors, we develop a zero-mean stochastic model-error model to characterize the model uncertainties mainly induced by the missed physical processes of the original model (e.g., stochastic atmospheric forcing, extra-tropical effects, Indian Ocean Dipole). Finally, we perturb each member of an ensemble forecast at each step by the developed stochastic model-error model during the 12-month forecasting process, and add the zero-mean perturbations into the physical fields to mimic the presence of missing processes and high-frequency stochastic noises. The impacts of stochastic model-error perturbations on ENSO deterministic predictions are examined by performing two sets of 21-yr hindcast experiments, which are initialized from the same initial conditions and differentiated by whether they consider the stochastic perturbations. The comparison results show that the stochastic perturbations have a significant effect on improving the ensemble-mean prediction skills during the entire 12-month forecasting process. This improvement occurs mainly because the nonlinear terms in the model can form a positive ensemble-mean from a series of zero-mean perturbations, which reduces the forecasting biases and then corrects the forecast through this nonlinear heating mechanism.
Preserving the Boltzmann ensemble in replica-exchange molecular dynamics.
Cooke, Ben; Schmidler, Scott C
2008-10-28
We consider the convergence behavior of replica-exchange molecular dynamics (REMD) [Sugita and Okamoto, Chem. Phys. Lett. 314, 141 (1999)] based on properties of the numerical integrators in the underlying isothermal molecular dynamics (MD) simulations. We show that a variety of deterministic algorithms favored by molecular dynamics practitioners for constant-temperature simulation of biomolecules fail either to be measure invariant or irreducible, and are therefore not ergodic. We then show that REMD using these algorithms also fails to be ergodic. As a result, the entire configuration space may not be explored even in an infinitely long simulation, and the simulation may not converge to the desired equilibrium Boltzmann ensemble. Moreover, our analysis shows that for initial configurations with unfavorable energy, it may be impossible for the system to reach a region surrounding the minimum energy configuration. We demonstrate these failures of REMD algorithms for three small systems: a Gaussian distribution (simple harmonic oscillator dynamics), a bimodal mixture of Gaussians distribution, and the alanine dipeptide. Examination of the resulting phase plots and equilibrium configuration densities indicates significant errors in the ensemble generated by REMD simulation. We describe a simple modification to address these failures based on a stochastic hybrid Monte Carlo correction, and prove that this is ergodic.
NASA Astrophysics Data System (ADS)
Shen, L.; Mickley, L. J.; Gilleland, E.
2016-04-01
We develop a statistical model using extreme value theory to estimate the 2000-2050 changes in ozone episodes across the United States. We model the relationships between daily maximum temperature (Tmax) and maximum daily 8 h average (MDA8) ozone in May-September over 2003-2012 using a Point Process (PP) model. At ~20% of the sites, a marked decrease in the ozone-temperature slope occurs at high temperatures, defined as ozone suppression. The PP model sometimes fails to capture ozone-Tmax relationships, so we refit the ozone-Tmax slope using logistic regression and a generalized Pareto distribution model. We then apply the resulting hybrid-extreme value theory model to projections of Tmax from an ensemble of downscaled climate models. Assuming constant anthropogenic emissions at the present level, we find an average increase of 2.3 d a-1 in ozone episodes (>75 ppbv) across the United States by the 2050s, with a change of +3-9 d a-1 at many sites.
NASA Astrophysics Data System (ADS)
Velázquez, Juan Alberto; Anctil, François; Ramos, Maria-Helena; Perrin, Charles
2010-05-01
An ensemble forecasting system seeks to assess and to communicate the uncertainty of hydrological predictions by proposing, at each time step, an ensemble of forecasts from which one can estimate the probability distribution of the predictant (the probabilistic forecast), in contrast with a single estimate of the flow, for which no distribution is obtainable (the deterministic forecast). In the past years, efforts towards the development of probabilistic hydrological prediction systems were made with the adoption of ensembles of numerical weather predictions (NWPs). The additional information provided by the different available Ensemble Prediction Systems (EPS) was evaluated in a hydrological context on various case studies (see the review by Cloke and Pappenberger, 2009). For example, the European ECMWF-EPS was explored in case studies by Roulin et al. (2005), Bartholmes et al. (2005), Jaun et al. (2008), and Renner et al. (2009). The Canadian EC-EPS was also evaluated by Velázquez et al. (2009). Most of these case studies investigate the ensemble predictions of a given hydrological model, set up over a limited number of catchments. Uncertainty from weather predictions is assessed through the use of meteorological ensembles. However, uncertainty from the tested hydrological model and statistical robustness of the forecasting system when coping with different hydro-meteorological conditions are less frequently evaluated. The aim of this study is to evaluate and compare the performance and the reliability of 18 lumped hydrological models applied to a large number of catchments in an operational ensemble forecasting context. Some of these models were evaluated in a previous study (Perrin et al. 2001) for their ability to simulate streamflow. Results demonstrated that very simple models can achieve a level of performance almost as high (sometimes higher) as models with more parameters. In the present study, we focus on the ability of the hydrological models to provide reliable probabilistic forecasts of streamflow, based on ensemble weather predictions. The models were therefore adapted to run in a forecasting mode, i.e., to update initial conditions according to the last observed discharge at the time of the forecast, and to cope with ensemble weather scenarios. All models are lumped, i.e., the hydrological behavior is integrated over the spatial scale of the catchment, and run at daily time steps. The complexity of tested models varies between 3 and 13 parameters. The models are tested on 29 French catchments. Daily streamflow time series extend over 17 months, from March 2005 to July 2006. Catchment areas range between 1470 km2 and 9390 km2, and represent a variety of hydrological and meteorological conditions. The 12 UTC 10-day ECMWF rainfall ensemble (51 members) was used, which led to daily streamflow forecasts for a 9-day lead time. In order to assess the performance and reliability of the hydrological ensemble predictions, we computed the Continuous Ranked probability Score (CRPS) (Matheson and Winkler, 1976), as well as the reliability diagram (e.g. Wilks, 1995) and the rank histogram (Talagrand et al., 1999). Since the ECMWF deterministic forecasts are also available, the performance of the hydrological forecasting systems was also evaluated by comparing the deterministic score (MAE) with the probabilistic score (CRPS). The results obtained for the 18 hydrological models and the 29 studied catchments are discussed in the perspective of improving the operational use of ensemble forecasting in hydrology. References Bartholmes, J. and Todini, E.: Coupling meteorological and hydrological models for flood forecasting, Hydrol. Earth Syst. Sci., 9, 333-346, 2005. Cloke, H. and Pappenberger, F.: Ensemble Flood Forecasting: A Review. Journal of Hydrology 375 (3-4): 613-626, 2009. Jaun, S., Ahrens, B., Walser, A., Ewen, T., and Schär, C.: A probabilistic view on the August 2005 floods in the upper Rhine catchment, Nat. Hazards Earth Syst. Sci., 8, 281-291, 2008. Matheson, J. E. and Winkler, R. L.: Scoring rules for continuous probability distributions, Manage Sci., 22, 1087-1096, 1976. Perrin, C., Michel C. and Andréassian,V. Does a large number of parameters enhance model performance? Comparative assessment of common catchment model structures on 429 catchments, J. Hydrol., 242, 275-301, 2001. Renner, M., Werner, M. G. F., Rademacher, S., and Sprokkereef, E.: Verification of ensemble flow forecast for the River Rhine, J. Hydrol., 376, 463-475, 2009. Roulin, E. and Vannitsem, S.: Skill of medium-range hydrological ensemble predictions, J. Hydrometeorol., 6, 729-744, 2005. Talagrand, O., Vautard, R., and Strauss, B.: Evaluation of the probabilistic prediction systems, in: Proceedings, ECMWF Workshop on Predictability, Shinfield Park, Reading, Berkshire, ECMWF, 1-25, 1999. Velázquez, J.A., Petit, T., Lavoie, A., Boucher M.-A., Turcotte R., Fortin V., and Anctil, F. : An evaluation of the Canadian global meteorological ensemble prediction system for short-term hydrological forecasting, Hydrol. Earth Syst. Sci., 13, 2221-2231, 2009. Wilks, D. S.: Statistical Methods in the Atmospheric Sciences, Academic Press, San Diego, CA, 465 pp., 1995.
Diagnostics of sources of tropospheric ozone using data assimilation during the KORUS-AQ campaign
NASA Astrophysics Data System (ADS)
Gaubert, B.; Emmons, L. K.; Miyazaki, K.; Buchholz, R. R.; Tang, W.; Arellano, A. F., Jr.; Tilmes, S.; Barré, J.; Worden, H. M.; Raeder, K.; Anderson, J. L.; Edwards, D. P.
2017-12-01
Atmospheric oxidative capacity plays a crucial role in the fate of greenhouse gases and air pollutants as well as in the formation of secondary pollutants such as tropospheric ozone. The attribution of sources of tropospheric ozone is a difficult task because of biases in input parameters and forcings such as emissions and meteorology in addition to errors in chemical schemes. We assimilate satellite remote sensing observations of ozone precursors such as carbon monoxide (CO) and nitrogen dioxide (NO2) in the global coupled chemistry-transport model: Community Atmosphere Model with Chemistry (CAM-Chem). The assimilation is completed using an Ensemble Adjustment Kalman Filter (EAKF) in the Data Assimilation Research Testbed (DART) framework which allows estimates of unobserved parameters and potential constraints on secondary pollutants and emissions. The ensemble will be constructed using perturbations in chemical kinetics, different emission fields and by assimilating meteorological observations to fully assess uncertainties in the chemical fields of targeted species. We present a set of tools such as emission tags (CO and propane), combined with diagnostic analysis of chemical regimes and perturbation of emissions ratios to estimate a regional budget of primary and secondary pollutants in East Asia and their sensitivity to data assimilation. This study benefits from the large set of aircraft and ozonesonde in-situ observations from the Korea-United States Air Quality (KORUS-AQ) campaign that occurred in South Korea in May-June 2016.
NASA Astrophysics Data System (ADS)
Pan, Yujie; Xue, Ming; Zhu, Kefeng; Wang, Mingjun
2018-05-01
A dual-resolution (DR) version of a regional ensemble Kalman filter (EnKF)-3D ensemble variational (3DEnVar) coupled hybrid data assimilation system is implemented as a prototype for the operational Rapid Refresh forecasting system. The DR 3DEnVar system combines a high-resolution (HR) deterministic background forecast with lower-resolution (LR) EnKF ensemble perturbations used for flow-dependent background error covariance to produce a HR analysis. The computational cost is substantially reduced by running the ensemble forecasts and EnKF analyses at LR. The DR 3DEnVar system is tested with 3-h cycles over a 9-day period using a 40/˜13-km grid spacing combination. The HR forecasts from the DR hybrid analyses are compared with forecasts launched from HR Gridpoint Statistical Interpolation (GSI) 3D variational (3DVar) analyses, and single LR hybrid analyses interpolated to the HR grid. With the DR 3DEnVar system, a 90% weight for the ensemble covariance yields the lowest forecast errors and the DR hybrid system clearly outperforms the HR GSI 3DVar. Humidity and wind forecasts are also better than those launched from interpolated LR hybrid analyses, but the temperature forecasts are slightly worse. The humidity forecasts are improved most. For precipitation forecasts, the DR 3DEnVar always outperforms HR GSI 3DVar. It also outperforms the LR 3DEnVar, except for the initial forecast period and lower thresholds.
Decision Support on the Sediments Flushing of Aimorés Dam Using Medium-Range Ensemble Forecasts
NASA Astrophysics Data System (ADS)
Mainardi Fan, Fernando; Schwanenberg, Dirk; Collischonn, Walter; Assis dos Reis, Alberto; Alvarado Montero, Rodolfo; Alencar Siqueira, Vinicius
2015-04-01
In the present study we investigate the use of medium-range streamflow forecasts in the Doce River basin (Brazil), at the reservoir of Aimorés Hydro Power Plant (HPP). During daily operations this reservoir acts as a "trap" to the sediments that originate from the upstream basin of the Doce River. This motivates a cleaning process called "pass through" to periodically remove the sediments from the reservoir. The "pass through" or "sediments flushing" process consists of a decrease of the reservoir's water level to a certain flushing level when a determined reservoir inflow threshold is forecasted. Then, the water in the approaching inflow is used to flush the sediments from the reservoir through the spillway and to recover the original reservoir storage. To be triggered, the sediments flushing operation requires an inflow larger than 3000m³/s in a forecast horizon of 7 days. This lead-time of 7 days is far beyond the basin's concentration time (around 2 days), meaning that the forecasts for the pass through procedure highly depends on Numerical Weather Predictions (NWP) models that generate Quantitative Precipitation Forecasts (QPF). This dependency creates an environment with a high amount of uncertainty to the operator. To support the decision making at Aimorés HPP we developed a fully operational hydrological forecasting system to the basin. The system is capable of generating ensemble streamflow forecasts scenarios when driven by QPF data from meteorological Ensemble Prediction Systems (EPS). This approach allows accounting for uncertainties in the NWP at a decision making level. This system is starting to be used operationally by CEMIG and is the one shown in the present study, including a hindcasting analysis to assess the performance of the system for the specific flushing problem. The QPF data used in the hindcasting study was derived from the TIGGE (THORPEX Interactive Grand Global Ensemble) database. Among all EPS available on TIGGE, three were selected: ECMWF, GEFS, and CPTEC. As a deterministic reference forecast, we adopt the high resolution ECMWF forecast for comparison. The experiment consisted on running retrospective forecasts for a full five-year period. To verify the proposed objectives of the study, we use different metrics to evaluate the forecast: ROC Curves, Exceedance Diagrams, Forecast Convergence Score (FCS). Metrics results enabled to understand the benefits of the hydrological ensemble prediction system as a decision making tool for the HPP operation. The ROC scores indicate that the use of the lower percentiles of the ensemble scenarios issues for a true alarm rate around 0,5 to 0,8 (depending on the model and on the percentile), for the lead time of seven days. While the false alarm rate is between 0 and 0,3. Those rates were better than the ones resulting from the deterministic reference forecast. Exceedance diagrams and forecast convergence scores indicate that the ensemble scenarios provide an early signal about the threshold crossing. Furthermore, the ensemble forecasts are more consistent between two subsequent forecasts in comparison to the deterministic forecast. The assessments results also give more credibility to CEMIG in the realization and communication of flushing operation with the stakeholders involved.
NASA Astrophysics Data System (ADS)
Hess, P.; Kinnison, D.; Tang, Q.
2015-03-01
Despite the need to understand the impact of changes in emissions and climate on tropospheric ozone, the attribution of tropospheric interannual ozone variability to specific processes has proven difficult. Here, we analyze the stratospheric contribution to tropospheric ozone variability and trends from 1953 to 2005 in the Northern Hemisphere (NH) mid-latitudes using four ensemble simulations of the free running (FR) Whole Atmosphere Community Climate Model (WACCM). The simulations are externally forced with observed time-varying (1) sea-surface temperatures (SSTs), (2) greenhouse gases (GHGs), (3) ozone depleting substances (ODS), (4) quasi-biennial oscillation (QBO), (5) solar variability (SV) and (6) stratospheric sulfate surface area density (SAD). A detailed representation of stratospheric chemistry is simulated, including the ozone loss due to volcanic eruptions and polar stratospheric clouds. In the troposphere, ozone production is represented by CH4-NOx smog chemistry, where surface chemical emissions remain interannually constant. Despite the simplicity of its tropospheric chemistry, at many NH measurement locations, the interannual ozone variability in the FR WACCM simulations is significantly correlated with the measured interannual variability. This suggests the importance of the external forcing applied in these simulations in driving interannual ozone variability. The variability and trend in the simulated 1953-2005 tropospheric ozone from 30 to 90° N at background surface measurement sites, 500 hPa measurement sites and in the area average are largely explained on interannual timescales by changes in the 30-90° N area averaged flux of ozone across the 100 hPa surface and changes in tropospheric methane concentrations. The average sensitivity of tropospheric ozone to methane (percent change in ozone to a percent change in methane) from 30 to 90° N is 0.17 at 500 hPa and 0.21 at the surface; the average sensitivity of tropospheric ozone to the 100 hPa ozone flux (percent change in ozone to a percent change in the ozone flux) from 30 to 90° N is 0.19 at 500 hPa and 0.11 at the surface. The 30-90° N simulated downward residual velocity at 100 hPa increased by 15% between 1953 and 2005. However, the impact of this on the 30-90° N 100 hPa ozone flux is modulated by the long-term changes in stratospheric ozone. The ozone flux decreases from 1965 to 1990 due to stratospheric ozone depletion, but increases again by approximately 7% from 1990 to 2005. The first empirical orthogonal function of interannual ozone variability explains from 40% (at the surface) to over 80% (at 150 hPa) of the simulated ozone interannual variability from 30 to 90° N. This identified mode of ozone variability shows strong stratosphere-troposphere coupling, demonstrating the importance of the stratosphere in an attribution of tropospheric ozone variability. The simulations, with no change in emissions, capture almost 50% of the measured ozone change during the 1990s at a variety of locations. This suggests that a large portion of the measured change is not due to changes in emissions, but can be traced to changes in large-scale modes of ozone variability. This emphasizes the difficulty in the attribution of ozone changes, and the importance of natural variability in understanding the trends and variability of ozone. We find little relation between the El Niño-Southern Oscillation (ENSO) index and large-scale tropospheric ozone variability over the long-term record.
Tropical pacing of Antarctic sea ice increase
NASA Astrophysics Data System (ADS)
Schneider, D. P.
2015-12-01
One reason why coupled climate model simulations generally do not reproduce the observed increase in Antarctic sea ice extent may be that their internally generated climate variability does not sync with the observed phases of phenomena like the Pacific Decadal Oscillation (PDO) and ENSO. For example, it is unlikely for a free-running coupled model simulation to capture the shift of the PDO from its positive to negative phase during 1998, and the subsequent ~15 year duration of the negative PDO phase. In previously presented work based on atmospheric models forced by observed tropical SSTs and stratospheric ozone, we demonstrated that tropical variability is key to explaining the wind trends over the Southern Ocean during the past ~35 years, particularly in the Ross, Amundsen and Bellingshausen Seas, the regions of the largest trends in sea ice extent and ice season duration. Here, we extend this idea to coupled model simulations with the Community Earth System Model (CESM) in which the evolution of SST anomalies in the central and eastern tropical Pacific is constrained to match the observations. This ensemble of 10 "tropical pacemaker" simulations shows a more realistic evolution of Antarctic sea ice anomalies than does its unconstrained counterpart, the CESM Large Ensemble (both sets of runs include stratospheric ozone depletion and other time-dependent radiative forcings). In particular, the pacemaker runs show that increased sea ice in the eastern Ross Sea is associated with a deeper Amundsen Sea Low (ASL) and stronger westerlies over the south Pacific. These circulation patterns in turn are linked with the negative phase of the PDO, characterized by negative SST anomalies in the central and eastern Pacific. The timing of tropical decadal variability with respect to ozone depletion further suggests a strong role for tropical variability in the recent acceleration of the Antarctic sea ice trend, as ozone depletion stabilized by late 1990s, prior to the most recent major shift in tropical climate. In the pacemaker runs, the positive sea ice trend in the eastern Ross Sea is stronger during the most recent period (~2000-2014) than it is during period of rapid ozone depletion (~1980-1996).
Fractional dynamics using an ensemble of classical trajectories
NASA Astrophysics Data System (ADS)
Sun, Zhaopeng; Dong, Hao; Zheng, Yujun
2018-01-01
A trajectory-based formulation for fractional dynamics is presented and the trajectories are generated deterministically. In this theoretical framework, we derive a new class of estimators in terms of confluent hypergeometric function (F11) to represent the Riesz fractional derivative. Using this method, the simulation of free and confined Lévy flight are in excellent agreement with the exact numerical and analytical results. In addition, the barrier crossing in a bistable potential driven by Lévy noise of index α is investigated. In phase space, the behavior of trajectories reveal the feature of Lévy flight in a better perspective.
The effect of future outdoor air pollution on human health and the contribution of climate change
NASA Astrophysics Data System (ADS)
Silva, R.; West, J. J.; Lamarque, J.; Shindell, D.; Collins, W.; Dalsoren, S. B.; Faluvegi, G. S.; Folberth, G.; Horowitz, L. W.; Nagashima, T.; Naik, V.; Rumbold, S.; Skeie, R.; Sudo, K.; Takemura, T.; Bergmann, D. J.; Cameron-Smith, P. J.; Cionni, I.; Doherty, R. M.; Eyring, V.; Josse, B.; MacKenzie, I. A.; Plummer, D.; Righi, M.; Stevenson, D. S.; Strode, S. A.; Szopa, S.; Zeng, G.
2013-12-01
At present, exposure to outdoor air pollution from ozone and fine particulate matter (PM2.5) causes over 2 million deaths per year, due to respiratory and cardiovascular diseases and lung cancer. Future ambient concentrations of ozone and PM2.5 will be affected by both air pollutant emissions and climate change. Here we estimate the potential impact of future outdoor air pollution on premature human mortality, and isolate the contribution of future climate change due to its effect on air quality. We use modeled present-day (2000) and future global ozone and PM2.5 concentrations from simulations with an ensemble of chemistry-climate models from the Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP). Future air pollution was modeled for global greenhouse gas and air pollutant emissions in the four IPCC AR5 Representative Concentration Pathway (RCP) scenarios, for 2030, 2050 and 2100. All model outputs are regridded to a common 0.5°x0.5° horizontal resolution. Future premature mortality is estimated for each RCP scenario and year based on changes in concentrations of ozone and PM2.5 relative to 2000. Using a health impact function, changes in concentrations for each RCP scenario are combined with future population and cause-specific baseline mortality rates as projected by a single independent scenario in which the global incidence of cardiopulmonary diseases is expected to increase. The effect of climate change is isolated by considering the difference between air pollutant concentrations from simulations with 2000 emissions and a future year climate and simulations with 2000 emissions and climate. Uncertainties in the results reflect the uncertainty in the concentration-response function and that associated with variability among models. Few previous studies have quantified the effects of future climate change on global human health via changes in air quality, and this is the first such study to use an ensemble of global models.
NASA Astrophysics Data System (ADS)
Jebri, B.; Khodri, M.; Gastineau, G.; Echevin, V.; Thiria, S.
2017-12-01
Upwelling is critical to the biological production, acidification, and deoxygenation of the ocean's major eastern boundary current ecosystems. A conceptual hypothesis suggests that the winds that favour coastal upwelling intensify with anthropogenic global warming due to increased land-sea temperature contrast. We examine this hypothesis for the dynamics of the Peru-Chile upwelling using a set of four large ensembles of coupled, ocean-atmosphere model simulations with the IPSL model covering the 1940-2014 period. In one large ensemble we prescribe the standard CMIP5 greenhouse gas (GHG) concentrations, anthropogenic aerosol, ozone and volcanic forcings, following the historical experiments through 2005 and RCP8.5 from 2006-2014, while the other ensembles consider separately the GHG, ozone and volcanic forcings. We find evidence for intensification of upwelling-favourable winds with however little evidence of atmospheric pressure gradients in response to increasing land-sea temperature differences. Our analyses reveal poleward migration and intensification of the South Pacific Anticyclone near poleward boundaries of climatological Peruvian and Chilean upwelling zones. This contribution further investigates the physical mechanisms for the Peru-Chile upwelling intensification and the relative role of natural and anthropogenic forcings.
NASA Astrophysics Data System (ADS)
Yan, Y.; Barth, A.; Beckers, J. M.; Brankart, J. M.; Brasseur, P.; Candille, G.
2017-07-01
In this paper, three incremental analysis update schemes (IAU 0, IAU 50 and IAU 100) are compared in the same assimilation experiments with a realistic eddy permitting primitive equation model of the North Atlantic Ocean using the Ensemble Kalman Filter. The difference between the three IAU schemes lies on the position of the increment update window. The relevance of each IAU scheme is evaluated through analyses on both thermohaline and dynamical variables. The validation of the assimilation results is performed according to both deterministic and probabilistic metrics against different sources of observations. For deterministic validation, the ensemble mean and the ensemble spread are compared to the observations. For probabilistic validation, the continuous ranked probability score (CRPS) is used to evaluate the ensemble forecast system according to reliability and resolution. The reliability is further decomposed into bias and dispersion by the reduced centred random variable (RCRV) score. The obtained results show that 1) the IAU 50 scheme has the same performance as the IAU 100 scheme 2) the IAU 50/100 schemes outperform the IAU 0 scheme in error covariance propagation for thermohaline variables in relatively stable region, while the IAU 0 scheme outperforms the IAU 50/100 schemes in dynamical variables estimation in dynamically active region 3) in case with sufficient number of observations and good error specification, the impact of IAU schemes is negligible. The differences between the IAU 0 scheme and the IAU 50/100 schemes are mainly due to different model integration time and different instability (density inversion, large vertical velocity, etc.) induced by the increment update. The longer model integration time with the IAU 50/100 schemes, especially the free model integration, on one hand, allows for better re-establishment of the equilibrium model state, on the other hand, smooths the strong gradients in dynamically active region.
NASA Astrophysics Data System (ADS)
Gagnon, Patrick; Rousseau, Alain N.; Charron, Dominique; Fortin, Vincent; Audet, René
2017-11-01
Several businesses and industries rely on rainfall forecasts to support their day-to-day operations. To deal with the uncertainty associated with rainfall forecast, some meteorological organisations have developed products, such as ensemble forecasts. However, due to the intensive computational requirements of ensemble forecasts, the spatial resolution remains coarse. For example, Environment and Climate Change Canada's (ECCC) Global Ensemble Prediction System (GEPS) data is freely available on a 1-degree grid (about 100 km), while those of the so-called High Resolution Deterministic Prediction System (HRDPS) are available on a 2.5-km grid (about 40 times finer). Potential users are then left with the option of using either a high-resolution rainfall forecast without uncertainty estimation and/or an ensemble with a spectrum of plausible rainfall values, but at a coarser spatial scale. The objective of this study was to evaluate the added value of coupling the Gibbs Sampling Disaggregation Model (GSDM) with ECCC products to provide accurate, precise and consistent rainfall estimates at a fine spatial resolution (10-km) within a forecast framework (6-h). For 30, 6-h, rainfall events occurring within a 40,000-km2 area (Québec, Canada), results show that, using 100-km aggregated reference rainfall depths as input, statistics of the rainfall fields generated by GSDM were close to those of the 10-km reference field. However, in forecast mode, GSDM outcomes inherit of the ECCC forecast biases, resulting in a poor performance when GEPS data were used as input, mainly due to the inherent rainfall depth distribution of the latter product. Better performance was achieved when the Regional Deterministic Prediction System (RDPS), available on a 10-km grid and aggregated at 100-km, was used as input to GSDM. Nevertheless, most of the analyzed ensemble forecasts were weakly consistent. Some areas of improvement are identified herein.
Impacts of snow cover fraction data assimilation on modeled energy and moisture budgets
NASA Astrophysics Data System (ADS)
Arsenault, Kristi R.; Houser, Paul R.; De Lannoy, Gabriëlle J. M.; Dirmeyer, Paul A.
2013-07-01
Two data assimilation (DA) methods, a simple rule-based direct insertion (DI) approach and a one-dimensional ensemble Kalman filter (EnKF) method, are evaluated by assimilating snow cover fraction observations into the Community Land surface Model. The ensemble perturbation needed for the EnKF resulted in negative snowpack biases. Therefore, a correction is made to the ensemble bias using an approach that constrains the ensemble forecasts with a single unperturbed deterministic LSM run. This is shown to improve the final snow state analyses. The EnKF method produces slightly better results in higher elevation locations, whereas results indicate that the DI method has a performance advantage in lower elevation regions. In addition, the two DA methods are evaluated in terms of their overall impacts on the other land surface state variables (e.g., soil moisture) and fluxes (e.g., latent heat flux). The EnKF method is shown to have less impact overall than the DI method and causes less distortion of the hydrological budget. However, the land surface model adjusts more slowly to the smaller EnKF increments, which leads to smaller but slightly more persistent moisture budget errors than found with the DI updates. The DI method can remove almost instantly much of the modeled snowpack, but this also allows the model system to quickly revert to hydrological balance for nonsnowpack conditions.
Quantum entanglement at ambient conditions in a macroscopic solid-state spin ensemble.
Klimov, Paul V; Falk, Abram L; Christle, David J; Dobrovitski, Viatcheslav V; Awschalom, David D
2015-11-01
Entanglement is a key resource for quantum computers, quantum-communication networks, and high-precision sensors. Macroscopic spin ensembles have been historically important in the development of quantum algorithms for these prospective technologies and remain strong candidates for implementing them today. This strength derives from their long-lived quantum coherence, strong signal, and ability to couple collectively to external degrees of freedom. Nonetheless, preparing ensembles of genuinely entangled spin states has required high magnetic fields and cryogenic temperatures or photochemical reactions. We demonstrate that entanglement can be realized in solid-state spin ensembles at ambient conditions. We use hybrid registers comprising of electron-nuclear spin pairs that are localized at color-center defects in a commercial SiC wafer. We optically initialize 10(3) identical registers in a 40-μm(3) volume (with [Formula: see text] fidelity) and deterministically prepare them into the maximally entangled Bell states (with 0.88 ± 0.07 fidelity). To verify entanglement, we develop a register-specific quantum-state tomography protocol. The entanglement of a macroscopic solid-state spin ensemble at ambient conditions represents an important step toward practical quantum technology.
NASA Astrophysics Data System (ADS)
Saleh, F.; Ramaswamy, V.; Georgas, N.; Blumberg, A. F.; Wang, Y.
2016-12-01
Advances in computational resources and modeling techniques are opening the path to effectively integrate existing complex models. In the context of flood prediction, recent extreme events have demonstrated the importance of integrating components of the hydrosystem to better represent the interactions amongst different physical processes and phenomena. As such, there is a pressing need to develop holistic and cross-disciplinary modeling frameworks that effectively integrate existing models and better represent the operative dynamics. This work presents a novel Hydrologic-Hydraulic-Hydrodynamic Ensemble (H3E) flood prediction framework that operationally integrates existing predictive models representing coastal (New York Harbor Observing and Prediction System, NYHOPS), hydrologic (US Army Corps of Engineers Hydrologic Modeling System, HEC-HMS) and hydraulic (2-dimensional River Analysis System, HEC-RAS) components. The state-of-the-art framework is forced with 125 ensemble meteorological inputs from numerical weather prediction models including the Global Ensemble Forecast System, the European Centre for Medium-Range Weather Forecasts (ECMWF), the Canadian Meteorological Centre (CMC), the Short Range Ensemble Forecast (SREF) and the North American Mesoscale Forecast System (NAM). The framework produces, within a 96-hour forecast horizon, on-the-fly Google Earth flood maps that provide critical information for decision makers and emergency preparedness managers. The utility of the framework was demonstrated by retrospectively forecasting an extreme flood event, hurricane Sandy in the Passaic and Hackensack watersheds (New Jersey, USA). Hurricane Sandy caused significant damage to a number of critical facilities in this area including the New Jersey Transit's main storage and maintenance facility. The results of this work demonstrate that ensemble based frameworks provide improved flood predictions and useful information about associated uncertainties, thus improving the assessment of risks as when compared to a deterministic forecast. The work offers perspectives for short-term flood forecasts, flood mitigation strategies and best management practices for climate change scenarios.
NASA Astrophysics Data System (ADS)
Hoppel, Karl; Bevilacqua, Richard; Canty, Timothy; Salawitch, Ross; Santee, Michelle
2005-10-01
The Polar Ozone and Aerosol Measurement (POAM III) instrument has provided 6 years (1998 to present) of Antarctic ozone profile measurements, which detail the annual formation of the ozone hole. During the period of ozone hole formation the measurement latitude follows the edge of the polar night and presents a unique challenge for comparing with model simulations. The formation of the ozone hole has been simulated by using a photochemical box model with an ensemble of trajectories, and the results were sampled at the measurement latitude for comparison with the measured ozone. The agreement is generally good but very sensitive to the model dynamics and less sensitive to changes in the model chemistry. In order to better isolate the chemical ozone loss the Match technique was applied to 5 years of data to directly calculate ozone photochemical loss rates. The measured loss rates are specific to the high solar zenith angle conditions of the POAM-Match trajectories and are found to increase slowly from July to early August and then increase rapidly until mid-September. The Match results are sensitive to the choice of meteorological analysis used for the trajectory calculations. The ECMWF trajectories yield the smallest, and perhaps most accurate, peak loss rates that can be reproduced by a photochemical model using standard JPL 2002 kinetics, assuming reactive bromine (BrOx) of 14 pptv based solely on contributions from CH3Br and halons, and without requiring ClOx to exceed the upper limit for available inorganic chlorine of 3.7 ppbv. Larger Match ozone loss rates are found for the late August and early September period if trajectories based on UKMO and NCEP analyses are employed. Such loss rates require higher values for ClO and/or BrO than can be simulated using JPL 2002 chemical kinetics and complete activation of chlorine. In these cases, the agreement between modeled and measured loss rates is significantly improved if the model employs larger ClOOCl cross sections (e.g., Burkholder et al., 1990) and BrOx of 24 ppt which reflects significant contributions from very short-lived bromocarbons to the inorganic bromine budget.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madankan, R.; Pouget, S.; Singla, P., E-mail: psingla@buffalo.edu
Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This papermore » presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.« less
Economic assessment of flood forecasts for a risk-averse decision-maker
NASA Astrophysics Data System (ADS)
Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier-Filion, Thomas-Charles
2017-04-01
A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. It has also been suggested in past studies that ensemble forecasts might possess a greater economic value than deterministic forecasts. However, the vast majority of recent hydro-economic literature is based on the cost-loss ratio framework, which might be appealing for its simplicity and intuitiveness. One important drawback of the cost-loss ratio is that it implicitly assumes a risk-neutral decision maker. By definition, a risk-neutral individual is indifferent to forecasts' sharpness: as long as forecasts agree with observations on average, the risk-neutral individual is satisfied. A risk-averse individual, however, is sensitive to the level of precision (sharpness) of forecasts. This person is willing to pay to increase his or her certainty about future events. In fact, this is how insurance companies operate: the probability of seeing one's house burn down is relatively low, so the expected cost related to such event is also low. However, people are willing to buy insurance to avoid the risk, however small, of loosing everything. Similarly, in a context where people's safety and property is at stake, the typical decision maker is more risk-averse than risk-neutral. Consequently, the cost-loss ratio is not the most appropriate tool to assess the economic value of flood forecasts. This presentation describes a more realistic framework for assessing the economic value of such forecasts for flood mitigation purposes. Borrowing from economics, the Constant Absolute Risk Aversion utility function (CARA) is the central tool of this new framework. Utility functions allow explicitly accounting for the level of risk aversion of the decision maker and fully exploiting the information related to ensemble forecasts' uncertainty. Three concurrent ensemble streamflow forecasting systems are compared in terms of quality (comparison with observed values) and in terms of their economic value. This assessment is performed for lead times of one to five days. The three systems are: (1) simple statistically dressed deterministic forecasts, (2) forecasts based on meteorological ensembles and (3) a variant of the latter that also includes an estimation of state variables uncertainty. The comparison takes place on the Montmorency River, a small flood-prone watershed in south central Quebec, Canada. The results show that forecasts quality as assessed by well-known tools such as the Continuous Ranked Probability Score or the reliability diagram do not necessarily translate directly into economic value, especially if the decision maker is not risk-neutral. In addition, results show that the economic value of forecasts for a risk-averse decision maker is very much influenced by the most extreme members of ensemble forecasts (upper tail of the predictive distributions). This study provides a new basis for further improvement of our comprehension of the complex interactions between forecasts uncertainty, risk-aversion and decision-making.
NASA Technical Reports Server (NTRS)
Hathaway, Michael D.
1986-01-01
Measurements of the unsteady velocity field within the stator row of a transonic axial-flow fan were acquired using a laser anemometer. Measurements were obtained on axisymmetric surfaces located at 10 and 50 percent span from the shroud, with the fan operating at maximum efficiency at design speed. The ensemble-average and variance of the measured velocities are used to identify rotor-wake-generated (deterministic) unsteadiness and turbulence, respectively. Correlations of both deterministic and turbulent velocity fluctuations provide information on the characteristics of unsteady interactions within the stator row. These correlations are derived from the Navier-Stokes equation in a manner similar to deriving the Reynolds stress terms, whereby various averaging operators are used to average the aperiodic, deterministic, and turbulent velocity fluctuations which are known to be present in multistage turbomachines. The correlations of deterministic and turbulent velocity fluctuations throughout the axial fan stator row are presented. In particular, amplification and attenuation of both types of unsteadiness are shown to occur within the stator blade passage.
NASA Astrophysics Data System (ADS)
Lopez, Ana; Fung, Fai; New, Mark; Watts, Glenn; Weston, Alan; Wilby, Robert L.
2009-08-01
The majority of climate change impacts and adaptation studies so far have been based on at most a few deterministic realizations of future climate, usually representing different emissions scenarios. Large ensembles of climate models are increasingly available either as ensembles of opportunity or perturbed physics ensembles, providing a wealth of additional data that is potentially useful for improving adaptation strategies to climate change. Because of the novelty of this ensemble information, there is little previous experience of practical applications or of the added value of this information for impacts and adaptation decision making. This paper evaluates the value of perturbed physics ensembles of climate models for understanding and planning public water supply under climate change. We deliberately select water resource models that are already used by water supply companies and regulators on the assumption that uptake of information from large ensembles of climate models will be more likely if it does not involve significant investment in new modeling tools and methods. We illustrate the methods with a case study on the Wimbleball water resource zone in the southwest of England. This zone is sufficiently simple to demonstrate the utility of the approach but with enough complexity to allow a variety of different decisions to be made. Our research shows that the additional information contained in the climate model ensemble provides a better understanding of the possible ranges of future conditions, compared to the use of single-model scenarios. Furthermore, with careful presentation, decision makers will find the results from large ensembles of models more accessible and be able to more easily compare the merits of different management options and the timing of different adaptation. The overhead in additional time and expertise for carrying out the impacts analysis will be justified by the increased quality of the decision-making process. We remark that even though we have focused our study on a water resource system in the United Kingdom, our conclusions about the added value of climate model ensembles in guiding adaptation decisions can be generalized to other sectors and geographical regions.
Comparison of Ensemble Mean and Deterministic Forecasts for Long-Range Airlift Fuel Planning
2014-03-27
honors in volleyball . In 2002, she transferred to the University of Oklahoma, where she earned a bachelor’s degree in Meteorology and was com- missioned...instrumental in safely estab- lishing remotely-piloted aircraft operations. She was a member of the Air Force Women’s Volleyball team in 2007, 2009, and 2010, as
NASA Astrophysics Data System (ADS)
Lim, S.; Park, S. K.; Zupanski, M.
2015-04-01
Since the air quality forecast is related to both chemistry and meteorology, the coupled atmosphere-chemistry data assimilation (DA) system is essential to air quality forecasting. Ozone (O3) plays an important role in chemical reactions and is usually assimilated in chemical DA. In tropical cyclones (TCs), O3 usually shows a lower concentration inside the eyewall and an elevated concentration around the eye, impacting atmospheric as well as chemical variables. To identify the impact of O3 observations on TC structure, including atmospheric and chemical information, we employed the Weather Research and Forecasting model coupled with Chemistry (WRF-Chem) with an ensemble-based DA algorithm - the maximum likelihood ensemble filter (MLEF). For a TC case that occurred over the East Asia, our results indicate that the ensemble forecast is reasonable, accompanied with larger background state uncertainty over the TC, and also over eastern China. Similarly, the assimilation of O3 observations impacts atmospheric and chemical variables near the TC and over eastern China. The strongest impact on air quality in the lower troposphere was over China, likely due to the pollution advection. In the vicinity of the TC, however, the strongest impact on chemical variables adjustment was at higher levels. The impact on atmospheric variables was similar in both over China and near the TC. The analysis results are validated using several measures that include the cost function, root-mean-squared error with respect to observations, and degrees of freedom for signal (DFS). All measures indicate a positive impact of DA on the analysis - the cost function and root mean square error have decreased by 16.9 and 8.87%, respectively. In particular, the DFS indicates a strong positive impact of observations in the TC area, with a weaker maximum over northeast China.
Stochastic Plume Simulations for the Fukushima Accident and the Deep Water Horizon Oil Spill
NASA Astrophysics Data System (ADS)
Coelho, E.; Peggion, G.; Rowley, C.; Hogan, P.
2012-04-01
The Fukushima Dai-ichi power plant suffered damage leading to radioactive contamination of coastal waters. Major issues in characterizing the extent of the affected waters were a poor knowledge of the radiation released to the coastal waters and the rather complex coastal dynamics of the region, not deterministically captured by the available prediction systems. Equivalently, during the Gulf of Mexico Deep Water Horizon oil platform accident in April 2010, significant amounts of oil and gas were released from the ocean floor. For this case, issues in mapping and predicting the extent of the affected waters in real-time were a poor knowledge of the actual amounts of oil reaching the surface and the fact that coastal dynamics over the region were not deterministically captured by the available prediction systems. To assess the ocean regions and times that were most likely affected by these accidents while capturing the above sources of uncertainty, ensembles of the Navy Coastal Ocean Model (NCOM) were configured over the two regions (NE Japan and Northern Gulf of Mexico). For the Fukushima case tracers were released on each ensemble member; their locations at each instant provided reference positions of water volumes where the signature of water released from the plant could be found. For the Deep Water Horizon oil spill case each ensemble member was coupled with a diffusion-advection solution to estimate possible scenarios of oil concentrations using perturbed estimates of the released amounts as the source terms at the surface. Stochastic plumes were then defined using a Risk Assessment Code (RAC) analysis that associates a number from 1 to 5 to each grid point, determined by the likelihood of having tracer particle within short ranges (for the Fukushima case), hence defining the high risk areas and those recommended for monitoring. For the Oil Spill case the RAC codes were determined by the likelihood of reaching oil concentrations as defined in the Bonn Agreement Oil Appearance Code. The likelihoods were taken in both cases from probability distribution functions derived from the ensemble runs. Results were compared with a control-deterministic solution and checked against available reports to assess their skill in capturing the actual observed plumes and other in-situ data, as well as their relevance for planning surveys and reconnaissance flights for both cases.
Ozone time scale decomposition and trend assessment from surface observations
NASA Astrophysics Data System (ADS)
Boleti, Eirini; Hueglin, Christoph; Takahama, Satoshi
2017-04-01
Emissions of ozone precursors have been regulated in Europe since around 1990 with control measures primarily targeting to industries and traffic. In order to understand how these measures have affected air quality, it is now important to investigate concentrations of tropospheric ozone in different types of environments, based on their NOx burden, and in different geographic regions. In this study, we analyze high quality data sets for Switzerland (NABEL network) and whole Europe (AirBase) for the last 25 years to calculate long-term trends of ozone concentrations. A sophisticated time scale decomposition method, called the Ensemble Empirical Mode Decomposition (EEMD) (Huang,1998;Wu,2009), is used for decomposition of the different time scales of the variation of ozone, namely the long-term trend, seasonal and short-term variability. This allows subtraction of the seasonal pattern of ozone from the observations and estimation of long-term changes of ozone concentrations with lower uncertainty ranges compared to typical methodologies used. We observe that, despite the implementation of regulations, for most of the measurement sites ozone daily mean values have been increasing until around mid-2000s. Afterwards, we observe a decline or a leveling off in the concentrations; certainly a late effect of limitations in ozone precursor emissions. On the other hand, the peak ozone concentrations have been decreasing for almost all regions. The evolution in the trend exhibits some differences between the different types of measurement. In addition, ozone is known to be strongly affected by meteorology. In the applied approach, some of the meteorological effects are already captured by the seasonal signal and already removed in the de-seasonalized ozone time series. For adjustment of the influence of meteorology on the higher frequency ozone variation, a statistical approach based on Generalized Additive Models (GAM) (Hastie,1990;Wood,2006), which corrects for meteorological effects, has been developed in order to a) investigate if trends are masked by meteorological variability and b) to understand which part of the observed trends is meteorology driven. By correlating short-term variation of ozone, as obtained from the EEMD, with the corresponding short-term variation of relevant meteorological parameters, we subtract the variation of ozone concentrations that is related to the meteorological effects explained by the GAM. We find that higher frequency meteorological correction reduces further the uncertainty in trend estimation by a small factor. In addition, the seasonal variability of ozone as obtained from the EEMD has been studied in more detail for possible changes in its behavior. A shortening of the seasonal cycle was observed, i.e. reduction of maximum and in-crease of minimum concentration per year, while the occurrence of maximum is shifted to earlier times during a year. In summary, we present a sophisticated and consistent approach for detecting and categorizing trends and meteorological influences on ozone concentrations in long-term measurements across Europe.
NASA Astrophysics Data System (ADS)
Huang, Ling; Luo, Yali
2017-08-01
Based on The Observing System Research and Predictability Experiment Interactive Grand Global Ensemble (TIGGE) data set, this study evaluates the ability of global ensemble prediction systems (EPSs) from the European Centre for Medium-Range Weather Forecasts (ECMWF), U.S. National Centers for Environmental Prediction, Japan Meteorological Agency (JMA), Korean Meteorological Administration, and China Meteorological Administration (CMA) to predict presummer rainy season (April-June) precipitation in south China. Evaluation of 5 day forecasts in three seasons (2013-2015) demonstrates the higher skill of probability matching forecasts compared to simple ensemble mean forecasts and shows that the deterministic forecast is a close second. The EPSs overestimate light-to-heavy rainfall (0.1 to 30 mm/12 h) and underestimate heavier rainfall (>30 mm/12 h), with JMA being the worst. By analyzing the synoptic situations predicted by the identified more skillful (ECMWF) and less skillful (JMA and CMA) EPSs and the ensemble sensitivity for four representative cases of torrential rainfall, the transport of warm-moist air into south China by the low-level southwesterly flow, upstream of the torrential rainfall regions, is found to be a key synoptic factor that controls the quantitative precipitation forecast. The results also suggest that prediction of locally produced torrential rainfall is more challenging than prediction of more extensively distributed torrential rainfall. A slight improvement in the performance is obtained by shortening the forecast lead time from 30-36 h to 18-24 h to 6-12 h for the cases with large-scale forcing, but not for the locally produced cases.
NASA Astrophysics Data System (ADS)
Fillion, Anthony; Bocquet, Marc; Gratton, Serge
2018-04-01
The analysis in nonlinear variational data assimilation is the solution of a non-quadratic minimization. Thus, the analysis efficiency relies on its ability to locate a global minimum of the cost function. If this minimization uses a Gauss-Newton (GN) method, it is critical for the starting point to be in the attraction basin of a global minimum. Otherwise the method may converge to a local extremum, which degrades the analysis. With chaotic models, the number of local extrema often increases with the temporal extent of the data assimilation window, making the former condition harder to satisfy. This is unfortunate because the assimilation performance also increases with this temporal extent. However, a quasi-static (QS) minimization may overcome these local extrema. It accomplishes this by gradually injecting the observations in the cost function. This method was introduced by Pires et al. (1996) in a 4D-Var context. We generalize this approach to four-dimensional strong-constraint nonlinear ensemble variational (EnVar) methods, which are based on both a nonlinear variational analysis and the propagation of dynamical error statistics via an ensemble. This forces one to consider the cost function minimizations in the broader context of cycled data assimilation algorithms. We adapt this QS approach to the iterative ensemble Kalman smoother (IEnKS), an exemplar of nonlinear deterministic four-dimensional EnVar methods. Using low-order models, we quantify the positive impact of the QS approach on the IEnKS, especially for long data assimilation windows. We also examine the computational cost of QS implementations and suggest cheaper algorithms.
The GEOS Chemistry Climate Model: Implications of Climate Feedbacks on Ozone Depletion and Recovery
NASA Technical Reports Server (NTRS)
Stolarski, Richard S.; Pawson, Steven; Douglass, Anne R.; Newman, Paul A.; Kawa, S. Randy; Nielsen, J. Eric; Rodriquez, Jose; Strahan, Susan; Oman, Luke; Waugh, Darryn
2008-01-01
The Goddard Earth Observing System Chemistry Climate Model (GEOS CCM) has been developed by combining the atmospheric chemistry and transport modules developed over the years at Goddard and the GEOS general circulation model, also developed at Goddard. The first version of the model was used in the CCMVal intercomparison exercises that contributed to the 2006 WMO/UNEP Ozone Assessment. The second version incorporates the updated version of the GCM (GEOS 5) and will be used for the next round of CCMVal evaluations and the 2010 Ozone Assessment. The third version, now under development, incorporates the combined stratosphere and troposphere chemistry package developed under the Global Modeling Initiative (GMI). We will show comparison to past observations that indicate that we represent the ozone trends over the past 30 years. We will also show the basic temperature, composition, and dynamical structure of the simulations. We will further show projections into the future. We will show results from an ensemble of transient and time-slice simulations, including simulations with fixed 1960 chlorine, simulations with a best guess scenario (Al), and simulations with extremely high chlorine loadings. We will discuss planned extensions of the model to include emission-based boundary conditions for both anthropogenic and biogenic compounds.
Shukla, Shraddhanand; Roberts, Jason B.; Hoell. Andrew,; Funk, Chris; Robertson, Franklin R.; Kirtmann, Benjamin
2016-01-01
The skill of North American multimodel ensemble (NMME) seasonal forecasts in East Africa (EA), which encompasses one of the most food and water insecure areas of the world, is evaluated using deterministic, categorical, and probabilistic evaluation methods. The skill is estimated for all three primary growing seasons: March–May (MAM), July–September (JAS), and October–December (OND). It is found that the precipitation forecast skill in this region is generally limited and statistically significant over only a small part of the domain. In the case of MAM (JAS) [OND] season it exceeds the skill of climatological forecasts in parts of equatorial EA (Northern Ethiopia) [equatorial EA] for up to 2 (5) [5] months lead. Temperature forecast skill is generally much higher than precipitation forecast skill (in terms of deterministic and probabilistic skill scores) and statistically significant over a majority of the region. Over the region as a whole, temperature forecasts also exhibit greater reliability than the precipitation forecasts. The NMME ensemble forecasts are found to be more skillful and reliable than the forecast from any individual model. The results also demonstrate that for some seasons (e.g. JAS), the predictability of precipitation signals varies and is higher during certain climate events (e.g. ENSO). Finally, potential room for improvement in forecast skill is identified in some models by comparing homogeneous predictability in individual NMME models with their respective forecast skill.
Nidheesh, N; Abdul Nazeer, K A; Ameer, P M
2017-12-01
Clustering algorithms with steps involving randomness usually give different results on different executions for the same dataset. This non-deterministic nature of algorithms such as the K-Means clustering algorithm limits their applicability in areas such as cancer subtype prediction using gene expression data. It is hard to sensibly compare the results of such algorithms with those of other algorithms. The non-deterministic nature of K-Means is due to its random selection of data points as initial centroids. We propose an improved, density based version of K-Means, which involves a novel and systematic method for selecting initial centroids. The key idea of the algorithm is to select data points which belong to dense regions and which are adequately separated in feature space as the initial centroids. We compared the proposed algorithm to a set of eleven widely used single clustering algorithms and a prominent ensemble clustering algorithm which is being used for cancer data classification, based on the performances on a set of datasets comprising ten cancer gene expression datasets. The proposed algorithm has shown better overall performance than the others. There is a pressing need in the Biomedical domain for simple, easy-to-use and more accurate Machine Learning tools for cancer subtype prediction. The proposed algorithm is simple, easy-to-use and gives stable results. Moreover, it provides comparatively better predictions of cancer subtypes from gene expression data. Copyright © 2017 Elsevier Ltd. All rights reserved.
Consistency between the global and regional modeling components of CAMS over Europe.
NASA Astrophysics Data System (ADS)
Katragkou, Eleni; Akritidis, Dimitrios; Kontos, Serafim; Zanis, Prodromos; Melas, Dimitrios; Engelen, Richard; Plu, Matthieu; Eskes, Henk
2017-04-01
The Copernicus Atmosphere Monitoring Service (CAMS) is a component of the European Earth Observation programme Copernicus. CAMS consists of two major forecast and analysis systems: i) the CAMS global near-real time service, based on the ECMWF Integrated Forecast System (C-IFS), which provides daily analyses and forecasts of reactive trace gases, greenhouse gases and aerosol concentrations ii) a regional ensemble (ENS) for European air quality, compiled and disseminated by Météo-France, which consists of seven ensemble members. The boundaries from the regional ensemble members are extracted from the global CAMS forecast product. This work reports on the consistency between the global and regional modeling components of CAMS, and the impact of global CAMS boundary conditions on regional forecasts. The current analysis includes ozone (O3) carbon monoxide (CO) and aerosol (PM10/PM2.5) forecasts. The comparison indicates an overall good agreement between the global C-IFS and the regional ENS patterns for O3 and CO, especially above 250m altitude, indicating that the global boundary conditions are efficiently included in the regional ensemble simulations. As expected, differences are found within the PBL, with lower/higher C-IFS O3/CO concentrations over continental Europe with respect to ENS.
NASA Astrophysics Data System (ADS)
Meißner, Dennis; Klein, Bastian; Ionita, Monica; Hemri, Stephan; Rademacher, Silke
2017-04-01
Inland waterway transport (IWT) is an important commercial sector significantly vulnerable to hydrological impacts. River ice and floods limit the availability of the waterway network and may cause considerable damages to waterway infrastructure. Low flows significantly affect IWT's operation efficiency usually several months a year due to the close correlation of (low) water levels / water depths and (high) transport costs. Therefore "navigation-related" hydrological forecasts focussing on the specific requirements of water-bound transport (relevant forecast locations, target parameters, skill characteristics etc.) play a major role in order to mitigate IWT's vulnerability to hydro-meteorological impacts. In light of continuing transport growth within the European Union, hydrological forecasts for the waterways are essential to stimulate the use of the free capacity IWT still offers more consequently. An overview of the current operational and pre-operational forecasting systems for the German waterways predicting water levels, discharges and river ice thickness on various time-scales will be presented. While short-term (deterministic) forecasts have a long tradition in navigation-related forecasting, (probabilistic) forecasting services offering extended lead-times are not yet well-established and are still subject to current research and development activities (e.g. within the EU-projects EUPORIAS and IMPREX). The focus is on improving technical aspects as well as on exploring adequate ways of disseminating and communicating probabilistic forecast information. For the German stretch of the River Rhine, one of the most frequented inland waterways worldwide, the existing deterministic forecast scheme has been extended by ensemble forecasts combined with statistical post-processing modules applying EMOS (Ensemble Model Output Statistics) and ECC (Ensemble Copula Coupling) in order to generate water level predictions up to 10 days and to estimate its predictive uncertainty properly. Additionally for the key locations at the international waterways Rhine, Elbe and Danube three competing forecast approaches are currently tested in a pre-operational set-up in order to generate monthly to seasonal (up to 3 months) forecasts: (1) the well-known Ensemble Streamflow Prediction approach (ensemble based on historical meteorology), (2) coupling hydrological models with post-processed outputs from ECMWF's general circulation model (System 4), and (3) a purely statistical approach based on the stable relationship (teleconnection) of global or regional oceanic, climate and hydrological data with river flows. The current results, still pre-operational, reveal the existence of a valuable predictability of water levels and streamflow also at monthly up to seasonal time-scales along the larger rivers used as waterways in Germany. Last but not least insight into the technical set-up of the aforementioned forecasting systems operated at the Federal Institute of Hydrology, which are based on a Delft-FEWS application, will be given focussing on the step-wise extension of the former system by integrating new components in order to meet the growing needs of the customers and to improve and extend the forecast portfolio for waterway users.
SAGE (version 5.96) Ozone Trends in the Lower Stratosphere
NASA Technical Reports Server (NTRS)
Cunnold, D. M.; Wang, H. J.; Thomason, L. W.; Zawodny, J. M.; Logan, J. A.; Megretkaia, I. A.
2002-01-01
Ozone retrievals from Stratospheric Aerosol and Gas Experiment (SAGE) II version 5.96 (v5.96) below approx. 25 km altitude are discussed. This version of the algorithm includes improved constraints on the wavelength dependence of aerosol extinctions based on the ensemble of aerosol size distribution measurements. This results in a reduction of SAGE ozone errors in the 2 years after the Mount Pinatubo eruption. However, SAGE ozone concentrations are still approx. 10% larger than ozonesonde and Halogen Occultation Experiment (HALOE) measurements below 20 km altitude under nonvolcanic conditions (and by more than this in the tropics). The analysis by Steele and Turco suggests that the SAGE ozone overpredictions are in the wrong direction to be explained by aerosol extinction extrapolation errors. Moreover, preliminary SAGE 11 v6.0a retrievals suggest that they are partially accounted for by geometric difficulties at low altitudes in v5.96 and prior retrievals. SAGE ozone trends for the 1979-1996 and 1984-1996 periods are calculated and compared, and the sources of trend errors are discussed. These calculations are made after filtering out ozone data during periods of high, local aerosol extinctions. In the lower stratosphere, below approx. 28 km altitude, there is shown to be excellent agreement in the altitudinal structure of ozone decreases at 45 deg N between SAGE and ozonesondes with the largest decrease in both between 1979 and 1996 having occurred below 20 km altitude, amounting to 0.9 +/- 0.7% yr (2sigma) at 16 km altitude. However, in contrast to the fairly steady decreases at 45 deg N, both SAGE measurements and Lauder ozonesondes show ozone increases at 45 deg S over the period from the mid-1980s to 1996 of 0.2 +/- 0.5%/yr (2sigma) from 15 to 20 km altitude. The SAGE data suggest that this increase is a wintertime phenomenon which occurs in the 15-20 km height range. Changes in dynamics are suggested as the most likely cause of this increase. These hemispheric differences in ozone trends are supported by ozone column measurements by the Total Ozone Mapping Spectrometer (TOMS).
NASA Astrophysics Data System (ADS)
Cifelli, R.; Mahoney, K. M.; Webb, R. S.; McCormick, B.
2017-12-01
To ensure structural and operational safety of dams and other water management infrastructure, water resources managers and engineers require information about the potential for heavy precipitation. The methods and data used to estimate extreme rainfall amounts for managing risk are based on 40-year-old science and in need of improvement. The need to evaluate new approaches based on the best science available has led the states of Colorado and New Mexico to engage a body of scientists and engineers in an innovative "ensemble approach" to updating extreme precipitation estimates. NOAA is at the forefront of one of three technical approaches that make up the "ensemble study"; the three approaches are conducted concurrently and in collaboration with each other. One approach is the conventional deterministic, "storm-based" method, another is a risk-based regional precipitation frequency estimation tool, and the third is an experimental approach utilizing NOAA's state-of-the-art High Resolution Rapid Refresh (HRRR) physically-based dynamical weather prediction model. The goal of the overall project is to use the individual strengths of these different methods to define an updated and broadly acceptable state of the practice for evaluation and design of dam spillways. This talk will highlight the NOAA research and NOAA's role in the overarching goal to better understand and characterizing extreme precipitation estimation uncertainty. The research led by NOAA explores a novel high-resolution dataset and post-processing techniques using a super-ensemble of hourly forecasts from the HRRR model. We also investigate how this rich dataset may be combined with statistical methods to optimally cast the data in probabilistic frameworks. NOAA expertise in the physical processes that drive extreme precipitation is also employed to develop careful testing and improved understanding of the limitations of older estimation methods and assumptions. The process of decision making in the midst of uncertainty is a major part of this study. We will speak to how the ensemble approach may be used in concert with one another to manage risk and enhance resiliency in the midst of uncertainty. Finally, the presentation will also address the implications of including climate change in future extreme precipitation estimation studies.
NASA Technical Reports Server (NTRS)
Garner, Gregory G.; Thompson, Anne M.
2013-01-01
An ensemble statistical post-processor (ESP) is developed for the National Air Quality Forecast Capability (NAQFC) to address the unique challenges of forecasting surface ozone in Baltimore, MD. Air quality and meteorological data were collected from the eight monitors that constitute the Baltimore forecast region. These data were used to build the ESP using a moving-block bootstrap, regression tree models, and extreme-value theory. The ESP was evaluated using a 10-fold cross-validation to avoid evaluation with the same data used in the development process. Results indicate that the ESP is conditionally biased, likely due to slight overfitting while training the regression tree models. When viewed from the perspective of a decision-maker, the ESP provides a wealth of additional information previously not available through the NAQFC alone. The user is provided the freedom to tailor the forecast to the decision at hand by using decision-specific probability thresholds that define a forecast for an ozone exceedance. Taking advantage of the ESP, the user not only receives an increase in value over the NAQFC, but also receives value for An ensemble statistical post-processor (ESP) is developed for the National Air Quality Forecast Capability (NAQFC) to address the unique challenges of forecasting surface ozone in Baltimore, MD. Air quality and meteorological data were collected from the eight monitors that constitute the Baltimore forecast region. These data were used to build the ESP using a moving-block bootstrap, regression tree models, and extreme-value theory. The ESP was evaluated using a 10-fold cross-validation to avoid evaluation with the same data used in the development process. Results indicate that the ESP is conditionally biased, likely due to slight overfitting while training the regression tree models. When viewed from the perspective of a decision-maker, the ESP provides a wealth of additional information previously not available through the NAQFC alone. The user is provided the freedom to tailor the forecast to the decision at hand by using decision-specific probability thresholds that define a forecast for an ozone exceedance. Taking advantage of the ESP, the user not only receives an increase in value over the NAQFC, but also receives value for
On the Influence of North Pacific Sea Surface Temperature on the Arctic Winter Climate
NASA Technical Reports Server (NTRS)
Hurwitz, Margaret M.; Newman, P. A.; Garfinkel, C. I.
2012-01-01
Differences between two ensembles of Goddard Earth Observing System Chemistry-Climate Model simulations isolate the impact of North Pacific sea surface temperatures (SSTs) on the Arctic winter climate. One ensemble of extended winter season forecasts is forced by unusually high SSTs in the North Pacific, while in the second ensemble SSTs in the North Pacific are unusually low. High Low differences are consistent with a weakened Western Pacific atmospheric teleconnection pattern, and in particular, a weakening of the Aleutian low. This relative change in tropospheric circulation inhibits planetary wave propagation into the stratosphere, in turn reducing polar stratospheric temperature in mid- and late winter. The number of winters with sudden stratospheric warmings is approximately tripled in the Low ensemble as compared with the High ensemble. Enhanced North Pacific SSTs, and thus a more stable and persistent Arctic vortex, lead to a relative decrease in lower stratospheric ozone in late winter, affecting the April clear-sky UV index at Northern Hemisphere mid-latitudes.
Dual Roles for Spike Signaling in Cortical Neural Populations
Ballard, Dana H.; Jehee, Janneke F. M.
2011-01-01
A prominent feature of signaling in cortical neurons is that of randomness in the action potential. The output of a typical pyramidal cell can be well fit with a Poisson model, and variations in the Poisson rate repeatedly have been shown to be correlated with stimuli. However while the rate provides a very useful characterization of neural spike data, it may not be the most fundamental description of the signaling code. Recent data showing γ frequency range multi-cell action potential correlations, together with spike timing dependent plasticity, are spurring a re-examination of the classical model, since precise timing codes imply that the generation of spikes is essentially deterministic. Could the observed Poisson randomness and timing determinism reflect two separate modes of communication, or do they somehow derive from a single process? We investigate in a timing-based model whether the apparent incompatibility between these probabilistic and deterministic observations may be resolved by examining how spikes could be used in the underlying neural circuits. The crucial component of this model draws on dual roles for spike signaling. In learning receptive fields from ensembles of inputs, spikes need to behave probabilistically, whereas for fast signaling of individual stimuli, the spikes need to behave deterministically. Our simulations show that this combination is possible if deterministic signals using γ latency coding are probabilistically routed through different members of a cortical cell population at different times. This model exhibits standard features characteristic of Poisson models such as orientation tuning and exponential interval histograms. In addition, it makes testable predictions that follow from the γ latency coding. PMID:21687798
Quantifying Uncertainty in Projections of Stratospheric Ozone Over the 21st Century
NASA Technical Reports Server (NTRS)
Charlton-Perez, A. J.; Hawkins, E.; Eyring, V.; Cionni, I.; Bodeker, G. E.; Kinnison, D. E.; Akiyoshi, H.; Frith, S. M.; Garcia, R.; Gettelman, A.;
2010-01-01
Future stratospheric ozone concentrations will be determined both by changes in the concentration of ozone depleting substances (ODSs) and by changes in stratospheric and tropospheric climate, including those caused by changes in anthropogenic greenhouse gases (GHGs). Since future economic development pathways and resultant emissions of GHGs are uncertain, anthropogenic climate change could be a significant source of uncertainty for future projections of stratospheric ozone. In this pilot study, using an ensemble of opportunity of chemistry-climate model (CCM) simulations, the contribution of scenario uncertainty from different plausible emissions pathways for 10 ODSs and GHGs to future ozone projections is quantified relative to the contribution from model uncertainty and internal variability of the chemistry-climate system. For both the global, annual mean ozone concentration and for ozone in specific geographical regions, differences between CCMs are the dominant source of uncertainty for the first two-thirds of the 21 st century, up-to and after the time when ozone concentrations 15 return to 1980 values. In the last third of the 21st century, dependent upon the set of greenhouse gas scenarios used, scenario uncertainty can be the dominant contributor. This result suggests that investment in chemistry-climate modelling is likely to continue to refine projections of stratospheric ozone and estimates of the return of stratospheric ozone concentrations to pre-1980 levels.
Quasi-most unstable modes: a window to 'À la carte' ensemble diversity?
NASA Astrophysics Data System (ADS)
Homar Santaner, Victor; Stensrud, David J.
2010-05-01
The atmospheric scientific community is nowadays facing the ambitious challenge of providing useful forecasts of atmospheric events that produce high societal impact. The low level of social resilience to false alarms creates tremendous pressure on forecasting offices to issue accurate, timely and reliable warnings.Currently, no operational numerical forecasting system is able to respond to the societal demand for high-resolution (in time and space) predictions in the 12-72h time span. The main reasons for such deficiencies are the lack of adequate observations and the high non-linearity of the numerical models that are currently used. The whole weather forecasting problem is intrinsically probabilistic and current methods aim at coping with the various sources of uncertainties and the error propagation throughout the forecasting system. This probabilistic perspective is often created by generating ensembles of deterministic predictions that are aimed at sampling the most important sources of uncertainty in the forecasting system. The ensemble generation/sampling strategy is a crucial aspect of their performance and various methods have been proposed. Although global forecasting offices have been using ensembles of perturbed initial conditions for medium-range operational forecasts since 1994, no consensus exists regarding the optimum sampling strategy for high resolution short-range ensemble forecasts. Bred vectors, however, have been hypothesized to better capture the growing modes in the highly nonlinear mesoscale dynamics of severe episodes than singular vectors or observation perturbations. Yet even this technique is not able to produce enough diversity in the ensembles to accurately and routinely predict extreme phenomena such as severe weather. Thus, we propose a new method to generate ensembles of initial conditions perturbations that is based on the breeding technique. Given a standard bred mode, a set of customized perturbations is derived with specified amplitudes and horizontal scales. This allows the ensemble to excite growing modes across a wider range of scales. Results show that this approach produces significantly more spread in the ensemble prediction than standard bred modes alone. Several examples that illustrate the benefits from this approach for severe weather forecasts will be provided.
Preservation of physical properties with Ensemble-type Kalman Filter Algorithms
NASA Astrophysics Data System (ADS)
Janjic, T.
2017-12-01
We show the behavior of the localized Ensemble Kalman filter (EnKF) with respect to preservation of positivity, conservation of mass, energy and enstrophy in toy models that conserve these properties. In order to preserve physical properties in the analysis as well as to deal with the non-Gaussianity in an EnKF framework, Janjic et al. 2014 proposed the use of physically based constraints in the analysis step to constrain the solution. In particular, constraints were used to ensure that the ensemble members and the ensemble mean conserve mass and remain nonnegative through measurement updates. In the study, mass and positivity were both preserved by formulating the filter update as a set of quadratic programming problems that incorporate nonnegativity constraints. Simple numerical experiments indicated that this approach can have a significant positive impact on the posterior ensemble distribution, giving results that were more physically plausible both for individual ensemble members and for the ensemble mean. Moreover, in experiments designed to mimic the most important characteristics of convective motion, it is shown that the mass conservation- and positivity-constrained rain significantly suppresses noise seen in localized EnKF results. This is highly desirable in order to avoid spurious storms from appearing in the forecast starting from this initial condition (Lange and Craig 2014). In addition, the root mean square error is reduced for all fields and total mass of the rain is correctly simulated. Similarly, the enstrophy, divergence, as well as energy spectra can as well be strongly affected by localization radius, thinning interval, and inflation and depend on the variable that is observed (Zeng and Janjic, 2016). We constructed the ensemble data assimilation algorithm that conserves mass, total energy and enstrophy (Zeng et al., 2017). With 2D shallow water model experiments, it is found that the conservation of enstrophy within the data assimilation effectively avoids the spurious energy cascade of rotational part and thereby successfully suppresses the noise generated by the data assimilation algorithm. The 14-day deterministic and ensemble free forecast, starting from the initial condition enforced by both total energy and enstrophy constraints, produces the best prediction.
Chance, destiny, and the inner workings of ClpXP.
Russell, Rick; Matouschek, Andreas
2014-07-31
AAA+ proteases are responsible for protein degradation in all branches of life. Using single-molecule and ensemble assays, Cordova et al. investigate how the bacterial protease ClpXP steps through a substrate's polypeptide chain and construct a quantitative kinetic model that recapitulates the interplay between stochastic and deterministic behaviors of ClpXP. Copyright © 2014 Elsevier Inc. All rights reserved.
Impact of Ozone Radiative Feedbacks on Global Weather Forecasting
NASA Astrophysics Data System (ADS)
Ivanova, I.; de Grandpré, J.; Rochon, Y. J.; Sitwell, M.
2017-12-01
A coupled Chemical Data Assimilation system for ozone is being developed at Environment and Climate Change Canada (ECCC) with the goals to improve the forecasting of UV index and the forecasting of air quality with the Global Environmental Multi-scale (GEM) Model for Air quality and Chemistry (MACH). Furthermore, this system provides an opportunity to evaluate the benefit of ozone assimilation for improving weather forecasting with the ECCC Global Deterministic Prediction System (GDPS) for Numerical Weather Prediction (NWP). The present UV index forecasting system uses a statistical approach for evaluating the impact of ozone in clear-sky and cloudy conditions, and the use of real-time ozone analysis and ozone forecasts is highly desirable. Improving air quality forecasting with GEM-MACH further necessitates the development of integrated dynamical-chemical assimilation system. Upon its completion, real-time ozone analysis and ozone forecasts will also be available for piloting the regional air quality system, and for the computation of ozone heating rates, in replacement of the monthly mean ozone distribution currently used in the GDPS. Experiments with ozone radiative feedbacks were run with the GDPS at 25km resolution and 84 levels with a lid at 0.1 hPa and were initialized with ozone analysis that has assimilated total ozone column from OMI, OMPS, and GOME satellite instruments. The results show that the use of prognostic ozone for the computation of the heating/cooling rates has a significant impact on the temperature distribution throughout the stratosphere and upper troposphere regions. The impact of ozone assimilation is especially significant in the tropopause region, where ozone heating in the infrared wavelengths is important and ozone lifetime is relatively long. The implementation of the ozone radiative feedback in the GDPS requires addressing various issues related to model biases (temperature and humidity) and biases in equilibrium state (ozone mixing ratio, air temperature and overhead column ozone) used for the calculation of the linearized photochemical production and loss of ozone. Furthermore the radiative budget in the tropopause region is strongly affected by water vapor cooling, which impact requires further evaluation for the use in chemically coupled operational NWP systems.
NASA Astrophysics Data System (ADS)
Polvani, Lorenzo M.; Abalos, Marta; Garcia, Rolando; Kinnison, Doug; Randel, William J.
2018-01-01
It is well established that increasing greenhouse gases, notably CO2, will cause an acceleration of the stratospheric Brewer-Dobson circulation (BDC) by the end of this century. We here present compelling new evidence that ozone depleting substances are also key drivers of BDC trends. We do so by analyzing and contrasting small ensembles of "single-forcing" integrations with a stratosphere resolving atmospheric model with interactive chemistry, coupled to fully interactive ocean, land, and sea ice components. First, confirming previous work, we show that increasing concentrations of ozone depleting substances have contributed a large fraction of the BDC trends in the late twentieth century. Second, we show that the phasing out of ozone depleting substances in coming decades—as a consequence of the Montreal Protocol—will cause a considerable reduction in BDC trends until the ozone hole is completely healed, toward the end of the 21st century.
Quantum entanglement at ambient conditions in a macroscopic solid-state spin ensemble
Klimov, Paul V.; Falk, Abram L.; Christle, David J.; Dobrovitski, Viatcheslav V.; Awschalom, David D.
2015-01-01
Entanglement is a key resource for quantum computers, quantum-communication networks, and high-precision sensors. Macroscopic spin ensembles have been historically important in the development of quantum algorithms for these prospective technologies and remain strong candidates for implementing them today. This strength derives from their long-lived quantum coherence, strong signal, and ability to couple collectively to external degrees of freedom. Nonetheless, preparing ensembles of genuinely entangled spin states has required high magnetic fields and cryogenic temperatures or photochemical reactions. We demonstrate that entanglement can be realized in solid-state spin ensembles at ambient conditions. We use hybrid registers comprising of electron-nuclear spin pairs that are localized at color-center defects in a commercial SiC wafer. We optically initialize 103 identical registers in a 40-μm3 volume (with 0.95−0.07+0.05 fidelity) and deterministically prepare them into the maximally entangled Bell states (with 0.88 ± 0.07 fidelity). To verify entanglement, we develop a register-specific quantum-state tomography protocol. The entanglement of a macroscopic solid-state spin ensemble at ambient conditions represents an important step toward practical quantum technology. PMID:26702444
Using ensemble rainfall predictions in a countrywide flood forecasting model in Scotland
NASA Astrophysics Data System (ADS)
Cranston, M. D.; Maxey, R.; Tavendale, A. C. W.; Buchanan, P.
2012-04-01
Improving flood predictions for all sources of flooding is at the centre of flood risk management policy in Scotland. With the introduction of the Flood Risk Management (Scotland) Act providing a new statutory basis for SEPA's flood warning responsibilities, the pressures on delivering hydrological science developments in support of this legislation has increased. Specifically, flood forecasting capabilities need to develop in support of the need to reduce the impact of flooding through the provision of actively disseminated, reliable and timely flood warnings. Flood forecasting in Scotland has developed significantly in recent years (Cranston and Tavendale, 2012). The development of hydrological models to predict flooding at a catchment scale has relied upon the application of rainfall runoff models utilising raingauge, radar and quantitative precipitation forecasts in the short lead time (less than 6 hours). Single or deterministic forecasts based on highly uncertain rainfall predictions have led to the greatest operational difficulties when communicating flood risk with emergency responders, therefore the emergence of probability-based estimates offers the greatest opportunity for managing uncertain predictions. This paper presents operational application of a physical-conceptual distributed hydrological model on a countrywide basis across Scotland. Developed by CEH Wallingford for SEPA in 2011, Grid-to-Grid (G2G) principally runs in deterministic mode and employs radar and raingauge estimates of rainfall together with weather model predictions to produce forecast river flows, as gridded time-series at a resolution of 1km and for up to 5 days ahead (Cranston, et al., 2012). However the G2G model is now being run operationally using ensemble predictions of rainfall from the MOGREPS-R system to provide probabilistic flood forecasts. By presenting a range of flood predictions on a national scale through this approach, hydrologists are now able to consider an objective measure of the likelihood of flooding impacts to help with risk based emergency communication.
NASA Astrophysics Data System (ADS)
Grossi, Giovanna; Caronna, Paolo; Ranzi, Roberto
2014-05-01
Within the framework of risk communication, the goal of an early warning system is to support the interaction between technicians and authorities (and subsequently population) as a prevention measure. The methodology proposed in the KULTURisk FP7 project aimed to build a closer collaboration between these actors, in the perspective of promoting pro-active actions to mitigate the effects of flood hazards. The transnational (Slovenia/ Italy) Soča/Isonzo case study focused on this concept of cooperation between stakeholders and hydrological forecasters. The DIMOSHONG_VIP hydrological model was calibrated for the Vipava/Vipacco River (650 km2), a tributary of the Soča/Isonzo River, on the basis of flood events occurred between 1998 and 2012. The European Centre for Medium-Range Weather Forecasts (ECMWF) provided the past meteorological forecasts, both deterministic (1 forecast) and probabilistic (51 ensemble members). The resolution of the ECMWF grid is currently about 15 km (Deterministic-DET) and 30 km (Ensemble Prediction System-EPS). A verification was conducted to validate the flood-forecast outputs of the DIMOSHONG_VIP+ECMWF early warning system. Basic descriptive statistics, like event probability, probability of a forecast occurrence and frequency bias were determined. Some performance measures were calculated, such as hit rate (probability of detection) and false alarm rate (probability of false detection). Relative Opening Characteristic (ROC) curves were generated both for deterministic and probabilistic forecasts. These analysis showed a good performance of the early warning system, in respect of the small size of the sample. A particular attention was spent to the design of flood-forecasting output charts, involving and inquiring stakeholders (Alto Adriatico River Basin Authority), hydrology specialists in the field, and common people. Graph types for both forecasted precipitation and discharge were set. Three different risk thresholds were identified ("attention", "pre-alarm" or "alert", "alarm"), with an "icon-style" representation, suitable for communication to civil protection stakeholders or the public. Aiming at showing probabilistic representations in a "user-friendly" way, we opted for the visualization of the single deterministic forecasted hydrograph together with the 5%, 25%, 50%, 75% and 95% percentiles bands of the Hydrological Ensemble Prediction System (HEPS). HEPS is generally used for 3-5 days hydrological forecasts, while the error due to incorrect initial data is comparable to the error due to the lower resolution with respect to the deterministic forecast. In the short term forecasting (12-48 hours) the HEPS-members show obviously a similar tendency; in this case, considering its higher resolution, the deterministic forecast is expected to be more effective. The plot of different forecasts in the same chart allows the use of model outputs from 4/5 days to few hours before a potential flood event. This framework was built to help a stakeholder, like a mayor, a civil protection authority, etc, in the flood control and management operations, and was designed to be included in a wider decision support system.
Watershed scale response to climate change--Trout Lake Basin, Wisconsin
Walker, John F.; Hunt, Randall J.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Trout River Basin at Trout Lake in northern Wisconsin.
Watershed scale response to climate change--Clear Creek Basin, Iowa
Christiansen, Daniel E.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Clear Creek Basin, near Coralville, Iowa.
Watershed scale response to climate change--Feather River Basin, California
Koczot, Kathryn M.; Markstrom, Steven L.; Hay, Lauren E.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Feather River Basin, California.
Watershed scale response to climate change--South Fork Flathead River Basin, Montana
Chase, Katherine J.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the South Fork Flathead River Basin, Montana.
Watershed scale response to climate change--Cathance Stream Basin, Maine
Dudley, Robert W.; Hay, Lauren E.; Markstrom, Steven L.; Hodgkins, Glenn A.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Cathance Stream Basin, Maine.
Watershed scale response to climate change--Pomperaug River Watershed, Connecticut
Bjerklie, David M.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Pomperaug River Basin at Southbury, Connecticut.
Watershed scale response to climate change--Starkweather Coulee Basin, North Dakota
Vining, Kevin C.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Starkweather Coulee Basin near Webster, North Dakota.
Watershed scale response to climate change--Sagehen Creek Basin, California
Markstrom, Steven L.; Hay, Lauren E.; Regan, R. Steven
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Sagehen Creek Basin near Truckee, California.
Watershed scale response to climate change--Sprague River Basin, Oregon
Risley, John; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Sprague River Basin near Chiloquin, Oregon.
Watershed scale response to climate change--Black Earth Creek Basin, Wisconsin
Hunt, Randall J.; Walker, John F.; Westenbroek, Steven M.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Black Earth Creek Basin, Wisconsin.
Watershed scale response to climate change--East River Basin, Colorado
Battaglin, William A.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the East River Basin, Colorado.
Watershed scale response to climate change--Naches River Basin, Washington
Mastin, Mark C.; Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Naches River Basin below Tieton River in Washington.
Watershed scale response to climate change--Flint River Basin, Georgia
Hay, Lauren E.; Markstrom, Steven L.
2012-01-01
Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Flint River Basin at Montezuma, Georgia.
NASA Astrophysics Data System (ADS)
Shedd, R.; Reed, S. M.; Porter, J. H.
2015-12-01
The National Weather Service (NWS) has been working for several years on the development of the Hydrologic Ensemble Forecast System (HEFS). The objective of HEFS is to provide ensemble river forecasts incorporating the best precipitation and temperature forcings at any specific time horizon. For the current implementation, this includes the Global Ensemble Forecast System (GEFS) and the Climate Forecast System (CFSv2). One of the core partners that has been working with the NWS since the beginning of the development phase of HEFS is the New York City Department of Environmental Protection (NYCDEP) which is responsible for the complex water supply system for New York City. The water supply system involves a network of reservoirs in both the Delaware and Hudson River basins. At the same time that the NWS was developing HEFS, NYCDEP was working on enhancing the operations of their water supply reservoirs through the development of a new Operations Support Tool (OST). OST is designed to guide reservoir system operations to ensure an adequate supply of high-quality drinking water for the city, as well as to meet secondary objectives for reaches downstream of the reservoirs assuming the primary water supply goals can be met. These secondary objectives include fisheries and ecosystem support, enhanced peak flow attenuation beyond that provided natively by the reservoirs, salt front management, and water supply for other cities. Since January 2014, the NWS Northeast and Middle Atlantic River Forecast Centers have provided daily one year forecasts from HEFS to NYCDEP. OST ingests these forecasts, couples them with near-real-time environmental and reservoir system data, and drives models of the water supply system. The input of ensemble forecasts results in an ensemble of model output, from which information on the range and likelihood of possible future system states can be extracted. This type of probabilistic information provides system managers with additional information not available from deterministic forecasts and allows managers to better assess risk, and provides greater context for decision-making than has been available in the past. HEFS has allowed NYCDEP water supply managers to make better decisions on reservoir operations than they likely would have in the past, using only deterministic forecasts.
Synchrony and entrainment properties of robust circadian oscillators
Bagheri, Neda; Taylor, Stephanie R.; Meeker, Kirsten; Petzold, Linda R.; Doyle, Francis J.
2008-01-01
Systems theoretic tools (i.e. mathematical modelling, control, and feedback design) advance the understanding of robust performance in complex biological networks. We highlight phase entrainment as a key performance measure used to investigate dynamics of a single deterministic circadian oscillator for the purpose of generating insight into the behaviour of a population of (synchronized) oscillators. More specifically, the analysis of phase characteristics may facilitate the identification of appropriate coupling mechanisms for the ensemble of noisy (stochastic) circadian clocks. Phase also serves as a critical control objective to correct mismatch between the biological clock and its environment. Thus, we introduce methods of investigating synchrony and entrainment in both stochastic and deterministic frameworks, and as a property of a single oscillator or population of coupled oscillators. PMID:18426774
On the skill of various ensemble spread estimators for probabilistic short range wind forecasting
NASA Astrophysics Data System (ADS)
Kann, A.
2012-05-01
A variety of applications ranging from civil protection associated with severe weather to economical interests are heavily dependent on meteorological information. For example, a precise planning of the energy supply with a high share of renewables requires detailed meteorological information on high temporal and spatial resolution. With respect to wind power, detailed analyses and forecasts of wind speed are of crucial interest for the energy management. Although the applicability and the current skill of state-of-the-art probabilistic short range forecasts has increased during the last years, ensemble systems still show systematic deficiencies which limit its practical use. This paper presents methods to improve the ensemble skill of 10-m wind speed forecasts by combining deterministic information from a nowcasting system on very high horizontal resolution with uncertainty estimates from a limited area ensemble system. It is shown for a one month validation period that a statistical post-processing procedure (a modified non-homogeneous Gaussian regression) adds further skill to the probabilistic forecasts, especially beyond the nowcasting range after +6 h.
A review of multimodel superensemble forecasting for weather, seasonal climate, and hurricanes
NASA Astrophysics Data System (ADS)
Krishnamurti, T. N.; Kumar, V.; Simon, A.; Bhardwaj, A.; Ghosh, T.; Ross, R.
2016-06-01
This review provides a summary of work in the area of ensemble forecasts for weather, climate, oceans, and hurricanes. This includes a combination of multiple forecast model results that does not dwell on the ensemble mean but uses a unique collective bias reduction procedure. A theoretical framework for this procedure is provided, utilizing a suite of models that is constructed from the well-known Lorenz low-order nonlinear system. A tutorial that includes a walk-through table and illustrates the inner workings of the multimodel superensemble's principle is provided. Systematic errors in a single deterministic model arise from a host of features that range from the model's initial state (data assimilation), resolution, representation of physics, dynamics, and ocean processes, local aspects of orography, water bodies, and details of the land surface. Models, in their diversity of representation of such features, end up leaving unique signatures of systematic errors. The multimodel superensemble utilizes as many as 10 million weights to take into account the bias errors arising from these diverse features of multimodels. The design of a single deterministic forecast models that utilizes multiple features from the use of the large volume of weights is provided here. This has led to a better understanding of the error growths and the collective bias reductions for several of the physical parameterizations within diverse models, such as cumulus convection, planetary boundary layer physics, and radiative transfer. A number of examples for weather, seasonal climate, hurricanes and sub surface oceanic forecast skills of member models, the ensemble mean, and the superensemble are provided.
NASA Astrophysics Data System (ADS)
Saleh, Firas; Ramaswamy, Venkatsundar; Georgas, Nickitas; Blumberg, Alan F.; Pullen, Julie
2016-07-01
This paper investigates the uncertainties in hourly streamflow ensemble forecasts for an extreme hydrological event using a hydrological model forced with short-range ensemble weather prediction models. A state-of-the art, automated, short-term hydrologic prediction framework was implemented using GIS and a regional scale hydrological model (HEC-HMS). The hydrologic framework was applied to the Hudson River basin ( ˜ 36 000 km2) in the United States using gridded precipitation data from the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) and was validated against streamflow observations from the United States Geologic Survey (USGS). Finally, 21 precipitation ensemble members of the latest Global Ensemble Forecast System (GEFS/R) were forced into HEC-HMS to generate a retrospective streamflow ensemble forecast for an extreme hydrological event, Hurricane Irene. The work shows that ensemble stream discharge forecasts provide improved predictions and useful information about associated uncertainties, thus improving the assessment of risks when compared with deterministic forecasts. The uncertainties in weather inputs may result in false warnings and missed river flooding events, reducing the potential to effectively mitigate flood damage. The findings demonstrate how errors in the ensemble median streamflow forecast and time of peak, as well as the ensemble spread (uncertainty) are reduced 48 h pre-event by utilizing the ensemble framework. The methodology and implications of this work benefit efforts of short-term streamflow forecasts at regional scales, notably regarding the peak timing of an extreme hydrologic event when combined with a flood threshold exceedance diagram. Although the modeling framework was implemented on the Hudson River basin, it is flexible and applicable in other parts of the world where atmospheric reanalysis products and streamflow data are available.
Trends in the predictive performance of raw ensemble weather forecasts
NASA Astrophysics Data System (ADS)
Hemri, Stephan; Scheuerer, Michael; Pappenberger, Florian; Bogner, Konrad; Haiden, Thomas
2015-04-01
Over the last two decades the paradigm in weather forecasting has shifted from being deterministic to probabilistic. Accordingly, numerical weather prediction (NWP) models have been run increasingly as ensemble forecasting systems. The goal of such ensemble forecasts is to approximate the forecast probability distribution by a finite sample of scenarios. Global ensemble forecast systems, like the European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble, are prone to probabilistic biases, and are therefore not reliable. They particularly tend to be underdispersive for surface weather parameters. Hence, statistical post-processing is required in order to obtain reliable and sharp forecasts. In this study we apply statistical post-processing to ensemble forecasts of near-surface temperature, 24-hour precipitation totals, and near-surface wind speed from the global ECMWF model. Our main objective is to evaluate the evolution of the difference in skill between the raw ensemble and the post-processed forecasts. The ECMWF ensemble is under continuous development, and hence its forecast skill improves over time. Parts of these improvements may be due to a reduction of probabilistic bias. Thus, we first hypothesize that the gain by post-processing decreases over time. Based on ECMWF forecasts from January 2002 to March 2014 and corresponding observations from globally distributed stations we generate post-processed forecasts by ensemble model output statistics (EMOS) for each station and variable. Parameter estimates are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over rolling training periods that consist of the n days preceding the initialization dates. Given the higher average skill in terms of CRPS of the post-processed forecasts for all three variables, we analyze the evolution of the difference in skill between raw ensemble and EMOS forecasts. The fact that the gap in skill remains almost constant over time, especially for near-surface wind speed, suggests that improvements to the atmospheric model have an effect quite different from what calibration by statistical post-processing is doing. That is, they are increasing potential skill. Thus this study indicates that (a) further model development is important even if one is just interested in point forecasts, and (b) statistical post-processing is important because it will keep adding skill in the foreseeable future.
A Two-Timescale Response to Ozone Depletion: Importance of the Background State
NASA Astrophysics Data System (ADS)
Seviour, W.; Waugh, D.; Gnanadesikan, A.
2015-12-01
It has been recently suggested that the response of Southern Ocean sea-ice extent to stratospheric ozone depletion is time-dependent; that the ocean surface initially cools due to enhanced northward Ekman drift caused by a poleward shift in the eddy-driven jet, and then warms after some time due to upwelling of warm waters from below the mixed layer. It is therefore possible that ozone depletion could act to favor a short-term increase in sea-ice extent. However, many uncertainties remain in understanding this mechanism, with different models showing widely differing time-scales and magnitudes of the response. Here, we analyze an ensemble of coupled model simulations with a step-function ozone perturbation. The two-timescale response is present with an approximately 30 year initial cooling period. The response is further shown to be highly dependent upon the background ocean temperature and salinity stratification, which is influenced by both natural internal variability and the isopycnal eddy mixing parameterization. It is suggested that the majority of inter-model differences in the Southern Ocean response to ozone depletion is caused by differences in stratification.
Using Analog Ensemble to generate spatially downscaled probabilistic wind power forecasts
NASA Astrophysics Data System (ADS)
Delle Monache, L.; Shahriari, M.; Cervone, G.
2017-12-01
We use the Analog Ensemble (AnEn) method to generate probabilistic 80-m wind power forecasts. We use data from the NCEP GFS ( 28 km resolution) and NCEP NAM (12 km resolution). We use forecasts data from NAM and GFS, and analysis data from NAM which enables us to: 1) use a lower-resolution model to create higher-resolution forecasts, and 2) use a higher-resolution model to create higher-resolution forecasts. The former essentially increases computing speed and the latter increases forecast accuracy. An aggregated model of the former can be compared against the latter to measure the accuracy of the AnEn spatial downscaling. The AnEn works by taking a deterministic future forecast and comparing it with past forecasts. The model searches for the best matching estimates within the past forecasts and selects the predictand value corresponding to these past forecasts as the ensemble prediction for the future forecast. Our study is based on predicting wind speed and air density at more than 13,000 grid points in the continental US. We run the AnEn model twice: 1) estimating 80-m wind speed by using predictor variables such as temperature, pressure, geopotential height, U-component and V-component of wind, 2) estimating air density by using predictors such as temperature, pressure, and relative humidity. We use the air density values to correct the standard wind power curves for different values of air density. The standard deviation of the ensemble members (i.e. ensemble spread) will be used as the degree of difficulty to predict wind power at different locations. The value of the correlation coefficient between the ensemble spread and the forecast error determines the appropriateness of this measure. This measure is prominent for wind farm developers as building wind farms in regions with higher predictability will reduce the real-time risks of operating in the electricity markets.
Summer drought predictability over Europe: empirical versus dynamical forecasts
NASA Astrophysics Data System (ADS)
Turco, Marco; Ceglar, Andrej; Prodhomme, Chloé; Soret, Albert; Toreti, Andrea; Doblas-Reyes Francisco, J.
2017-08-01
Seasonal climate forecasts could be an important planning tool for farmers, government and insurance companies that can lead to better and timely management of seasonal climate risks. However, climate seasonal forecasts are often under-used, because potential users are not well aware of the capabilities and limitations of these products. This study aims at assessing the merits and caveats of a statistical empirical method, the ensemble streamflow prediction system (ESP, an ensemble based on reordering historical data) and an operational dynamical forecast system, the European Centre for Medium-Range Weather Forecasts—System 4 (S4) in predicting summer drought in Europe. Droughts are defined using the Standardized Precipitation Evapotranspiration Index for the month of August integrated over 6 months. Both systems show useful and mostly comparable deterministic skill. We argue that this source of predictability is mostly attributable to the observed initial conditions. S4 shows only higher skill in terms of ability to probabilistically identify drought occurrence. Thus, currently, both approaches provide useful information and ESP represents a computationally fast alternative to dynamical prediction applications for drought prediction.
NASA Astrophysics Data System (ADS)
Addor, N.; Jaun, S.; Fundel, F.; Zappa, M.
2012-04-01
The Sihl River flows through Zurich, Switzerland's most populated city, for which it represents the largest flood threat. To anticipate extreme discharge events and provide decision support in case of flood risk, a hydrometeorological ensemble prediction system (HEPS) was launched operationally in 2008. This model chain relies on deterministic (COSMO-7) and probabilistic (COSMO-LEPS) atmospheric forecasts, which are used to force a semi-distributed hydrological model (PREVAH) coupled to a hydraulic model (FLORIS). The resulting hydrological forecasts are eventually communicated to the stakeholders involved in the Sihl discharge management. This fully operational setting provides a real framework with which we assessed the potential of deterministic and probabilistic discharge forecasts for flood mitigation. To study the suitability of HEPS for small-scale basins and to quantify the added value conveyed by the probability information, a 31-month reforecast was produced for the Sihl catchment (336 km2). Several metrics support the conclusion that the performance gain is of up to 2 days lead time for the catchment considered. Brier skill scores show that probabilistic hydrological forecasts outperform their deterministic counterparts for all the lead times and event intensities considered. The small size of the Sihl catchment does not prevent skillful discharge forecasts, but makes them particularly dependent on correct precipitation forecasts. Our evaluation stresses that the capacity of the model to provide confident and reliable mid-term probability forecasts for high discharges is limited. We finally highlight challenges for making decisions on the basis of hydrological predictions, and discuss the need for a tool to be used in addition to forecasts to compare the different mitigation actions possible in the Sihl catchment.
One-step generation of multipartite entanglement among nitrogen-vacancy center ensembles
Song, Wan-lu; Yin, Zhang-qi; Yang, Wan-li; Zhu, Xiao-bo; Zhou, Fei; Feng, Mang
2015-01-01
We describe a one-step, deterministic and scalable scheme for creating macroscopic arbitrary entangled coherent states (ECSs) of separate nitrogen-vacancy center ensembles (NVEs) that couple to a superconducting flux qubit. We discuss how to generate the entangled states between the flux qubit and two NVEs by the resonant driving. Then the ECSs of the NVEs can be obtained by projecting the flux qubit, and the entanglement detection can be realized by transferring the quantum state from the NVEs to the flux qubit. Our numerical simulation shows that even under current experimental parameters the concurrence of the ECSs can approach unity. We emphasize that this method is straightforwardly extendable to the case of many NVEs. PMID:25583623
1988-12-09
Measurement of Second Order Statistics .... .............. .54 5.4 Measurement of Triple Products ...... ................. .58 5.6 Uncertainty Analysis...deterministic fluctuations, u/ 2 , were 25 times larger than the mean fluctuations, u, there were no significant variations in the mean statistical ...input signals, the three velocity components are cal- culated, Awn in ,i-;dual phase ensembles are collected for the appropriate statistical 3
Deterministic matrices matching the compressed sensing phase transitions of Gaussian random matrices
Monajemi, Hatef; Jafarpour, Sina; Gavish, Matan; Donoho, David L.; Ambikasaran, Sivaram; Bacallado, Sergio; Bharadia, Dinesh; Chen, Yuxin; Choi, Young; Chowdhury, Mainak; Chowdhury, Soham; Damle, Anil; Fithian, Will; Goetz, Georges; Grosenick, Logan; Gross, Sam; Hills, Gage; Hornstein, Michael; Lakkam, Milinda; Lee, Jason; Li, Jian; Liu, Linxi; Sing-Long, Carlos; Marx, Mike; Mittal, Akshay; Monajemi, Hatef; No, Albert; Omrani, Reza; Pekelis, Leonid; Qin, Junjie; Raines, Kevin; Ryu, Ernest; Saxe, Andrew; Shi, Dai; Siilats, Keith; Strauss, David; Tang, Gary; Wang, Chaojun; Zhou, Zoey; Zhu, Zhen
2013-01-01
In compressed sensing, one takes samples of an N-dimensional vector using an matrix A, obtaining undersampled measurements . For random matrices with independent standard Gaussian entries, it is known that, when is k-sparse, there is a precisely determined phase transition: for a certain region in the (,)-phase diagram, convex optimization typically finds the sparsest solution, whereas outside that region, it typically fails. It has been shown empirically that the same property—with the same phase transition location—holds for a wide range of non-Gaussian random matrix ensembles. We report extensive experiments showing that the Gaussian phase transition also describes numerous deterministic matrices, including Spikes and Sines, Spikes and Noiselets, Paley Frames, Delsarte-Goethals Frames, Chirp Sensing Matrices, and Grassmannian Frames. Namely, for each of these deterministic matrices in turn, for a typical k-sparse object, we observe that convex optimization is successful over a region of the phase diagram that coincides with the region known for Gaussian random matrices. Our experiments considered coefficients constrained to for four different sets , and the results establish our finding for each of the four associated phase transitions. PMID:23277588
NASA Astrophysics Data System (ADS)
Narayanan, Kiran; Samtaney, Ravi
2018-04-01
We obtain numerical solutions of the two-fluid fluctuating compressible Navier-Stokes (FCNS) equations, which consistently account for thermal fluctuations from meso- to macroscales, in order to study the effect of such fluctuations on the mixing behavior in the Richtmyer-Meshkov instability (RMI). The numerical method used was successfully verified in two stages: for the deterministic fluxes by comparison against air-SF6 RMI experiment, and for the stochastic terms by comparison against the direct simulation Monte Carlo results for He-Ar RMI. We present results from fluctuating hydrodynamic RMI simulations for three He-Ar systems having length scales with decreasing order of magnitude that span from macroscopic to mesoscopic, with different levels of thermal fluctuations characterized by a nondimensional Boltzmann number (Bo). For a multidimensional FCNS system on a regular Cartesian grid, when using a discretization of a space-time stochastic flux Z (x ,t ) of the form Z (x ,t ) →1 /√{h ▵ t }N (i h ,n Δ t ) for spatial interval h , time interval Δ t , h , and Gaussian noise N should be greater than h0, with h0 corresponding to a cell volume that contains a sufficient number of molecules of the fluid such that the fluctuations are physically meaningful and produce the right equilibrium spectrum. For the mesoscale RMI systems simulated, it was desirable to use a cell size smaller than this limit in order to resolve the viscous shock. This was achieved by using a modified regularization of the noise term via Z (h3,h03)>x ,t →1 /√ ▵ t max(i h ,n Δ t ) , with h0=ξ h ∀h
Evaluation of the new EMAC-SWIFT chemistry climate model
NASA Astrophysics Data System (ADS)
Scheffler, Janice; Langematz, Ulrike; Wohltmann, Ingo; Rex, Markus
2016-04-01
It is well known that the representation of atmospheric ozone chemistry in weather and climate models is essential for a realistic simulation of the atmospheric state. Including atmospheric ozone chemistry into climate simulations is usually done by prescribing a climatological ozone field, by including a fast linear ozone scheme into the model or by using a climate model with complex interactive chemistry. While prescribed climatological ozone fields are often not aligned with the modelled dynamics, a linear ozone scheme may not be applicable for a wide range of climatological conditions. Although interactive chemistry provides a realistic representation of atmospheric chemistry such model simulations are computationally very expensive and hence not suitable for ensemble simulations or simulations with multiple climate change scenarios. A new approach to represent atmospheric chemistry in climate models which can cope with non-linearities in ozone chemistry and is applicable to a wide range of climatic states is the Semi-empirical Weighted Iterative Fit Technique (SWIFT) that is driven by reanalysis data and has been validated against observational satellite data and runs of a full Chemistry and Transport Model. SWIFT has recently been implemented into the ECHAM/MESSy (EMAC) chemistry climate model that uses a modular approach to climate modelling where individual model components can be switched on and off. Here, we show first results of EMAC-SWIFT simulations and validate these against EMAC simulations using the complex interactive chemistry scheme MECCA, and against observations.
NASA Technical Reports Server (NTRS)
Li, Feng; Vikhliaev, Yury V.; Newman, Paul A.; Pawson, Steven; Perlwitz, Judith; Waugh, Darryn W.; Douglass, Anne R.
2016-01-01
Stratospheric ozone depletion plays a major role in driving climate change in the Southern Hemisphere. To date, many climate models prescribe the stratospheric ozone layer's evolution using monthly and zonally averaged ozone fields. However, the prescribed ozone underestimates Antarctic ozone depletion and lacks zonal asymmetries. In this study we investigate the impact of using interactive stratospheric chemistry instead of prescribed ozone on climate change simulations of the Antarctic and Southern Ocean. Two sets of 1960-2010 ensemble transient simulations are conducted with the coupled ocean version of the Goddard Earth Observing System Model, version 5: one with interactive stratospheric chemistry and the other with prescribed ozone derived from the same interactive simulations. The model's climatology is evaluated using observations and reanalysis. Comparison of the 1979-2010 climate trends between these two simulations reveals that interactive chemistry has important effects on climate change not only in the Antarctic stratosphere, troposphere, and surface, but also in the Southern Ocean and Antarctic sea ice. Interactive chemistry causes stronger Antarctic lower stratosphere cooling and circumpolar westerly acceleration during November-December-January. It enhances stratosphere-troposphere coupling and leads to significantly larger tropospheric and surface westerly changes. The significantly stronger surface wind stress trends cause larger increases of the Southern Ocean Meridional Overturning Circulation, leading to year-round stronger ocean warming near the surface and enhanced Antarctic sea ice decrease.
NASA Astrophysics Data System (ADS)
Tsai, Hsiao-Chung; Elsberry, Russell L.
2013-12-01
SummaryAn opportunity exists to extend support to the decision-making processes of water resource management and hydrological operations by providing extended-range tropical cyclone (TC) formation and track forecasts in the western North Pacific from the 51-member ECMWF 32-day ensemble. A new objective verification technique demonstrates that the ECMWF ensemble can predict most of the formations and tracks of the TCs during July 2009 to December 2010, even for most of the tropical depressions. Due to the relatively large number of false-alarm TCs in the ECMWF ensemble forecasts that would cause problems for support of hydrological operations, characteristics of these false alarms are discussed. Special attention is given to the ability of the ECMWF ensemble to predict periods of no-TCs in the Taiwan area, since water resource management decisions also depend on the absence of typhoon-related rainfall. A three-tier approach is proposed to provide support for hydrological operations via extended-range forecasts twice weekly on the 30-day timescale, twice-daily on the 15-day timescale, and up to four times a day with a consensus of high-resolution deterministic models.
NASA Technical Reports Server (NTRS)
Li, Jing; Li, Xichen; Carlson, Barbara E.; Kahn, Ralph A.; Lacis, Andrew A.; Dubovik, Oleg; Nakajima, Teruyuki
2016-01-01
Various space-based sensors have been designed and corresponding algorithms developed to retrieve aerosol optical depth (AOD), the very basic aerosol optical property, yet considerable disagreement still exists across these different satellite data sets. Surface-based observations aim to provide ground truth for validating satellite data; hence, their deployment locations should preferably contain as much spatial information as possible, i.e., high spatial representativeness. Using a novel Ensemble Kalman Filter (EnKF)- based approach, we objectively evaluate the spatial representativeness of current Aerosol Robotic Network (AERONET) sites. Multisensor monthly mean AOD data sets from Moderate Resolution Imaging Spectroradiometer, Multiangle Imaging Spectroradiometer, Sea-viewing Wide Field-of-view Sensor, Ozone Monitoring Instrument, and Polarization and Anisotropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar are combined into a 605-member ensemble, and AERONET data are considered as the observations to be assimilated into this ensemble using the EnKF. The assessment is made by comparing the analysis error variance (that has been constrained by ground-based measurements), with the background error variance (based on satellite data alone). Results show that the total uncertainty is reduced by approximately 27% on average and could reach above 50% over certain places. The uncertainty reduction pattern also has distinct seasonal patterns, corresponding to the spatial distribution of seasonally varying aerosol types, such as dust in the spring for Northern Hemisphere and biomass burning in the fall for Southern Hemisphere. Dust and biomass burning sites have the highest spatial representativeness, rural and oceanic sites can also represent moderate spatial information, whereas the representativeness of urban sites is relatively localized. A spatial score ranging from 1 to 3 is assigned to each AERONET site based on the uncertainty reduction, indicating its representativeness level.
Probabilistic forecasts based on radar rainfall uncertainty
NASA Astrophysics Data System (ADS)
Liguori, S.; Rico-Ramirez, M. A.
2012-04-01
The potential advantages resulting from integrating weather radar rainfall estimates in hydro-meteorological forecasting systems is limited by the inherent uncertainty affecting radar rainfall measurements, which is due to various sources of error [1-3]. The improvement of quality control and correction techniques is recognized to play a role for the future improvement of radar-based flow predictions. However, the knowledge of the uncertainty affecting radar rainfall data can also be effectively used to build a hydro-meteorological forecasting system in a probabilistic framework. This work discusses the results of the implementation of a novel probabilistic forecasting system developed to improve ensemble predictions over a small urban area located in the North of England. An ensemble of radar rainfall fields can be determined as the sum of a deterministic component and a perturbation field, the latter being informed by the knowledge of the spatial-temporal characteristics of the radar error assessed with reference to rain-gauges measurements. This approach is similar to the REAL system [4] developed for use in the Southern-Alps. The radar uncertainty estimate can then be propagated with a nowcasting model, used to extrapolate an ensemble of radar rainfall forecasts, which can ultimately drive hydrological ensemble predictions. A radar ensemble generator has been calibrated using radar rainfall data made available from the UK Met Office after applying post-processing and corrections algorithms [5-6]. One hour rainfall accumulations from 235 rain gauges recorded for the year 2007 have provided the reference to determine the radar error. Statistics describing the spatial characteristics of the error (i.e. mean and covariance) have been computed off-line at gauges location, along with the parameters describing the error temporal correlation. A system has then been set up to impose the space-time error properties to stochastic perturbations, generated in real-time at gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.
Ozone Depletion in the Arctic Lower Stratosphere; Timing and Impacts on the Polar Vortex.
NASA Astrophysics Data System (ADS)
Rae, Cameron; Pyle, John
2017-04-01
There a strong link between ozone depletion in the Antarctic lower stratosphere and the strength/duration of the southern hemisphere polar vortex. Ozone depletion arising from enhanced levels of ODS in the lower stratosphere during the last few decades of the 20th century has been accompanied by a delay in the final warming date in the southern hemisphere. The delay in final warming is associated with anomalous tropospheric conditions. The relationship in the Arctic, however, is less clear as the northern hemisphere experiences relatively less intense ozone destruction in the Arctic lower stratosphere and the polar vortex is generally less stable. This study investigates the impacts of imposed lower stratospheric ozone depletion on the evolution of the polar vortex, particularly in the late-spring towards the end of its lifetime. A perpetual-year integration is compared with a series of near-identical seasonal integrations which differ only by an imposed artificial ozone depletion event, occurring a fixed number of days before the polar vortex final warming date each year. Any differences between the seasonal forecasts and perpetual year simulation are due to the timely occurrence of a strong ozone depletion event in the late-spring Arctic polar vortex. This ensemble of seasonal forecasts demonstrates the impacts that a strong ozone depletion event in the Arctic lower stratosphere will have on the evolution of the polar vortex, and highlights tropospheric impacts associated with this phenomenon.
More than ten state-of-the-art regional air quality models have been applied as part of the Air Quality Model Evaluation International Initiative (AQMEII). These models were run by twenty independent groups in Europe and North America. Standardised modelling outputs over a full y...
NASA Astrophysics Data System (ADS)
Seviour, W.; Waugh, D.; Gnanadesikan, A.
2016-02-01
It has been recently suggested that the response of Southern Ocean sea-ice extent to stratospheric ozone depletion is time-dependent; that the ocean surface initially cools due to enhanced northward Ekman drift caused by a poleward shift in the eddy-driven jet, and then warms after some time due to upwelling of warm waters from below the mixed layer. It is therefore possible that ozone depletion could act to favor a short-term increase in sea-ice extent. However, many uncertainties remain in understanding this mechanism, with different models showing widely differing time-scales and magnitudes of the response. Here, we analyze an ensemble of coupled model simulations with a step-function ozone perturbation. The two-timescale response is present with an approximately 30 year initial cooling period. The response is further shown to be highly dependent upon the background ocean temperature and salinity stratification, which is influenced by both natural internal variability and the isopycnal eddy mixing parameterization. It is suggested that the majority of inter-model differences in the Southern Ocean response to ozone depletion are caused by differences in stratification.
All-optical switch and transistor gated by one stored photon.
Chen, Wenlan; Beck, Kristin M; Bücker, Robert; Gullans, Michael; Lukin, Mikhail D; Tanji-Suzuki, Haruka; Vuletić, Vladan
2013-08-16
The realization of an all-optical transistor, in which one "gate" photon controls a "source" light beam, is a long-standing goal in optics. By stopping a light pulse in an atomic ensemble contained inside an optical resonator, we realized a device in which one stored gate photon controls the resonator transmission of subsequently applied source photons. A weak gate pulse induces bimodal transmission distribution, corresponding to zero and one gate photons. One stored gate photon produces fivefold source attenuation and can be retrieved from the atomic ensemble after switching more than one source photon. Without retrieval, one stored gate photon can switch several hundred source photons. With improved storage and retrieval efficiency, our work may enable various new applications, including photonic quantum gates and deterministic multiphoton entanglement.
Flash-flood early warning using weather radar data: from nowcasting to forecasting
NASA Astrophysics Data System (ADS)
Liechti, Katharina; Panziera, Luca; Germann, Urs; Zappa, Massimiliano
2013-04-01
In our study we explore the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 hours between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic forcing.
Flash-flood early warning using weather radar data: from nowcasting to forecasting
NASA Astrophysics Data System (ADS)
Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.
2013-01-01
This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel probabilistic radar-based forecasting chains for flash-flood early warning are investigated in three catchments in the Southern Swiss Alps and set in relation to deterministic discharge forecast for the same catchments. The first probabilistic radar-based forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second probabilistic forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialized with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. We found a clear preference for the probabilistic approach. Discharge forecasts perform better when forced by NORA rather than by a persistent radar QPE for lead times up to eight hours and for all discharge thresholds analysed. The best results were, however, obtained with the REAL-C2 forecasting chain, which was also remarkably skilful even with the highest thresholds. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.
Deterministic Squeezed States with Collective Measurements and Feedback.
Cox, Kevin C; Greve, Graham P; Weiner, Joshua M; Thompson, James K
2016-03-04
We demonstrate the creation of entangled, spin-squeezed states using a collective, or joint, measurement and real-time feedback. The pseudospin state of an ensemble of N=5×10^{4} laser-cooled ^{87}Rb atoms is deterministically driven to a specified population state with angular resolution that is a factor of 5.5(8) [7.4(6) dB] in variance below the standard quantum limit for unentangled atoms-comparable to the best enhancements using only unitary evolution. Without feedback, conditioning on the outcome of the joint premeasurement, we directly observe up to 59(8) times [17.7(6) dB] improvement in quantum phase variance relative to the standard quantum limit for N=4×10^{5} atoms. This is one of the largest reported entanglement enhancements to date in any system.
NASA Astrophysics Data System (ADS)
Caumont, Olivier; Hally, Alan; Garrote, Luis; Richard, Évelyne; Weerts, Albrecht; Delogu, Fabio; Fiori, Elisabetta; Rebora, Nicola; Parodi, Antonio; Mihalović, Ana; Ivković, Marija; Dekić, Ljiljana; van Verseveld, Willem; Nuissier, Olivier; Ducrocq, Véronique; D'Agostino, Daniele; Galizia, Antonella; Danovaro, Emanuele; Clematis, Andrea
2015-04-01
The FP7 DRIHM (Distributed Research Infrastructure for Hydro-Meteorology, http://www.drihm.eu, 2011-2015) project intends to develop a prototype e-Science environment to facilitate the collaboration between meteorologists, hydrologists, and Earth science experts for accelerated scientific advances in Hydro-Meteorology Research (HMR). As the project comes to its end, this presentation will summarize the HMR results that have been obtained in the framework of DRIHM. The vision shaped and implemented in the framework of the DRIHM project enables the production and interpretation of numerous, complex compositions of hydrometeorological simulations of flood events from rainfall, either simulated or modelled, down to discharge. Each element of a composition is drawn from a set of various state-of-the-art models. Atmospheric simulations providing high-resolution rainfall forecasts involve different global and limited-area convection-resolving models, the former being used as boundary conditions for the latter. Some of these models can be run as ensembles, i.e. with perturbed boundary conditions, initial conditions and/or physics, thus sampling the probability density function of rainfall forecasts. In addition, a stochastic downscaling algorithm can be used to create high-resolution rainfall ensemble forecasts from deterministic lower-resolution forecasts. All these rainfall forecasts may be used as input to various rainfall-discharge hydrological models that compute the resulting stream flows for catchments of interest. In some hydrological simulations, physical parameters are perturbed to take into account model errors. As a result, six different kinds of rainfall data (either deterministic or probabilistic) can currently be compared with each other and combined with three different hydrological model engines running either in deterministic or probabilistic mode. HMR topics which are allowed or facilitated by such unprecedented sets of hydrometerological forecasts include: physical process studies, intercomparison of models and ensembles, sensitivity studies to a particular component of the forecasting chain, and design of flash-flood early-warning systems. These benefits will be illustrated with the different key cases that have been under investigation in the course of the project. These are four catastrophic cases of flooding, namely the case of 4 November 2011 in Genoa, Italy, 6 November 2011 in Catalonia, Spain, 13-16 May 2014 in eastern Europe, and 9 October 2014, again in Genoa, Italy.
Short-term Temperature Prediction Using Adaptive Computing on Dynamic Scales
NASA Astrophysics Data System (ADS)
Hu, W.; Cervone, G.; Jha, S.; Balasubramanian, V.; Turilli, M.
2017-12-01
When predicting temperature, there are specific places and times when high accuracy predictions are harder. For example, not all the sub-regions in the domain require the same amount of computing resources to generate an accurate prediction. Plateau areas might require less computing resources than mountainous areas because of the steeper gradient of temperature change in the latter. However, it is difficult to estimate beforehand the optimal allocation of computational resources because several parameters play a role in determining the accuracy of the forecasts, in addition to orography. The allocation of resources to perform simulations can become a bottleneck because it requires human intervention to stop jobs or start new ones. The goal of this project is to design and develop a dynamic approach to generate short-term temperature predictions that can automatically determines the required computing resources and the geographic scales of the predictions based on the spatial and temporal uncertainties. The predictions and the prediction quality metrics are computed using a numeric weather prediction model, Analog Ensemble (AnEn), and the parallelization on high performance computing systems is accomplished using Ensemble Toolkit, one component of the RADICAL-Cybertools family of tools. RADICAL-Cybertools decouple the science needs from the computational capabilities by building an intermediate layer to run general ensemble patterns, regardless of the science. In this research, we show how the ensemble toolkit allows generating high resolution temperature forecasts at different spatial and temporal resolution. The AnEn algorithm is run using NAM analysis and forecasts data for the continental United States for a period of 2 years. AnEn results show that temperature forecasts perform well according to different probabilistic and deterministic statistical tests.
Few-Photon Nonlinearity with an Atomic Ensemble in an Optical Cavity
NASA Astrophysics Data System (ADS)
Tanji, Haruka
2011-12-01
This thesis investigates the effect of the cavity vacuum field on the dispersive properties of an atomic ensemble in a strongly coupled high-finesse cavity. In particular, we demonstrate vacuum-induced transparency (VIT). The light absorption by the ensemble is suppressed by up to 40% in the presence of a cavity vacuum field. The sharp transparency peak is accompanied by the reduction in the group velocity of a light pulse, measured to be as low as 1800 m/s. This observation is a large step towards the realization of photon number-state filters, recently proposed by Nikoghosyan et al. Furthermore, we demonstrate few-photon optical nonlinearity, where the transparency is increased from 40% to 80% with ˜12 photons in the cavity mode. The result may be viewed as all-optical switching, where the transmission of photons in one mode may be controlled by 12 photons in another. These studies point to the possibility of nonlinear interaction between photons in different free-space modes, a scheme that circumvents cavity-coupling losses that plague cavity-based quantum information processing. Potential applications include advanced quantum devices such as photonic quantum gates, photon-number resolving detectors, and single-photon transistors. In the efforts leading up to these results, we investigate the collective enhancement of atomic coupling to a single mode of a low-finesse cavity. With the strong collective coupling, we obtain exquisite control of quantum states in the atom-photon coupled system. In this system, we demonstrate a heralded single-photon source with 84% conditional efficiency, a quantum bus for deterministic entanglement of two remote ensembles, and heralded polarization-state quantum memory with fidelity above 90%.
An Evaluation of the Predictability of Austral Summer Season Precipitation over South America.
NASA Astrophysics Data System (ADS)
Misra, Vasubandhu
2004-03-01
In this study predictability of austral summer seasonal precipitation over South America is investigated using a 12-yr set of a 3.5-month range (seasonal) and a 17-yr range (continuous multiannual) five-member ensemble integrations of the Center for Ocean Land Atmosphere Studies (COLA) atmospheric general circulation model (AGCM). These integrations were performed with prescribed observed sea surface temperature (SST); therefore, skill attained represents an estimate of the upper bound of the skill achievable by COLA AGCM with predicted SST. The seasonal runs outperform the multiannual model integrations both in deterministic and probabilistic skill. The simulation of the January February March (JFM) seasonal climatology of precipitation is vastly superior in the seasonal runs except over the Nordeste region where the multiannual runs show a marginal improvement. The teleconnection of the ensemble mean JFM precipitation over tropical South America with global contemporaneous observed sea surface temperature in the seasonal runs conforms more closely to observations than in the multiannual runs. Both the sets of runs clearly beat persistence in predicting the interannual precipitation anomalies over the Amazon River basin, Nordeste, South Atlantic convergence zone, and subtropical South America. However, both types of runs display poorer simulations over subtropical regions than the tropical areas of South America. The examination of probabilistic skill of precipitation supports the conclusions from deterministic skill analysis that the seasonal runs yield superior simulations than the multiannual-type runs.
Nonlinear consolidation in randomly heterogeneous highly compressible aquitards
NASA Astrophysics Data System (ADS)
Zapata-Norberto, Berenice; Morales-Casique, Eric; Herrera, Graciela S.
2018-05-01
Severe land subsidence due to groundwater extraction may occur in multiaquifer systems where highly compressible aquitards are present. The highly compressible nature of the aquitards leads to nonlinear consolidation where the groundwater flow parameters are stress-dependent. The case is further complicated by the heterogeneity of the hydrogeologic and geotechnical properties of the aquitards. The effect of realistic vertical heterogeneity of hydrogeologic and geotechnical parameters on the consolidation of highly compressible aquitards is investigated by means of one-dimensional Monte Carlo numerical simulations where the lower boundary represents the effect of an instant drop in hydraulic head due to groundwater pumping. Two thousand realizations are generated for each of the following parameters: hydraulic conductivity ( K), compression index ( C c), void ratio ( e) and m (an empirical parameter relating hydraulic conductivity and void ratio). The correlation structure, the mean and the variance for each parameter were obtained from a literature review about field studies in the lacustrine sediments of Mexico City. The results indicate that among the parameters considered, random K has the largest effect on the ensemble average behavior of the system when compared to a nonlinear consolidation model with deterministic initial parameters. The deterministic solution underestimates the ensemble average of total settlement when initial K is random. In addition, random K leads to the largest variance (and therefore largest uncertainty) of total settlement, groundwater flux and time to reach steady-state conditions.
NASA Astrophysics Data System (ADS)
Regonda, Satish Kumar; Seo, Dong-Jun; Lawrence, Bill; Brown, James D.; Demargne, Julie
2013-08-01
We present a statistical procedure for generating short-term ensemble streamflow forecasts from single-valued, or deterministic, streamflow forecasts produced operationally by the U.S. National Weather Service (NWS) River Forecast Centers (RFCs). The resulting ensemble streamflow forecast provides an estimate of the predictive uncertainty associated with the single-valued forecast to support risk-based decision making by the forecasters and by the users of the forecast products, such as emergency managers. Forced by single-valued quantitative precipitation and temperature forecasts (QPF, QTF), the single-valued streamflow forecasts are produced at a 6-h time step nominally out to 5 days into the future. The single-valued streamflow forecasts reflect various run-time modifications, or "manual data assimilation", applied by the human forecasters in an attempt to reduce error from various sources in the end-to-end forecast process. The proposed procedure generates ensemble traces of streamflow from a parsimonious approximation of the conditional multivariate probability distribution of future streamflow given the single-valued streamflow forecast, QPF, and the most recent streamflow observation. For parameter estimation and evaluation, we used a multiyear archive of the single-valued river stage forecast produced operationally by the NWS Arkansas-Red River Basin River Forecast Center (ABRFC) in Tulsa, Oklahoma. As a by-product of parameter estimation, the procedure provides a categorical assessment of the effective lead time of the operational hydrologic forecasts for different QPF and forecast flow conditions. To evaluate the procedure, we carried out hindcasting experiments in dependent and cross-validation modes. The results indicate that the short-term streamflow ensemble hindcasts generated from the procedure are generally reliable within the effective lead time of the single-valued forecasts and well capture the skill of the single-valued forecasts. For smaller basins, however, the effective lead time is significantly reduced by short basin memory and reduced skill in the single-valued QPF.
Uncertainties in climate assessment for the case of aviation NO
Holmes, Christopher D.; Tang, Qi; Prather, Michael J.
2011-01-01
Nitrogen oxides emitted from aircraft engines alter the chemistry of the atmosphere, perturbing the greenhouse gases methane (CH4) and ozone (O3). We quantify uncertainties in radiative forcing (RF) due to short-lived increases in O3, long-lived decreases in CH4 and O3, and their net effect, using the ensemble of published models and a factor decomposition of each forcing. The decomposition captures major features of the ensemble, and also shows which processes drive the total uncertainty in several climate metrics. Aviation-specific factors drive most of the uncertainty for the short-lived O3 and long-lived CH4 RFs, but a nonaviation factor dominates for long-lived O3. The model ensemble shows strong anticorrelation between the short-lived and long-lived RF perturbations (R2 = 0.87). Uncertainty in the net RF is highly sensitive to this correlation. We reproduce the correlation and ensemble spread in one model, showing that processes controlling the background tropospheric abundance of nitrogen oxides are likely responsible for the modeling uncertainty in climate impacts from aviation. PMID:21690364
NASA Astrophysics Data System (ADS)
Reynders, Edwin P. B.; Langley, Robin S.
2018-08-01
The hybrid deterministic-statistical energy analysis method has proven to be a versatile framework for modeling built-up vibro-acoustic systems. The stiff system components are modeled deterministically, e.g., using the finite element method, while the wave fields in the flexible components are modeled as diffuse. In the present paper, the hybrid method is extended such that not only the ensemble mean and variance of the harmonic system response can be computed, but also of the band-averaged system response. This variance represents the uncertainty that is due to the assumption of a diffuse field in the flexible components of the hybrid system. The developments start with a cross-frequency generalization of the reciprocity relationship between the total energy in a diffuse field and the cross spectrum of the blocked reverberant loading at the boundaries of that field. By making extensive use of this generalization in a first-order perturbation analysis, explicit expressions are derived for the cross-frequency and band-averaged variance of the vibrational energies in the diffuse components and for the cross-frequency and band-averaged variance of the cross spectrum of the vibro-acoustic field response of the deterministic components. These expressions are extensively validated against detailed Monte Carlo analyses of coupled plate systems in which diffuse fields are simulated by randomly distributing small point masses across the flexible components, and good agreement is found.
NASA Technical Reports Server (NTRS)
Ziemke, Jerry R.; Chandra, Sushil; Bhartia, Pawan K.
2004-01-01
It is generally recognized that Stratospheric Aerosols and Gas Experiment (SAGE) stratospheric ozone data have become a standard long-record reference field for comparison with other stratospheric ozone measurements. This study demonstrates that stratospheric column ozone (SCO) derived from total ozone mapping spectrometer (TOMS) Cloud Slicing may be used to supplement SAGE data as a stand-alone long- record reference field in the tropics extending to middle and high latitudes over the Pacific. Comparisons of SAGE I1 version 6.2 SCO and TOMS version 8 Cloud Slicing SCO for 1984-2003 exhibit remarkable agreement in monthly ensemble means to within 1-3 DU (1 - 1.5% of SCO) despite being independently-calibrated measurements. An important component of our study is to incorporate these column ozone measurements to investigate long-term trends for the period 1979-2003. Our study includes Solar Backscatter Ultraviolet (SBW) version 8 measurements of upper stratospheric column ozone (i.e., zero to 32 hPa column ozone) to characterize seasonal cycles and seasonal trends in this region, as well as the lower stratosphere and troposphere when combined with TOMS SCO and total column ozone. The trend analyses suggest that most ozone reduction in the atmosphere since 1979 in mid-to-high latitudes has occurred in the Lower stratosphere below approx. 25 km. The delineation of upper and lower stratospheric column ozone indicate that trends in the upper stratosphere during the latter half of the 1979-2003 period have reduced to near zero globally, while trends in the lower stratosphere have become larger by approx. 5 DU decade%om the tropics extending to mid-latitudes in both hemispheres. For TCO, the trend analyses suggest moderate increases over the 25-year time record in the extra-tropics of both hemispheres of around 4-6 DU (Northern Hemisphere) and 6-8 DU (Southern Hemisphere).
Watershed scale response to climate change--Yampa River Basin, Colorado
Hay, Lauren E.; Battaglin, William A.; Markstrom, Steven L.
2012-01-01
General Circulation Model simulations of future climate through 2099 project a wide range of possible scenarios. To determine the sensitivity and potential effect of long-term climate change on the freshwater resources of the United States, the U.S. Geological Survey Global Change study, "An integrated watershed scale response to global change in selected basins across the United States" was started in 2008. The long-term goal of this national study is to provide the foundation for hydrologically based climate change studies across the nation. Fourteen basins for which the Precipitation Runoff Modeling System has been calibrated and evaluated were selected as study sites. Precipitation Runoff Modeling System is a deterministic, distributed parameter watershed model developed to evaluate the effects of various combinations of precipitation, temperature, and land use on streamflow and general basin hydrology. Output from five General Circulation Model simulations and four emission scenarios were used to develop an ensemble of climate-change scenarios for each basin. These ensembles were simulated with the corresponding Precipitation Runoff Modeling System model. This fact sheet summarizes the hydrologic effect and sensitivity of the Precipitation Runoff Modeling System simulations to climate change for the Yampa River Basin at Steamboat Springs, Colorado.
Stratospheric Aerosol and Gas Experiments 1 and 2: Comparisons with ozonesondes
NASA Technical Reports Server (NTRS)
Veiga, Robert E.; Cunnold, Derek M.; Chu, William P.; McCormick, M. Patrick
1995-01-01
Ozone profiles measured by the Stratospheric Aerosol and Gas Experiments (SAGE) 1 and 2 are compared with ozonesonde profiles at 24 stations over the period extending from 1979 through 1991. Ozonesonde/satellite differences at 21 stations with SAGE 2 overpasses were computed down to 11.5 km in midlatitudes, to 15.5 km in the lower latitudes, and for nine stations with SAGE 1 overpasses down to 15.5 km. The set of individual satellite and ozonesonde profile comparisons most closely colocated in time and space shows mean absolute differences relative to the satellite measurement of 6 +/- 2% for SAGE 2 and 8 +/- 3% for SAGE 1. The ensemble of ozonesonde/satellite differences, when averaged over all altitudes, shows that for SAGE 2, 70% were less than 5%, whereas for SAGE 1, 50% were less than 5%. The best agreement occurred in the altitude region near the ozone density maximum where almost all the relative differences were less than 5%. Most of the statistically significant differences occurred below the ozone maximum down to the tropopause in the region of steepest ozone gradients and typically ranged between 0 and -20%. Correlations between ozone and aerosol extinction in the northern midlatitudes indicate that aerosols had no discernible impact on the ozonesonde/satellite differences and on the SAGE 2 ozone retrieval for the levels of extinction encountered in the lower stratosphere during 1984 to mid-1991.
Tao, Fulu; Rötter, Reimund P; Palosuo, Taru; Gregorio Hernández Díaz-Ambrona, Carlos; Mínguez, M Inés; Semenov, Mikhail A; Kersebaum, Kurt Christian; Nendel, Claas; Specka, Xenia; Hoffmann, Holger; Ewert, Frank; Dambreville, Anaelle; Martre, Pierre; Rodríguez, Lucía; Ruiz-Ramos, Margarita; Gaiser, Thomas; Höhn, Jukka G; Salo, Tapio; Ferrise, Roberto; Bindi, Marco; Cammarano, Davide; Schulman, Alan H
2018-03-01
Climate change impact assessments are plagued with uncertainties from many sources, such as climate projections or the inadequacies in structure and parameters of the impact model. Previous studies tried to account for the uncertainty from one or two of these. Here, we developed a triple-ensemble probabilistic assessment using seven crop models, multiple sets of model parameters and eight contrasting climate projections together to comprehensively account for uncertainties from these three important sources. We demonstrated the approach in assessing climate change impact on barley growth and yield at Jokioinen, Finland in the Boreal climatic zone and Lleida, Spain in the Mediterranean climatic zone, for the 2050s. We further quantified and compared the contribution of crop model structure, crop model parameters and climate projections to the total variance of ensemble output using Analysis of Variance (ANOVA). Based on the triple-ensemble probabilistic assessment, the median of simulated yield change was -4% and +16%, and the probability of decreasing yield was 63% and 31% in the 2050s, at Jokioinen and Lleida, respectively, relative to 1981-2010. The contribution of crop model structure to the total variance of ensemble output was larger than that from downscaled climate projections and model parameters. The relative contribution of crop model parameters and downscaled climate projections to the total variance of ensemble output varied greatly among the seven crop models and between the two sites. The contribution of downscaled climate projections was on average larger than that of crop model parameters. This information on the uncertainty from different sources can be quite useful for model users to decide where to put the most effort when preparing or choosing models or parameters for impact analyses. We concluded that the triple-ensemble probabilistic approach that accounts for the uncertainties from multiple important sources provide more comprehensive information for quantifying uncertainties in climate change impact assessments as compared to the conventional approaches that are deterministic or only account for the uncertainties from one or two of the uncertainty sources. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Vich, M.; Romero, R.; Richard, E.; Arbogast, P.; Maynard, K.
2010-09-01
Heavy precipitation events occur regularly in the western Mediterranean region. These events often have a high impact on the society due to economic and personal losses. The improvement of the mesoscale numerical forecasts of these events can be used to prevent or minimize their impact on the society. In previous studies, two ensemble prediction systems (EPSs) based on perturbing the model initial and boundary conditions were developed and tested for a collection of high-impact MEDEX cyclonic episodes. These EPSs perturb the initial and boundary potential vorticity (PV) field through a PV inversion algorithm. This technique ensures modifications of all the meteorological fields without compromising the mass-wind balance. One EPS introduces the perturbations along the zones of the three-dimensional PV structure presenting the local most intense values and gradients of the field (a semi-objective choice, PV-gradient), while the other perturbs the PV field over the MM5 adjoint model calculated sensitivity zones (an objective method, PV-adjoint). The PV perturbations are set from a PV error climatology (PVEC) that characterizes typical PV errors in the ECMWF forecasts, both in intensity and displacement. This intensity and displacement perturbation of the PV field is chosen randomly, while its location is given by the perturbation zones defined in each ensemble generation method. Encouraged by the good results obtained by these two EPSs that perturb the PV field, a new approach based on a manual perturbation of the PV field has been tested and compared with the previous results. This technique uses the satellite water vapor (WV) observations to guide the correction of initial PV structures. The correction of the PV field intents to improve the match between the PV distribution and the WV image, taking advantage of the relation between dark and bright features of WV images and PV anomalies, under some assumptions. Afterwards, the PV inversion algorithm is applied to run a forecast with the corresponding perturbed initial state (PV-satellite). The non hydrostatic MM5 mesoscale model has been used to run all forecasts. The simulations are performed for a two-day period with a 22.5 km resolution domain (Domain 1 in http://mm5forecasts.uib.es) nested in the ECMWF large-scale forecast fields. The MEDEX cyclone of 10 June 2000, also known as the Montserrat Case, is a suitable testbed to compare the performance of each ensemble and the PV-satellite method. This case is characterized by an Atlantic upper-level trough and low-level cold front which generated a stationary mesoscale cyclone over the Spanish Mediterranean coast, advecting warm and moist air toward Catalonia from the Mediterranean Sea. The consequences of the resulting mesoscale convective system were 6-h accumulated rainfall amounts of 180 mm with estimated material losses to exceed 65 million euros by media. The performace of both ensemble forecasting systems and PV-satellite technique for our case study is evaluated through the verification of the rainfall field. Since the EPSs are probabilistic forecasts and the PV-satellite is deterministic, their comparison is done using the individual ensemble members. Therefore the verification procedure uses deterministic scores, like the ROC curve, the Taylor diagram or the Q-Q plot. These scores cover the different quality attributes of the forecast such as reliability, resolution, uncertainty and sharpness. The results show that the PV-satellite technique performance lies within the performance range obtained by both ensembles; it is even better than the non-perturbed ensemble member. Thus, perturbing randomly using the PV error climatology and introducing the perturbations in the zones given by each EPS captures the mismatch between PV and WV fields better than manual perturbations made by an expert forecaster, at least for this case study.
Probabilistic Storm Surge Forecast For Venice
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Lionello, Piero
2013-04-01
This study describes an ensemble storm surge prediction procedure for the city of Venice, which is potentially very useful for its management, maintenance and for operating the movable barriers that are presently being built. Ensemble Prediction System (EPS) is meant to complement the existing SL forecast system by providing a probabilistic forecast and information on uncertainty of SL prediction. The procedure is applied to storm surge events in the period 2009-2010 producing for each of them an ensemble of 50 simulations. It is shown that EPS slightly increases the accuracy of SL prediction with respect to the deterministic forecast (DF) and it is more reliable than it. Though results are low biased and forecast uncertainty is underestimated, the probability distribution of maximum sea level produced by the EPS is acceptably realistic. The error of the EPS mean is shown to be correlated with the EPS spread. SL peaks correspond to maxima of uncertainty and uncertainty increases linearly with the forecast range. The quasi linear dynamics of the storm surges produces a modulation of the uncertainty after the SL peak with period corresponding to that of the main Adriatic seiche.
The ARPAL operational high resolution Poor Man's Ensemble, description and validation
NASA Astrophysics Data System (ADS)
Corazza, Matteo; Sacchetti, Davide; Antonelli, Marta; Drofa, Oxana
2018-05-01
The Meteo Hydrological Functional Center for Civil Protection of the Environmental Protection Agency of the Liguria Region is responsible for issuing forecasts primarily aimed at the Civil Protection needs. Several deterministic high resolution models, run every 6 or 12 h, are regularly used in the Center to elaborate weather forecasts at short to medium range. The Region is frequently affected by severe flash floods over its very small basins, characterized by a steep orography close to the sea. These conditions led the Center in the past years to pay particular attention to the use and development of high resolution model chains for explicit simulation of convective phenomena. For years, the availability of several models has been used by the forecasters for subjective analyses of the potential evolution of the atmosphere and of its uncertainty. More recently, an Interactive Poor Man's Ensemble has been developed, aimed at providing statistical ensemble variables to help forecaster's evaluations. In this paper the structure of this system is described and results are validated using the regional dense ground observational network.
NASA Astrophysics Data System (ADS)
Jha, Sanjeev K.; Shrestha, Durga L.; Stadnyk, Tricia A.; Coulibaly, Paulin
2018-03-01
Flooding in Canada is often caused by heavy rainfall during the snowmelt period. Hydrologic forecast centers rely on precipitation forecasts obtained from numerical weather prediction (NWP) models to enforce hydrological models for streamflow forecasting. The uncertainties in raw quantitative precipitation forecasts (QPFs) are enhanced by physiography and orography effects over a diverse landscape, particularly in the western catchments of Canada. A Bayesian post-processing approach called rainfall post-processing (RPP), developed in Australia (Robertson et al., 2013; Shrestha et al., 2015), has been applied to assess its forecast performance in a Canadian catchment. Raw QPFs obtained from two sources, Global Ensemble Forecasting System (GEFS) Reforecast 2 project, from the National Centers for Environmental Prediction, and Global Deterministic Forecast System (GDPS), from Environment and Climate Change Canada, are used in this study. The study period from January 2013 to December 2015 covered a major flood event in Calgary, Alberta, Canada. Post-processed results show that the RPP is able to remove the bias and reduce the errors of both GEFS and GDPS forecasts. Ensembles generated from the RPP reliably quantify the forecast uncertainty.
Spatio-Temporal Data Analysis at Scale Using Models Based on Gaussian Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stein, Michael
Gaussian processes are the most commonly used statistical model for spatial and spatio-temporal processes that vary continuously. They are broadly applicable in the physical sciences and engineering and are also frequently used to approximate the output of complex computer models, deterministic or stochastic. We undertook research related to theory, computation, and applications of Gaussian processes as well as some work on estimating extremes of distributions for which a Gaussian process assumption might be inappropriate. Our theoretical contributions include the development of new classes of spatial-temporal covariance functions with desirable properties and new results showing that certain covariance models lead tomore » predictions with undesirable properties. To understand how Gaussian process models behave when applied to deterministic computer models, we derived what we believe to be the first significant results on the large sample properties of estimators of parameters of Gaussian processes when the actual process is a simple deterministic function. Finally, we investigated some theoretical issues related to maxima of observations with varying upper bounds and found that, depending on the circumstances, standard large sample results for maxima may or may not hold. Our computational innovations include methods for analyzing large spatial datasets when observations fall on a partially observed grid and methods for estimating parameters of a Gaussian process model from observations taken by a polar-orbiting satellite. In our application of Gaussian process models to deterministic computer experiments, we carried out some matrix computations that would have been infeasible using even extended precision arithmetic by focusing on special cases in which all elements of the matrices under study are rational and using exact arithmetic. The applications we studied include total column ozone as measured from a polar-orbiting satellite, sea surface temperatures over the Pacific Ocean, and annual temperature extremes at a site in New York City. In each of these applications, our theoretical and computational innovations were directly motivated by the challenges posed by analyzing these and similar types of data.« less
The Re-Analysis of Ozone Profile Data from a 41-Year Series of SBUV Instruments
NASA Technical Reports Server (NTRS)
Kramarova, Natalya; Frith, Stacey; Bhartia, Pawan K.; McPeters, Richard; Labow, Gordon; Taylor, Steven; Fisher, Bradford
2012-01-01
In this study we present the validation of ozone profiles from a number of Solar Back Scattered Ultra Violet (SBUV) and SBUV/2 instruments that were recently reprocessed using an updated (Version 8.6) algorithm. The SBUV dataset provides the longest available record of global ozone profiles, spanning a 41-year period from 1970 to 2011 (except a 5-year gap in the 1970s) and includes ozone profile records obtained from the Nimbus-4 BUV and Nimbus-7 SBUV instruments, and a series of SBUV(/2) instruments launched on NOAA operational satellites (NOAA 09, 11, 14, 16, 17, 18, 19). Although modifications in instrument design were made in the evolution from the BUV instrument to the modern SBUV(/2) model, the basic principles of the measurement technique and retrieval algorithm remain the same. The long term SBUV data record allows us to create a consistent, calibrated dataset of ozone profiles that can be used for climate studies and trend analyses. In particular, we focus on estimating the various sources of error in the SBUV profile ozone retrievals using independent observations and analysis of the algorithm itself. For the first time we include in the metadata a quantitative estimate of the smoothing error, defined as the error due to profile variability that the SBUV observing system cannot inherently measure. The magnitude of the smoothing error varies with altitude, latitude, season and solar zenith angle. Between 10 and 1 hPa the smoothing errors for the SBUV monthly zonal mean retrievals are of the order of 1 %, but start to increase above and below this layer. The largest smoothing errors, as large as 15-20%, were detected in in the troposphere. The SBUV averaging kernels, provided with the ozone profiles in version 8.6, help to eliminate the smoothing effect when comparing the SBUV profiles with high vertical resolution measurements, and make it convenient to use the SBUV ozone profiles for data assimilation and model validation purposes. The smoothing error can also be minimized by combining layers of data, and we will discuss recommendations for this approach as well. The SBUV ozone profiles have been intensively validated against satellite profile measurements obtained from the Microwave Limb Sounders (MLS) (on board the UARS and AURA satellites), Stratospheric Aerosol and Gas Experiment (SAGE) and Michelson Interferometer for Passive Atmospheric Sounding (MIPAS). Also, we compare coincident and collocated SBUV ozone retrievals with observations made by ground-based instruments, such as microwave spectrometers, lidars, Umkehr instruments and balloon-borne ozonosondes. Finally, we compare the SBUV ozone profiles with output from the NASA GSFC GEOS-CCM model. In the stratosphere between 25 and 1 hPa the mean biases and standard deviations are within 5% for monthly mean ozone profiles. Above and below this layer the vertical resolution of the SBUV algorithm decreases and the effects of vertical smoothing should be taken into account. Though the SBUV algorithm has a coarser vertical resolution in the lower stratosphere and troposphere, it is capable of precisely estimating the integrated ozone column between the surface and 25 hPa. The time series of the tropospheric - lower stratospheric ozone column derived from SBUV agrees within 5% with the corresponding values observed by an ensemble of ozone sonde stations in North Hemisphere. Drift of the ozone time series obtained from each SBUV(/2) instrument relative to ground based and satellite measurements are evaluated and some features of individual SBUV(l2) instruments are discussed. In addition to evaluating individual instruments against independent observations, we also focus on the instrument to instrument consistency in the series. Overall, Version 8.6 ozone profiles obtained from two different SBUV(l2) instruments compare within a couple of percent during overlap periods and are consistently varying in time, with some exceptions. Some of the noted discrepancies might bssociated with ozone diurnal variations, since the difference in the local time of the observations for a pair of SBUV(l2) instruments could be several hours. Other issues include the potential short-term drift in measurements as the instrument orbit drifts, and measurements are obtained at high solar zenith angles (>85 ). Based on the results of the validation, a consistent, calibrated dataset of SBUV ozone profiles has been created based on internal calibration only.
Comparison of the economic impact of different wind power forecast systems for producers
NASA Astrophysics Data System (ADS)
Alessandrini, S.; Davò, F.; Sperati, S.; Benini, M.; Delle Monache, L.
2014-05-01
Deterministic forecasts of wind production for the next 72 h at a single wind farm or at the regional level are among the main end-users requirement. However, for an optimal management of wind power production and distribution it is important to provide, together with a deterministic prediction, a probabilistic one. A deterministic forecast consists of a single value for each time in the future for the variable to be predicted, while probabilistic forecasting informs on probabilities for potential future events. This means providing information about uncertainty (i.e. a forecast of the PDF of power) in addition to the commonly provided single-valued power prediction. A significant probabilistic application is related to the trading of energy in day-ahead electricity markets. It has been shown that, when trading future wind energy production, using probabilistic wind power predictions can lead to higher benefits than those obtained by using deterministic forecasts alone. In fact, by using probabilistic forecasting it is possible to solve economic model equations trying to optimize the revenue for the producer depending, for example, on the specific penalties for forecast errors valid in that market. In this work we have applied a probabilistic wind power forecast systems based on the "analog ensemble" method for bidding wind energy during the day-ahead market in the case of a wind farm located in Italy. The actual hourly income for the plant is computed considering the actual selling energy prices and penalties proportional to the unbalancing, defined as the difference between the day-ahead offered energy and the actual production. The economic benefit of using a probabilistic approach for the day-ahead energy bidding are evaluated, resulting in an increase of 23% of the annual income for a wind farm owner in the case of knowing "a priori" the future energy prices. The uncertainty on price forecasting partly reduces the economic benefit gained by using a probabilistic energy forecast system.
The Stochastic Evolution of a Protocell: The Gillespie Algorithm in a Dynamically Varying Volume
Carletti, T.; Filisetti, A.
2012-01-01
We propose an improvement of the Gillespie algorithm allowing us to study the time evolution of an ensemble of chemical reactions occurring in a varying volume, whose growth is directly related to the amount of some specific molecules, belonging to the reactions set. This allows us to study the stochastic evolution of a protocell, whose volume increases because of the production of container molecules. Several protocell models are considered and compared with the deterministic models. PMID:22536297
NASA Astrophysics Data System (ADS)
Kim, Ok-Yeon; Kim, Hye-Mi; Lee, Myong-In; Min, Young-Mi
2017-01-01
This study aims at predicting the seasonal number of typhoons (TY) over the western North Pacific with an Asia-Pacific Climate Center (APCC) multi-model ensemble (MME)-based dynamical-statistical hybrid model. The hybrid model uses the statistical relationship between the number of TY during the typhoon season (July-October) and the large-scale key predictors forecasted by APCC MME for the same season. The cross validation result from the MME hybrid model demonstrates high prediction skill, with a correlation of 0.67 between the hindcasts and observation for 1982-2008. The cross validation from the hybrid model with individual models participating in MME indicates that there is no single model which consistently outperforms the other models in predicting typhoon number. Although the forecast skill of MME is not always the highest compared to that of each individual model, the skill of MME presents rather higher averaged correlations and small variance of correlations. Given large set of ensemble members from multi-models, a relative operating characteristic score reveals an 82 % (above-) and 78 % (below-normal) improvement for the probabilistic prediction of the number of TY. It implies that there is 82 % (78 %) probability that the forecasts can successfully discriminate between above normal (below-normal) from other years. The forecast skill of the hybrid model for the past 7 years (2002-2008) is more skillful than the forecast from the Tropical Storm Risk consortium. Using large set of ensemble members from multi-models, the APCC MME could provide useful deterministic and probabilistic seasonal typhoon forecasts to the end-users in particular, the residents of tropical cyclone-prone areas in the Asia-Pacific region.
Modeling disease transmission near eradication: An equation free approach
NASA Astrophysics Data System (ADS)
Williams, Matthew O.; Proctor, Joshua L.; Kutz, J. Nathan
2015-01-01
Although disease transmission in the near eradication regime is inherently stochastic, deterministic quantities such as the probability of eradication are of interest to policy makers and researchers. Rather than running large ensembles of discrete stochastic simulations over long intervals in time to compute these deterministic quantities, we create a data-driven and deterministic "coarse" model for them using the Equation Free (EF) framework. In lieu of deriving an explicit coarse model, the EF framework approximates any needed information, such as coarse time derivatives, by running short computational experiments. However, the choice of the coarse variables (i.e., the state of the coarse system) is critical if the resulting model is to be accurate. In this manuscript, we propose a set of coarse variables that result in an accurate model in the endemic and near eradication regimes, and demonstrate this on a compartmental model representing the spread of Poliomyelitis. When combined with adaptive time-stepping coarse projective integrators, this approach can yield over a factor of two speedup compared to direct simulation, and due to its lower dimensionality, could be beneficial when conducting systems level tasks such as designing eradication or monitoring campaigns.
NASA Technical Reports Server (NTRS)
Li, Feng; Newman, Paul; Pawson, Steven; Waugh, Darryn
2014-01-01
Stratospheric ozone depletion has played a dominant role in driving Antarctic climate change in the last decades. In order to capture the stratospheric ozone forcing, many coupled atmosphere-ocean general circulation models (AOGCMs) prescribe the Antarctic ozone hole using monthly and zonally averaged ozone field. However, the prescribed ozone hole has a high ozone bias and lacks zonal asymmetry. The impacts of these biases on model simulations, particularly on Southern Ocean and the Antarctic sea ice, are not well understood. The purpose of this study is to determine the effects of using interactive stratospheric chemistry instead of prescribed ozone on Antarctic and Southern Ocean climate change in an AOGCM. We compare two sets of ensemble simulations for the 1960-2010 period using different versions of the Goddard Earth Observing System 5 - AOGCM: one with interactive stratospheric chemistry, and the other with prescribed monthly and zonally averaged ozone and 6 other stratospheric radiative species calculated from the interactive chemistry simulations. Consistent with previous studies using prescribed sea surface temperatures and sea ice concentrations, the interactive chemistry runs simulate a deeper Antarctic ozone hole and consistently larger changes in surface pressure and winds than the prescribed ozone runs. The use of a coupled atmosphere-ocean model in this study enables us to determine the impact of these surface changes on Southern Ocean circulation and Antarctic sea ice. The larger surface wind trends in the interactive chemistry case lead to larger Southern Ocean circulation trends with stronger changes in northerly and westerly surface flow near the Antarctica continent and stronger upwelling near 60S. Using interactive chemistry also simulates a larger decrease of sea ice concentrations. Our results highlight the importance of using interactive chemistry in order to correctly capture the influences of stratospheric ozone depletion on climate change over Antarctic and the Southern Ocean.
Heat wave over India during summer 2015: an assessment of real time extended range forecast
NASA Astrophysics Data System (ADS)
Pattanaik, D. R.; Mohapatra, M.; Srivastava, A. K.; Kumar, Arun
2017-08-01
Hot winds are the marked feature of summer season in India during late spring preceding the climatological onset of the monsoon season in June. Some years the conditions becomes very vulnerable with the maximum temperature ( T max) exceeding 45 °C for many days over parts of north-western, eastern coastal states of India and Indo-Gangetic plain. During summer of 2015 (late May to early June) eastern coastal states, central and northwestern parts of India experienced severe heat wave conditions leading to loss of thousands of human life in extreme high temperature conditions. It is not only the loss of human life but also the animals and birds were very vulnerable to this extreme heat wave conditions. In this study, an attempt is made to assess the performance of real time extended range forecast (forecast up to 3 weeks) of this scorching T max based on the NCEP's Climate Forecast System (CFS) latest version coupled model (CFSv2). The heat wave condition was very severe during the week from 22 to 28 May with subsequent week from 29 May to 4 June also witnessed high T max over many parts of central India including eastern coastal states of India. The 8 ensemble members of operational CFSv2 model are used once in a week to prepare the weekly bias corrected deterministic (ensemble mean) T max forecast for 3 weeks valid from Friday to Thursday coinciding with the heat wave periods of 2015. Using the 8 ensemble members separately and the CFSv2 corresponding hindcast climatology the probability of above and below normal T max is also prepared for the same 3 weeks. The real time deterministic and probabilistic forecasts did indicate impending heat wave over many parts of India during late May and early June of 2015 associated with strong northwesterly wind over main land mass of India, delaying the sea breeze, leading to heat waves over eastern coastal regions of India. Thus, the capability of coupled model in providing early warning of such killer heat wave can be very useful to the disaster managers to take appropriate actions to minimize the loss of life and property due to such high T max.
The integrated process rates (IPR) estimated by the Eta-CMAQ model at grid cells along the trajectory of the air mass transport path were analyzed to quantitatively investigate the relative importance of physical and chemical processes for O3 formation and evolution ov...
Scale-invariant Green-Kubo relation for time-averaged diffusivity
NASA Astrophysics Data System (ADS)
Meyer, Philipp; Barkai, Eli; Kantz, Holger
2017-12-01
In recent years it was shown both theoretically and experimentally that in certain systems exhibiting anomalous diffusion the time- and ensemble-averaged mean-squared displacement are remarkably different. The ensemble-averaged diffusivity is obtained from a scaling Green-Kubo relation, which connects the scale-invariant nonstationary velocity correlation function with the transport coefficient. Here we obtain the relation between time-averaged diffusivity, usually recorded in single-particle tracking experiments, and the underlying scale-invariant velocity correlation function. The time-averaged mean-squared displacement is given by 〈δ2¯〉 ˜2 DνtβΔν -β , where t is the total measurement time and Δ is the lag time. Here ν is the anomalous diffusion exponent obtained from ensemble-averaged measurements 〈x2〉 ˜tν , while β ≥-1 marks the growth or decline of the kinetic energy 〈v2〉 ˜tβ . Thus, we establish a connection between exponents that can be read off the asymptotic properties of the velocity correlation function and similarly for the transport constant Dν. We demonstrate our results with nonstationary scale-invariant stochastic and deterministic models, thereby highlighting that systems with equivalent behavior in the ensemble average can differ strongly in their time average. If the averaged kinetic energy is finite, β =0 , the time scaling of 〈δ2¯〉 and 〈x2〉 are identical; however, the time-averaged transport coefficient Dν is not identical to the corresponding ensemble-averaged diffusion constant.
NASA Astrophysics Data System (ADS)
Escriba, P. A.; Callado, A.; Santos, D.; Santos, C.; Simarro, J.; García-Moya, J. A.
2009-09-01
At 00 UTC 24 January 2009 an explosive ciclogenesis originated over the Atlantic Ocean reached its maximum intensity with observed surface pressures lower than 970 hPa on its center and placed at Gulf of Vizcaya. During its path through southern France this low caused strong westerly and north-westerly winds over the Iberian Peninsula higher than 150 km/h at some places. These extreme winds leaved 10 casualties in Spain, 8 of them in Catalonia. The aim of this work is to show whether exists an added value in the short range prediction of the 24 January 2009 strong winds when using the Short Range Ensemble Prediction System (SREPS) of the Spanish Meteorological Agency (AEMET), with respect to the operational forecasting tools. This study emphasizes two aspects of probabilistic forecasting: the ability of a 3-day forecast of warn an extreme windy event and the ability of quantifying the predictability of the event so that giving value to deterministic forecast. Two type of probabilistic forecasts of wind are carried out, a non-calibrated and a calibrated one using Bayesian Model Averaging (BMA). AEMET runs daily experimentally SREPS twice a day (00 and 12 UTC). This system consists of 20 members that are constructed by integrating 5 local area models, COSMO (COSMO), HIRLAM (HIRLAM Consortium), HRM (DWD), MM5 (NOAA) and UM (UKMO), at 25 km of horizontal resolution. Each model uses 4 different initial and boundary conditions, the global models GFS (NCEP), GME (DWD), IFS (ECMWF) and UM. By this way it is obtained a probabilistic forecast that takes into account the initial, the contour and the model errors. BMA is a statistical tool for combining predictive probability functions from different sources. The BMA predictive probability density function (PDF) is a weighted average of PDFs centered on the individual bias-corrected forecasts. The weights are equal to posterior probabilities of the models generating the forecasts and reflect the skill of the ensemble members. Here BMA is applied to provide probabilistic forecasts of wind speed. In this work several forecasts for different time ranges (H+72, H+48 and H+24) of 10 meters wind speed over Catalonia are verified subjectively at one of the instants of maximum intensity, 12 UTC 24 January 2009. On one hand, three probabilistic forecasts are compared, ECMWF EPS, non-calibrated SREPS and calibrated SREPS. On the other hand, the relationship between predictability and skill of deterministic forecast is studied by looking at HIRLAM 0.16 deterministic forecasts of the event. Verification is focused on location and intensity of 10 meters wind speed and 10-minutal measures from AEMET automatic ground stations are used as observations. The results indicate that SREPS is able to forecast three days ahead mean winds higher than 36 km/h and that correctly localizes them with a significant probability of ocurrence in the affected area. The probability is higher after BMA calibration of the ensemble. The fact that probability of strong winds is high allows us to state that the predictability of the event is also high and, as a consequence, deterministic forecasts are more reliable. This is confirmed when verifying HIRLAM deterministic forecasts against observed values.
Stochastic Parametrisations and Regime Behaviour of Atmospheric Models
NASA Astrophysics Data System (ADS)
Arnold, Hannah; Moroz, Irene; Palmer, Tim
2013-04-01
The presence of regimes is a characteristic of non-linear, chaotic systems (Lorenz, 2006). In the atmosphere, regimes emerge as familiar circulation patterns such as the El-Nino Southern Oscillation (ENSO), the North Atlantic Oscillation (NAO) and Scandinavian Blocking events. In recent years there has been much interest in the problem of identifying and studying atmospheric regimes (Solomon et al, 2007). In particular, how do these regimes respond to an external forcing such as anthropogenic greenhouse gas emissions? The importance of regimes in observed trends over the past 50-100 years indicates that in order to predict anthropogenic climate change, our climate models must be able to represent accurately natural circulation regimes, their statistics and variability. It is well established that representing model uncertainty as well as initial condition uncertainty is important for reliable weather forecasts (Palmer, 2001). In particular, stochastic parametrisation schemes have been shown to improve the skill of weather forecast models (e.g. Berner et al., 2009; Frenkel et al., 2012; Palmer et al., 2009). It is possible that including stochastic physics as a representation of model uncertainty could also be beneficial in climate modelling, enabling the simulator to explore larger regions of the climate attractor including other flow regimes. An alternative representation of model uncertainty is a perturbed parameter scheme, whereby physical parameters in subgrid parametrisation schemes are perturbed about their optimal value. Perturbing parameters gives a greater control over the ensemble than multi-model or multiparametrisation ensembles, and has been used as a representation of model uncertainty in climate prediction (Stainforth et al., 2005; Rougier et al., 2009). We investigate the effect of including representations of model uncertainty on the regime behaviour of a simulator. A simple chaotic model of the atmosphere, the Lorenz '96 system, is used to study the predictability of regime changes (Lorenz 1996, 2006). Three types of models are considered: a deterministic parametrisation scheme, stochastic parametrisation schemes with additive or multiplicative noise, and a perturbed parameter ensemble. Each forecasting scheme was tested on its ability to reproduce the attractor of the full system, defined in a reduced space based on EOF decomposition. None of the forecast models accurately capture the less common regime, though a significant improvement is observed over the deterministic parametrisation when a temporally correlated stochastic parametrisation is used. The attractor for the perturbed parameter ensemble improves on that forecast by the deterministic or white additive schemes, showing a distinct peak in the attractor corresponding to the less common regime. However, the 40 constituent members of the perturbed parameter ensemble each differ greatly from the true attractor, with many only showing one dominant regime with very rare transitions. These results indicate that perturbed parameter ensembles must be carefully analysed as individual members may have very different characteristics to the ensemble mean and to the true system being modelled. On the other hand, the stochastic parametrisation schemes tested performed well, improving the simulated climate, and motivating the development of a stochastic earth-system simulator for use in climate prediction. J. Berner, G. J. Shutts, M. Leutbecher, and T. N. Palmer. A spectral stochastic kinetic energy backscatter scheme and its impact on flow dependent predictability in the ECMWF ensemble prediction system. J. Atmos. Sci., 66(3):603-626, 2009. Y. Frenkel, A. J. Majda, and B. Khouider. Using the stochastic multicloud model to improve tropical convective parametrisation: A paradigm example. J. Atmos. Sci., 69(3):1080-1105, 2012. E. N. Lorenz. Predictability: a problem partly solved. In Proceedings, Seminar on Predictability, 4-8 September 1995, volume 1, pages 1-18, Shinfield Park, Reading, 1996. ECMWF. E. N. Lorenz. Regimes in simple systems. J. Atmos. Sci., 63(8):2056-2073, 2006. T. N Palmer. A nonlinear dynamical perspective on model error: A proposal for non-local stochastic-dynamic parametrisation in weather and climate prediction models. Q. J. Roy. Meteor. Soc., 127(572):279-304, 2001. T. N. Palmer, R. Buizza, F. Doblas-Reyes, T. Jung, M. Leutbecher, G. J. Shutts, M. Steinheimer, and A. Weisheimer. Stochastic parametrization and model uncertainty. Technical Report 598, European Centre for Medium-Range Weather Forecasts, 2009. J. Rougier, D. M. H. Sexton, J. M. Murphy, and D. Stainforth. Analyzing the climate sensitivity of the HadSM3 climate model using ensembles from different but related experiments. J. Climate, 22:3540-3557, 2009. S. Solomon, D. Qin, M. Manning, Z. Chen, M. Marquis, K. B. Averyt, Tignor M., and H. L. Miller. Climate models and their evaluation. In Climate Change 2007: The Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report of the Intergovernmental Panel on Climate Change, Cambridge, United Kingdom and New York, NY, USA, 2007. Cambridge University Press. D. A Stainforth, T. Aina, C. Christensen, M. Collins, N. Faull, D. J. Frame, J. A. Kettleborough, S. Knight, A. Martin, J. M. Murphy, C. Piani, D. Sexton, L. A. Smith, R. A Spicer, A. J. Thorpe, and M. R Allen. Uncertainty in predictions of the climate response to rising levels of greenhouse gases. Nature, 433(7024):403-406, 2005.
NASA Astrophysics Data System (ADS)
Mozaffarilegha, Marjan; Esteki, Ali; Ahadi, Mohsen; Nazeri, Ahmadreza
The speech-evoked auditory brainstem response (sABR) shows how complex sounds such as speech and music are processed in the auditory system. Speech-ABR could be used to evaluate particular impairments and improvements in auditory processing system. Many researchers used linear approaches for characterizing different components of sABR signal, whereas nonlinear techniques are not applied so commonly. The primary aim of the present study is to examine the underlying dynamics of normal sABR signals. The secondary goal is to evaluate whether some chaotic features exist in this signal. We have presented a methodology for determining various components of sABR signals, by performing Ensemble Empirical Mode Decomposition (EEMD) to get the intrinsic mode functions (IMFs). Then, composite multiscale entropy (CMSE), the largest Lyapunov exponent (LLE) and deterministic nonlinear prediction are computed for each extracted IMF. EEMD decomposes sABR signal into five modes and a residue. The CMSE results of sABR signals obtained from 40 healthy people showed that 1st, and 2nd IMFs were similar to the white noise, IMF-3 with synthetic chaotic time series and 4th, and 5th IMFs with sine waveform. LLE analysis showed positive values for 3rd IMFs. Moreover, 1st, and 2nd IMFs showed overlaps with surrogate data and 3rd, 4th and 5th IMFs showed no overlap with corresponding surrogate data. Results showed the presence of noisy, chaotic and deterministic components in the signal which respectively corresponded to 1st, and 2nd IMFs, IMF-3, and 4th and 5th IMFs. While these findings provide supportive evidence of the chaos conjecture for the 3rd IMF, they do not confirm any such claims. However, they provide a first step towards an understanding of nonlinear behavior of auditory system dynamics in brainstem level.
NASA Astrophysics Data System (ADS)
Cosgrove, B.; Gochis, D.; Clark, E. P.; Cui, Z.; Dugger, A. L.; Fall, G. M.; Feng, X.; Fresch, M. A.; Gourley, J. J.; Khan, S.; Kitzmiller, D.; Lee, H. S.; Liu, Y.; McCreight, J. L.; Newman, A. J.; Oubeidillah, A.; Pan, L.; Pham, C.; Salas, F.; Sampson, K. M.; Smith, M.; Sood, G.; Wood, A.; Yates, D. N.; Yu, W.; Zhang, Y.
2015-12-01
The National Weather Service (NWS) National Water Center(NWC) is collaborating with the NWS National Centers for Environmental Prediction (NCEP) and the National Center for Atmospheric Research (NCAR) to implement a first-of-its-kind operational instance of the Weather Research and Forecasting (WRF)-Hydro model over the Continental United States (CONUS) and contributing drainage areas on the NWS Weather and Climate Operational Supercomputing System (WCOSS) supercomputer. The system will provide seamless, high-resolution, continuously cycling forecasts of streamflow and other hydrologic outputs of value from both deterministic- and ensemble-type runs. WRF-Hydro will form the core of the NWC national water modeling strategy, supporting NWS hydrologic forecast operations along with emergency response and water management efforts of partner agencies. Input and output from the system will be comprehensively verified via the NWC Water Resource Evaluation Service. Hydrologic events occur on a wide range of temporal scales, from fast acting flash floods, to long-term flow events impacting water supply. In order to capture this range of events, the initial operational WRF-Hydro configuration will feature 1) hourly analysis runs, 2) short-and medium-range deterministic forecasts out to two day and ten day horizons and 3) long-range ensemble forecasts out to 30 days. All three of these configurations are underpinned by a 1km execution of the NoahMP land surface model, with channel routing taking place on 2.67 million NHDPlusV2 catchments covering the CONUS and contributing areas. Additionally, the short- and medium-range forecasts runs will feature surface and sub-surface routing on a 250m grid, while the hourly analyses will feature this same 250m routing in addition to nudging-based assimilation of US Geological Survey (USGS) streamflow observations. A limited number of major reservoirs will be configured within the model to begin to represent the first-order impacts of streamflow regulation.
NASA Astrophysics Data System (ADS)
Addor, N.; Jaun, S.; Zappa, M.
2011-01-01
The Sihl River flows through Zurich, Switzerland's most populated city, for which it represents the largest flood threat. To anticipate extreme discharge events and provide decision support in case of flood risk, a hydrometeorological ensemble prediction system (HEPS) was launched operationally in 2008. This models chain relies on limited-area atmospheric forecasts provided by the deterministic model COSMO-7 and the probabilistic model COSMO-LEPS. These atmospheric forecasts are used to force a semi-distributed hydrological model (PREVAH), coupled to a hydraulic model (FLORIS). The resulting hydrological forecasts are eventually communicated to the stakeholders involved in the Sihl discharge management. This fully operational setting provides a real framework to compare the potential of deterministic and probabilistic discharge forecasts for flood mitigation. To study the suitability of HEPS for small-scale basins and to quantify the added-value conveyed by the probability information, a reforecast was made for the period June 2007 to December 2009 for the Sihl catchment (336 km2). Several metrics support the conclusion that the performance gain can be of up to 2 days lead time for the catchment considered. Brier skill scores show that COSMO-LEPS-based hydrological forecasts overall outperform their COSMO-7 based counterparts for all the lead times and event intensities considered. The small size of the Sihl catchment does not prevent skillful discharge forecasts, but makes them particularly dependent on correct precipitation forecasts, as shown by comparisons with a reference run driven by observed meteorological parameters. Our evaluation stresses that the capacity of the model to provide confident and reliable mid-term probability forecasts for high discharges is limited. The two most intense events of the study period are investigated utilising a novel graphical representation of probability forecasts and used to generate high discharge scenarios. They highlight challenges for making decisions on the basis of hydrological predictions, and indicate the need for a tool to be used in addition to forecasts to compare the different mitigation actions possible in the Sihl catchment.
NASA Astrophysics Data System (ADS)
Layer, Michael
Damaging wind events not associated with severe convective storms or tropical cyclones can occur over the Northeast U.S. during the cool season and can cause significant problems with transportation, infrastructure, and public safety. These non-convective wind events (NCWEs) events are difficult for operational forecasters to predict in the NYC region as revealed by relatively poor verification statistics in recent years. This study investigates the climatology of NCWEs occurring between 15 September and 15 May over 13 seasons from 2000-2001 through 2012-2013. The events are broken down into three distinct types commonly observed in the region: pre-cold frontal (PRF), post-cold frontal (POF), and nor'easter/coastal storm (NEC) cases. Relationships between observed winds and some atmospheric parameters such as 900 hPa height gradient, 3-hour MSLP tendency, low-level wind profile, and stability are also studied. Overall, PRF and NEC events exhibit stronger height gradients, stronger low-level winds, and stronger low-level stability than POF events. Model verification is also conducted over the 2009-2014 time period using the Short Range Ensemble Forecast system (SREF) from the National Centers for Environmental Prediction (NCEP). Both deterministic and probabilistic verification metrics are used to evaluate the performance of the ensemble during NCWEs. Although the SREF has better forecast skill than most of the deterministic SREF control members, it is rather poorly calibrated, and exhibits a significant overforecasting, or positive wind speed bias in the lower atmosphere.
Evaluation and intercomparison of air quality forecasts over Korea during the KORUS-AQ campaign
NASA Astrophysics Data System (ADS)
Lee, Seungun; Park, Rokjin J.; Kim, Soontae; Song, Chul H.; Kim, Cheol-Hee; Woo, Jung-Hun
2017-04-01
We evaluate and intercompare ozone and aerosol simulations over Korea during the KORUS-AQ campaign, which was conducted in May-June 2016. Four global and regional air quality models participated in the campaign and provided daily air quality forecasts over Korea to guide aircraft flight paths for detecting air pollution events over Korean peninsula and its nearby oceans. We first evaluate the model performance by comparing simulated and observed hourly surface ozone and PM2.5 concentrations at ground sites in Korea and find that the models successfully capture intermittent air pollution events and reproduce the daily variation of ozone and PM2.5 concentrations. However, significant underestimates of peak ozone concentrations in the afternoon are also found in most models. Among chemical constituents of PM2.5, the models typically overestimate observed nitrate aerosol concentrations and underestimate organic aerosol concentrations, although the observed mass concentrations of PM2.5 are seemingly reproduced by the models. In particular, all models used the same anthropogenic emission inventory (KU-CREATE) for daily air quality forecast, but they show a considerable discrepancy for ozone and aerosols. Compared to individual model results, the ensemble mean of all models shows the best performance with correlation coefficients of 0.73 for ozone and 0.57 for PM2.5. We here investigate contributing factors to the discrepancy, which will serve as a guidance to improve the performance of the air quality forecast.
Organization and scaling in water supply networks
NASA Astrophysics Data System (ADS)
Cheng, Likwan; Karney, Bryan W.
2017-12-01
Public water supply is one of the society's most vital resources and most costly infrastructures. Traditional concepts of these networks capture their engineering identity as isolated, deterministic hydraulic units, but overlook their physics identity as related entities in a probabilistic, geographic ensemble, characterized by size organization and property scaling. Although discoveries of allometric scaling in natural supply networks (organisms and rivers) raised the prospect for similar findings in anthropogenic supplies, so far such a finding has not been reported in public water or related civic resource supplies. Examining an empirical ensemble of large number and wide size range, we show that water supply networks possess self-organized size abundance and theory-explained allometric scaling in spatial, infrastructural, and resource- and emission-flow properties. These discoveries establish scaling physics for water supply networks and may lead to novel applications in resource- and jurisdiction-scale water governance.
A model for cancer tissue heterogeneity.
Mohanty, Anwoy Kumar; Datta, Aniruddha; Venkatraj, Vijayanagaram
2014-03-01
An important problem in the study of cancer is the understanding of the heterogeneous nature of the cell population. The clonal evolution of the tumor cells results in the tumors being composed of multiple subpopulations. Each subpopulation reacts differently to any given therapy. This calls for the development of novel (regulatory network) models, which can accommodate heterogeneity in cancerous tissues. In this paper, we present a new approach to model heterogeneity in cancer. We model heterogeneity as an ensemble of deterministic Boolean networks based on prior pathway knowledge. We develop the model considering the use of qPCR data. By observing gene expressions when the tissue is subjected to various stimuli, the compositional breakup of the tissue under study can be determined. We demonstrate the viability of this approach by using our model on synthetic data, and real-world data collected from fibroblasts.
Ensemble downscaling in coupled solar wind-magnetosphere modeling for space weather forecasting.
Owens, M J; Horbury, T S; Wicks, R T; McGregor, S L; Savani, N P; Xiong, M
2014-06-01
Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind "noise," which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical "downscaling" of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme. Solar wind models must be downscaled in order to drive magnetospheric models Ensemble downscaling is more effective than deterministic downscaling The magnetosphere responds nonlinearly to small-scale solar wind fluctuations.
NASA Astrophysics Data System (ADS)
Hristova-Veleva, S. M.; Chen, H.; Gopalakrishnan, S.; Haddad, Z. S.
2017-12-01
Tropical cyclones (TCs) are the product of complex multi-scale processes and interactions. The role of the environment has long been recognized. However, recent research has shown that convective-scale processes in the hurricane core might also play a crucial role in determining TCs intensity and size. Several studies have linked Rapid Intensification to the characteristics of the convective clouds (shallow versus deep), their organization (isolated versus wide-spread) and their location with respect to dynamical controls (the vertical shear, the radius of maximum wind). Yet a third set of controls signifies the interaction between the storm-scale and large-scale processes. Our goal is to use observations and models to advance the still-lacking understanding of these processes. Recently, hurricane models have improved significantly. However, deterministic forecasts have limitations due to the uncertainty in the representation of the physical processes and initial conditions. A crucial step forward is the use of high-resolution ensembles. We adopt the following approach: i) generate a high resolution ensemble forecast using HWRF; ii) produce synthetic data (e.g. brightness temperature) from the model fields for direct comparison to satellite observations; iii) develop metrics to allow us to sub-select the realistic members of the ensemble, based on objective measures of the similarity between observed and forecasted structures; iv) for these most-realistic members, determine the skill in forecasting TCs to provide"guidance on guidance"; v) use the members with the best predictive skill to untangle the complex multi-scale interactions. We will report on the first three goals of our research, using forecasts and observations of hurricane Edouard (2014), focusing on RI. We will focus on describing the metrics for the selection of the most appropriate ensemble members, based on applying low-wave number analysis (WNA - Hristova-Veleva et al., 2016) to the observed and forecasted 2D fields to develop objective criteria for consistency. We investigate the WNA cartoons of environmental moisture, precipitation structure and surface convergence. We will present the preliminary selection of most skillful members and will outline our future goals - analyzing the multi-scale interactions using these members
NASA Astrophysics Data System (ADS)
Pinson, Pierre
2016-04-01
The operational management of renewable energy generation in power systems and electricity markets requires forecasts in various forms, e.g., deterministic or probabilistic, continuous or categorical, depending upon the decision process at hand. Besides, such forecasts may also be necessary at various spatial and temporal scales, from high temporal resolutions (in the order of minutes) and very localized for an offshore wind farm, to coarser temporal resolutions (hours) and covering a whole country for day-ahead power scheduling problems. As of today, weather predictions are a common input to forecasting methodologies for renewable energy generation. Since for most decision processes, optimal decisions can only be made if accounting for forecast uncertainties, ensemble predictions and density forecasts are increasingly seen as the product of choice. After discussing some of the basic approaches to obtaining ensemble forecasts of renewable power generation, it will be argued that space-time trajectories of renewable power production may or may not be necessitate post-processing ensemble forecasts for relevant weather variables. Example approaches and test case applications will be covered, e.g., looking at the Horns Rev offshore wind farm in Denmark, or gridded forecasts for the whole continental Europe. Eventually, we will illustrate some of the limitations of current frameworks to forecast verification, which actually make it difficult to fully assess the quality of post-processing approaches to obtain renewable energy predictions.
Application Bayesian Model Averaging method for ensemble system for Poland
NASA Astrophysics Data System (ADS)
Guzikowski, Jakub; Czerwinska, Agnieszka
2014-05-01
The aim of the project is to evaluate methods for generating numerical ensemble weather prediction using a meteorological data from The Weather Research & Forecasting Model and calibrating this data by means of Bayesian Model Averaging (WRF BMA) approach. We are constructing height resolution short range ensemble forecasts using meteorological data (temperature) generated by nine WRF's models. WRF models have 35 vertical levels and 2.5 km x 2.5 km horizontal resolution. The main emphasis is that the used ensemble members has a different parameterization of the physical phenomena occurring in the boundary layer. To calibrate an ensemble forecast we use Bayesian Model Averaging (BMA) approach. The BMA predictive Probability Density Function (PDF) is a weighted average of predictive PDFs associated with each individual ensemble member, with weights that reflect the member's relative skill. For test we chose a case with heat wave and convective weather conditions in Poland area from 23th July to 1st August 2013. From 23th July to 29th July 2013 temperature oscillated below or above 30 Celsius degree in many meteorology stations and new temperature records were added. During this time the growth of the hospitalized patients with cardiovascular system problems was registered. On 29th July 2013 an advection of moist tropical air masses was recorded in the area of Poland causes strong convection event with mesoscale convection system (MCS). MCS caused local flooding, damage to the transport infrastructure, destroyed buildings, trees and injuries and direct threat of life. Comparison of the meteorological data from ensemble system with the data recorded on 74 weather stations localized in Poland is made. We prepare a set of the model - observations pairs. Then, the obtained data from single ensemble members and median from WRF BMA system are evaluated on the basis of the deterministic statistical error Root Mean Square Error (RMSE), Mean Absolute Error (MAE). To evaluation probabilistic data The Brier Score (BS) and Continuous Ranked Probability Score (CRPS) were used. Finally comparison between BMA calibrated data and data from ensemble members will be displayed.
Ensemble Simulation of the Atmospheric Radionuclides Discharged by the Fukushima Nuclear Accident
NASA Astrophysics Data System (ADS)
Sekiyama, Thomas; Kajino, Mizuo; Kunii, Masaru
2013-04-01
Enormous amounts of radionuclides were discharged into the atmosphere by a nuclear accident at the Fukushima Daiichi nuclear power plant (FDNPP) after the earthquake and tsunami on 11 March 2011. The radionuclides were dispersed from the power plant and deposited mainly over eastern Japan and the North Pacific Ocean. A lot of numerical simulations of the radionuclide dispersion and deposition had been attempted repeatedly since the nuclear accident. However, none of them were able to perfectly simulate the distribution of dose rates observed after the accident over eastern Japan. This was partly due to the error of the wind vectors and precipitations used in the numerical simulations; unfortunately, their deterministic simulations could not deal with the probability distribution of the simulation results and errors. Therefore, an ensemble simulation of the atmospheric radionuclides was performed using the ensemble Kalman filter (EnKF) data assimilation system coupled with the Japan Meteorological Agency (JMA) non-hydrostatic mesoscale model (NHM); this mesoscale model has been used operationally for daily weather forecasts by JMA. Meteorological observations were provided to the EnKF data assimilation system from the JMA operational-weather-forecast dataset. Through this ensemble data assimilation, twenty members of the meteorological analysis over eastern Japan from 11 to 31 March 2011 were successfully obtained. Using these meteorological ensemble analysis members, the radionuclide behavior in the atmosphere such as advection, convection, diffusion, dry deposition, and wet deposition was simulated. This ensemble simulation provided the multiple results of the radionuclide dispersion and distribution. Because a large ensemble deviation indicates the low accuracy of the numerical simulation, the probabilistic information is obtainable from the ensemble simulation results. For example, the uncertainty of precipitation triggered the uncertainty of wet deposition; the uncertainty of wet deposition triggered the uncertainty of atmospheric radionuclide amounts. Then the remained radionuclides were transported downwind; consequently the uncertainty signal of the radionuclide amounts was propagated downwind. The signal propagation was seen in the ensemble simulation by the tracking of the large deviation areas of radionuclide concentration and deposition. These statistics are able to provide information useful for the probabilistic prediction of radionuclides.
Intercomparison of the community multiscale air quality model and CALGRID using process analysis.
O'Neill, Susan M; Lamb, Brian K
2005-08-01
This study was designed to examine the similarities and differences between two advanced photochemical air quality modeling systems: EPA Models-3/CMAQ and CALGRID/CALMET. Both modeling systems were applied to an ozone episode that occurred along the I-5 urban corridor in western Washington and Oregon during July 11-14, 1996. Both models employed the same modeling domain and used the same detailed gridded emission inventory. The CMAQ model was run using both the CB-IV and RADM2 chemical mechanisms, while CALGRID was used with the SAPRC-97 chemical mechanism. Outputfrom the Mesoscale Meteorological Model (MM5) employed with observational nudging was used in both models. The two modeling systems, representing three chemical mechanisms and two sets of meteorological inputs, were evaluated in terms of statistical performance measures for both 1- and 8-h average observed ozone concentrations. The results showed that the different versions of the systems were more similar than different, and all versions performed well in the Portland region and downwind of Seattle but performed poorly in the more rural region north of Seattle. Improving the meteorological input into the CALGRID/CALMET system with planetary boundary layer (PBL) parameters from the Models-3/CMAQ meteorology preprocessor (MCIP) improved the performance of the CALGRID/CALMET system. The 8-h ensemble case was often the best performer of all the cases indicating that the models perform better over longer analysis periods. The 1-h ensemble case, derived from all runs, was not necessarily an improvement over the five individual cases, but the standard deviation about the mean provided a measure of overall modeling uncertainty. Process analysis was applied to examine the contribution of the individual processes to the species conservation equation. The process analysis results indicated that the two modeling systems arrive at similar solutions by very different means. Transport rates are faster and exhibit greater fluctuations in the CMAQ cases than in the CALGRID cases, which lead to different placement of the urban ozone plumes. The CALGRID cases, which rely on the SAPRC97 chemical mechanism, exhibited a greater diurnal production/loss cycle of ozone concentrations per hour compared to either the RADM2 or CBIV chemical mechanisms in the CMAQ cases. These results demonstrate the need for specialized process field measurements to confirm whether we are modeling ozone with valid processes.
NASA Astrophysics Data System (ADS)
Yatheendradas, S.; Vivoni, E.
2007-12-01
A common practice in distributed hydrological modeling is to assign soil hydraulic properties based on coarse textural datasets. For semiarid regions with poor soil information, the performance of a model can be severely constrained due to the high model sensitivity to near-surface soil characteristics. Neglecting the uncertainty in soil hydraulic properties, their spatial variation and their naturally-occurring horizonation can potentially affect the modeled hydrological response. In this study, we investigate such effects using the TIN-based Real-time Integrated Basin Simulator (tRIBS) applied to the mid-sized (100 km2) Sierra Los Locos watershed in northern Sonora, Mexico. The Sierra Los Locos basin is characterized by complex mountainous terrain leading to topographic organization of soil characteristics and ecosystem distributions. We focus on simulations during the 2004 North American Monsoon Experiment (NAME) when intensive soil moisture measurements and aircraft- based soil moisture retrievals are available in the basin. Our experiments focus on soil moisture comparisons at the point, topographic transect and basin scales using a range of different soil characterizations. We compare the distributed soil moisture estimates obtained using (1) a deterministic simulation based on soil texture from coarse soil maps, (2) a set of ensemble simulations that capture soil parameter uncertainty and their spatial distribution, and (3) a set of simulations that conditions the ensemble on recent soil profile measurements. Uncertainties considered in near-surface soil characterization provide insights into their influence on the modeled uncertainty, into the value of soil profile observations, and into effective use of on-going field observations for constraining the soil moisture response uncertainty.
Evaluating NMME Seasonal Forecast Skill for use in NASA SERVIR Hub Regions
NASA Technical Reports Server (NTRS)
Roberts, J. Brent; Roberts, Franklin R.
2013-01-01
The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The coupled forecasts have numerous potential applications, both national and international in scope. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in driving applications models in hub regions including East Africa, the Hindu Kush- Himalayan (HKH) region and Mesoamerica. A prerequisite for seasonal forecast use in application modeling (e.g. hydrology, agriculture) is bias correction and skill assessment. Efforts to address systematic biases and multi-model combination in support of NASA SERVIR impact modeling requirements will be highlighted. Specifically, quantilequantile mapping for bias correction has been implemented for all archived NMME hindcasts. Both deterministic and probabilistic skill estimates for raw, bias-corrected, and multi-model ensemble forecasts as a function of forecast lead will be presented for temperature and precipitation. Complementing this statistical assessment will be case studies of significant events, for example, the ability of the NMME forecasts suite to anticipate the 2010/2011 drought in the Horn of Africa and its relationship to evolving SST patterns.
Post-processing through linear regression
NASA Astrophysics Data System (ADS)
van Schaeybroeck, B.; Vannitsem, S.
2011-03-01
Various post-processing techniques are compared for both deterministic and ensemble forecasts, all based on linear regression between forecast data and observations. In order to evaluate the quality of the regression methods, three criteria are proposed, related to the effective correction of forecast error, the optimal variability of the corrected forecast and multicollinearity. The regression schemes under consideration include the ordinary least-square (OLS) method, a new time-dependent Tikhonov regularization (TDTR) method, the total least-square method, a new geometric-mean regression (GM), a recently introduced error-in-variables (EVMOS) method and, finally, a "best member" OLS method. The advantages and drawbacks of each method are clarified. These techniques are applied in the context of the 63 Lorenz system, whose model version is affected by both initial condition and model errors. For short forecast lead times, the number and choice of predictors plays an important role. Contrarily to the other techniques, GM degrades when the number of predictors increases. At intermediate lead times, linear regression is unable to provide corrections to the forecast and can sometimes degrade the performance (GM and the best member OLS with noise). At long lead times the regression schemes (EVMOS, TDTR) which yield the correct variability and the largest correlation between ensemble error and spread, should be preferred.
NASA Astrophysics Data System (ADS)
Martin, A.; Pascal, C.; Leconte, R.
2014-12-01
Stochastic Dynamic Programming (SDP) is known to be an effective technique to find the optimal operating policy of hydropower systems. In order to improve the performance of SDP, this project evaluates the impact of re-updating the policy at every time step by using Ensemble Streamflow Prediction (ESP). We present a case study of the Kemano's hydropower system on the Nechako River in British Columbia, Canada. Managed by Rio Tinto Alcan (RTA), this system is subject to large streamflow volumes in spring due to important amount of snow depth during the winter season. Therefore, the operating policy should not only maximize production but also minimize the risk of flooding. The hydrological behavior of the system is simulated with CEQUEAU, a distributed and deterministic hydrological model developed by the Institut national de la recherche scientifique - Eau, Terre et Environnement (INRS-ETE) in Quebec, Canada. On each decision time step, CEQUEAU is used to generate ESP scenarios based on historical meteorological sequences and the current state of the hydrological model. These scenarios are used into the SDP to optimize the new release policy for the next time steps. This routine is then repeated over the entire simulation period. Results are compared with those obtained by using SDP on historical inflow scenarios.
NASA Astrophysics Data System (ADS)
Hu, Lu; Jacob, Daniel J.; Liu, Xiong; Zhang, Yi; Zhang, Lin; Kim, Patrick S.; Sulprizio, Melissa P.; Yantosca, Robert M.
2017-10-01
The global budget of tropospheric ozone is governed by a complicated ensemble of coupled chemical and dynamical processes. Simulation of tropospheric ozone has been a major focus of the GEOS-Chem chemical transport model (CTM) over the past 20 years, and many developments over the years have affected the model representation of the ozone budget. Here we conduct a comprehensive evaluation of the standard version of GEOS-Chem (v10-01) with ozone observations from ozonesondes, the OMI satellite instrument, and MOZAIC-IAGOS commercial aircraft for 2012-2013. Global validation of the OMI 700-400 hPa data with ozonesondes shows that OMI maintained persistent high quality and no significant drift over the 2006-2013 period. GEOS-Chem shows no significant seasonal or latitudinal bias relative to OMI and strong correlations in all seasons on the 2° × 2.5° horizontal scale (r = 0.88-0.95), improving on previous model versions. The most pronounced model bias revealed by ozonesondes and MOZAIC-IAGOS is at high northern latitudes in winter-spring where the model is 10-20 ppbv too low. This appears to be due to insufficient stratosphere-troposphere exchange (STE). Model updates to lightning NOx, Asian anthropogenic emissions, bromine chemistry, isoprene chemistry, and meteorological fields over the past decade have overall led to gradual increase in the simulated global tropospheric ozone burden and more active ozone production and loss. From simulations with different versions of GEOS meteorological fields we find that tropospheric ozone in GEOS-Chem v10-01 has a global production rate of 4960-5530 Tg a-1, lifetime of 20.9-24.2 days, burden of 345-357 Tg, and STE of 325-492 Tg a-1. Change in the intensity of tropical deep convection between these different meteorological fields is a major factor driving differences in the ozone budget.
Application of multi-constituent satellite data assimilation for KORUS-AQ
NASA Astrophysics Data System (ADS)
Miyazaki, K.; Sekiya, T.; Fu, D.; Bowman, K. W.; Kulawik, S. S.; Walker, T.; Takigawa, M.; Ogochi, K.; Gaubert, B.; Barré, J.; Emmons, L. K.
2017-12-01
Comprehensive tropospheric maps of multi-constituent fields at 1.1 degree resolution, provided by an assimilation of multiple satellite measurements of O3, CO, NO2, and HNO3 from multiple satellite (OMI, GOME-2, MOPITT, MLS, and AIRS) using an ensemble Kalman filter, are used to study variations in tropospheric composition over east Asia during KORUS-AQ. Assimilated model results for both direct ozone assimilations and assimilations of ozone precursors (NOx and CO) were compared to DC-8 aircraft observations, with significant improvements in model/aircraft comparisons for ozone (the negative model bias was reduced by up to 80 %), CO (by up to 90 %), OH (by up to 40 %), and NOx seen in both approaches. Corrections made to the precursor emissions (i.e., surface NOx and CO emissions), especially over eastern and central China and over South Korea, were important in reducing the negative bias of O3 and CO over South Korea. We obtained additional bias reductions from assimilation of multispectral retrievals of tropospheric ozone profiles from AIRS and OMI, especially for the middle troposphere ozone. Improved agreements with the ground-based measurements at remote sites over South Korea and western Japan suggest that the representation of long-range transport of polluted air is improved by data assimilation, as a result of the optimization of precursor emissions, mainly over China. The higher estimated NOx, by 60-90 % over South Korea and by 20-40 % over eastern China compared to bottom-up inventories, suggests an important underestimation of anthropogenic sources in the emission inventories in these areas. Additional bias reductions were obtained by assimilating the multispectral retrievals, especially for the middle troposphere O3. In the future, assimilating datasets from a new constellation of low Earth orbiting sounders (e.g., IASI, AIRS, CrIS, Sentinel-5p (TROPOMI), and Sentinel-5) and geostationary satellites (Sentinel-4, GEMS, and TEMPO) will provide more detailed knowledge of ozone and its precursors for east Asia and the entire globe. The data assimilation framework will also be used for chemical OSSE studies to evaluate and optimize the current and future observing systems.
Ensemble sea ice forecast for predicting compressive situations in the Baltic Sea
NASA Astrophysics Data System (ADS)
Lehtiranta, Jonni; Lensu, Mikko; Kokkonen, Iiro; Haapala, Jari
2017-04-01
Forecasting of sea ice hazards is important for winter shipping in the Baltic Sea. In current numerical models the ice thickness distribution and drift are captured well, but compressive situations are often missing from forecast products. Its inclusion is requested by the shipping community, as compression poses a threat to ship operations. As compressing ice is capable of stopping ships for days and even damaging them, its inclusion in ice forecasts is vital. However, we have found that compression can not be predicted well in a deterministic forecast, since it can be a local and a quickly changing phenomenon. It is also very sensitive to small changes in the wind speed and direction, the prevailing ice conditions, and the model parameters. Thus, a probabilistic ensemble simulation is needed to produce a meaningful compression forecast. An ensemble model setup was developed in the SafeWIN project for this purpose. It uses the HELMI multicategory ice model, which was amended for making simulations in parallel. The ensemble was built by perturbing the atmospheric forcing and the physical parameters of the ice pack. The model setup will provide probabilistic forecasts for the compression in the Baltic sea ice. Additionally the model setup provides insight into the uncertainties related to different model parameters and their impact on the model results. We have completed several hindcast simulations for the Baltic Sea for verification purposes. These results are shown to match compression reports gathered from ships. In addition, an ensemble forecast is in preoperational testing phase and its first evaluation will be presented in this work.
Deterministic composite nanophotonic lattices in large area for broadband applications
NASA Astrophysics Data System (ADS)
Xavier, Jolly; Probst, Jürgen; Becker, Christiane
2016-12-01
Exotic manipulation of the flow of photons in nanoengineered materials with an aperiodic distribution of nanostructures plays a key role in efficiency-enhanced broadband photonic and plasmonic technologies for spectrally tailorable integrated biosensing, nanostructured thin film solarcells, white light emitting diodes, novel plasmonic ensembles etc. Through a generic deterministic nanotechnological route here we show subwavelength-scale silicon (Si) nanostructures on nanoimprinted glass substrate in large area (4 cm2) with advanced functional features of aperiodic composite nanophotonic lattices. These nanophotonic aperiodic lattices have easily tailorable supercell tiles with well-defined and discrete lattice basis elements and they show rich Fourier spectra. The presented nanophotonic lattices are designed functionally akin to two-dimensional aperiodic composite lattices with unconventional flexibility- comprising periodic photonic crystals and/or in-plane photonic quasicrystals as pattern design subsystems. The fabricated composite lattice-structured Si nanostructures are comparatively analyzed with a range of nanophotonic structures with conventional lattice geometries of periodic, disordered random as well as in-plane quasicrystalline photonic lattices with comparable lattice parameters. As a proof of concept of compatibility with advanced bottom-up liquid phase crystallized (LPC) Si thin film fabrication, the experimental structural analysis is further extended to double-side-textured deterministic aperiodic lattice-structured 10 μm thick large area LPC Si film on nanoimprinted substrates.
Deterministic composite nanophotonic lattices in large area for broadband applications
Xavier, Jolly; Probst, Jürgen; Becker, Christiane
2016-01-01
Exotic manipulation of the flow of photons in nanoengineered materials with an aperiodic distribution of nanostructures plays a key role in efficiency-enhanced broadband photonic and plasmonic technologies for spectrally tailorable integrated biosensing, nanostructured thin film solarcells, white light emitting diodes, novel plasmonic ensembles etc. Through a generic deterministic nanotechnological route here we show subwavelength-scale silicon (Si) nanostructures on nanoimprinted glass substrate in large area (4 cm2) with advanced functional features of aperiodic composite nanophotonic lattices. These nanophotonic aperiodic lattices have easily tailorable supercell tiles with well-defined and discrete lattice basis elements and they show rich Fourier spectra. The presented nanophotonic lattices are designed functionally akin to two-dimensional aperiodic composite lattices with unconventional flexibility- comprising periodic photonic crystals and/or in-plane photonic quasicrystals as pattern design subsystems. The fabricated composite lattice-structured Si nanostructures are comparatively analyzed with a range of nanophotonic structures with conventional lattice geometries of periodic, disordered random as well as in-plane quasicrystalline photonic lattices with comparable lattice parameters. As a proof of concept of compatibility with advanced bottom-up liquid phase crystallized (LPC) Si thin film fabrication, the experimental structural analysis is further extended to double-side-textured deterministic aperiodic lattice-structured 10 μm thick large area LPC Si film on nanoimprinted substrates. PMID:27941869
NASA Astrophysics Data System (ADS)
Brown-Steiner, B.; Selin, N. E.; Prinn, R. G.; Monier, E.; Garcia-Menendez, F.; Tilmes, S.; Emmons, L. K.; Lamarque, J. F.; Cameron-Smith, P. J.
2017-12-01
We summarize two methods to aid in the identification of ozone signals from underlying spatially and temporally heterogeneous data in order to help research communities avoid the sometimes burdensome computational costs of high-resolution high-complexity models. The first method utilizes simplified chemical mechanisms (a Reduced Hydrocarbon Mechanism and a Superfast Mechanism) alongside a more complex mechanism (MOZART-4) within CESM CAM-Chem to extend the number of simulated meteorological years (or add additional members to an ensemble) for a given modeling problem. The Reduced Hydrocarbon mechanism is twice as fast, and the Superfast mechanism is three times faster than the MOZART-4 mechanism. We show that simplified chemical mechanisms are largely capable of simulating surface ozone across the globe as well as the more complex chemical mechanisms, and where they are not capable, a simple standardized anomaly emulation approach can correct for their inadequacies. The second method uses strategic averaging over both temporal and spatial scales to filter out the highly heterogeneous noise that underlies ozone observations and simulations. This method allows for a selection of temporal and spatial averaging scales that match a particular signal strength (between 0.5 and 5 ppbv), and enables the identification of regions where an ozone signal can rise above the ozone noise over a given region and a given period of time. In conjunction, these two methods can be used to "scale down" chemical mechanism complexity and quantitatively determine spatial and temporal scales that could enable research communities to utilize simplified representations of atmospheric chemistry and thereby maximize their productivity and efficiency given computational constraints. While this framework is here applied to ozone data, it could also be applied to a broad range of geospatial data sets (observed or modeled) that have spatial and temporal coverage.
NASA Astrophysics Data System (ADS)
van der Zwan, Rene
2013-04-01
The Rijnland water system is situated in the western part of the Netherlands, and is a low-lying area of which 90% is below sea-level. The area covers 1,100 square kilometres, where 1.3 million people live, work, travel and enjoy leisure. The District Water Control Board of Rijnland is responsible for flood defence, water quantity and quality management. This includes design and maintenance of flood defence structures, control of regulating structures for an adequate water level management, and waste water treatment. For water quantity management Rijnland uses, besides an online monitoring network for collecting water level and precipitation data, a real time control decision support system. This decision support system consists of deterministic hydro-meteorological forecasts with a 24-hr forecast horizon, coupled with a control module that provides optimal operation schedules for the storage basin pumping stations. The uncertainty of the rainfall forecast is not forwarded in the hydrological prediction. At this moment 65% of the pumping capacity of the storage basin pumping stations can be automatically controlled by the decision control system. Within 5 years, after renovation of two other pumping stations, the total capacity of 200 m3/s will be automatically controlled. In critical conditions there is a need of both a longer forecast horizon and a probabilistic forecast. Therefore ensemble precipitation forecasts of the ECMWF are already consulted off-line during dry-spells, and Rijnland is running a pilot operational system providing 10-day water level ensemble forecasts. The use of EPS during dry-spells and the findings of the pilot will be presented. Challenges and next steps towards on-line implementation of ensemble forecasts for risk-based operational management of the Rijnland water system will be discussed. An important element in that discussion is the question: will policy and decision makers, operator and citizens adapt this Anticipatory Water management, including temporary lower storage basin levels and a reduction in extra investments for infrastructural measures.
Prediction of the Arctic Oscillation in Boreal Winter by Dynamical Seasonal Forecasting Systems
NASA Technical Reports Server (NTRS)
Kang, Daehyun; Lee, Myong-In; Im, Jungho; Kim, Daehyun; Kim, Hye-Mi; Kang, Hyun-Suk; Shubert, Siegfried D.; Arriba, Albertom; MacLachlan, Craig
2013-01-01
This study assesses the prediction skill of the boreal winter Arctic Oscillation (AO) in the state-of-the-art dynamical ensemble prediction systems (EPSs): the UKMO GloSea4, the NCEP CFSv2, and the NASA GEOS-5. Long-term reforecasts made with the EPSs are used to evaluate representations of the AO, and to examine skill scores for the deterministic and probabilistic forecast of the AO index. The reforecasts reproduce the observed changes in the large-scale patterns of the Northern Hemispheric surface temperature, upper-level wind, and precipitation according to the AO phase. Results demonstrate that all EPSs have better prediction skill than the persistence prediction for lead times up to 3-month, suggesting a great potential for skillful prediction of the AO and the associated climate anomalies in seasonal time scale. It is also found that the deterministic and probabilistic forecast skill of the AO in the recent period (1997-2010) is higher than that in the earlier period (1983-1996).
Eigenvalue density of cross-correlations in Sri Lankan financial market
NASA Astrophysics Data System (ADS)
Nilantha, K. G. D. R.; Ranasinghe; Malmini, P. K. C.
2007-05-01
We apply the universal properties with Gaussian orthogonal ensemble (GOE) of random matrices namely spectral properties, distribution of eigenvalues, eigenvalue spacing predicted by random matrix theory (RMT) to compare cross-correlation matrix estimators from emerging market data. The daily stock prices of the Sri Lankan All share price index and Milanka price index from August 2004 to March 2005 were analyzed. Most eigenvalues in the spectrum of the cross-correlation matrix of stock price changes agree with the universal predictions of RMT. We find that the cross-correlation matrix satisfies the universal properties of the GOE of real symmetric random matrices. The eigen distribution follows the RMT predictions in the bulk but there are some deviations at the large eigenvalues. The nearest-neighbor spacing and the next nearest-neighbor spacing of the eigenvalues were examined and found that they follow the universality of GOE. RMT with deterministic correlations found that each eigenvalue from deterministic correlations is observed at values, which are repelled from the bulk distribution.
Hardware-efficient Bell state preparation using Quantum Zeno Dynamics in superconducting circuits
NASA Astrophysics Data System (ADS)
Flurin, Emmanuel; Blok, Machiel; Hacohen-Gourgy, Shay; Martin, Leigh S.; Livingston, William P.; Dove, Allison; Siddiqi, Irfan
By preforming a continuous joint measurement on a two qubit system, we restrict the qubit evolution to a chosen subspace of the total Hilbert space. This extension of the quantum Zeno effect, called Quantum Zeno Dynamics, has already been explored in various physical systems such as superconducting cavities, single rydberg atoms, atomic ensembles and Bose Einstein condensates. In this experiment, two superconducting qubits are strongly dispersively coupled to a high-Q cavity (χ >> κ) allowing for the doubly excited state | 11 〉 to be selectively monitored. The Quantum Zeno Dynamics in the complementary subspace enables us to coherently prepare a Bell state. As opposed to dissipation engineering schemes, we emphasize that our protocol is deterministic, does not rely direct coupling between qubits and functions only using single qubit controls and cavity readout. Such Quantum Zeno Dynamics can be generalized to larger Hilbert space enabling deterministic generation of many-body entangled states, and thus realizes a decoherence-free subspace allowing alternative noise-protection schemes.
Future Effects of Southern Hemisphere Stratospheric Zonal Asymmetries on Climate
NASA Astrophysics Data System (ADS)
Stone, K.; Solomon, S.; Kinnison, D. E.; Fyfe, J. C.
2017-12-01
Stratospheric zonal asymmetries in the Southern Hemisphere have been shown to have significant influences on both stratospheric and tropospheric dynamics and climate. Accurate representation of stratospheric ozone in particular is important for realistic simulation of the polar vortex strength and temperature trends. This is therefore also important for stratospheric ozone change's effect on the troposphere, both through modulation of the Southern Annular Mode (SAM), and more localized climate. Here, we characterization the impact of future changes in Southern Hemisphere zonal asymmetry on tropospheric climate, including changes to future tropospheric temperature, and precipitation. The separate impacts of increasing GHGs and ozone recovery on the zonal asymmetric influence on the surface are also investigated. For this purpose, we use a variety of models, including Chemistry Climate Model Initiative simulations from the Community Earth System Model, version 1, with the Whole Atmosphere Community Climate Model (CESM1(WACCM)) and the Australian Community Climate and Earth System Simulator-Chemistry Climate Model (ACCESS-CCM). These models have interactive chemistry and can therefore more accurately represent the zonally asymmetric nature of the stratosphere. The CESM1(WACCM) and ACCESS-CCM models are also compared to simulations from the Canadian Can2ESM model and CESM-Large Ensemble Project (LENS) that have prescribed ozone to further investigate the importance of simulating stratospheric zonal asymmetry.
NASA Astrophysics Data System (ADS)
Ahmadalipour, Ali; Moradkhani, Hamid
2017-12-01
Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.
Verification of an ensemble prediction system for storm surge forecast in the Adriatic Sea
NASA Astrophysics Data System (ADS)
Mel, Riccardo; Lionello, Piero
2014-12-01
In the Adriatic Sea, storm surges present a significant threat to Venice and to the flat coastal areas of the northern coast of the basin. Sea level forecast is of paramount importance for the management of daily activities and for operating the movable barriers that are presently being built for the protection of the city. In this paper, an EPS (ensemble prediction system) for operational forecasting of storm surge in the northern Adriatic Sea is presented and applied to a 3-month-long period (October-December 2010). The sea level EPS is based on the HYPSE (hydrostatic Padua Sea elevation) model, which is a standard single-layer nonlinear shallow water model, whose forcings (mean sea level pressure and surface wind fields) are provided by the ensemble members of the ECMWF (European Center for Medium-Range Weather Forecasts) EPS. Results are verified against observations at five tide gauges located along the Croatian and Italian coasts of the Adriatic Sea. Forecast uncertainty increases with the predicted value of the storm surge and with the forecast lead time. The EMF (ensemble mean forecast) provided by the EPS has a rms (root mean square) error lower than the DF (deterministic forecast), especially for short (up to 3 days) lead times. Uncertainty for short lead times of the forecast and for small storm surges is mainly caused by uncertainty of the initial condition of the hydrodynamical model. Uncertainty for large lead times and large storm surges is mainly caused by uncertainty in the meteorological forcings. The EPS spread increases with the rms error of the forecast. For large lead times the EPS spread and the forecast error substantially coincide. However, the EPS spread in this study, which does not account for uncertainty in the initial condition, underestimates the error during the early part of the forecast and for small storm surge values. On the contrary, it overestimates the rms error for large surge values. The PF (probability forecast) of the EPS has a clear skill in predicting the actual probability distribution of sea level, and it outperforms simple "dressed" PF methods. A probability estimate based on the single DF is shown to be inadequate. However, a PF obtained with a prescribed Gaussian distribution and centered on the DF value performs very similarly to the EPS-based PF.
Total probabilities of ensemble runoff forecasts
NASA Astrophysics Data System (ADS)
Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian
2017-04-01
Ensemble forecasting has a long history from meteorological modelling, as an indication of the uncertainty of the forecasts. However, it is necessary to calibrate and post-process the ensembles as the they often exhibit both bias and dispersion errors. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters varying in space and time, while giving a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, which makes it unsuitable for our purpose. Our post-processing method of the ensembles is developed in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu), where we are making forecasts for whole Europe, and based on observations from around 700 catchments. As the target is flood forecasting, we are also more interested in improving the forecast skill for high-flows rather than in a good prediction of the entire flow regime. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different meteorological forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to estimate the total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but we are adding a spatial penalty in the calibration process to force a spatial correlation of the parameters. The penalty takes distance, stream-connectivity and size of the catchment areas into account. This can in some cases have a slight negative impact on the calibration error, but avoids large differences between parameters of nearby locations, whether stream connected or not. The spatial calibration also makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.
NASA Astrophysics Data System (ADS)
Chevalier, A.; Gheusi, F.; Delmas, R.; Ordóñez, C.; Sarrat, C.; Zbinden, R.; Thouret, V.; Athier, G.; Cousin, J.-M.
2007-08-01
The PAES (French acronym for synoptic scale atmospheric pollution) network focuses on the chemical composition (ozone, CO, NOx/y and aerosols) of the lower troposphere (0-3000 m). Its high-altitude surface stations located in different mountainous areas in France complete the low-altitude rural MERA stations (the French contribution to the european program EMEP, European Monitoring and Evaluation Program). They are representative of pollution at the scale of the French territory because they are away from any major source of pollution. This study deals with ozone observations between 2001 and 2004 at 11 stations from PAES and MERA, in addition to 16 elevated stations located in mountainous areas of Switzerland, Germany, Austria, Italy and Spain. The set of stations covers a range of altitudes between 115 and 3550 m. The comparison between recent ozone mixing ratios to those of the last decade at Pic du Midi (2877 m), as well as trends calculated over 14-year data series at three high-altitude sites in the Alps (Jungfraujoch, Sonnblick and Zugspitze) reveal that ozone is still increasing but at a slower rate than in the 1980s and 1990s. The 2001-2004 mean levels of ozone from surface stations capture the ozone stratification revealed by climatological profiles from the airborne observation system MOZAIC (Measurement of OZone and water vapour by Airbus In-service airCraft) and from ozone soundings above Payerne (Switzerland). In particular all data evidence a clear transition at about 1000-1200 m a.s.l. between a sharp gradient below (of the order of +30 ppb/km) and a gentler gradient (+3 ppb/km) above. The same altitude (1200 m) is also found to be a threshold regarding how well the ozone levels at the surface stations agree with the free-tropospheric reference (MOZAIC or soundings). Below the departure can be as large as 40%, but suddenly drops within 15% above. For stations above 2000 m, the departure is even less than 8%. Ozone variability also reveals a clear transition between boundary-layer and free-tropospheric regimes around 1000 m a.s.l. Below, diurnal photochemistry accounts for about the third of the variability in summer, but less than 20% above - and at all levels in winter - where ozone variability is mostly due to day-to-day changes (linked to weather conditions or synoptic transport). In summary, the altitude range 1000-1200 m clearly turns out in our study to be an upper limit below which specific surface effects dominate the ozone content. Monthly-mean ozone mixing-ratios show at all levels a minimum in winter and the classical summer broad maximum in spring and summer - which is actually the superposition of the tropospheric spring maximum (April-May) and regional pollution episodes linked to persistent anticyclonic conditions that may occur from June to September. To complement this classical result it is shown that summer maxima are associated with considerably more variability than the spring maximum. This ensemble of findings support the relevance of mountain station networks such as PAES for the long-term observation of free-tropospheric ozone over Europe.
Ensemble downscaling in coupled solar wind-magnetosphere modeling for space weather forecasting
Owens, M J; Horbury, T S; Wicks, R T; McGregor, S L; Savani, N P; Xiong, M
2014-01-01
Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme. Key Points Solar wind models must be downscaled in order to drive magnetospheric models Ensemble downscaling is more effective than deterministic downscaling The magnetosphere responds nonlinearly to small-scale solar wind fluctuations PMID:26213518
Detection and Attribution of Temperature Trends in the Presence of Natural Variability
NASA Astrophysics Data System (ADS)
Wallace, J. M.
2014-12-01
The fingerprint of human-induced global warming stands out clearly above the noise In the time series of global-mean temperature, but not local temperature. At extratropical latitudes over land the standard error of 50-year linear temperature trends at a fixed point is as large as the cumulative rise in global-mean temperature over the past century. Much of the samping variability in local temperature trends is "dynamically-induced", i.e., attributable to the fact that the seasonally-varying mean circulation varies substantially from one year to the next and anomalous circulation patterns are generally accompanied by anomalous temperature patterns. In the presence of such large sampling variability it is virtually impossible to identify the spatial signature of greenhouse warming based on observational data or to partition observed local temperature trends into natural and human-induced components. It follows that previous IPCC assessments, which have focused on the deterministic signature of human-induced climate change, are inherently limited as to what they can tell us about the attribution of the past record of local temperature change or about how much the temperature at a particular place is likely to rise in the next few decades in response to global warming. To obtain more informative assessments of regional and local climate variability and change it will be necessary to take a probabilistic approach. Just as the use of the ensembles has contributed to more informative extended range weather predictions, large ensembles of climate model simulations can provide a statistical context for interpreting observed climate change and for framing projections of future climate. For some purposes, statistics relating to the interannual variability in the historical record can serve as a surrogate for statistics relating to the diversity of climate change scenarios in large ensembles.
Bayesian quantitative precipitation forecasts in terms of quantiles
NASA Astrophysics Data System (ADS)
Bentzien, Sabrina; Friederichs, Petra
2014-05-01
Ensemble prediction systems (EPS) for numerical weather predictions on the mesoscale are particularly developed to obtain probabilistic guidance for high impact weather. An EPS not only issues a deterministic future state of the atmosphere but a sample of possible future states. Ensemble postprocessing then translates such a sample of forecasts into probabilistic measures. This study focus on probabilistic quantitative precipitation forecasts in terms of quantiles. Quantiles are particular suitable to describe precipitation at various locations, since no assumption is required on the distribution of precipitation. The focus is on the prediction during high-impact events and related to the Volkswagen Stiftung funded project WEX-MOP (Mesoscale Weather Extremes - Theory, Spatial Modeling and Prediction). Quantile forecasts are derived from the raw ensemble and via quantile regression. Neighborhood method and time-lagging are effective tools to inexpensively increase the ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Since an EPS provides a large amount of potentially informative predictors, a variable selection is required in order to obtain a stable statistical model. A Bayesian formulation of quantile regression allows for inference about the selection of predictive covariates by the use of appropriate prior distributions. Moreover, the implementation of an additional process layer for the regression parameters accounts for spatial variations of the parameters. Bayesian quantile regression and its spatially adaptive extension is illustrated for the German-focused mesoscale weather prediction ensemble COSMO-DE-EPS, which runs (pre)operationally since December 2010 at the German Meteorological Service (DWD). Objective out-of-sample verification uses the quantile score (QS), a weighted absolute error between quantile forecasts and observations. The QS is a proper scoring function and can be decomposed into reliability, resolutions and uncertainty parts. A quantile reliability plot gives detailed insights in the predictive performance of the quantile forecasts.
NASA Astrophysics Data System (ADS)
Ehsan, Muhammad Azhar; Tippett, Michael K.; Almazroui, Mansour; Ismail, Muhammad; Yousef, Ahmed; Kucharski, Fred; Omar, Mohamed; Hussein, Mahmoud; Alkhalaf, Abdulrahman A.
2017-05-01
Northern Hemisphere winter precipitation reforecasts from the European Centre for Medium Range Weather Forecast System-4 and six of the models in the North American Multi-Model Ensemble are evaluated, focusing on two regions (Region-A: 20°N-45°N, 10°E-65°E and Region-B: 20°N-55°N, 205°E-255°E) where winter precipitation is a dominant fraction of the annual total and where precipitation from mid-latitude storms is important. Predictability and skill (deterministic and probabilistic) are assessed for 1983-2013 by the multimodel composite (MME) of seven prediction models. The MME climatological mean and variability over the two regions is comparable to observation with some regional differences. The statistically significant decreasing trend observed in Region-B precipitation is captured well by the MME and most of the individual models. El Niño Southern Oscillation is a source of forecast skill, and the correlation coefficient between the Niño3.4 index and precipitation over region A and B is 0.46 and 0.35, statistically significant at the 95 % level. The MME reforecasts weakly reproduce the observed teleconnection. Signal, noise and signal to noise ratio analysis show that the signal variance over two regions is very small as compared to noise variance which tends to reduce the prediction skill. The MME ranked probability skill score is higher than that of individual models, showing the advantage of a multimodel ensemble. Observed Region-A rainfall anomalies are strongly associated with the North Atlantic Oscillation, but none of the models reproduce this relation, which may explain the low skill over Region-A. The superior quality of multimodel ensemble compared with individual models is mainly due to larger ensemble size.
NASA Astrophysics Data System (ADS)
Hosseiny, S. M. H.; Zarzar, C.; Gomez, M.; Siddique, R.; Smith, V.; Mejia, A.; Demir, I.
2016-12-01
The National Water Model (NWM) provides a platform for operationalize nationwide flood inundation forecasting and mapping. The ability to model flood inundation on a national scale will provide invaluable information to decision makers and local emergency officials. Often, forecast products use deterministic model output to provide a visual representation of a single inundation scenario, which is subject to uncertainty from various sources. While this provides a straightforward representation of the potential inundation, the inherent uncertainty associated with the model output should be considered to optimize this tool for decision making support. The goal of this study is to produce ensembles of future flood inundation conditions (i.e. extent, depth, and velocity) to spatially quantify and visually assess uncertainties associated with the predicted flood inundation maps. The setting for this study is located in a highly urbanized watershed along the Darby Creek in Pennsylvania. A forecasting framework coupling the NWM with multiple hydraulic models was developed to produce a suite ensembles of future flood inundation predictions. Time lagged ensembles from the NWM short range forecasts were used to account for uncertainty associated with the hydrologic forecasts. The forecasts from the NWM were input to iRIC and HEC-RAS two-dimensional software packages, from which water extent, depth, and flow velocity were output. Quantifying the agreement between output ensembles for each forecast grid provided the uncertainty metrics for predicted flood water inundation extent, depth, and flow velocity. For visualization, a series of flood maps that display flood extent, water depth, and flow velocity along with the underlying uncertainty associated with each of the forecasted variables were produced. The results from this study demonstrate the potential to incorporate and visualize model uncertainties in flood inundation maps in order to identify the high flood risk zones.
NASA Astrophysics Data System (ADS)
Solvang Johansen, Stian; Steinsland, Ingelin; Engeland, Kolbjørn
2016-04-01
Running hydrological models with precipitation and temperature ensemble forcing to generate ensembles of streamflow is a commonly used method in operational hydrology. Evaluations of streamflow ensembles have however revealed that the ensembles are biased with respect to both mean and spread. Thus postprocessing of the ensembles is needed in order to improve the forecast skill. The aims of this study is (i) to to evaluate how postprocessing of streamflow ensembles works for Norwegian catchments within different hydrological regimes and to (ii) demonstrate how post processed streamflow ensembles are used operationally by a hydropower producer. These aims were achieved by postprocessing forecasted daily discharge for 10 lead-times for 20 catchments in Norway by using EPS forcing from ECMWF applied the semi-distributed HBV-model dividing each catchment into 10 elevation zones. Statkraft Energi uses forecasts from these catchments for scheduling hydropower production. The catchments represent different hydrological regimes. Some catchments have stable winter condition with winter low flow and a major flood event during spring or early summer caused by snow melting. Others has a more mixed snow-rain regime, often with a secondary flood season during autumn, and in the coastal areas, the stream flow is dominated by rain, and the main flood season is autumn and winter. For post processing, a Bayesian model averaging model (BMA) close to (Kleiber et al 2011) is used. The model creates a predictive PDF that is a weighted average of PDFs centered on the individual bias corrected forecasts. The weights are here equal since all ensemble members come from the same model, and thus have the same probability. For modeling streamflow, the gamma distribution is chosen as a predictive PDF. The bias correction parameters and the PDF parameters are estimated using a 30-day sliding window training period. Preliminary results show that the improvement varies between catchments depending on where they are situated and the hydrological regime. There is an improvement in CRPS for all catchments compared to raw EPS ensembles. The improvement is up to lead-time 5-7. The postprocessing also improves the MAE for the median of the predictive PDF compared to the median of the raw EPS. But less compared to CRPS, often up to lead-time 2-3. The streamflow ensembles are to some extent used operationally in Statkraft Energi (Hydro Power company, Norway), with respect to early warning, risk assessment and decision-making. Presently all forecast used operationally for short-term scheduling are deterministic, but ensembles are used visually for expert assessment of risk in difficult situations where e.g. there is a chance of overflow in a reservoir. However, there are plans to incorporate ensembles in the daily scheduling of hydropower production.
Preliminary results of BTDF calibration of transmissive solar diffusers for remote sensing
NASA Astrophysics Data System (ADS)
Georgiev, Georgi T.; Butler, James J.; Thome, Kurt; Cooksey, Catherine; Ding, Leibo
2016-09-01
Satellite instruments operating in the reflected solar wavelength region require accurate and precise determination of the optical properties of their diffusers used in pre-flight and post-flight calibrations. The majority of recent and current space instruments use reflective diffusers. As a result, numerous Bidirectional Reflectance Distribution Function (BRDF) calibration comparisons have been conducted between the National Institute of Standards and Technology (NIST) and other industry and university-based metrology laboratories. However, based on literature searches and communications with NIST and other laboratories, no Bidirectional Transmittance Distribution Function (BTDF) measurement comparisons have been conducted between National Measurement Laboratories (NMLs) and other metrology laboratories. On the other hand, there is a growing interest in the use of transmissive diffusers in the calibration of satellite, air-borne, and ground-based remote sensing instruments. Current remote sensing instruments employing transmissive diffusers include the Ozone Mapping and Profiler Suite instrument (OMPS) Limb instrument on the Suomi-National Polar-orbiting Partnership (S-NPP) platform,, the Geostationary Ocean Color Imager (GOCI) on the Korea Aerospace Research Institute's (KARI) Communication, Ocean, and Meteorological Satellite (COMS), the Ozone Monitoring Instrument (OMI) on NASA's Earth Observing System (EOS) Aura platform, the Tropospheric Emissions: Monitoring of Pollution (TEMPO) instrument and the Geostationary Environmental Monitoring Spectrometer (GEMS).. This ensemble of instruments requires validated BTDF measurements of their onboard transmissive diffusers from the ultraviolet through the near infrared. This paper presents the preliminary results of a BTDF comparison between the NASA Diffuser Calibration Laboratory (DCL) and NIST on quartz and thin Spectralon samples.
Effects of different representations of transport in the new EMAC-SWIFT chemistry climate model
NASA Astrophysics Data System (ADS)
Scheffler, Janice; Langematz, Ulrike; Wohltmann, Ingo; Kreyling, Daniel; Rex, Markus
2017-04-01
It is well known that the representation of atmospheric ozone chemistry in weather and climate models is essential for a realistic simulation of the atmospheric state. Interactively coupled chemistry climate models (CCMs) provide a means to realistically simulate the interaction between atmospheric chemistry and dynamics. The calculation of chemistry in CCMs, however, is computationally expensive which renders the use of complex chemistry models not suitable for ensemble simulations or simulations with multiple climate change scenarios. In these simulations ozone is therefore usually prescribed as a climatological field or included by incorporating a fast linear ozone scheme into the model. While prescribed climatological ozone fields are often not aligned with the modelled dynamics, a linear ozone scheme may not be applicable for a wide range of climatological conditions. An alternative approach to represent atmospheric chemistry in climate models which can cope with non-linearities in ozone chemistry and is applicable to a wide range of climatic states is the Semi-empirical Weighted Iterative Fit Technique (SWIFT) that is driven by reanalysis data and has been validated against observational satellite data and runs of a full Chemistry and Transport Model. SWIFT has been implemented into the ECHAM/MESSy (EMAC) chemistry climate model that uses a modular approach to climate modelling where individual model components can be switched on and off. When using SWIFT in EMAC, there are several possibilities to represent the effect of transport inside the polar vortex: the semi-Lagrangian transport scheme of EMAC and a transport parameterisation that can be useful when using SWIFT in models not having transport of their own. Here, we present results of equivalent simulations with different handling of transport, compare with EMAC simulations with full interactive chemistry and evaluate the results with observations.
NASA Astrophysics Data System (ADS)
Brochero, D.; Peña, J.; Anctil, F.; Boucher, M. A.; Nogales, J.; Reyes, N.
2016-12-01
The impacts of floods in Colombia during 2010 and 2011 as a result of ENSO in its cold phase (La Niña) marked a milestone in Colombian politics. In La Mojana region the balance was around 100,000 homeless and 3 km2 of flooded crops. We model the upstream basin of La Mojana (3600 km2 and a mean annual precipitation from 1000mm in valleys to 4500 mm in mountains). A forecasting system of at least three days in advance was judged prudent. This basin receives an streamflow highly regulated by multiple reservoirs that we model with a recurrent neural networks from 1 to 3-days ahead. For hydrological modeling purposes we use the GR4J, HBV, and SIMHYD models, records of daily precipitation, temperature, and streamflows, and 110 prediction scenarios of precipitation and temperature from Canada, USA, Brazil, and Europe extracted from the TIGGE database (MEPS). Calibration period is between January 2004 and August 2011. Validation from September to December 2011, taking as meteorological input the MEPS. We analised four alternative for the 3-day Hydrological Ensemble Prediction System (HEPS) Calibration: 1) only the GR4J model and observed values, 2). as 1 but HBV and SIMHYD are included, 3). Simultaneous optimization of the three hydrological models based on the reliability maximisation and the CRPS minimisation using the multiobjective calibration, observed and forecasted temperature and precipitation from the MEPS and, 4). as 3 but adding the daily streamflow data assimilation. Results show that the use of multiple hydrological models is clearly advantageous but even more performing the simultaneous optimization of hydrological models in the probabilistic context directly. The results evolution of the MAE on the reliability diagram (MAE-RD) are 43%, 27%, 17% and 15% respectively for the four alternatives. Regarding CRPS, MAE results show that the probabilistic prediction improves the deterministic estimate based on the daily mean HEPS scenario, despite the improvement in reliability is not necessarily reflected in the CRPS for the four alternatives: 4.3, 3.06 , 9.98, and 3.94, values that also accompany the mean scenario Nash-Sutcliffe of 0.93, 0.96, 0.51, and 0.93 respectively. In conclusion it shows that alternative 4 reached a good compromise between the deterministic and probabilistic performance (NS=0.93 and MAERD = 15%).
NASA Astrophysics Data System (ADS)
Addor, N.; Jaun, S.; Fundel, F.; Zappa, M.
2011-07-01
The Sihl River flows through Zurich, Switzerland's most populated city, for which it represents the largest flood threat. To anticipate extreme discharge events and provide decision support in case of flood risk, a hydrometeorological ensemble prediction system (HEPS) was launched operationally in 2008. This model chain relies on limited-area atmospheric forecasts provided by the deterministic model COSMO-7 and the probabilistic model COSMO-LEPS. These atmospheric forecasts are used to force a semi-distributed hydrological model (PREVAH), coupled to a hydraulic model (FLORIS). The resulting hydrological forecasts are eventually communicated to the stakeholders involved in the Sihl discharge management. This fully operational setting provides a real framework with which to compare the potential of deterministic and probabilistic discharge forecasts for flood mitigation. To study the suitability of HEPS for small-scale basins and to quantify the added-value conveyed by the probability information, a reforecast was made for the period June 2007 to December 2009 for the Sihl catchment (336 km2). Several metrics support the conclusion that the performance gain can be of up to 2 days lead time for the catchment considered. Brier skill scores show that overall COSMO-LEPS-based hydrological forecasts outperforms their COSMO-7-based counterparts for all the lead times and event intensities considered. The small size of the Sihl catchment does not prevent skillful discharge forecasts, but makes them particularly dependent on correct precipitation forecasts, as shown by comparisons with a reference run driven by observed meteorological parameters. Our evaluation stresses that the capacity of the model to provide confident and reliable mid-term probability forecasts for high discharges is limited. The two most intense events of the study period are investigated utilising a novel graphical representation of probability forecasts, and are used to generate high discharge scenarios. They highlight challenges for making decisions on the basis of hydrological predictions, and indicate the need for a tool to be used in addition to forecasts to compare the different mitigation actions possible in the Sihl catchment. No definitive conclusion on the model chain capacity to forecast flooding events endangering the city of Zurich could be drawn because of the under-sampling of extreme events. Further research on the form of the reforecasts needed to infer on floods associated to return periods of several decades, centuries, is encouraged.
NASA Astrophysics Data System (ADS)
Jimenez-Guerrero, Pedro; Balzarini, Alessandra; Baró, Rocío; Curci, Gabriele; Forkel, Renate; Hirtl, Marcus; Honzak, Luka; Langer, Matthias; Pérez, Juan L.; Pirovano, Guido; San José, Roberto; Tuccella, Paolo; Werhahn, Johannes; Zabkar, Rahela
2014-05-01
The study of the response of the aerosol levels in the atmosphere to a changing climate and how this affects the radiative budget of the Earth (direct, semi-direct and indirect effects) is an essential topic to build confidence on climate science, since these feedbacks involve the largest uncertainties nowadays. Air quality-climate interactions (AQCI) are, therefore, a key, but uncertain contributor to the anthropogenic forcing that remains poorly understood. To build confidence in the AQCI studies, regional-scale integrated meteorology-atmospheric chemistry models (i.e., models with on-line chemistry) that include detailed treatment of aerosol life cycle and aerosol impacts on radiation (direct effects) and clouds (indirect effects) are in demand. In this context, the main objective of this contribution is the study and definition of the uncertainties in the climate-chemistry-aerosol-cloud-radiation system associated to the direct radiative forcing and the indirect effect caused by aerosols over Europe, using an ensemble of fully-coupled meteorology-chemistry model simulations with the WRF-Chem model run under the umbrella of AQMEII-Phase 2 international initiative. Simulations were performed for Europe for the entire year 2010. According to the common simulation strategy, the year was simulated as a sequence of 2-day time slices. For better comparability, the seven groups applied the same grid spacing of 23 km and shared common processing of initial and boundary conditions as well as anthropogenic and fire emissions. With exception of a simulation with different cloud microphysics, identical physics options were chosen while the chemistry options were varied. Two model set-ups will be considered here: one sub-ensemble of simulations not taking into account any aerosol feedbacks (the baseline case) and another sub-ensemble of simulations which differs from the former by the inclusion of aerosol-radiation feedback. The existing differences for meteorological variables (mainly 2-m temperature and precipitation) and air quality levels (mainly ozone an PM10) between both sub-ensembles of WRF-Chem simulations have been characterized. In the case of ozone and PM10, an increase in solar radiation and temperature has generally resulted in an enhanced photochemical activity and therefore a negative feedback (areas with low aerosol concentrations present more than 50 W m-2 higher global radiation for cloudy conditions). However, simulated feedback effects between aerosol concentrations and meteorological variables and on pollutant distributions strongly depend on the model configuration and the meteorological situation. These results will help providing improved science-based foundations to better assess the impacts of climate variability, support the development of effective climate change policies and optimize private decision-making.
NASA Astrophysics Data System (ADS)
Liu, H.; Chan, C.; Huang, J.; Zhang, Y.; Choi, H.; Crawford, J. H.; Considine, D. B.; Zheng, X.; Oltmans, S. J.; Liu, S. C.; Zhang, L.; Liu, X.; Thouret, V.
2012-12-01
Tropospheric ozone concentrations and emissions of NOx have both increased significantly over China as a result of rapid industrialization during the past decade. These trends degrade local and regional air quality and have important effects on background tropospheric ozone and surface ozone over downwind North Pacific and North America. In-situ observations of tropospheric ozone over China are therefore essential to testing and improving our understanding of the impact of Asian anthropogenic (versus natural) emissions and various chemical, physical, and dynamical processes on both regional and global tropospheric ozone. Despite their critical importance, in-situ observations of tropospheric ozone profiles over China have been few and far between in most of the country. To investigate the ensemble of processes that control the distribution, variability, and sources of springtime tropospheric ozone over China and its surrounding regions, an intensive ozonesonde sounding campaign, called Transport of Air Pollutants and Tropospheric Ozone over China (TAPTO-China), was conducted at nine locations across China in the springs of 2004 (South China) and 2005 (North China). In this paper, we use a global 3-D model of tropospheric chemistry (GEOS-Chem) to examine the characteristics of distribution and variability and quantify various sources of tropospheric ozone over North China by analysis of intensive ozonesonde data obtained at four stations in North / Northwest China during the second phase of TAPTO-China (April-May 2005). These four stations include Xining (36.43N, 101.45E), Beijing (39.80N, 116.18E), Longfengshan (44.44N, 127.36E), and Aletai (47.73N, 88.08E). We drive GEOS-Chem with two sets of assimilated meteorological observations (GEOS-4 and GEOS-5) from the Goddard Earth Observing System (GEOS) of the NASA Global Modeling and Assimilation Office (GAMO), allowing us to examine the impacts of variability in meteorology. We show that the observed tropospheric ozone mixing ratios exhibit strong spatio-temporal variability. The model generally simulates well the ozonesonde observations but tends to underestimate ozone in the upper troposphere over Beijing and Longfengshan. We find that Asian fossil fuel emissions, stratospheric ozone, African lightning NOx emissions, as well as intercontinental transport are the main contributors to tropospheric ozone over North China in spring. While the lower-tropospheric ozone is largely influenced by Asian fossil fuel emissions (except over Aletai, Northwest China), lightning NOx emissions have a larger impact on the upper-tropospheric ozone than Asian fossil fuel emissions (except over Longfengshan, Northeast China). Model simulations suggest that the European fossil fuel emissions contribute more to the lower-tropospheric ozone over Aletai than the Asian fossil fuel emissions. We will also show that tropospheric ozone measurements by Tropospheric Emission Spectrometer (TES) aboard the NASA EOS Aura satellite can be used to study tropospheric ozone variability at Xining.
NASA Astrophysics Data System (ADS)
Amengual, A.; Romero, R.; Vich, M.; Alonso, S.
2009-06-01
The improvement of the short- and mid-range numerical runoff forecasts over the flood-prone Spanish Mediterranean area is a challenging issue. This work analyses four intense precipitation events which produced floods of different magnitude over the Llobregat river basin, a medium size catchment located in Catalonia, north-eastern Spain. One of them was a devasting flash flood - known as the "Montserrat" event - which produced 5 fatalities and material losses estimated at about 65 million euros. The characterization of the Llobregat basin's hydrological response to these floods is first assessed by using rain-gauge data and the Hydrologic Engineering Center's Hydrological Modeling System (HEC-HMS) runoff model. In second place, the non-hydrostatic fifth-generation Pennsylvania State University/NCAR mesoscale model (MM5) is nested within the ECMWF large-scale forecast fields in a set of 54 h period simulations to provide quantitative precipitation forecasts (QPFs) for each hydrometeorological episode. The hydrological model is forced with these QPFs to evaluate the reliability of the resulting discharge forecasts, while an ensemble prediction system (EPS) based on perturbed atmospheric initial and boundary conditions has been designed to test the value of a probabilistic strategy versus the previous deterministic approach. Specifically, a Potential Vorticity (PV) Inversion technique has been used to perturb the MM5 model initial and boundary states (i.e. ECMWF forecast fields). For that purpose, a PV error climatology has been previously derived in order to introduce realistic PV perturbations in the EPS. Results show the benefits of using a probabilistic approach in those cases where the deterministic QPF presents significant deficiencies over the Llobregat river basin in terms of the rainfall amounts, timing and localization. These deficiences in precipitation fields have a major impact on flood forecasts. Our ensemble strategy has been found useful to reduce the biases at different hydrometric sections along the watershed. Therefore, in an operational context, the devised methodology could be useful to expand the lead times associated with the prediction of similar future floods, helping to alleviate their possible hazardous consequences.
NASA Astrophysics Data System (ADS)
Amengual, A.; Romero, R.; Vich, M.; Alonso, S.
2009-01-01
The improvement of the short- and mid-range numerical runoff forecasts over the flood-prone Spanish Mediterranean area is a challenging issue. This work analyses four intense precipitation events which produced floods of different magnitude over the Llobregat river basin, a medium size catchment located in Catalonia, north-eastern Spain. One of them was a devasting flash flood - known as the "Montserrat" event - which produced 5 fatalities and material losses estimated at about 65 million euros. The characterization of the Llobregat basin's hydrological response to these floods is first assessed by using rain-gauge data and the Hydrologic Engineering Center's Hydrological Modeling System (HEC-HMS) runoff model. In second place, the non-hydrostatic fifth-generation Pennsylvania State University/NCAR mesoscale model (MM5) is nested within the ECMWF large-scale forecast fields in a set of 54 h period simulations to provide quantitative precipitation forecasts (QPFs) for each hydrometeorological episode. The hydrological model is forced with these QPFs to evaluate the reliability of the resulting discharge forecasts, while an ensemble prediction system (EPS) based on perturbed atmospheric initial and boundary conditions has been designed to test the value of a probabilistic strategy versus the previous deterministic approach. Specifically, a Potential Vorticity (PV) Inversion technique has been used to perturb the MM5 model initial and boundary states (i.e. ECMWF forecast fields). For that purpose, a PV error climatology has been previously derived in order to introduce realistic PV perturbations in the EPS. Results show the benefits of using a probabilistic approach in those cases where the deterministic QPF presents significant deficiencies over the Llobregat river basin in terms of the rainfall amounts, timing and localization. These deficiences in precipitation fields have a major impact on flood forecasts. Our ensemble strategy has been found useful to reduce the biases at different hydrometric sections along the watershed. Therefore, in an operational context, the devised methodology could be useful to expand the lead times associated with the prediction of similar future floods, helping to alleviate their possible hazardous consequences.
NASA Astrophysics Data System (ADS)
Silva, R.; West, J.; Anenberg, S.; Lamarque, J.; Shindell, D. T.; Bergmann, D. J.; Berntsen, T.; Cameron-Smith, P. J.; Collins, B.; Ghan, S. J.; Josse, B.; Nagashima, T.; Naik, V.; Plummer, D.; Rodriguez, J. M.; Szopa, S.; Zeng, G.
2012-12-01
Climate change can adversely affect air quality, through changes in meteorology, atmospheric chemistry, and emissions. Future changes in air pollutant emissions will also profoundly influence air quality. These changes in air quality can affect human health, as exposure to ground-level ozone and fine particulate matter (PM2.5) has been associated with premature human mortality. Here we will quantify the global mortality impacts of past and future climate change, considering the effects of climate change on air quality isolated from emission changes. The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP) has simulated the past and future surface concentrations of ozone and PM2.5 from each of several GCMs, for emissions from 1850 ("preindustrial") to 2000 ("present-day"), and for the IPCC AR5 Representative Concentration Pathways (RCPs) scenarios to 2100. We will use ozone and PM2.5 concentrations from simulations from five or more global models of atmospheric dynamics and chemistry, for a base year (present-day), pre-industrial conditions, and future scenarios, considering changes in climate and emissions. We will assess the mortality impacts of past climate change by using one simulation ensemble with present emissions and climate and one with present emissions but 1850 climate. We will similarly quantify the potential impacts of future climate change under the four RCP scenarios in 2030, 2050 and 2100. All model outputs will be regridded to the same resolution to estimate multi-model medians and range in each grid cell. Resulting premature deaths will be calculated using these medians along with epidemiologically-derived concentration-response functions, and present-day or future projections of population and baseline mortality rates, considering aging and transitioning disease rates over time. The spatial distributions of current and future global premature mortalities due to ozone and PM2.5 outdoor air pollution will be presented separately. These results will strengthen our understanding of the impacts of climate change today, and in future years considering different plausible scenarios.
Status of Middle Atmosphere-Climate Models: Results SPARC-GRIPS
NASA Technical Reports Server (NTRS)
Pawson, Steven; Kodera, Kunihiko
2003-01-01
The middle atmosphere is an important component of the climate system, primarily because of the radiative forcing of ozone. Middle atmospheric ozone can change, over long times, because of changes in the abundance of anthropogenic pollutants which catalytically destroy it, and because of the temperature sensitivity of kinetic reaction rates. There is thus a complex interaction between ozone, involving chemical and climatic mechanisms. One question of interest is how ozone will change over the next decades , as the "greenhouse-gas cooling" of the middle atmosphere increases but the concentrations of chlorine species decreases (because of policy changes). concerns the climate biases in current middle atmosphere-climate models, especially their ability to simulate the correct seasonal cycle at high latitudes, and the existence of temperature biases in the global mean. A major obstacle when addressing this question This paper will present a summary of recent results from the "GCM-Reality Intercomparison Project for SPARC" (GRIPS) initiative. A set of middle atmosphere-climate models has been compared, identifying common biases. Mechanisms for these biases are being studied in some detail, including off-line assessments of the radiation transfer codes and coordinated studies of the impacts of gravity wave drag due to sub-grid-scale processes. ensemble of models will be presented, along with numerical experiments undertaken with one or more models, designed to investigate the mechanisms at work in the atmosphere. The discussion will focus on dynamical and radiative mechanisms in the current climate, but implications for coupled ozone chemistry and the future climate will be assessed.
Probabilistic forecasting of extreme weather events based on extreme value theory
NASA Astrophysics Data System (ADS)
Van De Vyver, Hans; Van Schaeybroeck, Bert
2016-04-01
Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.
Multi-scale dynamical behavior of spatially distributed systems: a deterministic point of view
NASA Astrophysics Data System (ADS)
Mangiarotti, S.; Le Jean, F.; Drapeau, L.; Huc, M.
2015-12-01
Physical and biophysical systems are spatially distributed systems. Their behavior can be observed or modelled spatially at various resolutions. In this work, a deterministic point of view is adopted to analyze multi-scale behavior taking a set of ordinary differential equation (ODE) as elementary part of the system.To perform analyses, scenes of study are thus generated based on ensembles of identical elementary ODE systems. Without any loss of generality, their dynamics is chosen chaotic in order to ensure sensitivity to initial conditions, that is, one fundamental property of atmosphere under instable conditions [1]. The Rössler system [2] is used for this purpose for both its topological and algebraic simplicity [3,4].Two cases are thus considered: the chaotic oscillators composing the scene of study are taken either independent, or in phase synchronization. Scale behaviors are analyzed considering the scene of study as aggregations (basically obtained by spatially averaging the signal) or as associations (obtained by concatenating the time series). The global modeling technique is used to perform the numerical analyses [5].One important result of this work is that, under phase synchronization, a scene of aggregated dynamics can be approximated by the elementary system composing the scene, but modifying its parameterization [6]. This is shown based on numerical analyses. It is then demonstrated analytically and generalized to a larger class of ODE systems. Preliminary applications to cereal crops observed from satellite are also presented.[1] Lorenz, Deterministic nonperiodic flow. J. Atmos. Sci., 20, 130-141 (1963).[2] Rössler, An equation for continuous chaos, Phys. Lett. A, 57, 397-398 (1976).[3] Gouesbet & Letellier, Global vector-field reconstruction by using a multivariate polynomial L2 approximation on nets, Phys. Rev. E 49, 4955-4972 (1994).[4] Letellier, Roulin & Rössler, Inequivalent topologies of chaos in simple equations, Chaos, Solitons & Fractals, 28, 337-360 (2006).[5] Mangiarotti, Coudret, Drapeau, & Jarlan, Polynomial search and global modeling, Phys. Rev. E 86(4), 046205 (2012).[6] Mangiarotti, Modélisation globale et Caractérisation Topologique de dynamiques environnementales. Habilitation à Diriger des Recherches, Univ. Toulouse 3 (2014).
Multi-model Estimates of Intercontinental Source-Receptor Relationships for Ozone Pollution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fiore, A M; Dentener, F J; Wild, O
2008-10-16
Understanding the surface O{sub 3} response over a 'receptor' region to emission changes over a foreign 'source' region is key to evaluating the potential gains from an international approach to abate ozone (O{sub 3}) pollution. We apply an ensemble of 21 global and hemispheric chemical transport models to estimate the spatial average surface O{sub 3} response over East Asia (EA), Europe (EU), North America (NA) and South Asia (SA) to 20% decreases in anthropogenic emissions of the O{sub 3} precursors, NO{sub x}, NMVOC, and CO (individually and combined), from each of these regions. We find that the ensemble mean surfacemore » O{sub 3} concentrations in the base case (year 2001) simulation matches available observations throughout the year over EU but overestimates them by >10 ppb during summer and early fall over the eastern U.S. and Japan. The sum of the O{sub 3} responses to NO{sub x}, CO, and NMVOC decreases separately is approximately equal to that from a simultaneous reduction of all precursors. We define a continental-scale 'import sensitivity' as the ratio of the O{sub 3} response to the 20% reductions in foreign versus 'domestic' (i.e., over the source region itself) emissions. For example, the combined reduction of emissions from the 3 foreign regions produces an ensemble spatial mean decrease of 0.6 ppb over EU (0.4 ppb from NA), less than the 0.8 ppb from the reduction of EU emissions, leading to an import sensitivity ratio of 0.7. The ensemble mean surface O{sub 3} response to foreign emissions is largest in spring and late fall (0.7-0.9 ppb decrease in all regions from the combined precursor reductions in the 3 foreign regions), with import sensitivities ranging from 0.5 to 1.1 (responses to domestic emission reductions are 0.8-1.6 ppb). High O{sub 3} values are much more sensitive to domestic emissions than to foreign emissions, as indicated by lower import sensitivities of 0.2 to 0.3 during July in EA, EU, and NA when O{sub 3} levels are typically highest, and by the weaker relative response of annual incidences of daily maximum 8-hour average O{sub 3} above 60 ppb to emission reductions in a foreign region (<10-20% of that to domestic) as compared to the annual mean response (up to 50% of that to domestic). Applying the ensemble annual mean results to changes in anthropogenic emissions from 1996 to 2002, we estimate a Northern Hemispheric increase in background surface O{sub 3} of about 0.1 ppb yr{sup -1}, at the low end of the 0.1-0.5 ppb yr{sup -1} derived from observations. From an additional simulation in which global atmospheric methane was reduced, we infer that 20% reductions in anthropogenic methane emissions from a foreign source region would yield an O{sub 3} response in a receptor region that roughly equals that produced by combined 20% reductions of anthropogenic NO{sub x}, NMVOC and CO emissions from the foreign source region.« less
A Wind Forecasting System for Energy Application
NASA Astrophysics Data System (ADS)
Courtney, Jennifer; Lynch, Peter; Sweeney, Conor
2010-05-01
Accurate forecasting of available energy is crucial for the efficient management and use of wind power in the national power grid. With energy output critically dependent upon wind strength there is a need to reduce the errors associated wind forecasting. The objective of this research is to get the best possible wind forecasts for the wind energy industry. To achieve this goal, three methods are being applied. First, a mesoscale numerical weather prediction (NWP) model called WRF (Weather Research and Forecasting) is being used to predict wind values over Ireland. Currently, a gird resolution of 10km is used and higher model resolutions are being evaluated to establish whether they are economically viable given the forecast skill improvement they produce. Second, the WRF model is being used in conjunction with ECMWF (European Centre for Medium-Range Weather Forecasts) ensemble forecasts to produce a probabilistic weather forecasting product. Due to the chaotic nature of the atmosphere, a single, deterministic weather forecast can only have limited skill. The ECMWF ensemble methods produce an ensemble of 51 global forecasts, twice a day, by perturbing initial conditions of a 'control' forecast which is the best estimate of the initial state of the atmosphere. This method provides an indication of the reliability of the forecast and a quantitative basis for probabilistic forecasting. The limitation of ensemble forecasting lies in the fact that the perturbed model runs behave differently under different weather patterns and each model run is equally likely to be closest to the observed weather situation. Models have biases, and involve assumptions about physical processes and forcing factors such as underlying topography. Third, Bayesian Model Averaging (BMA) is being applied to the output from the ensemble forecasts in order to statistically post-process the results and achieve a better wind forecasting system. BMA is a promising technique that will offer calibrated probabilistic wind forecasts which will be invaluable in wind energy management. In brief, this method turns the ensemble forecasts into a calibrated predictive probability distribution. Each ensemble member is provided with a 'weight' determined by its relative predictive skill over a training period of around 30 days. Verification of data is carried out using observed wind data from operational wind farms. These are then compared to existing forecasts produced by ECMWF and Met Eireann in relation to skill scores. We are developing decision-making models to show the benefits achieved using the data produced by our wind energy forecasting system. An energy trading model will be developed, based on the rules currently used by the Single Electricity Market Operator for energy trading in Ireland. This trading model will illustrate the potential for financial savings by using the forecast data generated by this research.
NASA Astrophysics Data System (ADS)
Zatarain-Salazar, J.; Reed, P. M.; Herman, J. D.; Giuliani, M.; Castelletti, A.
2014-12-01
Globally reservoir operations provide fundamental services to water supply, energy generation, recreation, and ecosystems. The pressures of expanding populations, climate change, and increased energy demands are motivating a significant investment in re-operationalizing existing reservoirs or defining operations for new reservoirs. Recent work has highlighted the potential benefits of exploiting recent advances in many-objective optimization and direct policy search (DPS) to aid in addressing these systems' multi-sector demand tradeoffs. This study contributes to a comprehensive diagnostic assessment of multi-objective evolutionary optimization algorithms (MOEAs) efficiency, effectiveness, reliability, and controllability when supporting DPS for the Conowingo dam in the Lower Susquehanna River Basin. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to the system's competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. Seven benchmark and state-of-the-art MOEAs are tested on deterministic and stochastic instances of the Susquehanna test case. In the deterministic formulation, the operating objectives are evaluated over the historical realization of the hydroclimatic variables (i.e., inflows and evaporation rates). In the stochastic formulation, the same objectives are instead evaluated over an ensemble of stochastic inflows and evaporation rates realizations. The algorithms are evaluated in their ability to support DPS in discovering reservoir operations that compose the tradeoffs for six multi-sector performance objectives with thirty-two decision variables. Our diagnostic results highlight that many-objective DPS is very challenging for modern MOEAs and that epsilon dominance is critical for attaining high levels of performance. Epsilon dominance algorithms epsilon-MOEA, epsilon-NSGAII and the auto adaptive Borg MOEA, are statistically superior for the six-objective Susquehanna instance of this important class of problems. Additionally, shifting from deterministic history-based DPS to stochastic DPS significantly increases the difficulty of the problem.
Performance of multi-physics ensembles in convective precipitation events over northeastern Spain
NASA Astrophysics Data System (ADS)
García-Ortega, E.; Lorenzana, J.; Merino, A.; Fernández-González, S.; López, L.; Sánchez, J. L.
2017-07-01
Convective precipitation with hail greatly affects southwestern Europe, causing major economic losses. The local character of this meteorological phenomenon is a serious obstacle to forecasting. Therefore, the development of reliable short-term forecasts constitutes an essential challenge to minimizing and managing risks. However, deterministic outcomes are affected by different uncertainty sources, such as physics parameterizations. This study examines the performance of different combinations of physics schemes of the Weather Research and Forecasting model to describe the spatial distribution of precipitation in convective environments with hail falls. Two 30-member multi-physics ensembles, with two and three domains of maximum resolution 9 and 3km each, were designed using various combinations of cumulus, microphysics and radiation schemes. The experiment was evaluated for 10 convective precipitation days with hail over 2005-2010 in northeastern Spain. Different indexes were used to evaluate the ability of each ensemble member to capture the precipitation patterns, which were compared with observations of a rain-gauge network. A standardized metric was constructed to identify optimal performers. Results show interesting differences between the two ensembles. In two domain simulations, the selection of cumulus parameterizations was crucial, with the Betts-Miller-Janjic scheme the best. In contrast, the Kain-Fristch cumulus scheme gave the poorest results, suggesting that it should not be used in the study area. Nevertheless, in three domain simulations, the cumulus schemes used in coarser domains were not critical and the best results depended mainly on microphysics schemes. The best performance was shown by Morrison, New Thomson and Goddard microphysics.
Li, Ke; Zhang, Peng; Crittenden, John C; Guhathakurta, Subhrajit; Chen, Yongsheng; Fernando, Harindra; Sawhney, Anil; McCartney, Peter; Grimm, Nancy; Kahhat, Ramzy; Joshi, Himanshu; Konjevod, Goran; Choi, Yu-Jin; Fonseca, Ernesto; Allenby, Braden; Gerrity, Daniel; Torrens, Paul M
2007-07-15
To encourage sustainable development, engineers and scientists need to understand the interactions among social decision-making, development and redevelopment, land, energy and material use, and their environmental impacts. In this study, a framework that connects these interactions was proposed to guide more sustainable urban planning and construction practices. Focusing on the rapidly urbanizing setting of Phoenix, Arizona, complexity models and deterministic models were assembled as a metamodel, which is called Sustainable Futures 2100 and were used to predict land use and development, to quantify construction material demands, to analyze the life cycle environmental impacts, and to simulate future ground-level ozone formation.
Jenouvrier, Stéphanie; Holland, Marika; Stroeve, Julienne; Barbraud, Christophe; Weimerskirch, Henri; Serreze, Mark; Caswell, Hal
2012-09-01
Sea ice conditions in the Antarctic affect the life cycle of the emperor penguin (Aptenodytes forsteri). We present a population projection for the emperor penguin population of Terre Adélie, Antarctica, by linking demographic models (stage-structured, seasonal, nonlinear, two-sex matrix population models) to sea ice forecasts from an ensemble of IPCC climate models. Based on maximum likelihood capture-mark-recapture analysis, we find that seasonal sea ice concentration anomalies (SICa ) affect adult survival and breeding success. Demographic models show that both deterministic and stochastic population growth rates are maximized at intermediate values of annual SICa , because neither the complete absence of sea ice, nor heavy and persistent sea ice, would provide satisfactory conditions for the emperor penguin. We show that under some conditions the stochastic growth rate is positively affected by the variance in SICa . We identify an ensemble of five general circulation climate models whose output closely matches the historical record of sea ice concentration in Terre Adélie. The output of this ensemble is used to produce stochastic forecasts of SICa , which in turn drive the population model. Uncertainty is included by incorporating multiple climate models and by a parametric bootstrap procedure that includes parameter uncertainty due to both model selection and estimation error. The median of these simulations predicts a decline of the Terre Adélie emperor penguin population of 81% by the year 2100. We find a 43% chance of an even greater decline, of 90% or more. The uncertainty in population projections reflects large differences among climate models in their forecasts of future sea ice conditions. One such model predicts population increases over much of the century, but overall, the ensemble of models predicts that population declines are far more likely than population increases. We conclude that climate change is a significant risk for the emperor penguin. Our analytical approach, in which demographic models are linked to IPCC climate models, is powerful and generally applicable to other species and systems. © 2012 Blackwell Publishing Ltd.
Improving medium-range and seasonal hydroclimate forecasts in the southeast USA
NASA Astrophysics Data System (ADS)
Tian, Di
Accurate hydro-climate forecasts are important for decision making by water managers, agricultural producers, and other stake holders. Numerical weather prediction models and general circulation models may have potential for improving hydro-climate forecasts at different scales. In this study, forecast analogs of the Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) based on different approaches were evaluated for medium-range reference evapotranspiration (ETo), irrigation scheduling, and urban water demand forecasts in the southeast United States; the Climate Forecast System version 2 (CFSv2) and the North American national multi-model ensemble (NMME) were statistically downscaled for seasonal forecasts of ETo, precipitation (P) and 2-m temperature (T2M) at the regional level. The GFS mean temperature (Tmean), relative humidity, and wind speed (Wind) reforecasts combined with the climatology of Reanalysis 2 solar radiation (Rs) produced higher skill than using the direct GFS output only. Constructed analogs showed slightly higher skill than natural analogs for deterministic forecasts. Both irrigation scheduling driven by the GEFS-based ETo forecasts and GEFS-based ETo forecast skill were generally positive up to one week throughout the year. The GEFS improved ETo forecast skill compared to the GFS. The GEFS-based analog forecasts for the input variables of an operational urban water demand model were skillful when applied in the Tampa Bay area. The modified operational models driven by GEFS analog forecasts showed higher forecast skill than the operational model based on persistence. The results for CFSv2 seasonal forecasts showed maximum temperature (Tmax) and Rs had the greatest influence on ETo. The downscaled Tmax showed the highest predictability, followed by Tmean, Tmin, Rs, and Wind. The CFSv2 model could better predict ETo in cold seasons during El Nino Southern Oscillation (ENSO) events only when the forecast initial condition was in ENSO. Downscaled P and T2M forecasts were produced by directly downscaling the NMME P and T2M output or indirectly using the NMME forecasts of Nino3.4 sea surface temperatures to predict local-scale P and T2M. The indirect method generally showed the highest forecast skill which occurs in cold seasons. The bias-corrected NMME ensemble forecast skill did not outperform the best single model.
Power flow prediction in vibrating systems via model reduction
NASA Astrophysics Data System (ADS)
Li, Xianhui
This dissertation focuses on power flow prediction in vibrating systems. Reduced order models (ROMs) are built based on rational Krylov model reduction which preserve power flow information in the original systems over a specified frequency band. Stiffness and mass matrices of the ROMs are obtained by projecting the original system matrices onto the subspaces spanned by forced responses. A matrix-free algorithm is designed to construct ROMs directly from the power quantities at selected interpolation frequencies. Strategies for parallel implementation of the algorithm via message passing interface are proposed. The quality of ROMs is iteratively refined according to the error estimate based on residual norms. Band capacity is proposed to provide a priori estimate of the sizes of good quality ROMs. Frequency averaging is recast as ensemble averaging and Cauchy distribution is used to simplify the computation. Besides model reduction for deterministic systems, details of constructing ROMs for parametric and nonparametric random systems are also presented. Case studies have been conducted on testbeds from Harwell-Boeing collections. Input and coupling power flow are computed for the original systems and the ROMs. Good agreement is observed in all cases.
Finally making sense of the double-slit experiment.
Aharonov, Yakir; Cohen, Eliahu; Colombo, Fabrizio; Landsberger, Tomer; Sabadini, Irene; Struppa, Daniele C; Tollaksen, Jeff
2017-06-20
Feynman stated that the double-slit experiment "…has in it the heart of quantum mechanics. In reality, it contains the only mystery" and that "nobody can give you a deeper explanation of this phenomenon than I have given; that is, a description of it" [Feynman R, Leighton R, Sands M (1965) The Feynman Lectures on Physics ]. We rise to the challenge with an alternative to the wave function-centered interpretations: instead of a quantum wave passing through both slits, we have a localized particle with nonlocal interactions with the other slit. Key to this explanation is dynamical nonlocality, which naturally appears in the Heisenberg picture as nonlocal equations of motion. This insight led us to develop an approach to quantum mechanics which relies on pre- and postselection, weak measurements, deterministic, and modular variables. We consider those properties of a single particle that are deterministic to be primal. The Heisenberg picture allows us to specify the most complete enumeration of such deterministic properties in contrast to the Schrödinger wave function, which remains an ensemble property. We exercise this approach by analyzing a version of the double-slit experiment augmented with postselection, showing that only it and not the wave function approach can be accommodated within a time-symmetric interpretation, where interference appears even when the particle is localized. Although the Heisenberg and Schrödinger pictures are equivalent formulations, nevertheless, the framework presented here has led to insights, intuitions, and experiments that were missed from the old perspective.
PRELIMINARY RESULTS OF BTDF CALIBRATION OF TRANSMISSIVE SOLAR DIFFUSERS FOR REMOTE SENSING.
Georgiev, Georgi T; Butler, James J; Thome, Kurt; Cooksey, Catherine; Ding, Leibo
2016-01-01
Satellite instruments operating in the reflected solar wavelength region require accurate and precise determination of the optical properties of their diffusers used in pre-flight and post-flight calibrations. The majority of recent and current space instruments use reflective diffusers. As a result, numerous Bidirectional Reflectance Distribution Function (BRDF) calibration comparisons have been conducted between the National Institute of Standards and Technology (NIST) and other industry and university-based metrology laboratories. However, based on literature searches and communications with NIST and other laboratories, no Bidirectional Transmittance Distribution Function (BTDF) measurement comparisons have been conducted between National Measurement Laboratories (NMLs) and other metrology laboratories. On the other hand, there is a growing interest in the use of transmissive diffusers in the calibration of satellite, air-borne, and ground-based remote sensing instruments. Current remote sensing instruments employing transmissive diffusers include the Ozone Mapping and Profiler Suite instrument (OMPS) Limb instrument on the Suomi-National Polar-orbiting Partnership (S-NPP) platform,, the Geostationary Ocean Color Imager (GOCI) on the Korea Aerospace Research Institute's (KARI) Communication, Ocean, and Meteorological Satellite (COMS), the Ozone Monitoring Instrument (OMI) on NASA's Earth Observing System (EOS) Aura platform, the Tropospheric Emissions: Monitoring of Pollution (TEMPO) instrument and the Geostationary Environmental Monitoring Spectrometer (GEMS).. This ensemble of instruments requires validated BTDF measurements of their on-board transmissive diffusers from the ultraviolet through the near infrared. This paper presents the preliminary results of a BTDF comparison between the NASA Diffuser Calibration Laboratory (DCL) and NIST on quartz and thin Spectralon samples.
Preliminary Results of BTDF Calibration of Transmissive Solar Diffusers for Remote Sensing
NASA Technical Reports Server (NTRS)
Georgiev, Georgi T.; Butler, James J.; Thome, Kurt; Cooksey, Catherine; Ding, Leibo
2016-01-01
Satellite instruments operating in the reflected solar wavelength region require accurate and precise determination of the optical properties of their diffusers used in pre-flight and post-flight calibrations. The majority of recent and current space instruments use reflective diffusers. As a result, numerous Bidirectional Reflectance Distribution Function (BRDF) calibration comparisons have been conducted between the National Institute of Standards and Technology (NIST) and other industry and university-based metrology laboratories. However, based on literature searches and communications with NIST and other laboratories, no Bidirectional Transmittance Distribution Function (BTDF) measurement comparisons have been conducted between National Measurement Laboratories (NMLs) and other metrology laboratories. On the other hand, there is a growing interest in the use of transmissive diffusers in the calibration of satellite, air-borne, and ground-based remote sensing instruments. Current remote sensing instruments employing transmissive diffusers include the Ozone Mapping and Profiler Suite instrument (OMPS) Limb instrument on the Suomi-National Polar-orbiting Partnership (S-NPP) platform,, the Geostationary Ocean Color Imager (GOCI) on the Korea Aerospace Research Institute's (KARI) Communication, Ocean, and Meteorological Satellite (COMS), the Ozone Monitoring Instrument (OMI) on NASA's Earth Observing System (EOS) Aura platform, the Tropospheric Emissions: Monitoring of Pollution (TEMPO) instrument and the Geostationary Environmental Monitoring Spectrometer (GEMS).. This ensemble of instruments requires validated BTDF measurements of their on-board transmissive diffusers from the ultraviolet through the near infrared. This paper presents the preliminary results of a BTDF comparison between the NASA Diffuser Calibration Laboratory (DCL) and NIST on quartz and thin Spectralon samples.
PRELIMINARY RESULTS OF BTDF CALIBRATION OF TRANSMISSIVE SOLAR DIFFUSERS FOR REMOTE SENSING
Georgiev, Georgi T.; Butler, James J.; Thome, Kurt; Cooksey, Catherine; Ding, Leibo
2016-01-01
Satellite instruments operating in the reflected solar wavelength region require accurate and precise determination of the optical properties of their diffusers used in pre-flight and post-flight calibrations. The majority of recent and current space instruments use reflective diffusers. As a result, numerous Bidirectional Reflectance Distribution Function (BRDF) calibration comparisons have been conducted between the National Institute of Standards and Technology (NIST) and other industry and university-based metrology laboratories. However, based on literature searches and communications with NIST and other laboratories, no Bidirectional Transmittance Distribution Function (BTDF) measurement comparisons have been conducted between National Measurement Laboratories (NMLs) and other metrology laboratories. On the other hand, there is a growing interest in the use of transmissive diffusers in the calibration of satellite, air-borne, and ground-based remote sensing instruments. Current remote sensing instruments employing transmissive diffusers include the Ozone Mapping and Profiler Suite instrument (OMPS) Limb instrument on the Suomi-National Polar-orbiting Partnership (S-NPP) platform,, the Geostationary Ocean Color Imager (GOCI) on the Korea Aerospace Research Institute’s (KARI) Communication, Ocean, and Meteorological Satellite (COMS), the Ozone Monitoring Instrument (OMI) on NASA’s Earth Observing System (EOS) Aura platform, the Tropospheric Emissions: Monitoring of Pollution (TEMPO) instrument and the Geostationary Environmental Monitoring Spectrometer (GEMS).. This ensemble of instruments requires validated BTDF measurements of their on-board transmissive diffusers from the ultraviolet through the near infrared. This paper presents the preliminary results of a BTDF comparison between the NASA Diffuser Calibration Laboratory (DCL) and NIST on quartz and thin Spectralon samples. PMID:28003712
9+ Years of CALIPSO PSC Observations: An Evolving Climatology
NASA Technical Reports Server (NTRS)
Pitts, Michael C.; Poole, Lamont R.
2015-01-01
Polar stratospheric clouds (PSCs) play a crucial role in the springtime chemical depletion of ozone at high latitudes. PSC particles (primarily supercooled ternary solution, or STS droplets) provide sites for heterogeneous chemical reactions that transform stable chlorine and bromine reservoir species into highly reactive ozone-destructive forms. Furthermore, large nitric acid trihydrate (NAT) PSC particles can irreversibly redistribute odd nitrogen through gravitational sedimentation (a process commonly known as denitrification), which prolongs the ozone depletion process by slowing the reformation of the stable chlorine reservoirs. Spaceborne observations from the CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization) lidar on the CALIPSO (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations) satellite are providing a rich new dataset for studying PSCs. CALIPSO is an excellent platform for studying polar processes with CALIOP acquiring, on average, over 300,000 backscatter profiles daily at latitudes between 55o and 82o in both hemispheres. PSCs are detected in the CALIOP backscatter profiles using a successive horizontal averaging scheme that enables detection of strongly scattering PSCs (e.g., ice) at the finest possible spatial resolution (5 km), while enhancing the detection of very tenuous PSCs (e.g., low number density NAT) at larger spatial scales (up to 135 km). CALIOP PSCs are separated into composition classes (STS; liquid/NAT mixtures; and ice) based on the ensemble 532-nm scattering ratio (the ratio of total-to-molecular backscatter) and 532-nm particulate depolarization ratio (which is sensitive to the presence of non-spherical, i.e. NAT and ice particles). In this paper, we will provide an overview of the CALIOP PSC detection and composition classification algorithm and then examine the vertical and spatial distribution of PSCs in the Arctic and Antarctic on vortex-wide scales for entire PSC seasons over the more than nine-year data record from 2006- 2015.
CALIPSO Polar Stratospheric Cloud Observations from 2006-2015
NASA Technical Reports Server (NTRS)
Pitts, Michael C.; Poole, Lamont R.
2015-01-01
Polar stratospheric clouds (PSCs) play a crucial role in the springtime chemical depletion of ozone at high latitudes. PSC particles (primarily supercooled ternary solution, or STS droplets) provide sites for heterogeneous chemical reactions that transform stable chlorine and bromine reservoir species into highly reactive ozone-destructive forms. Furthermore, large nitric acid trihydrate (NAT) PSC particles can irreversibly redistribute odd nitrogen through gravitational sedimentation (a process commonly known as denitrification), which prolongs the ozone depletion process by slowing the reformation of the stable chlorine reservoirs. Spaceborne observations from the CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization) lidar on the CALIPSO (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations) satellite are providing a rich new dataset for studying PSCs. CALIPSO is an excellent platform for studying polar processes with CALIOP acquiring, on average, over 300,000 backscatter profiles daily at latitudes between 55o and 82o in both hemispheres. PSCs are detected in the CALIOP backscatter profiles using a successive horizontal averaging scheme that enables detection of strongly scattering PSCs (e.g., ice) at the finest possible spatial resolution (5 km), while enhancing the detection of very tenuous PSCs (e.g., low number density NAT) at larger spatial scales (up to 135 km). CALIOP PSCs are separated into composition classes (STS; liquid/NAT mixtures; and ice) based on the ensemble 532-nm scattering ratio (the ratio of total-to-molecular backscatter) and 532-nm particulate depolarization ratio (which is sensitive to the presence of non-spherical, i.e. NAT and ice particles). In this paper, we will provide an overview of the CALIOP PSC detection and composition classification algorithm and then examine the vertical and spatial distribution of PSCs in the Arctic and Antarctic on vortex-wide scales for entire PSC seasons over the more than nine-year data record from 2006- 2015.
A universal quantum information processor for scalable quantum communication and networks
Yang, Xihua; Xue, Bolin; Zhang, Junxiang; Zhu, Shiyao
2014-01-01
Entanglement provides an essential resource for quantum computation, quantum communication, and quantum networks. How to conveniently and efficiently realize the generation, distribution, storage, retrieval, and control of multipartite entanglement is the basic requirement for realistic quantum information processing. Here, we present a theoretical proposal to efficiently and conveniently achieve a universal quantum information processor (QIP) via atomic coherence in an atomic ensemble. The atomic coherence, produced through electromagnetically induced transparency (EIT) in the Λ-type configuration, acts as the QIP and has full functions of quantum beam splitter, quantum frequency converter, quantum entangler, and quantum repeater. By employing EIT-based nondegenerate four-wave mixing processes, the generation, exchange, distribution, and manipulation of light-light, atom-light, and atom-atom multipartite entanglement can be efficiently and flexibly achieved in a deterministic way with only coherent light fields. This method greatly facilitates the operations in quantum information processing, and holds promising applications in realistic scalable quantum communication and quantum networks. PMID:25316514
ShinyGPAS: interactive genomic prediction accuracy simulator based on deterministic formulas.
Morota, Gota
2017-12-20
Deterministic formulas for the accuracy of genomic predictions highlight the relationships among prediction accuracy and potential factors influencing prediction accuracy prior to performing computationally intensive cross-validation. Visualizing such deterministic formulas in an interactive manner may lead to a better understanding of how genetic factors control prediction accuracy. The software to simulate deterministic formulas for genomic prediction accuracy was implemented in R and encapsulated as a web-based Shiny application. Shiny genomic prediction accuracy simulator (ShinyGPAS) simulates various deterministic formulas and delivers dynamic scatter plots of prediction accuracy versus genetic factors impacting prediction accuracy, while requiring only mouse navigation in a web browser. ShinyGPAS is available at: https://chikudaisei.shinyapps.io/shinygpas/ . ShinyGPAS is a shiny-based interactive genomic prediction accuracy simulator using deterministic formulas. It can be used for interactively exploring potential factors that influence prediction accuracy in genome-enabled prediction, simulating achievable prediction accuracy prior to genotyping individuals, or supporting in-class teaching. ShinyGPAS is open source software and it is hosted online as a freely available web-based resource with an intuitive graphical user interface.
Uncertainty and dispersion in air parcel trajectories near the tropical tropopause
NASA Astrophysics Data System (ADS)
Bergman, John; Jensen, Eric; Pfister, Leonhard; Bui, Thoapaul
2016-04-01
The Tropical Tropopause Layer (TTL) is important as the gateway to the stratosphere for chemical constituents produced at the Earth's surface. As such, understanding the processes that transport air through the upper tropical troposphere is important for a number of current scientific issues such as the impact of stratospheric water vapor on the global radiative budget and the depletion of ozone by both anthropogenically- and naturally-produced halocarbons. Compared to the lower troposphere, transport in the TTL is relatively unaffected by turbulent motion. Consequently, Lagrangian particle models are thought to provide reasonable estimates of parcel pathways through the TTL. However, there are complications that make trajectory analyses difficult to interpret; uncertainty in the wind data used to drive these calculations and trajectory dispersion being among the most important. These issues are examined using ensembles of backward air parcel trajectories that are initially tightly grouped near the tropical tropopause using three approaches: A Monte Carlo ensemble, in which different members use identical resolved wind fluctuations but different realizations of stochastic, multi-fractal simulations of unresolved winds, perturbed initial location ensembles, in which members use identical resolved wind fields but initial locations are displaced 2° in latitude and longitude, and a multi-model ensemble that uses identical initial conditions but different resolved wind fields and/or trajectory formulations. Comparisons among the approaches distinguish, to some degree, physical dispersion from that due to data uncertainty and the impact of unresolved wind fluctuations from that of resolved variability.
The state of the art of flood forecasting - Hydrological Ensemble Prediction Systems
NASA Astrophysics Data System (ADS)
Thielen-Del Pozo, J.; Pappenberger, F.; Salamon, P.; Bogner, K.; Burek, P.; de Roo, A.
2010-09-01
Flood forecasting systems form a key part of ‘preparedness' strategies for disastrous floods and provide hydrological services, civil protection authorities and the public with information of upcoming events. Provided the warning leadtime is sufficiently long, adequate preparatory actions can be taken to efficiently reduce the impacts of the flooding. Because of the specific characteristics of each catchment, varying data availability and end-user demands, the design of the best flood forecasting system may differ from catchment to catchment. However, despite the differences in concept and data needs, there is one underlying issue that spans across all systems. There has been an growing awareness and acceptance that uncertainty is a fundamental issue of flood forecasting and needs to be dealt with at the different spatial and temporal scales as well as the different stages of the flood generating processes. Today, operational flood forecasting centres change increasingly from single deterministic forecasts to probabilistic forecasts with various representations of the different contributions of uncertainty. The move towards these so-called Hydrological Ensemble Prediction Systems (HEPS) in flood forecasting represents the state of the art in forecasting science, following on the success of the use of ensembles for weather forecasting (Buizza et al., 2005) and paralleling the move towards ensemble forecasting in other related disciplines such as climate change predictions. The use of HEPS has been internationally fostered by initiatives such as "The Hydrologic Ensemble Prediction Experiment" (HEPEX), created with the aim to investigate how best to produce, communicate and use hydrologic ensemble forecasts in hydrological short-, medium- und long term prediction of hydrological processes. The advantages of quantifying the different contributions of uncertainty as well as the overall uncertainty to obtain reliable and useful flood forecasts also for extreme events, has become evident. However, despite the demonstrated advantages, worldwide the incorporation of HEPS in operational flood forecasting is still limited. The applicability of HEPS for smaller river basins was tested in MAP D-Phase, an acronym for "Demonstration of Probabilistic Hydrological and Atmospheric Simulation of flood Events in the Alpine region" which was launched in 2005 as a Forecast Demonstration Project of World Weather Research Programme of WMO, and entered a pre-operational and still active testing phase in 2007. In Europe, a comparatively high number of EPS driven systems for medium-large rivers exist. National flood forecasting centres of Sweden, Finland and the Netherlands, have already implemented HEPS in their operational forecasting chain, while in other countries including France, Germany, Czech Republic and Hungary, hybrids or experimental chains have been installed. As an example of HEPS, the European Flood Alert System (EFAS) is being presented. EFAS provides medium-range probabilistic flood forecasting information for large trans-national river basins. It incorporates multiple sets of weather forecast including different types of EPS and deterministic forecasts from different providers. EFAS products are evaluated and visualised as exceedance of critical levels only - both in forms of maps and time series. Different sources of uncertainty and its impact on the flood forecasting performance for every grid cell has been tested offline but not yet incorporated operationally into the forecasting chain for computational reasons. However, at stations where real-time discharges are available, a hydrological uncertainty processor is being applied to estimate the total predictive uncertainty from the hydrological and input uncertainties. Research on long-term EFAS results has shown the need for complementing statistical analysis with case studies for which examples will be shown.
NASA Astrophysics Data System (ADS)
Yang, J.; Astitha, M.; Delle Monache, L.; Alessandrini, S.
2016-12-01
Accuracy of weather forecasts in Northeast U.S. has become very important in recent years, given the serious and devastating effects of extreme weather events. Despite the use of evolved forecasting tools and techniques strengthened by increased super-computing resources, the weather forecasting systems still have their limitations in predicting extreme events. In this study, we examine the combination of analog ensemble and Bayesian regression techniques to improve the prediction of storms that have impacted NE U.S., mostly defined by the occurrence of high wind speeds (i.e. blizzards, winter storms, hurricanes and thunderstorms). The predicted wind speed, wind direction and temperature by two state-of-the-science atmospheric models (WRF and RAMS/ICLAMS) are combined using the mentioned techniques, exploring various ways that those variables influence the minimization of the prediction error (systematic and random). This study is focused on retrospective simulations of 146 storms that affected the NE U.S. in the period 2005-2016. In order to evaluate the techniques, leave-one-out cross validation procedure was implemented regarding 145 storms as the training dataset. The analog ensemble method selects a set of past observations that corresponded to the best analogs of the numerical weather prediction and provides a set of ensemble members of the selected observation dataset. The set of ensemble members can then be used in a deterministic or probabilistic way. In the Bayesian regression framework, optimal variances are estimated for the training partition by minimizing the root mean square error and are applied to the out-of-sample storm. The preliminary results indicate a significant improvement in the statistical metrics of 10-m wind speed for 146 storms using both techniques (20-30% bias and error reduction in all observation-model pairs). In this presentation, we discuss the various combinations of atmospheric predictors and techniques and illustrate how the long record of predicted storms is valuable in the improvement of wind speed prediction.
NASA Astrophysics Data System (ADS)
Mao, H.; McGlynn, D. F.; Wu, Z.; Sive, B. C.
2017-12-01
A time scale decomposition technique, the Ensemble Empirical Mode Decomposition (EEMD), has been employed to decompose the time scales in long-term ozone measurement data at 24 US National Park Service sites. Time scales of interest include the annual cycle, variability by large scale climate oscillations, and the long-term trend. The implementation of policy regulations was found to have had a greater effect on sites nearest to urban regions. Ozone daily mean values increased until around the late 1990s followed by decreasing trends during the ensuing decades for sites in the East, southern California, and northwestern Washington. Sites in the Midwest did not experience a reversal of trends from positive to negative until the mid- to late 2000s. The magnitude of the annual amplitude decreased for nine sites and increased for three sites. Stronger decreases in the annual amplitude occurred in the East, with more sites in the East experiencing decreases in annual amplitude than in the West. The date of annual ozone peaks and minimums has changed for 12 sites in total, but those with a shift in peak date did not necessarily have a shift in the trough date. There appeared to be a link between peak dates occurring earlier and a decrease in the annual amplitude. This is likely related to a decrease in ozone titration due to NOx emission reductions. Furthermore, it was found that the shift in the Pacific Decadal Oscillation (PDO) regime from positive to negative in 1998-1999 resulting in an increase in occurrences of La Niña-like conditions had the effect of directing more polluted air masses from East Asia to higher latitudes over North America. This change in PDO regime was likely one main factor causing the increase in ozone concentrations on all time scales at an Alaskan site DENA-HQ.
Evaluation of the North American Multi-Model Ensemble System for Monthly and Seasonal Prediction
NASA Astrophysics Data System (ADS)
Zhang, Q.
2014-12-01
Since August 2011, the real time seasonal forecasts of the U.S. National Multi-Model Ensemble (NMME) have been made on 8th of each month by NCEP Climate Prediction Center (CPC). The participating models were NCEP/CFSv1&2, GFDL/CM2.2, NCAR/U.Miami/COLA/CCSM3, NASA/GEOS5, IRI/ ECHAM-a & ECHAM-f in the first year of the real time NMME forecast. Two Canadian coupled models CMC/CanCM3 and CM4 joined in and CFSv1 and IRI's models dropped out in the second year. The NMME team at CPC collects monthly means of three variables, precipitation, temperature at 2m and sea surface temperature from each modeling center on a 1x1 global grid, removes systematic errors, makes the grand ensemble mean in equal weight for each model mean and probability forecast with equal weight for each member of each model. This provides the NMME forecast locked in schedule for the CPC operational seasonal and monthly outlook. The basic verification metrics of seasonal and monthly prediction of NMME are calculated as an evaluation of skill, including both deterministic and probabilistic forecasts for the 3-year real time (August, 2011- July 2014) period and the 30-year retrospective forecast (1982-2011) of the individual models as well as the NMME ensemble. The motivation of this study is to provide skill benchmarks for future improvements of the NMME seasonal and monthly prediction system. We also want to establish whether the real time and hindcast periods (used for bias correction in real time) are consistent. The experimental phase I of the project already supplies routine guidance to users of the NMME forecasts.
Ensemble Kalman filter inference of spatially-varying Manning's n coefficients in the coastal ocean
NASA Astrophysics Data System (ADS)
Siripatana, Adil; Mayo, Talea; Knio, Omar; Dawson, Clint; Maître, Olivier Le; Hoteit, Ibrahim
2018-07-01
Ensemble Kalman (EnKF) filtering is an established framework for large scale state estimation problems. EnKFs can also be used for state-parameter estimation, using the so-called "Joint-EnKF" approach. The idea is simply to augment the state vector with the parameters to be estimated and assign invariant dynamics for the time evolution of the parameters. In this contribution, we investigate the efficiency of the Joint-EnKF for estimating spatially-varying Manning's n coefficients used to define the bottom roughness in the Shallow Water Equations (SWEs) of a coastal ocean model. Observation System Simulation Experiments (OSSEs) are conducted using the ADvanced CIRCulation (ADCIRC) model, which solves a modified form of the Shallow Water Equations. A deterministic EnKF, the Singular Evolutive Interpolated Kalman (SEIK) filter, is used to estimate a vector of Manning's n coefficients defined at the model nodal points by assimilating synthetic water elevation data. It is found that with reasonable ensemble size (O (10)) , the filter's estimate converges to the reference Manning's field. To enhance performance, we have further reduced the dimension of the parameter search space through a Karhunen-Loéve (KL) expansion. We have also iterated on the filter update step to better account for the nonlinearity of the parameter estimation problem. We study the sensitivity of the system to the ensemble size, localization scale, dimension of retained KL modes, and number of iterations. The performance of the proposed framework in term of estimation accuracy suggests that a well-tuned Joint-EnKF provides a promising robust approach to infer spatially varying seabed roughness parameters in the context of coastal ocean modeling.
NASA Astrophysics Data System (ADS)
Ament, F.; Weusthoff, T.; Arpagaus, M.; Rotach, M.
2009-04-01
The main aim of the WWRP Forecast Demonstration Project MAP D-PHASE is to demonstrate the performance of today's models to forecast heavy precipitation and flood events in the Alpine region. Therefore an end-to-end, real-time forecasting system was installed and operated during the D PHASE Operations Period from June to November 2007. Part of this system are 30 numerical weather prediction models (deterministic as well as ensemble systems) operated by weather services and research institutes, which issue alerts if predicted precipitation accumulations exceed critical thresholds. Additionally to the real-time alerts, all relevant model fields of these simulations are stored in a central data archive. This comprehensive data set allows a detailed assessment of today's quantitative precipitation forecast (QPF) performance in the Alpine region. We will present results of QPF verifications against Swiss radar and rain gauge data both from a qualitative point of view, in terms of alerts, as well as from a quantitative perspective, in terms of precipitation rate. Various influencing factors like lead time, accumulation time, selection of warning thresholds, or bias corrections will be discussed. Additional to traditional verifications of area average precipitation amounts, the performance of the models to predict the correct precipitation statistics without requiring a point-to-point match will be described by using modern Fuzzy verification techniques. Both analyses reveal significant advantages of deep convection resolving models compared to coarser models with parameterized convection. An intercomparison of the model forecasts themselves reveals a remarkably high variability between different models, and makes it worthwhile to evaluate the potential of a multi-model ensemble. Various multi-model ensemble strategies will be tested by combining D-PHASE models to virtual ensemble systems.
Transient ensemble dynamics in time-independent galactic potentials
NASA Astrophysics Data System (ADS)
Mahon, M. Elaine; Abernathy, Robert A.; Bradley, Brendan O.; Kandrup, Henry E.
1995-07-01
This paper summarizes a numerical investigation of the short-time, possibly transient, behaviour of ensembles of stochastic orbits evolving in fixed non-integrable potentials, with the aim of deriving insights into the structure and evolution of galaxies. The simulations involved three different two-dimensional potentials, quite different in appearance. However, despite these differences, ensembles in all three potentials exhibit similar behaviour. This suggests that the conclusions inferred from the simulations are robust, relying only on basic topological properties, e.g., the existence of KAM tori and cantori. Generic ensembles of initial conditions, corresponding to stochastic orbits, exhibit a rapid coarse-grained approach towards a near-invariant distribution on a time-scale <
The Rise of Complexity in Flood Forecasting: Opportunities, Challenges and Tradeoffs
NASA Astrophysics Data System (ADS)
Wood, A. W.; Clark, M. P.; Nijssen, B.
2017-12-01
Operational flood forecasting is currently undergoing a major transformation. Most national flood forecasting services have relied for decades on lumped, highly calibrated conceptual hydrological models running on local office computing resources, providing deterministic streamflow predictions at gauged river locations that are important to stakeholders and emergency managers. A variety of recent technological advances now make it possible to run complex, high-to-hyper-resolution models for operational hydrologic prediction over large domains, and the US National Weather Service is now attempting to use hyper-resolution models to create new forecast services and products. Yet other `increased-complexity' forecasting strategies also exist that pursue different tradeoffs between model complexity (i.e., spatial resolution, physics) and streamflow forecast system objectives. There is currently a pressing need for a greater understanding in the hydrology community of the opportunities, challenges and tradeoffs associated with these different forecasting approaches, and for a greater participation by the hydrology community in evaluating, guiding and implementing these approaches. Intermediate-resolution forecast systems, for instance, use distributed land surface model (LSM) physics but retain the agility to deploy ensemble methods (including hydrologic data assimilation and hindcast-based post-processing). Fully coupled numerical weather prediction (NWP) systems, another example, use still coarser LSMs to produce ensemble streamflow predictions either at the model scale or after sub-grid scale runoff routing. Based on the direct experience of the authors and colleagues in research and operational forecasting, this presentation describes examples of different streamflow forecast paradigms, from the traditional to the recent hyper-resolution, to illustrate the range of choices facing forecast system developers. We also discuss the degree to which the strengths and weaknesses of each strategy map onto the requirements for different types of forecasting services (e.g., flash flooding, river flooding, seasonal water supply prediction).
NASA Astrophysics Data System (ADS)
He, M.; Hogue, T. S.; Franz, K.; Margulis, S. A.; Vrugt, J. A.
2009-12-01
The National Weather Service (NWS), the agency responsible for short- and long-term streamflow predictions across the nation, primarily applies the SNOW17 model for operational forecasting of snow accumulation and melt. The SNOW17-forecasted snowmelt serves as an input to a rainfall-runoff model for streamflow forecasts in snow-dominated areas. The accuracy of streamflow predictions in these areas largely relies on the accuracy of snowmelt. However, no direct snowmelt measurements are available to validate the SNOW17 predictions. Instead, indirect measurements such as snow water equivalent (SWE) measurements or discharge are typically used to calibrate SNOW17 parameters. In addition, the forecast practice is inherently deterministic, lacking tools to systematically address forecasting uncertainties (e.g., uncertainties in parameters, forcing, SWE and discharge observations, etc.). The current research presents an Integrated Uncertainty analysis and Ensemble-based data Assimilation (IUEA) framework to improve predictions of snowmelt and discharge while simultaneously providing meaningful estimates of the associated uncertainty. The IUEA approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. The robustness and usefulness of the IUEA-SNOW17 framework is evaluated for snow-dominated watersheds in the northern Sierra Mountains, using the coupled IUEA-SNOW17 and an operational soil moisture accounting model (SAC-SMA). Preliminary results are promising and indicate successful performance of the coupled IUEA-SNOW17 framework. Implementation of the SNOW17 with the IUEA is straightforward and requires no major modification to the SNOW17 model structure. The IUEA-SNOW17 framework is intended to be modular and transferable and should assist the NWS in advancing the current forecasting system and reinforcing current operational forecasting skill.
Van Delden, Jay S
2003-07-15
A novel, interferometric, polarization-interrogating filter assembly and method for the simultaneous measurement of all four Stokes parameters across a partially polarized irradiance image in a no-moving-parts, instantaneous, highly sensitive manner is described. In the reported embodiment of the filter, two spatially varying linear retarders and a linear polarizer comprise an ortho-Babinet, polarization-interrogating (OBPI) filter. The OBPI filter uniquely encodes the incident ensemble of electromagnetic wave fronts comprising a partially polarized irradiance image in a controlled, deterministic, spatially varying manner to map the complete state of polarization across the image to local variations in a superposed interference pattern. Experimental interferograms are reported along with a numerical simulation of the method.
Experimental demonstration on the deterministic quantum key distribution based on entangled photons.
Chen, Hua; Zhou, Zhi-Yuan; Zangana, Alaa Jabbar Jumaah; Yin, Zhen-Qiang; Wu, Juan; Han, Yun-Guang; Wang, Shuang; Li, Hong-Wei; He, De-Yong; Tawfeeq, Shelan Khasro; Shi, Bao-Sen; Guo, Guang-Can; Chen, Wei; Han, Zheng-Fu
2016-02-10
As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified "Ping-Pong"(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based quantum communications.
Experimental demonstration on the deterministic quantum key distribution based on entangled photons
Chen, Hua; Zhou, Zhi-Yuan; Zangana, Alaa Jabbar Jumaah; Yin, Zhen-Qiang; Wu, Juan; Han, Yun-Guang; Wang, Shuang; Li, Hong-Wei; He, De-Yong; Tawfeeq, Shelan Khasro; Shi, Bao-Sen; Guo, Guang-Can; Chen, Wei; Han, Zheng-Fu
2016-01-01
As an important resource, entanglement light source has been used in developing quantum information technologies, such as quantum key distribution(QKD). There are few experiments implementing entanglement-based deterministic QKD protocols since the security of existing protocols may be compromised in lossy channels. In this work, we report on a loss-tolerant deterministic QKD experiment which follows a modified “Ping-Pong”(PP) protocol. The experiment results demonstrate for the first time that a secure deterministic QKD session can be fulfilled in a channel with an optical loss of 9 dB, based on a telecom-band entangled photon source. This exhibits a conceivable prospect of ultilizing entanglement light source in real-life fiber-based quantum communications. PMID:26860582
Attribution of declining Western U.S. Snowpack to human effects
Pierce, D.W.; Barnett, T.P.; Hidalgo, H.G.; Das, T.; Bonfils, Celine; Santer, B.D.; Bala, G.; Dettinger, M.D.; Cayan, D.R.; Mirin, A.; Wood, A.W.; Nozawa, T.
2008-01-01
Observations show snowpack has declined across much of the western United States over the period 1950-99. This reduction has important social and economic implications, as water retained in the snowpack from winter storms forms an important part of the hydrological cycle and water supply in the region. A formal model-based detection and attribution (D-A) study of these reductions is performed. The detection variable is the ratio of 1 April snow water equivalent (SWE) to water-year-to-date precipitation (P), chosen to reduce the effect of P variability on the results. Estimates of natural internal climate variability are obtained from 1600 years of two control simulations performed with fully coupled ocean-atmosphere climate models. Estimates of the SWE/P response to anthropogenic greenhouse gases, ozone, and some aerosols are taken from multiple-member ensembles of perturbation experiments run with two models. The D-A shows the observations and anthropogenically forced models have greater SWE/P reductions than can be explained by natural internal climate variability alone. Model-estimated effects of changes in solar and volcanic forcing likewise do not explain the SWE/P reductions. The mean model estimate is that about half of the SWE/P reductions observed in the west from 1950 to 1999 are the result of climate changes forced by anthropogenic greenhouse gases, ozone, and aerosols. ?? 2008 American Meteorological Society.
Thermostatted kinetic equations as models for complex systems in physics and life sciences.
Bianca, Carlo
2012-12-01
Statistical mechanics is a powerful method for understanding equilibrium thermodynamics. An equivalent theoretical framework for nonequilibrium systems has remained elusive. The thermodynamic forces driving the system away from equilibrium introduce energy that must be dissipated if nonequilibrium steady states are to be obtained. Historically, further terms were introduced, collectively called a thermostat, whose original application was to generate constant-temperature equilibrium ensembles. This review surveys kinetic models coupled with time-reversible deterministic thermostats for the modeling of large systems composed both by inert matter particles and living entities. The introduction of deterministic thermostats allows to model the onset of nonequilibrium stationary states that are typical of most real-world complex systems. The first part of the paper is focused on a general presentation of the main physical and mathematical definitions and tools: nonequilibrium phenomena, Gauss least constraint principle and Gaussian thermostats. The second part provides a review of a variety of thermostatted mathematical models in physics and life sciences, including Kac, Boltzmann, Jager-Segel and the thermostatted (continuous and discrete) kinetic for active particles models. Applications refer to semiconductor devices, nanosciences, biological phenomena, vehicular traffic, social and economics systems, crowds and swarms dynamics. Copyright © 2012 Elsevier B.V. All rights reserved.
A Total Ozone Dependent Ozone Profile Climatology Based on Ozone-Sondes and Aura MLS Data
NASA Astrophysics Data System (ADS)
Labow, G. J.; McPeters, R. D.; Ziemke, J. R.
2014-12-01
A new total ozone-based ozone profile climatology has been created for use in satellite and/or ground based ozone retrievals. This climatology was formed by combining data from the Microwave Limb Sounder (MLS) with data from balloon sondes and binned by zone and total ozone. Because profile shape varies with total column ozone, this climatology better captures the ozone variations than the previously used seasonal climatologies, especially near the tropopause. This is significantly different than ozone climatologies used in the past as there is no time component. The MLS instrument on Aura has excellent latitude coverage and measures ozone profiles daily from the upper troposphere to the lower mesosphere at ~3.5 km resolution. Almost a million individual MLS ozone measurements are merged with data from over 55,000 ozonesondes which are then binned as a function of total ozone. The climatology consists of average ozone profiles as a function of total ozone for six 30 degree latitude bands covering altitudes from 0-75 km (in Z* pressure altitude coordinates). This new climatology better represents the profile shape as a function of total ozone than previous climatologies and shows some remarkable and somewhat unexpected correlations between total ozone and ozone in the lower altitudes, particularly in the lower and middle troposphere. These data can also be used to infer biases and errors in either the MLS retrievals or ozone sondes.
Numerical Error Estimation with UQ
NASA Astrophysics Data System (ADS)
Ackmann, Jan; Korn, Peter; Marotzke, Jochem
2014-05-01
Ocean models are still in need of means to quantify model errors, which are inevitably made when running numerical experiments. The total model error can formally be decomposed into two parts, the formulation error and the discretization error. The formulation error arises from the continuous formulation of the model not fully describing the studied physical process. The discretization error arises from having to solve a discretized model instead of the continuously formulated model. Our work on error estimation is concerned with the discretization error. Given a solution of a discretized model, our general problem statement is to find a way to quantify the uncertainties due to discretization in physical quantities of interest (diagnostics), which are frequently used in Geophysical Fluid Dynamics. The approach we use to tackle this problem is called the "Goal Error Ensemble method". The basic idea of the Goal Error Ensemble method is that errors in diagnostics can be translated into a weighted sum of local model errors, which makes it conceptually based on the Dual Weighted Residual method from Computational Fluid Dynamics. In contrast to the Dual Weighted Residual method these local model errors are not considered deterministically but interpreted as local model uncertainty and described stochastically by a random process. The parameters for the random process are tuned with high-resolution near-initial model information. However, the original Goal Error Ensemble method, introduced in [1], was successfully evaluated only in the case of inviscid flows without lateral boundaries in a shallow-water framework and is hence only of limited use in a numerical ocean model. Our work consists in extending the method to bounded, viscous flows in a shallow-water framework. As our numerical model, we use the ICON-Shallow-Water model. In viscous flows our high-resolution information is dependent on the viscosity parameter, making our uncertainty measures viscosity-dependent. We will show that we can choose a sensible parameter by using the Reynolds-number as a criteria. Another topic, we will discuss is the choice of the underlying distribution of the random process. This is especially of importance in the scope of lateral boundaries. We will present resulting error estimates for different height- and velocity-based diagnostics applied to the Munk gyre experiment. References [1] F. RAUSER: Error Estimation in Geophysical Fluid Dynamics through Learning; PhD Thesis, IMPRS-ESM, Hamburg, 2010 [2] F. RAUSER, J. MAROTZKE, P. KORN: Ensemble-type numerical uncertainty quantification from single model integrations; SIAM/ASA Journal on Uncertainty Quantification, submitted
NASA Astrophysics Data System (ADS)
Brochero, Darwin; Hajji, Islem; Pina, Jasson; Plana, Queralt; Sylvain, Jean-Daniel; Vergeynst, Jenna; Anctil, Francois
2015-04-01
Theories about generalization error with ensembles are mainly based on the diversity concept, which promotes resorting to many members of different properties to support mutually agreeable decisions. Kuncheva (2004) proposed the Multi Level Diversity Model (MLDM) to promote diversity in model ensembles, combining different data subsets, input subsets, models, parameters, and including a combiner level in order to optimize the final ensemble. This work tests the hypothesis about the minimisation of the generalization error with ensembles of Neural Network (NN) structures. We used the MLDM to evaluate two different scenarios: (i) ensembles from a same NN architecture, and (ii) a super-ensemble built by a combination of sub-ensembles of many NN architectures. The time series used correspond to the 12 basins of the MOdel Parameter Estimation eXperiment (MOPEX) project that were used by Duan et al. (2006) and Vos (2013) as benchmark. Six architectures are evaluated: FeedForward NN (FFNN) trained with the Levenberg Marquardt algorithm (Hagan et al., 1996), FFNN trained with SCE (Duan et al., 1993), Recurrent NN trained with a complex method (Weins et al., 2008), Dynamic NARX NN (Leontaritis and Billings, 1985), Echo State Network (ESN), and leak integrator neuron (L-ESN) (Lukosevicius and Jaeger, 2009). Each architecture performs separately an Input Variable Selection (IVS) according to a forward stepwise selection (Anctil et al., 2009) using mean square error as objective function. Post-processing by Predictor Stepwise Selection (PSS) of the super-ensemble has been done following the method proposed by Brochero et al. (2011). IVS results showed that the lagged stream flow, lagged precipitation, and Standardized Precipitation Index (SPI) (McKee et al., 1993) were the most relevant variables. They were respectively selected as one of the firsts three selected variables in 66, 45, and 28 of the 72 scenarios. A relationship between aridity index (Arora, 2002) and NN performance showed that wet basins are more easily modelled than dry basins. Nash-Sutcliffe (NS) Efficiency criterion was used to evaluate the performance of the models. Test results showed that in 9 of the 12 basins, the mean sub-ensembles performance was better than the one presented by Vos (2013). Furthermore, in 55 of 72 cases (6 NN structures x 12 basins) the mean sub-ensemble performance was better than the best individual performance, and in 10 basins the performance of the mean super-ensemble was better than the best individual super-ensemble member. As well, it was identified that members of ESN and L-ESN sub-ensembles have very similar and good performance values. Regarding the mean super-ensemble performance, we obtained an average gain in performance of 17%, and found that PSS preserves sub-ensemble members from different NN structures, indicating the pertinence of diversity in the super-ensemble. Moreover, it was demonstrated that around 100 predictors from the different structures are enough to optimize the super-ensemble. Although sub-ensembles of FFNN-SCE showed unstable performances, FFNN-SCE members were picked-up several times in the final predictor selection. References Anctil, F., M. Filion, and J. Tournebize (2009). "A neural network experiment on the simulation of daily nitrate-nitrogen and suspended sediment fluxes from a small agricultural catchment". In: Ecol. Model. 220.6, pp. 879-887. Arora, V. K. (2002). "The use of the aridity index to assess climate change effect on annual runoff". In: J. Hydrol. 265.164, pp. 164 -177 . Brochero, D., F. Anctil, and C. Gagn'e (2011). "Simplifying a hydrological ensemble prediction system with a backward greedy selection of members Part 1: Optimization criteria". In: Hydrol. Earth Syst. Sci. 15.11, pp. 3307-3325. Duan, Q., J. Schaake, V. Andr'eassian, S. Franks, G. Goteti, H. Gupta, Y. Gusev, F. Habets, A. Hall, L. Hay, T. Hogue, M. Huang, G. Leavesley, X. Liang, O. Nasonova, J. Noilhan, L. Oudin, S. Sorooshian, T. Wagener, and E. Wood (2006). "Model Parameter Estimation Experiment (MOPEX): An overview of science strategy and major results from the second and third workshops". In: J. Hydrol. 320.12, pp. 3-17. Duan, Q., V. Gupta, and S. Sorooshian (1993). "Shuffled complex evolution approach for effective and efficient global minimization". In: J. Optimiz. Theory App. 76.3, pp. 501-521. Hagan, M. T., H. B. Demuth, and M. Beale (1996). Neural network design . 1st ed. PWS Publishing Co., p. 730. Kuncheva, L. I. (2004). Combining Pattern Classifiers: Methods and Algorithms . Wiley-Interscience, p. 350. Leontaritis, I. and S. Billings (1985). "Input-output parametric models for non-linear systems Part I: deterministic non-linear systems". In: International Journal of Control 41.2, pp. 303-328. Lukosevicius, M. and H. Jaeger (2009). "Reservoir computing approaches to recurrent neural network training". In: Computer Science Review 3.3, pp. 127-149. McKee, T., N. Doesken, and J. Kleist (1993). The Relationship of Drought Frequency and Duration to Time Scales . In: Eighth Conference on Applied Climatology. Vos, N. J. de (2013). "Echo state networks as an alternative to traditional artificial neural networks in rainfall-runoff modelling". In: Hydrol. Earth Syst. Sci. 17.1, pp. 253-267. Weins, T., R. Burton, G. Schoenau, and D. Bitner (2008). Recursive Generalized Neural Networks (RGNN) for the Modeling of a Load Sensing Pump. In: ASME Joint Conference on Fluid Power, Transmission and Control.
Flash flood warnings for ungauged basins based on high-resolution precipitation forecasts
NASA Astrophysics Data System (ADS)
Demargne, Julie; Javelle, Pierre; Organde, Didier; de Saint Aubin, Céline; Janet, Bruno
2016-04-01
Early detection of flash floods, which are typically triggered by severe rainfall events, is still challenging due to large meteorological and hydrologic uncertainties at the spatial and temporal scales of interest. Also the rapid rising of waters necessarily limits the lead time of warnings to alert communities and activate effective emergency procedures. To better anticipate such events and mitigate their impacts, the French national service in charge of flood forecasting (SCHAPI) is implementing a national flash flood warning system for small-to-medium (up to 1000 km²) ungauged basins based on a discharge-threshold flood warning method called AIGA (Javelle et al. 2014). The current deterministic AIGA system has been run in real-time in the South of France since 2005 and has been tested in the RHYTMME project (rhytmme.irstea.fr/). It ingests the operational radar-gauge QPE grids from Météo-France to run a simplified hourly distributed hydrologic model at a 1-km² resolution every 15 minutes. This produces real-time peak discharge estimates along the river network, which are subsequently compared to regionalized flood frequency estimates to provide warnings according to the AIGA-estimated return period of the ongoing event. The calibration and regionalization of the hydrologic model has been recently enhanced for implementing the national flash flood warning system for the entire French territory by 2016. To further extend the effective warning lead time, the flash flood warning system is being enhanced to ingest Météo-France's AROME-NWC high-resolution precipitation nowcasts. The AROME-NWC system combines the most recent available observations with forecasts from the nowcasting version of the AROME convection-permitting model (Auger et al. 2015). AROME-NWC pre-operational deterministic precipitation forecasts, produced every hour at a 2.5-km resolution for a 6-hr forecast horizon, were provided for 3 significant rain events in September and November 2014 and ingested as time-lagged ensembles. The time-lagged approach is a practical choice of accounting for the atmospheric forecast uncertainty when no extensive forecast archive is available for statistical modelling. The evaluation on 185 basins in the South of France showed significant improvements in terms of flash flood event detection and effective warning lead-time, compared to warnings from the current AIGA setup (without any future precipitation). Various verification metrics (e.g., Relative Mean Error, Brier Skill Score) show the skill of ensemble precipitation and flow forecasts compared to single-valued persistency benchmarks. Planned enhancements include integrating additional probabilistic NWP products (e.g., AROME precipitation ensembles on longer forecast horizon), accounting for and reducing hydrologic uncertainties from the model parameters and initial conditions via data assimilation, and developing a comprehensive observational and post-event damage database to determine decision-relevant warning thresholds for flood magnitude and probability. Javelle, P., Demargne, J., Defrance, D., Arnaud, P., 2014. Evaluating flash flood warnings at ungauged locations using post-event surveys: a case study with the AIGA warning system. Hydrological Sciences Journal, doi: 10.1080/02626667.2014.923970 Auger, L., Dupont, O., Hagelin, S., Brousseau, P., Brovelli, P., 2015. AROME-NWC: a new nowcasting tool based on an operational mesoscale forecasting system. Quarterly Journal of the Royal Meteorological Society, 141: 1603-1611, doi: 10.1002/qj.2463
Bei, Naifang; Li, Guohui; Meng, Zhiyong; Weng, Yonghui; Zavala, Miguel; Molina, L T
2014-11-15
The purpose of this study is to investigate the impact of using an ensemble Kalman filter (EnKF) on air quality simulations in the California-Mexico border region on two days (May 30 and June 04, 2010) during Cal-Mex 2010. The uncertainties in ozone (O3) and aerosol simulations in the border area due to the meteorological initial uncertainties were examined through ensemble simulations. The ensemble spread of surface O3 averaged over the coastal region was less than 10ppb. The spreads in the nitrate and ammonium aerosols are substantial on both days, mostly caused by the large uncertainties in the surface temperature and humidity simulations. In general, the forecast initialized with the EnKF analysis (EnKF) improved the simulation of meteorological fields to some degree in the border region compared to the reference forecast initialized with NCEP analysis data (FCST) and the simulation with observation nudging (FDDA), which in turn leading to reasonable air quality simulations. The simulated surface O3 distributions by EnKF were consistently better than FCST and FDDA on both days. EnKF usually produced more reasonable simulations of nitrate and ammonium aerosols compared to the observations, but still have difficulties in improving the simulations of organic and sulfate aerosols. However, discrepancies between the EnKF simulations and the measurements were still considerably large, particularly for sulfate and organic aerosols, indicating that there are still ample rooms for improvement in the present data assimilation and/or the modeling systems. Copyright © 2014 Elsevier B.V. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-04
... Promulgation of Implementation Plans; North Carolina; Charlotte; Ozone 2002 Base Year Emissions Inventory... final action to approve the ozone 2002 base year emissions inventory portion of the state implementation... is part of the Charlotte-Gastonia-Rock Hill, North Carolina ozone attainment demonstration that was...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-24
... Promulgation of Implementation Plans; Georgia; Atlanta; Ozone 2002 Base Year Emissions Inventory AGENCY... approve the ozone 2002 base year emissions inventory, portion of the state implementation plan (SIP... Atlanta, Georgia (hereafter referred to as ``the Atlanta Area'' or ``Area''), ozone attainment...
Simulation of unsteady flows by the DSMC macroscopic chemistry method
NASA Astrophysics Data System (ADS)
Goldsworthy, Mark; Macrossan, Michael; Abdel-jawad, Madhat
2009-03-01
In the Direct Simulation Monte-Carlo (DSMC) method, a combination of statistical and deterministic procedures applied to a finite number of 'simulator' particles are used to model rarefied gas-kinetic processes. In the macroscopic chemistry method (MCM) for DSMC, chemical reactions are decoupled from the specific particle pairs selected for collisions. Information from all of the particles within a cell, not just those selected for collisions, is used to determine a reaction rate coefficient for that cell. Unlike collision-based methods, MCM can be used with any viscosity or non-reacting collision models and any non-reacting energy exchange models. It can be used to implement any reaction rate formulations, whether these be from experimental or theoretical studies. MCM has been previously validated for steady flow DSMC simulations. Here we show how MCM can be used to model chemical kinetics in DSMC simulations of unsteady flow. Results are compared with a collision-based chemistry procedure for two binary reactions in a 1-D unsteady shock-expansion tube simulation. Close agreement is demonstrated between the two methods for instantaneous, ensemble-averaged profiles of temperature, density and species mole fractions, as well as for the accumulated number of net reactions per cell.
NASA Technical Reports Server (NTRS)
Myers, R. H.
1976-01-01
The depletion of ozone in the stratosphere is examined, and causes for the depletion are cited. Ground station and satellite measurements of ozone, which are taken on a worldwide basis, are discussed. Instruments used in ozone measurement are discussed, such as the Dobson spectrophotometer, which is credited with providing the longest and most extensive series of observations for ground based observation of stratospheric ozone. Other ground based instruments used to measure ozone are also discussed. The statistical differences of ground based measurements of ozone from these different instruments are compared to each other, and to satellite measurements. Mathematical methods (i.e., trend analysis or linear regression analysis) of analyzing the variability of ozone concentration with respect to time and lattitude are described. Various time series models which can be employed in accounting for ozone concentration variability are examined.
A probabilistic drought forecasting framework: A combined dynamical and statistical approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh
In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less
NASA Astrophysics Data System (ADS)
Vlasenko, A. V.; Sizonenko, A. B.; Zhdanov, A. A.
2018-05-01
Discrete time series or mappings are proposed for describing the dynamics of a nonlinear system. The article considers the problems of forecasting the dynamics of the system from the time series generated by it. In particular, the commercial rate of drilling oil and gas wells can be considered as a series where each next value depends on the previous one. The main parameter here is the technical drilling speed. With the aim of eliminating the measurement error and presenting the commercial speed of the object to the current with a good accuracy, future or any of the elapsed time points, the use of the Kalman filter is suggested. For the transition from a deterministic model to a probabilistic one, the use of ensemble modeling is suggested. Ensemble systems can provide a wide range of visual output, which helps the user to evaluate the measure of confidence in the model. In particular, the availability of information on the estimated calendar duration of the construction of oil and gas wells will allow drilling companies to optimize production planning by rationalizing the approach to loading drilling rigs, which ultimately leads to maximization of profit and an increase of their competitiveness.
Long-range persistence in the global mean surface temperature and the global warming "time bomb"
NASA Astrophysics Data System (ADS)
Rypdal, M.; Rypdal, K.
2012-04-01
Detrended Fluctuation Analysis (DFA) and Maximum Likelihood Estimations (MLE) based on instrumental data over the last 160 years indicate that there is Long-Range Persistence (LRP) in Global Mean Surface Temperature (GMST) on time scales of months to decades. The persistence is much higher in sea surface temperature than in land temperatures. Power spectral analysis of multi-model, multi-ensemble runs of global climate models indicate further that this persistence may extend to centennial and maybe even millennial time-scales. We also support these conclusions by wavelet variogram analysis, DFA, and MLE of Northern hemisphere mean surface temperature reconstructions over the last two millennia. These analyses indicate that the GMST is a strongly persistent noise with Hurst exponent H>0.9 on time scales from decades up to at least 500 years. We show that such LRP can be very important for long-term climate prediction and for the establishment of a "time bomb" in the climate system due to a growing energy imbalance caused by the slow relaxation to radiative equilibrium under rising anthropogenic forcing. We do this by the construction of a multi-parameter dynamic-stochastic model for the GMST response to deterministic and stochastic forcing, where LRP is represented by a power-law response function. Reconstructed data for total forcing and GMST over the last millennium are used with this model to estimate trend coefficients and Hurst exponent for the GMST on multi-century time scale by means of MLE. Ensembles of solutions generated from the stochastic model also allow us to estimate confidence intervals for these estimates.
NASA Astrophysics Data System (ADS)
von Trentini, F.; Willkofer, F.; Wood, R. R.; Schmid, F. J.; Ludwig, R.
2017-12-01
The ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) focuses on the effects of climate change on hydro-meteorological extreme events and their implications for water management in Bavaria and Québec. Therefore, a hydro-meteorological model chain is applied. It employs high performance computing capacity of the Leibniz Supercomputing Centre facility SuperMUC to dynamically downscale 50 members of the Global Circulation Model CanESM2 over European and Eastern North American domains using the Canadian Regional Climate Model (RCM) CRCM5. Over Europe, the unique single model ensemble is conjointly analyzed with the latest information provided through the CORDEX-initiative, to better assess the influence of natural climate variability and climatic change in the dynamics of extreme events. Furthermore, these 50 members of a single RCM will enhance extreme value statistics (extreme return periods) by exploiting the available 1500 model years for the reference period from 1981 to 2010. Hence, the RCM output is applied to drive the process based, fully distributed, and deterministic hydrological model WaSiM in high temporal (3h) and spatial (500m) resolution. WaSiM and the large ensemble are further used to derive a variety of hydro-meteorological patterns leading to severe flood events. A tool for virtual perfect prediction shall provide a combination of optimal lead time and management strategy to mitigate certain flood events following these patterns.
Ling, Qing-Hua; Song, Yu-Qing; Han, Fei; Yang, Dan; Huang, De-Shuang
2016-01-01
For ensemble learning, how to select and combine the candidate classifiers are two key issues which influence the performance of the ensemble system dramatically. Random vector functional link networks (RVFL) without direct input-to-output links is one of suitable base-classifiers for ensemble systems because of its fast learning speed, simple structure and good generalization performance. In this paper, to obtain a more compact ensemble system with improved convergence performance, an improved ensemble of RVFL based on attractive and repulsive particle swarm optimization (ARPSO) with double optimization strategy is proposed. In the proposed method, ARPSO is applied to select and combine the candidate RVFL. As for using ARPSO to select the optimal base RVFL, ARPSO considers both the convergence accuracy on the validation data and the diversity of the candidate ensemble system to build the RVFL ensembles. In the process of combining RVFL, the ensemble weights corresponding to the base RVFL are initialized by the minimum norm least-square method and then further optimized by ARPSO. Finally, a few redundant RVFL is pruned, and thus the more compact ensemble of RVFL is obtained. Moreover, in this paper, theoretical analysis and justification on how to prune the base classifiers on classification problem is presented, and a simple and practically feasible strategy for pruning redundant base classifiers on both classification and regression problems is proposed. Since the double optimization is performed on the basis of the single optimization, the ensemble of RVFL built by the proposed method outperforms that built by some single optimization methods. Experiment results on function approximation and classification problems verify that the proposed method could improve its convergence accuracy as well as reduce the complexity of the ensemble system. PMID:27835638
Ling, Qing-Hua; Song, Yu-Qing; Han, Fei; Yang, Dan; Huang, De-Shuang
2016-01-01
For ensemble learning, how to select and combine the candidate classifiers are two key issues which influence the performance of the ensemble system dramatically. Random vector functional link networks (RVFL) without direct input-to-output links is one of suitable base-classifiers for ensemble systems because of its fast learning speed, simple structure and good generalization performance. In this paper, to obtain a more compact ensemble system with improved convergence performance, an improved ensemble of RVFL based on attractive and repulsive particle swarm optimization (ARPSO) with double optimization strategy is proposed. In the proposed method, ARPSO is applied to select and combine the candidate RVFL. As for using ARPSO to select the optimal base RVFL, ARPSO considers both the convergence accuracy on the validation data and the diversity of the candidate ensemble system to build the RVFL ensembles. In the process of combining RVFL, the ensemble weights corresponding to the base RVFL are initialized by the minimum norm least-square method and then further optimized by ARPSO. Finally, a few redundant RVFL is pruned, and thus the more compact ensemble of RVFL is obtained. Moreover, in this paper, theoretical analysis and justification on how to prune the base classifiers on classification problem is presented, and a simple and practically feasible strategy for pruning redundant base classifiers on both classification and regression problems is proposed. Since the double optimization is performed on the basis of the single optimization, the ensemble of RVFL built by the proposed method outperforms that built by some single optimization methods. Experiment results on function approximation and classification problems verify that the proposed method could improve its convergence accuracy as well as reduce the complexity of the ensemble system.
On the predictability of outliers in ensemble forecasts
NASA Astrophysics Data System (ADS)
Siegert, S.; Bröcker, J.; Kantz, H.
2012-03-01
In numerical weather prediction, ensembles are used to retrieve probabilistic forecasts of future weather conditions. We consider events where the verification is smaller than the smallest, or larger than the largest ensemble member of a scalar ensemble forecast. These events are called outliers. In a statistically consistent K-member ensemble, outliers should occur with a base rate of 2/(K+1). In operational ensembles this base rate tends to be higher. We study the predictability of outlier events in terms of the Brier Skill Score and find that forecast probabilities can be calculated which are more skillful than the unconditional base rate. This is shown analytically for statistically consistent ensembles. Using logistic regression, forecast probabilities for outlier events in an operational ensemble are calculated. These probabilities exhibit positive skill which is quantitatively similar to the analytical results. Possible causes of these results as well as their consequences for ensemble interpretation are discussed.
The latitudinal distribution of ozone to 35 km altitude from ECC ozonesonde observations, 1982-1990
NASA Technical Reports Server (NTRS)
Komhyr, W. D.; Oltmans, S. J.; Lathrop, J. A.; Kerr, J. B.; Matthews, W. A.
1994-01-01
Electrochemical concentration cell (ECC) ozone-sonde observations, made in recent years at ten stations whose locations range from the Arctic to Antarctica, have yielded a self-consistent ozone data base from which mean seasonal and annual latitudinal ozone vertical distributions to 35 km have been derived. Ozone measurement uncertainties are estimated, and results are presented in the Bass-Paur (1985) ozone absorption coefficient scale adopted for use with Dobson ozone spectrophotometers January 1, 1992. The data should be useful for comparison with model calculations of the global distribution of atmospheric ozone, for serving as apriori statistical information in deriving ozone vertical distributions from satellite and Umkehr observations, and for improving the satellite and Umkehr ozone inversion algorithms. Attention is drawn to similar results based on a less comprehensive data set published in Ozone in the Atmosphere, Proceedings of the 1988 Quadrennial Ozone Symposium where errors in data tabulations occurred for three of the stations due to inadvertent transposition of ozone partial pressure and air temperature values.
Argumentation Based Joint Learning: A Novel Ensemble Learning Approach
Xu, Junyi; Yao, Li; Li, Le
2015-01-01
Recently, ensemble learning methods have been widely used to improve classification performance in machine learning. In this paper, we present a novel ensemble learning method: argumentation based multi-agent joint learning (AMAJL), which integrates ideas from multi-agent argumentation, ensemble learning, and association rule mining. In AMAJL, argumentation technology is introduced as an ensemble strategy to integrate multiple base classifiers and generate a high performance ensemble classifier. We design an argumentation framework named Arena as a communication platform for knowledge integration. Through argumentation based joint learning, high quality individual knowledge can be extracted, and thus a refined global knowledge base can be generated and used independently for classification. We perform numerous experiments on multiple public datasets using AMAJL and other benchmark methods. The results demonstrate that our method can effectively extract high quality knowledge for ensemble classifier and improve the performance of classification. PMID:25966359
Equation-free analysis of agent-based models and systematic parameter determination
NASA Astrophysics Data System (ADS)
Thomas, Spencer A.; Lloyd, David J. B.; Skeldon, Anne C.
2016-12-01
Agent based models (ABM)s are increasingly used in social science, economics, mathematics, biology and computer science to describe time dependent systems in circumstances where a description in terms of equations is difficult. Yet few tools are currently available for the systematic analysis of ABM behaviour. Numerical continuation and bifurcation analysis is a well-established tool for the study of deterministic systems. Recently, equation-free (EF) methods have been developed to extend numerical continuation techniques to systems where the dynamics are described at a microscopic scale and continuation of a macroscopic property of the system is considered. To date, the practical use of EF methods has been limited by; (1) the over-head of application-specific implementation; (2) the laborious configuration of problem-specific parameters; and (3) large ensemble sizes (potentially) leading to computationally restrictive run-times. In this paper we address these issues with our tool for the EF continuation of stochastic systems, which includes algorithms to systematically configuration problem specific parameters and enhance robustness to noise. Our tool is generic and can be applied to any 'black-box' simulator and determines the essential EF parameters prior to EF analysis. Robustness is significantly improved using our convergence-constraint with a corrector-repeat (C3R) method. This algorithm automatically detects outliers based on the dynamics of the underlying system enabling both an order of magnitude reduction in ensemble size and continuation of systems at much higher levels of noise than classical approaches. We demonstrate our method with application to several ABM models, revealing parameter dependence, bifurcation and stability analysis of these complex systems giving a deep understanding of the dynamical behaviour of the models in a way that is not otherwise easily obtainable. In each case we demonstrate our systematic parameter determination stage for configuring the system specific EF parameters.
NASA Astrophysics Data System (ADS)
Kruglova, Ekaterina; Kulikova, Irina; Khan, Valentina; Tischenko, Vladimir
2017-04-01
The subseasonal predictability of low-frequency modes and the atmospheric circulation regimes is investigated based on the using of outputs from global Semi-Lagrangian (SL-AV) model of the Hydrometcentre of Russia and Institute of Numerical Mathematics of Russian Academy of Science. Teleconnection indices (AO, WA, EA, NAO, EU, WP, PNA) are used as the quantitative characteristics of low-frequency variability to identify zonal and meridional flow regimes with focus on control distribution of high impact weather patterns in the Northern Eurasia. The predictability of weekly and monthly averaged indices is estimated by the methods of diagnostic verification of forecast and reanalysis data covering the hindcast period, and also with the use of the recommended WMO quantitative criteria. Characteristics of the low frequency variability have been discussed. Particularly, it is revealed that the meridional flow regimes are reproduced by SL-AV for summer season better comparing to winter period. It is shown that the model's deterministic forecast (ensemble mean) skill at week 1 (days 1-7) is noticeably better than that of climatic forecasts. The decrease of skill scores at week 2 (days 8-14) and week 3( days 15-21) is explained by deficiencies in the modeling system and inaccurate initial conditions. It was noticed the slightly improvement of the skill of model at week 4 (days 22-28), when the condition of atmosphere is more determined by the flow of energy from the outside. The reliability of forecasts of monthly (days 1-30) averaged indices is comparable to that at week 1 (days 1-7). Numerical experiments demonstrated that the forecast accuracy can be improved (thus the limit of practical predictability can be extended) through the using of probabilistic approach based on ensemble forecasts. It is shown that the quality of forecasts of the regimes of circulation like blocking is higher, than that of zonal flow.
Telegraph noise in Markovian master equation for electron transport through molecular junctions
NASA Astrophysics Data System (ADS)
Kosov, Daniel S.
2018-05-01
We present a theoretical approach to solve the Markovian master equation for quantum transport with stochastic telegraph noise. Considering probabilities as functionals of a random telegraph process, we use Novikov's functional method to convert the stochastic master equation to a set of deterministic differential equations. The equations are then solved in the Laplace space, and the expression for the probability vector averaged over the ensemble of realisations of the stochastic process is obtained. We apply the theory to study the manifestations of telegraph noise in the transport properties of molecular junctions. We consider the quantum electron transport in a resonant-level molecule as well as polaronic regime transport in a molecular junction with electron-vibration interaction.
Deterministic quantum splitter based on time-reversed Hong-Ou-Mandel interference
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Jun; Lee, Kim Fook; Kumar, Prem
2007-09-15
By utilizing a fiber-based indistinguishable photon-pair source in the 1.55 {mu}m telecommunications band [J. Chen et al., Opt. Lett. 31, 2798 (2006)], we present the first, to the best of our knowledge, deterministic quantum splitter based on the principle of time-reversed Hong-Ou-Mandel quantum interference. The deterministically separated identical photons' indistinguishability is then verified by using a conventional Hong-Ou-Mandel quantum interference, which exhibits a near-unity dip visibility of 94{+-}1%, making this quantum splitter useful for various quantum information processing applications.
Ensemble-based docking: From hit discovery to metabolism and toxicity predictions
Evangelista, Wilfredo; Weir, Rebecca; Ellingson, Sally; ...
2016-07-29
The use of ensemble-based docking for the exploration of biochemical pathways and toxicity prediction of drug candidates is described. We describe the computational engineering work necessary to enable large ensemble docking campaigns on supercomputers. We show examples where ensemble-based docking has significantly increased the number and the diversity of validated drug candidates. Finally, we illustrate how ensemble-based docking can be extended beyond hit discovery and toward providing a structural basis for the prediction of metabolism and off-target binding relevant to pre-clinical and clinical trials.
A novel optical ozone sensor based on purely organic phosphor.
Lee, Dongwook; Jung, Jaehun; Bilby, David; Kwon, Min Sang; Yun, Jaesook; Kim, Jinsang
2015-02-11
An optical ozone sensor was developed based on the finding that a purely organic phosphor linearly loses its phosphorescence emission intensity in the presence of varying concentration of ozone gas and ozonated water. Compared to conventional conductance-based inorganic sensors, our novel sensory film has many advantages such as easy fabrication, low-cost, and portability. NMR data confirmed that phosphorescence drop is attributed to oxidation of the core triplet generating aldehyde group of the phosphor. We observed that linear correlation between phosphorescence and ozone concentration and it can detect ozone concentrations of 0.1 ppm that is the threshold concentration harmful to human tissue and respiratory organs. Like a litmus paper, this ozone sensor can be fabricated as a free-standing and disposable film.
Locally Weighted Ensemble Clustering.
Huang, Dong; Wang, Chang-Dong; Lai, Jian-Huang
2018-05-01
Due to its ability to combine multiple base clusterings into a probably better and more robust clustering, the ensemble clustering technique has been attracting increasing attention in recent years. Despite the significant success, one limitation to most of the existing ensemble clustering methods is that they generally treat all base clusterings equally regardless of their reliability, which makes them vulnerable to low-quality base clusterings. Although some efforts have been made to (globally) evaluate and weight the base clusterings, yet these methods tend to view each base clustering as an individual and neglect the local diversity of clusters inside the same base clustering. It remains an open problem how to evaluate the reliability of clusters and exploit the local diversity in the ensemble to enhance the consensus performance, especially, in the case when there is no access to data features or specific assumptions on data distribution. To address this, in this paper, we propose a novel ensemble clustering approach based on ensemble-driven cluster uncertainty estimation and local weighting strategy. In particular, the uncertainty of each cluster is estimated by considering the cluster labels in the entire ensemble via an entropic criterion. A novel ensemble-driven cluster validity measure is introduced, and a locally weighted co-association matrix is presented to serve as a summary for the ensemble of diverse clusters. With the local diversity in ensembles exploited, two novel consensus functions are further proposed. Extensive experiments on a variety of real-world datasets demonstrate the superiority of the proposed approach over the state-of-the-art.
Schlaier, Juergen R; Beer, Anton L; Faltermeier, Rupert; Fellner, Claudia; Steib, Kathrin; Lange, Max; Greenlee, Mark W; Brawanski, Alexander T; Anthofer, Judith M
2017-06-01
This study compared tractography approaches for identifying cerebellar-thalamic fiber bundles relevant to planning target sites for deep brain stimulation (DBS). In particular, probabilistic and deterministic tracking of the dentate-rubro-thalamic tract (DRTT) and differences between the spatial courses of the DRTT and the cerebello-thalamo-cortical (CTC) tract were compared. Six patients with movement disorders were examined by magnetic resonance imaging (MRI), including two sets of diffusion-weighted images (12 and 64 directions). Probabilistic and deterministic tractography was applied on each diffusion-weighted dataset to delineate the DRTT. Results were compared with regard to their sensitivity in revealing the DRTT and additional fiber tracts and processing time. Two sets of regions-of-interests (ROIs) guided deterministic tractography of the DRTT or the CTC, respectively. Tract distances to an atlas-based reference target were compared. Probabilistic fiber tracking with 64 orientations detected the DRTT in all twelve hemispheres. Deterministic tracking detected the DRTT in nine (12 directions) and in only two (64 directions) hemispheres. Probabilistic tracking was more sensitive in detecting additional fibers (e.g. ansa lenticularis and medial forebrain bundle) than deterministic tracking. Probabilistic tracking lasted substantially longer than deterministic. Deterministic tracking was more sensitive in detecting the CTC than the DRTT. CTC tracts were located adjacent but consistently more posterior to DRTT tracts. These results suggest that probabilistic tracking is more sensitive and robust in detecting the DRTT but harder to implement than deterministic approaches. Although sensitivity of deterministic tracking is higher for the CTC than the DRTT, targets for DBS based on these tracts likely differ. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Kleist, D. T.; Ide, K.; Mahajan, R.; Thomas, C.
2014-12-01
The use of hybrid error covariance models has become quite popular for numerical weather prediction (NWP). One such method for incorporating localized covariances from an ensemble within the variational framework utilizes an augmented control variable (EnVar), and has been implemented in the operational NCEP data assimilation system (GSI). By taking the existing 3D EnVar algorithm in GSI and allowing for four-dimensional ensemble perturbations, coupled with the 4DVAR infrastructure already in place, a 4D EnVar capability has been developed. The 4D EnVar algorithm has a few attractive qualities relative to 4DVAR, including the lack of need for tangent-linear and adjoint model as well as reduced computational cost. Preliminary results using real observations have been encouraging, showing forecast improvements nearly as large as were found in moving from 3DVAR to hybrid 3D EnVar. 4D EnVar is the method of choice for the next generation assimilation system for use with the operational NCEP global model, the global forecast system (GFS). The use of an outer-loop has long been the method of choice for 4DVar data assimilation to help address nonlinearity. An outer loop involves the re-running of the (deterministic) background forecast from the updated initial condition at the beginning of the assimilation window, and proceeding with another inner loop minimization. Within 4D EnVar, a similar procedure can be adopted since the solver evaluates a 4D analysis increment throughout the window, consistent with the valid times of the 4D ensemble perturbations. In this procedure, the ensemble perturbations are kept fixed and centered about the updated background state. This is analogous to the quasi-outer loop idea developed for the EnKF. Here, we present results for both toy model and real NWP systems demonstrating the impact from incorporating outer loops to address nonlinearity within the 4D EnVar context. The appropriate amplitudes for observation and background error covariances in subsequent outer loops will be explored. Lastly, variable transformations on the ensemble perturbations will be utilized to help address issues of non-Gaussianity. This may be particularly important for variables that clearly have non-Gaussian error characteristics such as water vapor and cloud condensate.
Simulation's Ensemble is Better Than Ensemble Simulation
NASA Astrophysics Data System (ADS)
Yan, X.
2017-12-01
Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.
Structural Uncertainty in Antarctic sea ice simulations
NASA Astrophysics Data System (ADS)
Schneider, D. P.
2016-12-01
The inability of the vast majority of historical climate model simulations to reproduce the observed increase in Antarctic sea ice has motivated many studies about the quality of the observational record, the role of natural variability versus forced changes, and the possibility of missing or inadequate forcings in the models (such as freshwater discharge from thinning ice shelves or an inadequate magnitude of stratospheric ozone depletion). In this presentation I will highlight another source of uncertainty that has received comparatively little attention: Structural uncertainty, that is, the systematic uncertainty in simulated sea ice trends that arises from model physics and mean-state biases. Using two large ensembles of experiments from the Community Earth System Model (CESM), I will show that the model is predisposed towards producing negative Antarctic sea ice trends during 1979-present, and that this outcome is not simply because the model's decadal variability is out-of-synch with that in nature. In the "Tropical Pacific Pacemaker" ensemble, in which observed tropical Pacific SST anomalies are prescribed, the model produces very realistic atmospheric circulation trends over the Southern Ocean, yet the sea ice trend is negative in every ensemble member. However, if the ensemble-mean trend (commonly interpreted as the forced response) is removed, some ensemble members show a sea ice increase that is very similar to the observed. While this results does confirm the important role of natural variability, it also suggests a strong bias in the forced response. I will discuss the reasons for this systematic bias and explore possible remedies. This an important problem to solve because projections of 21st -Century changes in the Antarctic climate system (including ice sheet surface mass balance changes and related changes in the sea level budget) have a strong dependence on the mean state of and changes in the Antarctic sea ice cover. This problem is not unique to CESM, but is pervasive across CMIP5-class models.
Nonlinear mode decomposition: A noise-robust, adaptive decomposition method
NASA Astrophysics Data System (ADS)
Iatsenko, Dmytro; McClintock, Peter V. E.; Stefanovska, Aneta
2015-09-01
The signals emanating from complex systems are usually composed of a mixture of different oscillations which, for a reliable analysis, should be separated from each other and from the inevitable background of noise. Here we introduce an adaptive decomposition tool—nonlinear mode decomposition (NMD)—which decomposes a given signal into a set of physically meaningful oscillations for any wave form, simultaneously removing the noise. NMD is based on the powerful combination of time-frequency analysis techniques—which, together with the adaptive choice of their parameters, make it extremely noise robust—and surrogate data tests used to identify interdependent oscillations and to distinguish deterministic from random activity. We illustrate the application of NMD to both simulated and real signals and demonstrate its qualitative and quantitative superiority over other approaches, such as (ensemble) empirical mode decomposition, Karhunen-Loève expansion, and independent component analysis. We point out that NMD is likely to be applicable and useful in many different areas of research, such as geophysics, finance, and the life sciences. The necessary matlab codes for running NMD are freely available for download.
Technical note: Combining quantile forecasts and predictive distributions of streamflows
NASA Astrophysics Data System (ADS)
Bogner, Konrad; Liechti, Katharina; Zappa, Massimiliano
2017-11-01
The enhanced availability of many different hydro-meteorological modelling and forecasting systems raises the issue of how to optimally combine this great deal of information. Especially the usage of deterministic and probabilistic forecasts with sometimes widely divergent predicted future streamflow values makes it even more complicated for decision makers to sift out the relevant information. In this study multiple streamflow forecast information will be aggregated based on several different predictive distributions, and quantile forecasts. For this combination the Bayesian model averaging (BMA) approach, the non-homogeneous Gaussian regression (NGR), also known as the ensemble model output statistic (EMOS) techniques, and a novel method called Beta-transformed linear pooling (BLP) will be applied. By the help of the quantile score (QS) and the continuous ranked probability score (CRPS), the combination results for the Sihl River in Switzerland with about 5 years of forecast data will be compared and the differences between the raw and optimally combined forecasts will be highlighted. The results demonstrate the importance of applying proper forecast combination methods for decision makers in the field of flood and water resource management.
Watching cellular machinery in action, one molecule at a time.
Monachino, Enrico; Spenkelink, Lisanne M; van Oijen, Antoine M
2017-01-02
Single-molecule manipulation and imaging techniques have become important elements of the biologist's toolkit to gain mechanistic insights into cellular processes. By removing ensemble averaging, single-molecule methods provide unique access to the dynamic behavior of biomolecules. Recently, the use of these approaches has expanded to the study of complex multiprotein systems and has enabled detailed characterization of the behavior of individual molecules inside living cells. In this review, we provide an overview of the various force- and fluorescence-based single-molecule methods with applications both in vitro and in vivo, highlighting these advances by describing their applications in studies on cytoskeletal motors and DNA replication. We also discuss how single-molecule approaches have increased our understanding of the dynamic behavior of complex multiprotein systems. These methods have shown that the behavior of multicomponent protein complexes is highly stochastic and less linear and deterministic than previously thought. Further development of single-molecule tools will help to elucidate the molecular dynamics of these complex systems both inside the cell and in solutions with purified components. © 2017 Monachino et al.
Residue-level global and local ensemble-ensemble comparisons of protein domains.
Clark, Sarah A; Tronrud, Dale E; Karplus, P Andrew
2015-09-01
Many methods of protein structure generation such as NMR-based solution structure determination and template-based modeling do not produce a single model, but an ensemble of models consistent with the available information. Current strategies for comparing ensembles lose information because they use only a single representative structure. Here, we describe the ENSEMBLATOR and its novel strategy to directly compare two ensembles containing the same atoms to identify significant global and local backbone differences between them on per-atom and per-residue levels, respectively. The ENSEMBLATOR has four components: eePREP (ee for ensemble-ensemble), which selects atoms common to all models; eeCORE, which identifies atoms belonging to a cutoff-distance dependent common core; eeGLOBAL, which globally superimposes all models using the defined core atoms and calculates for each atom the two intraensemble variations, the interensemble variation, and the closest approach of members of the two ensembles; and eeLOCAL, which performs a local overlay of each dipeptide and, using a novel measure of local backbone similarity, reports the same four variations as eeGLOBAL. The combination of eeGLOBAL and eeLOCAL analyses identifies the most significant differences between ensembles. We illustrate the ENSEMBLATOR's capabilities by showing how using it to analyze NMR ensembles and to compare NMR ensembles with crystal structures provides novel insights compared to published studies. One of these studies leads us to suggest that a "consistency check" of NMR-derived ensembles may be a useful analysis step for NMR-based structure determinations in general. The ENSEMBLATOR 1.0 is available as a first generation tool to carry out ensemble-ensemble comparisons. © 2015 The Protein Society.
Residue-level global and local ensemble-ensemble comparisons of protein domains
Clark, Sarah A; Tronrud, Dale E; Andrew Karplus, P
2015-01-01
Many methods of protein structure generation such as NMR-based solution structure determination and template-based modeling do not produce a single model, but an ensemble of models consistent with the available information. Current strategies for comparing ensembles lose information because they use only a single representative structure. Here, we describe the ENSEMBLATOR and its novel strategy to directly compare two ensembles containing the same atoms to identify significant global and local backbone differences between them on per-atom and per-residue levels, respectively. The ENSEMBLATOR has four components: eePREP (ee for ensemble-ensemble), which selects atoms common to all models; eeCORE, which identifies atoms belonging to a cutoff-distance dependent common core; eeGLOBAL, which globally superimposes all models using the defined core atoms and calculates for each atom the two intraensemble variations, the interensemble variation, and the closest approach of members of the two ensembles; and eeLOCAL, which performs a local overlay of each dipeptide and, using a novel measure of local backbone similarity, reports the same four variations as eeGLOBAL. The combination of eeGLOBAL and eeLOCAL analyses identifies the most significant differences between ensembles. We illustrate the ENSEMBLATOR's capabilities by showing how using it to analyze NMR ensembles and to compare NMR ensembles with crystal structures provides novel insights compared to published studies. One of these studies leads us to suggest that a “consistency check” of NMR-derived ensembles may be a useful analysis step for NMR-based structure determinations in general. The ENSEMBLATOR 1.0 is available as a first generation tool to carry out ensemble-ensemble comparisons. PMID:26032515
NASA Astrophysics Data System (ADS)
De Ridder, K.; Bertrand, C.; Casanova, G.; Lefebvre, W.
2012-09-01
Increasingly, mesoscale meteorological and climate models are used to predict urban weather and climate. Yet, large uncertainties remain regarding values of some urban surface properties. In particular, information concerning urban values for thermal roughness length and thermal admittance is scarce. In this paper, we present a method to estimate values for thermal admittance in combination with an optimal scheme for thermal roughness length, based on METEOSAT-8/SEVIRI thermal infrared imagery in conjunction with a deterministic atmospheric model containing a simple urbanized land surface scheme. Given the spatial resolution of the SEVIRI sensor, the resulting parameter values are applicable at scales of the order of 5 km. As a study case we focused on the city of Paris, for the day of 29 June 2006. Land surface temperature was calculated from SEVIRI thermal radiances using a new split-window algorithm specifically designed to handle urban conditions, as described inAppendix A, including a correction for anisotropy effects. Land surface temperature was also calculated in an ensemble of simulations carried out with the ARPS mesoscale atmospheric model, combining different thermal roughness length parameterizations with a range of thermal admittance values. Particular care was taken to spatially match the simulated land surface temperature with the SEVIRI field of view, using the so-called point spread function of the latter. Using Bayesian inference, the best agreement between simulated and observed land surface temperature was obtained for the Zilitinkevich (1970) and Brutsaert (1975) thermal roughness length parameterizations, the latter with the coefficients obtained by Kanda et al. (2007). The retrieved thermal admittance values associated with either thermal roughness parameterization were, respectively, 1843 ± 108 J m-2 s-1/2 K-1 and 1926 ± 115 J m-2 s-1/2 K-1.
NASA Astrophysics Data System (ADS)
Hardebol, N. J.; Maier, C.; Nick, H.; Geiger, S.; Bertotti, G.; Boro, H.
2015-12-01
A fracture network arrangement is quantified across an isolated carbonate platform from outcrop and aerial imagery to address its impact on fluid flow. The network is described in terms of fracture density, orientation, and length distribution parameters. Of particular interest is the role of fracture cross connections and abutments on the effective permeability. Hence, the flow simulations explicitly account for network topology by adopting Discrete-Fracture-and-Matrix description. The interior of the Latemar carbonate platform (Dolomites, Italy) is taken as outcrop analogue for subsurface reservoirs of isolated carbonate build-ups that exhibit a fracture-dominated permeability. New is our dual strategy to describe the fracture network both as deterministic- and stochastic-based inputs for flow simulations. The fracture geometries are captured explicitly and form a multiscale data set by integration of interpretations from outcrops, airborne imagery, and lidar. The deterministic network descriptions form the basis for descriptive rules that are diagnostic of the complex natural fracture arrangement. The fracture networks exhibit a variable degree of multitier hierarchies with smaller-sized fractures abutting against larger fractures under both right and oblique angles. The influence of network topology on connectivity is quantified using Discrete-Fracture-Single phase fluid flow simulations. The simulation results show that the effective permeability for the fracture and matrix ensemble can be 50 to 400 times higher than the matrix permeability of 1.0 · 10-14 m2. The permeability enhancement is strongly controlled by the connectivity of the fracture network. Therefore, the degree of intersecting and abutting fractures should be captured from outcrops with accuracy to be of value as analogue.
Long-term performance of passive materials for removal of ozone from indoor air.
Cros, C J; Morrison, G C; Siegel, J A; Corsi, R L
2012-02-01
The health effects associated with exposure to ozone range from respiratory irritation to increased mortality. In this paper, we explore the use of three green building materials and an activated carbon (AC) mat that remove ozone from indoor air. We studied the effects of long-term exposure of these materials to real environments on ozone removal capability and pre- and post-ozonation emissions. A field study was completed over a 6-month period, and laboratory testing was intermittently conducted on material samples retrieved from the field. The results show sustained ozone removal for all materials except recycled carpet, with greatest ozone deposition velocity for AC mat (2.5-3.8 m/h) and perlite-based ceiling tile (2.2-3.2 m/h). Carbonyl emission rates were low for AC across all field sites. Painted gypsum wallboard and perlite-based ceiling tile had similar overall emission rates over the 6-month period, while carpet had large initial emission rates of undesirable by-products that decayed rapidly but remained high compared with other materials. This study confirms that AC mats and perlite-based ceiling tile are viable surfaces for inclusion in buildings to remove ozone without generating undesirable by-products. PRACTICAL IMPLICATIONS The use of passive removal materials for ozone control could decrease the need for, or even render unnecessary, active but energy consuming control solutions. In buildings where ozone should be controlled (high outdoor ozone concentrations, sensitive populations), materials specifically designed or selected for removing ozone could be implemented, as long as ozone removal is not associated with large emissions of harmful by-products. We find that activated carbon mats and perlite-based ceiling tiles can provide substantial, long-lasting, ozone control. © 2011 John Wiley & Sons A/S.
Total probabilities of ensemble runoff forecasts
NASA Astrophysics Data System (ADS)
Olav Skøien, Jon; Bogner, Konrad; Salamon, Peter; Smith, Paul; Pappenberger, Florian
2016-04-01
Ensemble forecasting has for a long time been used as a method in meteorological modelling to indicate the uncertainty of the forecasts. However, as the ensembles often exhibit both bias and dispersion errors, it is necessary to calibrate and post-process them. Two of the most common methods for this are Bayesian Model Averaging (Raftery et al., 2005) and Ensemble Model Output Statistics (EMOS) (Gneiting et al., 2005). There are also methods for regionalizing these methods (Berrocal et al., 2007) and for incorporating the correlation between lead times (Hemri et al., 2013). Engeland and Steinsland Engeland and Steinsland (2014) developed a framework which can estimate post-processing parameters which are different in space and time, but still can give a spatially and temporally consistent output. However, their method is computationally complex for our larger number of stations, and cannot directly be regionalized in the way we would like, so we suggest a different path below. The target of our work is to create a mean forecast with uncertainty bounds for a large number of locations in the framework of the European Flood Awareness System (EFAS - http://www.efas.eu) We are therefore more interested in improving the forecast skill for high-flows rather than the forecast skill of lower runoff levels. EFAS uses a combination of ensemble forecasts and deterministic forecasts from different forecasters to force a distributed hydrologic model and to compute runoff ensembles for each river pixel within the model domain. Instead of showing the mean and the variability of each forecast ensemble individually, we will now post-process all model outputs to find a total probability, the post-processed mean and uncertainty of all ensembles. The post-processing parameters are first calibrated for each calibration location, but assuring that they have some spatial correlation, by adding a spatial penalty in the calibration process. This can in some cases have a slight negative impact on the calibration error, but makes it easier to interpolate the post-processing parameters to uncalibrated locations. We also look into different methods for handling the non-normal distributions of runoff data and the effect of different data transformations on forecasts skills in general and for floods in particular. Berrocal, V. J., Raftery, A. E. and Gneiting, T.: Combining Spatial Statistical and Ensemble Information in Probabilistic Weather Forecasts, Mon. Weather Rev., 135(4), 1386-1402, doi:10.1175/MWR3341.1, 2007. Engeland, K. and Steinsland, I.: Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times, Water Resour. Res., 50(1), 182-197, doi:10.1002/2012WR012757, 2014. Gneiting, T., Raftery, A. E., Westveld, A. H. and Goldman, T.: Calibrated Probabilistic Forecasting Using Ensemble Model Output Statistics and Minimum CRPS Estimation, Mon. Weather Rev., 133(5), 1098-1118, doi:10.1175/MWR2904.1, 2005. Hemri, S., Fundel, F. and Zappa, M.: Simultaneous calibration of ensemble river flow predictions over an entire range of lead times, Water Resour. Res., 49(10), 6744-6755, doi:10.1002/wrcr.20542, 2013. Raftery, A. E., Gneiting, T., Balabdaoui, F. and Polakowski, M.: Using Bayesian Model Averaging to Calibrate Forecast Ensembles, Mon. Weather Rev., 133(5), 1155-1174, doi:10.1175/MWR2906.1, 2005.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-24
... Promulgation of Implementation Plans; Georgia; Atlanta; Ozone 2002 Base Year Emissions Inventory AGENCY: Environmental Protection Agency (EPA). ACTION: Proposed rule. SUMMARY: EPA is proposing to approve the ozone... (hereafter referred to as ``the Atlanta Area'' or ``Area''), ozone attainment demonstration that was...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-18
... 1997 8-Hour Ozone Nonattainment Area; Ozone 2002 Base Year Emissions Inventory AGENCY: Environmental... ozone 2002 base year emissions inventory portion of the state implementation plan (SIP) revision.... The emissions inventory is included in the ozone attainment demonstration that was submitted for the...
Ensemble Streamflow Prediction in Korea: Past and Future 5 Years
NASA Astrophysics Data System (ADS)
Jeong, D.; Kim, Y.; Lee, J.
2005-05-01
The Ensemble Streamflow Prediction (ESP) approach was first introduced in 2000 by the Hydrology Research Group (HRG) at Seoul National University as an alternative probabilistic forecasting technique for improving the 'Water Supply Outlook' That is issued every month by the Ministry of Construction and Transportation in Korea. That study motivated the Korea Water Resources Corporation (KOWACO) to establish their seasonal probabilistic forecasting system for the 5 major river basins using the ESP approach. In cooperation with the HRG, the KOWACO developed monthly optimal multi-reservoir operating systems for the Geum river basin in 2004, which coupled the ESP forecasts with an optimization model using sampling stochastic dynamic programming. The user interfaces for both ESP and SSDP have also been designed for the developed computer systems to become more practical. More projects for developing ESP systems to the other 3 major river basins (i.e. the Nakdong, Han and Seomjin river basins) was also completed by the HRG and KOWACO at the end of December 2004. Therefore, the ESP system has become the most important mid- and long-term streamflow forecast technique in Korea. In addition to the practical aspects, resent research experience on ESP has raised some concerns into ways of improving the accuracy of ESP in Korea. Jeong and Kim (2002) performed an error analysis on its resulting probabilistic forecasts and found that the modeling error is dominant in the dry season, while the meteorological error is dominant in the flood season. To address the first issue, Kim et al. (2004) tested various combinations and/or combining techniques and showed that the ESP probabilistic accuracy could be improved considerably during the dry season when the hydrologic models were combined and/or corrected. In addition, an attempt was also made to improve the ESP accuracy for the flood season using climate forecast information. This ongoing project handles three types of climate forecast information: (1) the Monthly Industrial Meteorology Information Magazine (MIMIM) of the Korea Meteorological Administration (2) the Global Data Assimilation Prediction System (GDAPS), and (3) the US National Centers for Environmental Prediction (NCEP). Each of these forecasts is issued in a unique format: (1) MIMIM is a most-probable-event forecast, (2) GDAPS is a single series of deterministic forecasts, and (3) NCEP is an ensemble of deterministic forecasts. Other minor issues include how long the initial conditions influences the ESP accuracy, and how many ESP scenarios are needed to obtain the best accuracy. This presentation also addresses some future research that is needed for ESP in Korea.
NASA Technical Reports Server (NTRS)
Ott, Lesley E.; Pickering, Kenneth E.; Stenchikov, Georgiy L.; Huntrieser, Heidi; Schumann, Ulrich
2006-01-01
The July 21,1998 thunderstonn observed during the European Lightning Nitrogen Oxides Project (EULINOX) project was simulated using the three-dimensional Goddard Cumulus Ensemble (GCE) model. The simulation successfully reproduced a number of observed storm features including the splitting of the original cell into a southern cell which developed supercell characteristics, and a northern cell which became multicellular. Output from the GCE simulation was used to drive an offline cloud-scale chemical transport model which calculates tracer transport and includes a parameterization of lightning NO(x) production which uses observed flash rates as input. Estimates of lightning NO(x) production were deduced by assuming various values of production per intracloud and production per cloud-to-ground flash and comparing the results with in-cloud aircraft observations. The assumption that both types of flashes produce 360 moles of NO per flash on average compared most favorably with column mass and probability distribution functions calculated from observations. This assumed production per flash corresponds to a global annual lightning NOx source of 7 Tg N per yr. Chemical reactions were included in the model to evaluate the impact of lightning NO(x), on ozone. During the storm, the inclusion of lightning NOx in the model results in a small loss of ozone (on average less than 4 ppbv) at all model levels. Simulations of the chemical environment in the 24 hours following the storm show on average a small increase in the net production of ozone at most levels resulting from lightning NO(x), maximizing at approximately 5 ppbv per day at 5.5 km. Between 8 and 10.5 km, lightning NO(x) causes decreased net ozone production.
NASA Astrophysics Data System (ADS)
Ott, Lesley E.; Pickering, Kenneth E.; Stenchikov, Georgiy L.; Huntrieser, Heidi; Schumann, Ulrich
2007-03-01
The 21 July 1998 thunderstorm observed during the European Lightning Nitrogen Oxides Project (EULINOX) project was simulated using the three-dimensional Goddard Cumulus Ensemble (GCE) model. The simulation successfully reproduced a number of observed storm features including the splitting of the original cell into a southern cell which developed supercell characteristics and a northern cell which became multicellular. Output from the GCE simulation was used to drive an offline cloud-scale chemical transport model which calculates tracer transport and includes a parameterization of lightning NOx production which uses observed flash rates as input. Estimates of lightning NOx production were deduced by assuming various values of production per intracloud and production per cloud-to-ground flash and comparing the results with in-cloud aircraft observations. The assumption that both types of flashes produce 360 moles of NO per flash on average compared most favorably with column mass and probability distribution functions calculated from observations. This assumed production per flash corresponds to a global annual lightning NOx source of 7 Tg N yr-1. Chemical reactions were included in the model to evaluate the impact of lightning NOx on ozone. During the storm, the inclusion of lightning NOx in the model results in a small loss of ozone (on average less than 4 ppbv) at all model levels. Simulations of the chemical environment in the 24 hours following the storm show on average a small increase in the net production of ozone at most levels resulting from lightning NOx, maximizing at approximately 5 ppbv day-1 at 5.5 km. Between 8 and 10.5 km, lightning NOx causes decreased net ozone production.
NASA Technical Reports Server (NTRS)
Thompson, Anne M.; Miller, Sonya K.; Tilmes, Simone; Kollonige, Debra W.; Witte, Jacquelyn C.; Oltmans, Samuel J.; Johnson, Brian J.; Fujiwara, Masatomo; Schmidlin, F. J.; Coetzee, G. J. R.;
2012-01-01
We present a regional and seasonal climatology of SHADOZ ozone profiles in the troposphere and tropical tropopause layer (TTL) based on measurements taken during the first five years of Aura, 2005-2009, when new stations joined the network at Hanoi, Vietnam; Hilo, Hawaii; Alajuela Heredia, Costa Rica; Cotonou, Benin. In all, 15 stations operated during that period. A west-to-east progression of decreasing convective influence and increasing pollution leads to distinct tropospheric ozone profiles in three regions: (1) western Pacific eastern Indian Ocean; (2) equatorial Americas (San Cristobal, Alajuela, Paramaribo); (3) Atlantic and Africa. Comparisons in total ozone column from soundings, the Ozone Monitoring Instrument (OMI, on Aura, 2004-) satellite and ground-based instrumentation are presented. Most stations show better agreement with OMI than they did for EPTOMS comparisons (1998-2004; Earth-ProbeTotal Ozone Mapping Spectrometer), partly due to a revised above-burst ozone climatology. Possible station biases in the stratospheric segment of the ozone measurement noted in the first 7 years of SHADOZ ozone profiles are re-examined. High stratospheric bias observed during the TOMS period appears to persist at one station. Comparisons of SHADOZ tropospheric ozone and the daily Trajectory-enhanced Tropospheric Ozone Residual (TTOR) product (based on OMIMLS) show that the satellite-derived column amount averages 25 low. Correlations between TTOR and the SHADOZ sondes are quite good (typical r2 0.5-0.8), however, which may account for why some published residual-based OMI products capture tropospheric interannual variability fairly realistically. On the other hand, no clear explanations emerge for why TTOR-sonde discrepancies vary over a wide range at most SHADOZ sites.
NASA Astrophysics Data System (ADS)
Thompson, Anne M.; Miller, Sonya K.; Tilmes, Simone; Kollonige, Debra W.; Witte, Jacquelyn C.; Oltmans, Samuel J.; Johnson, Bryan J.; Fujiwara, Masatomo; Schmidlin, F. J.; Coetzee, G. J. R.; Komala, Ninong; Maata, Matakite; Bt Mohamad, Maznorizan; Nguyo, J.; Mutai, C.; Ogino, S.-Y.; da Silva, F. Raimundo; Leme, N. M. Paes; Posny, Francoise; Scheele, Rinus; Selkirk, Henry B.; Shiotani, Masato; Stübi, René; Levrat, Gilbert; Calpini, Bertrand; Thouret, ValéRie; Tsuruta, Haruo; Canossa, Jessica Valverde; VöMel, Holger; Yonemura, S.; Diaz, Jorge AndréS.; Tan Thanh, Nguyen T.; Thuy Ha, Hoang T.
2012-12-01
We present a regional and seasonal climatology of SHADOZ ozone profiles in the troposphere and tropical tropopause layer (TTL) based on measurements taken during the first five years of Aura, 2005-2009, when new stations joined the network at Hanoi, Vietnam; Hilo, Hawaii; Alajuela/Heredia, Costa Rica; Cotonou, Benin. In all, 15 stations operated during that period. A west-to-east progression of decreasing convective influence and increasing pollution leads to distinct tropospheric ozone profiles in three regions: (1) western Pacific/eastern Indian Ocean; (2) equatorial Americas (San Cristóbal, Alajuela, Paramaribo); (3) Atlantic and Africa. Comparisons in total ozone column from soundings, the Ozone Monitoring Instrument (OMI, on Aura, 2004-) satellite and ground-based instrumentation are presented. Most stations show better agreement with OMI than they did for EP/TOMS comparisons (1998-2004; Earth-Probe/Total Ozone Mapping Spectrometer), partly due to a revised above-burst ozone climatology. Possible station biases in the stratospheric segment of the ozone measurement noted in the first 7 years of SHADOZ ozone profiles are re-examined. High stratospheric bias observed during the TOMS period appears to persist at one station. Comparisons of SHADOZ tropospheric ozone and the daily Trajectory-enhanced Tropospheric Ozone Residual (TTOR) product (based on OMI/MLS) show that the satellite-derived column amount averages 25% low. Correlations between TTOR and the SHADOZ sondes are quite good (typical r2= 0.5-0.8), however, which may account for why some published residual-based OMI products capture tropospheric interannual variability fairly realistically. On the other hand, no clear explanations emerge for why TTOR-sonde discrepancies vary over a wide range at most SHADOZ sites.
NASA Astrophysics Data System (ADS)
Maher, Nicola; Marotzke, Jochem
2017-04-01
Natural climate variability is found in observations, paleo-proxies, and climate models. Such climate variability can be intrinsic internal variability or externally forced, for example by changes in greenhouse gases or large volcanic eruptions. There are still questions concerning how external forcing, both natural (e.g., volcanic eruptions and solar variability) and anthropogenic (e.g., greenhouse gases and ozone) may excite both interannual modes of variability in the climate system. This project aims to address some of these problems, utilising the large ensemble of the MPI-ESM-LR climate model. In this study we investigate the statistics of four modes of interannual variability, namely the North Atlantic Oscillation (NAO), the Indian Ocean Dipole (IOD), the Southern Annular Mode (SAM) and the El Niño Southern Oscillation (ENSO). Using the 100-member ensemble of MPI-ESM-LR the statistical properties of these modes (amplitude and standard deviation) can be assessed over time. Here we compare the properties in the pre-industrial control run, historical run and future scenarios (RCP4.5, RCP2.6) and present preliminary results.
Creating Weather System Ensembles Through Synergistic Process Modeling and Machine Learning
NASA Astrophysics Data System (ADS)
Chen, B.; Posselt, D. J.; Nguyen, H.; Wu, L.; Su, H.; Braverman, A. J.
2017-12-01
Earth's weather and climate are sensitive to a variety of control factors (e.g., initial state, forcing functions, etc). Characterizing the response of the atmosphere to a change in initial conditions or model forcing is critical for weather forecasting (ensemble prediction) and climate change assessment. Input - response relationships can be quantified by generating an ensemble of multiple (100s to 1000s) realistic realizations of weather and climate states. Atmospheric numerical models generate simulated data through discretized numerical approximation of the partial differential equations (PDEs) governing the underlying physics. However, the computational expense of running high resolution atmospheric state models makes generation of more than a few simulations infeasible. Here, we discuss an experiment wherein we approximate the numerical PDE solver within the Weather Research and Forecasting (WRF) Model using neural networks trained on a subset of model run outputs. Once trained, these neural nets can produce large number of realization of weather states from a small number of deterministic simulations with speeds that are orders of magnitude faster than the underlying PDE solver. Our neural network architecture is inspired by the governing partial differential equations. These equations are location-invariant, and consist of first and second derivations. As such, we use a 3x3 lon-lat grid of atmospheric profiles as the predictor in the neural net to provide the network the information necessary to compute the first and second moments. Results indicate that the neural network algorithm can approximate the PDE outputs with high degree of accuracy (less than 1% error), and that this error increases as a function of the prediction time lag.
NASA Technical Reports Server (NTRS)
Young, Sun-Woo; Carmichael, Gregory R.
1994-01-01
Tropospheric ozone production and transport in mid-latitude eastern Asia is studied. Data analysis of surface-based ozone measurements in Japan and satellite-based tropospheric column measurements of the entire western Pacific Rim are combined with results from three-dimensional model simulations to investigate the diurnal, seasonal and long-term variations of ozone in this region. Surface ozone measurements from Japan show distinct seasonal variation with a spring peak and summer minimum. Satellite studies of the entire tropospheric column of ozone show high concentrations in both the spring and summer seasons. Finally, preliminary model simulation studies show good agreement with observed values.
ATLAS: Airborne Tunable Laser Absorption Spectrometer for stratospheric trace gas measurements
NASA Technical Reports Server (NTRS)
Loewenstein, Max; Podolske, James R.; Strahan, Susan E.
1990-01-01
The ATLAS instrument is an advanced technology diode laser based absorption spectrometer designed specifically for stratospheric tracer studies. This technique was used in the acquisition of N2O tracer data sets on the Airborne Antarctic Ozone Experiment and the Airborne Arctic Stratospheric Expedition. These data sets have proved valuable for comparison with atmospheric models, as well as in assisting in the interpretation of the entire ensemble of chemical and meteorological data acquired on these two field studies. The N2O dynamical tracer data set analysis revealed several ramifications concerning the polar atmosphere: the N2O/NO(y) correlation, which is used as a tool to study denitrification in the polar vertex; the N2O Southern Hemisphere morphology, showing subsidence in the winter polar vortex; and the value of the N2O measurements in the interpretation of ClO, O3, and NO(y) measurements and of the derived dynamical tracer, potential vorticity. Field studies also led to improved characterization of the instrument and to improved accuracy.
NASA Technical Reports Server (NTRS)
1985-01-01
Topics addressed include: assessment models; model predictions of ozone changes; ozone and temperature trends; trace gas effects on climate; kinetics and photchemical data base; spectroscopic data base (infrared to microwave); instrument intercomparisons and assessments; and monthly mean distribution of ozone and temperature.
NASA Astrophysics Data System (ADS)
Zhang, Shupeng; Yi, Xue; Zheng, Xiaogu; Chen, Zhuoqi; Dan, Bo; Zhang, Xuanze
2014-11-01
In this paper, a global carbon assimilation system (GCAS) is developed for optimizing the global land surface carbon flux at 1° resolution using multiple ecosystem models. In GCAS, three ecosystem models, Boreal Ecosystem Productivity Simulator, Carnegie-Ames-Stanford Approach, and Community Atmosphere Biosphere Land Exchange, produce the prior fluxes, and an atmospheric transport model, Model for OZone And Related chemical Tracers, is used to calculate atmospheric CO2 concentrations resulting from these prior fluxes. A local ensemble Kalman filter is developed to assimilate atmospheric CO2 data observed at 92 stations to optimize the carbon flux for six land regions, and the Bayesian model averaging method is implemented in GCAS to calculate the weighted average of the optimized fluxes based on individual ecosystem models. The weights for the models are found according to the closeness of their forecasted CO2 concentration to observation. Results of this study show that the model weights vary in time and space, allowing for an optimum utilization of different strengths of different ecosystem models. It is also demonstrated that spatial localization is an effective technique to avoid spurious optimization results for regions that are not well constrained by the atmospheric data. Based on the multimodel optimized flux from GCAS, we found that the average global terrestrial carbon sink over the 2002-2008 period is 2.97 ± 1.1 PgC yr-1, and the sinks are 0.88 ± 0.52, 0.27 ± 0.33, 0.67 ± 0.39, 0.90 ± 0.68, 0.21 ± 0.31, and 0.04 ± 0.08 PgC yr-1 for the North America, South America, Africa, Eurasia, Tropical Asia, and Australia, respectively. This multimodel GCAS can be used to improve global carbon cycle estimation.
NASA Technical Reports Server (NTRS)
Newchurch, Mike; Johnson, Matthew S.; Huang, Guanyu; Kuang, Shi; Wang, Lihua; Chance, Kelly; Liu, Xiong
2016-01-01
Laminar ozone structure is a ubiquitous feature of tropospheric-ozone distributions resulting from dynamic and chemical atmospheric processes. Understanding the characteristics of these ozone laminae and the mechanisms responsible for producing them is important to outline the transport pathways of trace gases and to quantify the impact of different sources on tropospheric background ozone. In this study, we present a new method to detect ozone laminae to understand their climatological characteristics of occurrence frequency in terms of thickness and altitude. We employ both ground-based and airborne ozone lidar measurements and other synergistic observations and modeling to investigate the sources and mechanisms such as biomass burning transport, stratospheric intrusion, lightning-generated NOx, and nocturnal low-level jets that are responsible for depleted or enhanced tropospheric ozone layers. Spaceborne (e.g., OMI (Ozone Monitoring Instrument), TROPOMI (Tropospheric Monitoring Instrument), TEMPO (Tropospheric Emissions: Monitoring of Pollution)) measurements of these laminae will observe greater horizontal extent and lower vertical resolution than balloon-borne or lidar measurements will quantify. Using integrated ground-based, airborne, and spaceborne observations in a modeling framework affords insight into how to gain knowledge of both the vertical and horizontal evolution of these ubiquitous ozone laminae.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-23
... moderate 8-hour ozone nonattainment area? In 1997, EPA revised the health-based NAAQS for ozone, setting it... standard based on scientific evidence demonstrating that ozone causes adverse health effects at lower ozone... was set. EPA determined that the 1997 8-hour standard would be more protective of human health...
Using Ozone To Clean and Passivate Oxygen-Handling Hardware
NASA Technical Reports Server (NTRS)
Torrance, Paul; Biesinger, Paul
2009-01-01
A proposed method of cleaning, passivating, and verifying the cleanliness of oxygen-handling hardware would extend the established art of cleaning by use of ozone. As used here, "cleaning" signifies ridding all exposed surfaces of combustible (in particular, carbon-based) contaminants. The method calls for exposing the surfaces of the hardware to ozone while monitoring the ozone effluent for carbon dioxide. The ozone would passivate the hardware while oxidizing carbon-based residues, converting the carbon in them to carbon dioxide. The exposure to ozone would be continued until no more carbon dioxide was detected, signifying that cleaning and passivation were complete.
Structural Deterministic Safety Factors Selection Criteria and Verification
NASA Technical Reports Server (NTRS)
Verderaime, V.
1992-01-01
Though current deterministic safety factors are arbitrarily and unaccountably specified, its ratio is rooted in resistive and applied stress probability distributions. This study approached the deterministic method from a probabilistic concept leading to a more systematic and coherent philosophy and criterion for designing more uniform and reliable high-performance structures. The deterministic method was noted to consist of three safety factors: a standard deviation multiplier of the applied stress distribution; a K-factor for the A- or B-basis material ultimate stress; and the conventional safety factor to ensure that the applied stress does not operate in the inelastic zone of metallic materials. The conventional safety factor is specifically defined as the ratio of ultimate-to-yield stresses. A deterministic safety index of the combined safety factors was derived from which the corresponding reliability proved the deterministic method is not reliability sensitive. The bases for selecting safety factors are presented and verification requirements are discussed. The suggested deterministic approach is applicable to all NASA, DOD, and commercial high-performance structures under static stresses.
The characterization of an air pollution episode using satellite total ozone measurements
NASA Technical Reports Server (NTRS)
Fishman, Jack; Shipham, Mark C.; Vukovich, Fred M.; Cahoon, Donald R.
1987-01-01
A case study is presented which demonstrates that measurements of total ozone from a space-based platform can be used to study a widespread air pollution episode over the southeastern U.S. In particular, the synoptic-scale distribution of surface-level ozone obtained from an independent analysis of ground-based monitoring stations appears to be captured by the synoptic-scale distribution of total ozone, even though about 90 percent of the total ozone is in the stratosphere. Additional analyses of upper air meteorological data, other satellite imagery, and in situ aircraft measurements of ozone likewise support the fact that synoptic-scale variability of tropospheric ozone is primarily responsible for the observed variability in total ozone under certain conditions. The use of the type of analysis discussed in this study may provide an important technique for understanding the global budget of tropospheric ozone.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meadows, J.R.
The ozone-induced degradation rates of various purine bases, hydroxylated purine compounds, pyrimidine bases, and uric acid were compared. Of the compounds examined, uric acid was the one most readily degraded while the parent compounds, purine and pyrimidine, were the ones most resistant to ozonation. When the breakdown of hydroxylated purines was studied, it was determined that the more OH substituents on the purine, the more readily it was degraded. Because of the preferential attack by ozone on uric acid in solutions containing a nucleic acid base plus uric acid, the presence of the uric acid had a sparing effect onmore » the base. This effect was readily apparent for guanine, thymine, and uracil which were the bases more labile to ozone. Two of the ozonation products of uric acid were identified as allantoin and urea. Ozonation of bovine and swine erythrocyte suspensions resulted in oxidation of oxyhemoglobin to methemoglobin, formation of thiobarbituric acid-reactive materials-a measure of lipid oxidation- and lysis of the red cells. Each of these changes was inhibited by the presence of uric acid in the solution during ozonation.« less
Genetic programming based ensemble system for microarray data classification.
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.
Genetic Programming Based Ensemble System for Microarray Data Classification
Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To
2015-01-01
Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748
Minimalist ensemble algorithms for genome-wide protein localization prediction.
Lin, Jhih-Rong; Mondal, Ananda Mohan; Liu, Rong; Hu, Jianjun
2012-07-03
Computational prediction of protein subcellular localization can greatly help to elucidate its functions. Despite the existence of dozens of protein localization prediction algorithms, the prediction accuracy and coverage are still low. Several ensemble algorithms have been proposed to improve the prediction performance, which usually include as many as 10 or more individual localization algorithms. However, their performance is still limited by the running complexity and redundancy among individual prediction algorithms. This paper proposed a novel method for rational design of minimalist ensemble algorithms for practical genome-wide protein subcellular localization prediction. The algorithm is based on combining a feature selection based filter and a logistic regression classifier. Using a novel concept of contribution scores, we analyzed issues of algorithm redundancy, consensus mistakes, and algorithm complementarity in designing ensemble algorithms. We applied the proposed minimalist logistic regression (LR) ensemble algorithm to two genome-wide datasets of Yeast and Human and compared its performance with current ensemble algorithms. Experimental results showed that the minimalist ensemble algorithm can achieve high prediction accuracy with only 1/3 to 1/2 of individual predictors of current ensemble algorithms, which greatly reduces computational complexity and running time. It was found that the high performance ensemble algorithms are usually composed of the predictors that together cover most of available features. Compared to the best individual predictor, our ensemble algorithm improved the prediction accuracy from AUC score of 0.558 to 0.707 for the Yeast dataset and from 0.628 to 0.646 for the Human dataset. Compared with popular weighted voting based ensemble algorithms, our classifier-based ensemble algorithms achieved much better performance without suffering from inclusion of too many individual predictors. We proposed a method for rational design of minimalist ensemble algorithms using feature selection and classifiers. The proposed minimalist ensemble algorithm based on logistic regression can achieve equal or better prediction performance while using only half or one-third of individual predictors compared to other ensemble algorithms. The results also suggested that meta-predictors that take advantage of a variety of features by combining individual predictors tend to achieve the best performance. The LR ensemble server and related benchmark datasets are available at http://mleg.cse.sc.edu/LRensemble/cgi-bin/predict.cgi.
Minimalist ensemble algorithms for genome-wide protein localization prediction
2012-01-01
Background Computational prediction of protein subcellular localization can greatly help to elucidate its functions. Despite the existence of dozens of protein localization prediction algorithms, the prediction accuracy and coverage are still low. Several ensemble algorithms have been proposed to improve the prediction performance, which usually include as many as 10 or more individual localization algorithms. However, their performance is still limited by the running complexity and redundancy among individual prediction algorithms. Results This paper proposed a novel method for rational design of minimalist ensemble algorithms for practical genome-wide protein subcellular localization prediction. The algorithm is based on combining a feature selection based filter and a logistic regression classifier. Using a novel concept of contribution scores, we analyzed issues of algorithm redundancy, consensus mistakes, and algorithm complementarity in designing ensemble algorithms. We applied the proposed minimalist logistic regression (LR) ensemble algorithm to two genome-wide datasets of Yeast and Human and compared its performance with current ensemble algorithms. Experimental results showed that the minimalist ensemble algorithm can achieve high prediction accuracy with only 1/3 to 1/2 of individual predictors of current ensemble algorithms, which greatly reduces computational complexity and running time. It was found that the high performance ensemble algorithms are usually composed of the predictors that together cover most of available features. Compared to the best individual predictor, our ensemble algorithm improved the prediction accuracy from AUC score of 0.558 to 0.707 for the Yeast dataset and from 0.628 to 0.646 for the Human dataset. Compared with popular weighted voting based ensemble algorithms, our classifier-based ensemble algorithms achieved much better performance without suffering from inclusion of too many individual predictors. Conclusions We proposed a method for rational design of minimalist ensemble algorithms using feature selection and classifiers. The proposed minimalist ensemble algorithm based on logistic regression can achieve equal or better prediction performance while using only half or one-third of individual predictors compared to other ensemble algorithms. The results also suggested that meta-predictors that take advantage of a variety of features by combining individual predictors tend to achieve the best performance. The LR ensemble server and related benchmark datasets are available at http://mleg.cse.sc.edu/LRensemble/cgi-bin/predict.cgi. PMID:22759391
Field-Testing for Ozone: Analyzing Air Quality in Your Hometown.
ERIC Educational Resources Information Center
Lee, Judy; DeRulle, Joyce
1995-01-01
Describes a project designed to teach students how to measure ground-level ozone and determine ozone concentrations. Enables students to research the effects of ozone exposure and discuss ways to clean up the problem. Includes an activity based on the oxidation capability of ozone. (JRH)
Ensemble-based docking: From hit discovery to metabolism and toxicity predictions.
Evangelista, Wilfredo; Weir, Rebecca L; Ellingson, Sally R; Harris, Jason B; Kapoor, Karan; Smith, Jeremy C; Baudry, Jerome
2016-10-15
This paper describes and illustrates the use of ensemble-based docking, i.e., using a collection of protein structures in docking calculations for hit discovery, the exploration of biochemical pathways and toxicity prediction of drug candidates. We describe the computational engineering work necessary to enable large ensemble docking campaigns on supercomputers. We show examples where ensemble-based docking has significantly increased the number and the diversity of validated drug candidates. Finally, we illustrate how ensemble-based docking can be extended beyond hit discovery and toward providing a structural basis for the prediction of metabolism and off-target binding relevant to pre-clinical and clinical trials. Copyright © 2016 Elsevier Ltd. All rights reserved.
Toward Global Real Time Hydrologic Modeling - An "Open" View From the Trenches
NASA Astrophysics Data System (ADS)
Nelson, J.
2015-12-01
Big Data has become a popular term to describe the exponential growth of data and related cyber infrastructure to process it so that better analysis can be performed and lead to improved decision-making. How are we doing in the hydrologic sciences? As part of a significant collaborative effort that brought together scientists from public, private, and academic organizations a new transformative hydrologic forecasting modeling infrastructure has been developed. How was it possible to go from deterministic hydrologic forecasts largely driven through manual interactions at 3600 stations to automated 15-day ensemble forecasts at 2.67 million stations? Earth observations of precipitation, temperature, moisture, and other atmospheric and land surface conditions form the foundation of global hydrologic forecasts, but this project demonstrates a critical component to harness these resources can be summed up in one word: OPEN. Whether it is open data sources, open software solutions with open standards, or just being open to collaborations and building teams across institutions, disciplines, and international boundaries, time and time again through my involvement in the development of a high-resolution real time global hydrologic forecasting model I have discovered that in every aspect the sum has always been greater than the parts. While much has been accomplished, much more remains to be done, but the most important lesson learned has been to the degree that we can remain open and work together, the greater our ability will be to use big data hydrologic modeling resources to solve the world's most vexing water related challenges. This presentation will demonstrate a transformational global real time hydrologic forecasting application based on downscaled ECMWF ensemble forecasts, RAPID routing, and Tethys Platform for cloud computing and visualization with discussions of the human and cyber infrastructure connections that make it successful and needs moving forward.
Value of medium range weather forecasts in the improvement of seasonal hydrologic prediction skill
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shukla, Shraddhanand; Voisin, Nathalie; Lettenmaier, D. P.
2012-08-15
We investigated the contribution of medium range weather forecasts with lead times up to 14 days to seasonal hydrologic prediction skill over the Conterminous United States (CONUS). Three different Ensemble Streamflow Prediction (ESP)-based experiments were performed for the period 1980-2003 using the Variable Infiltration Capacity (VIC) hydrology model to generate forecasts of monthly runoff and soil moisture (SM) at lead-1 (first month of the forecast period) to lead-3. The first experiment (ESP) used a resampling from the retrospective period 1980-2003 and represented full climatological uncertainty for the entire forecast period. In the second and third experiments, the first 14 daysmore » of each ESP ensemble member were replaced by either observations (perfect 14-day forecast) or by a deterministic 14-day weather forecast. We used Spearman rank correlations of forecasts and observations as the forecast skill score. We estimated the potential and actual improvement in baseline skill as the difference between the skill of experiments 2 and 3 relative to ESP, respectively. We found that useful runoff and SM forecast skill at lead-1 to -3 months can be obtained by exploiting medium range weather forecast skill in conjunction with the skill derived by the knowledge of initial hydrologic conditions. Potential improvement in baseline skill by using medium range weather forecasts, for runoff (SM) forecasts generally varies from 0 to 0.8 (0 to 0.5) as measured by differences in correlations, with actual improvement generally from 0 to 0.8 of the potential improvement. With some exceptions, most of the improvement in runoff is for lead-1 forecasts, although some improvement in SM was achieved at lead-2.« less
Transient Macroscopic Chemistry in the DSMC Method
NASA Astrophysics Data System (ADS)
Goldsworthy, M. J.; Macrossan, M. N.; Abdel-Jawad, M.
2008-12-01
In the Direct Simulation Monte Carlo method, a combination of statistical and deterministic procedures applied to a finite number of `simulator' particles are used to model rarefied gas-kinetic processes. Traditionally, chemical reactions are modelled using information from specific colliding particle pairs. In the Macroscopic Chemistry Method (MCM), the reactions are decoupled from the specific particle pairs selected for collisions. Information from all of the particles within a cell is used to determine a reaction rate coefficient for that cell. MCM has previously been applied to steady flow DSMC simulations. Here we show how MCM can be used to model chemical kinetics in DSMC simulations of unsteady flow. Results are compared with a collision-based chemistry procedure for two binary reactions in a 1-D unsteady shock-expansion tube simulation and during the unsteady development of 2-D flow through a cavity. For the shock tube simulation, close agreement is demonstrated between the two methods for instantaneous, ensemble-averaged profiles of temperature and species mole fractions. For the cavity flow, a high degree of thermal non-equilibrium is present and non-equilibrium reaction rate correction factors are employed in MCM. Very close agreement is demonstrated for ensemble averaged mole fraction contours predicted by the particle and macroscopic methods at three different flow-times. A comparison of the accumulated number of net reactions per cell shows that both methods compute identical numbers of reaction events. For the 2-D flow, MCM required similar CPU and memory resources to the particle chemistry method. The Macroscopic Chemistry Method is applicable to any general DSMC code using any viscosity or non-reacting collision models and any non-reacting energy exchange models. MCM can be used to implement any reaction rate formulations, whether these be from experimental or theoretical studies.
40 CFR 52.2332 - Control Strategy: Ozone.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 5 2014-07-01 2014-07-01 false Control Strategy: Ozone. 52.2332...: Ozone. Determinations—EPA is determining that, as of July 18, 1995, the Salt Lake and Davis Counties ozone nonattainment area has attained the ozone standard based on air quality monitoring data from 1992...
40 CFR 52.2332 - Control Strategy: Ozone.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 5 2012-07-01 2012-07-01 false Control Strategy: Ozone. 52.2332...: Ozone. Determinations—EPA is determining that, as of July 18, 1995, the Salt Lake and Davis Counties ozone nonattainment area has attained the ozone standard based on air quality monitoring data from 1992...
40 CFR 52.2332 - Control Strategy: Ozone.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 5 2013-07-01 2013-07-01 false Control Strategy: Ozone. 52.2332...: Ozone. Determinations—EPA is determining that, as of July 18, 1995, the Salt Lake and Davis Counties ozone nonattainment area has attained the ozone standard based on air quality monitoring data from 1992...
40 CFR 52.2332 - Control Strategy: Ozone.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 4 2010-07-01 2010-07-01 false Control Strategy: Ozone. 52.2332...: Ozone. Determinations—EPA is determining that, as of July 18, 1995, the Salt Lake and Davis Counties ozone nonattainment area has attained the ozone standard based on air quality monitoring data from 1992...
40 CFR 52.2332 - Control Strategy: Ozone.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 4 2011-07-01 2011-07-01 false Control Strategy: Ozone. 52.2332...: Ozone. Determinations—EPA is determining that, as of July 18, 1995, the Salt Lake and Davis Counties ozone nonattainment area has attained the ozone standard based on air quality monitoring data from 1992...
Global distribution of ozone for various seasons
NASA Technical Reports Server (NTRS)
Koprova, L. I.
1979-01-01
A technique which was used to obtain a catalog of the seasonal global distribution of ozone is presented. The technique is based on the simultaneous use of 1964-1975 data on the total ozone content from a worldwide network of ozonometric stations and on the vertical ozone profile from ozone sounding stations.
Future global mortality from changes in air pollution attributable to climate change
Silva, Raquel A.; West, J. Jason; Lamarque, Jean-François; ...
2017-07-31
Ground-level ozone and fine particulate matter (PM2.5) are associated with premature human mortality(1-4); their future concentrations depend on changes in emissions, which dominate the near-term(5), and on climate change(6,7). Previous global studies of the air-quality-related health effects of future climate change(8,9) used single atmospheric models. But, in related studies, mortality results differ among models(10-12). Here we use an ensemble of global chemistry-climate models(13) to show that premature mortality from changes in air pollution attributable to climate change, under the high greenhouse gas scenario RCP8.5 (ref. 14), is probably positive. We estimate 3,340 (-30,300 to 47,100) ozone-related deaths in 2030, relativemore » to 2000 climate, and 43,600 (-195,000 to 237,000) in 2100 (14% of the increase in global ozone-related mortality). For PM2.5, we estimate 55,600 (-34,300 to 164,000) deaths in 2030 and 215,000 (-76,100 to 595,000) in 2100 (countering by 16% the global decrease in PM2.5-related mortality). Premature mortality attributable to climate change is estimated to be positive in all regions except Africa, and is greatest in India and East Asia. Finally, most individual models yield increased mortality from climate change, but some yield decreases, suggesting caution in interpreting results from a single model. Climate change mitigation is likely to reduce air-pollution-related mortality.« less
Future Global Mortality from Changes in Air Pollution Attributable to Climate Change
NASA Technical Reports Server (NTRS)
Silva, Raquel A.; West, J. Jason; Lamarque, Jean-Francois; Shindell, Drew T.; Collins, William J.; Faluvegi, Greg; Folberth, Gerd A.; Horowitz, Larry W.; Nagashima, Tatsuya; Naik, Vaishali;
2017-01-01
Ground-level ozone and fine particulate matter (PM (sub 2.5)) are associated with premature human mortality; their future concentrations depend on changes in emissions, which dominate the near-term, and on climate change. Previous global studies of the air-quality-related health effects of future climate change used single atmospheric models. However, in related studies, mortality results differ among models. Here we use an ensemble of global chemistry-climate models to show that premature mortality from changes in air pollution attributable to climate change, under the high greenhouse gas scenario RCP (Representative Concentration Pathway) 8.5, is probably positive. We estimate 3,340 (30,300 to 47,100) ozone-related deaths in 2030, relative to 2000 climate, and 43,600 (195,000 to 237,000) in 2100 (14 percent of the increase in global ozone-related mortality). For PM (sub 2.5), we estimate 55,600 (34,300 to 164,000) deaths in 2030 and 215,000 (76,100 to 595,000) in 2100 (countering by 16 percent the global decrease in PM (sub 2.5)-related mortality). Premature mortality attributable to climate change is estimated to be positive in all regions except Africa, and is greatest in India and East Asia. Most individual models yield increased mortality from climate change, but some yield decreases, suggesting caution in interpreting results from a single model. Climate change mitigation is likely to reduce air-pollution-related mortality.
NASA Technical Reports Server (NTRS)
Hurwitz, M. M.; Newman, P. A.
2010-01-01
This study examines trends in Antarctic temperature and APSC, a temperature proxy for the area of polar stratospheric clouds, in an ensemble of Goddard Earth Observing System (GEOS) chemistry-climate model (CCM) simulations of the 21st century. A selection of greenhouse gas, ozone-depleting substance, and sea surface temperature scenarios is used to test the trend sensitivity to these parameters. One scenario is used to compare temperature trends in two versions of the GEOS CCM. An extended austral winter season is examined in detail. In May, June, and July, the expected future increase in CO2-related radiative cooling drives temperature trends in the Antarctic lower stratosphere. At 50 hPa, a 1.3 K cooling is expected between 2000 and 2100. Ozone levels increase, despite this robust cooling signal and the consequent increase in APSC, suggesting the enhancement of stratospheric transport in future. In the lower stratosphere, the choice of climate change scenarios does not affect the magnitude of the early winter cooling. Midwinter temperature trends are generally small. In October, APSC trends have the same sign as the prescribed halogen trends. That is, there are negative APSC trends in "grealistic future" simulations, where halogen loading decreases in accordance with the Montreal Protocol and CO2 continues to increase. In these simulations, the speed of ozone recovery is not influenced by either the choice of sea surface temperature and greenhouse gas scenarios or by the model version.
Future global mortality from changes in air pollution attributable to climate change
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silva, Raquel A.; West, J. Jason; Lamarque, Jean-François
Ground-level ozone and fine particulate matter (PM2.5) are associated with premature human mortality(1-4); their future concentrations depend on changes in emissions, which dominate the near-term(5), and on climate change(6,7). Previous global studies of the air-quality-related health effects of future climate change(8,9) used single atmospheric models. But, in related studies, mortality results differ among models(10-12). Here we use an ensemble of global chemistry-climate models(13) to show that premature mortality from changes in air pollution attributable to climate change, under the high greenhouse gas scenario RCP8.5 (ref. 14), is probably positive. We estimate 3,340 (-30,300 to 47,100) ozone-related deaths in 2030, relativemore » to 2000 climate, and 43,600 (-195,000 to 237,000) in 2100 (14% of the increase in global ozone-related mortality). For PM2.5, we estimate 55,600 (-34,300 to 164,000) deaths in 2030 and 215,000 (-76,100 to 595,000) in 2100 (countering by 16% the global decrease in PM2.5-related mortality). Premature mortality attributable to climate change is estimated to be positive in all regions except Africa, and is greatest in India and East Asia. Finally, most individual models yield increased mortality from climate change, but some yield decreases, suggesting caution in interpreting results from a single model. Climate change mitigation is likely to reduce air-pollution-related mortality.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenthal, William Steven; Tartakovsky, Alex; Huang, Zhenyu
State and parameter estimation of power transmission networks is important for monitoring power grid operating conditions and analyzing transient stability. Wind power generation depends on fluctuating input power levels, which are correlated in time and contribute to uncertainty in turbine dynamical models. The ensemble Kalman filter (EnKF), a standard state estimation technique, uses a deterministic forecast and does not explicitly model time-correlated noise in parameters such as mechanical input power. However, this uncertainty affects the probability of fault-induced transient instability and increased prediction bias. Here a novel approach is to model input power noise with time-correlated stochastic fluctuations, and integratemore » them with the network dynamics during the forecast. While the EnKF has been used to calibrate constant parameters in turbine dynamical models, the calibration of a statistical model for a time-correlated parameter has not been investigated. In this study, twin experiments on a standard transmission network test case are used to validate our time-correlated noise model framework for state estimation of unsteady operating conditions and transient stability analysis, and a methodology is proposed for the inference of the mechanical input power time-correlation length parameter using time-series data from PMUs monitoring power dynamics at generator buses.« less
A data-driven multi-model methodology with deep feature selection for short-term wind forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Cong; Cui, Mingjian; Hodge, Bri-Mathias
With the growing wind penetration into the power system worldwide, improving wind power forecasting accuracy is becoming increasingly important to ensure continued economic and reliable power system operations. In this paper, a data-driven multi-model wind forecasting methodology is developed with a two-layer ensemble machine learning technique. The first layer is composed of multiple machine learning models that generate individual forecasts. A deep feature selection framework is developed to determine the most suitable inputs to the first layer machine learning models. Then, a blending algorithm is applied in the second layer to create an ensemble of the forecasts produced by firstmore » layer models and generate both deterministic and probabilistic forecasts. This two-layer model seeks to utilize the statistically different characteristics of each machine learning algorithm. A number of machine learning algorithms are selected and compared in both layers. This developed multi-model wind forecasting methodology is compared to several benchmarks. The effectiveness of the proposed methodology is evaluated to provide 1-hour-ahead wind speed forecasting at seven locations of the Surface Radiation network. Numerical results show that comparing to the single-algorithm models, the developed multi-model framework with deep feature selection procedure has improved the forecasting accuracy by up to 30%.« less
How potentially predictable are midlatitude ocean currents?
Nonaka, Masami; Sasai, Yoshikazu; Sasaki, Hideharu; Taguchi, Bunmei; Nakamura, Hisashi
2016-01-01
Predictability of atmospheric variability is known to be limited owing to significant uncertainty that arises from intrinsic variability generated independently of external forcing and/or boundary conditions. Observed atmospheric variability is therefore regarded as just a single realization among different dynamical states that could occur. In contrast, subject to wind, thermal and fresh-water forcing at the surface, the ocean circulation has been considered to be rather deterministic under the prescribed atmospheric forcing, and it still remains unknown how uncertain the upper-ocean circulation variability is. This study evaluates how much uncertainty the oceanic interannual variability can potentially have, through multiple simulations with an eddy-resolving ocean general circulation model driven by the observed interannually-varying atmospheric forcing under slightly different conditions. These ensemble “hindcast” experiments have revealed substantial uncertainty due to intrinsic variability in the extratropical ocean circulation that limits potential predictability of its interannual variability, especially along the strong western boundary currents (WBCs) in mid-latitudes, including the Kuroshio and its eastward extention. The intrinsic variability also greatly limits potential predictability of meso-scale oceanic eddy activity. These findings suggest that multi-member ensemble simulations are essential for understanding and predicting variability in the WBCs, which are important for weather and climate variability and marine ecosystems. PMID:26831954
NASA Astrophysics Data System (ADS)
Li, Ming; Wang, Q. J.; Bennett, James C.; Robertson, David E.
2016-09-01
This study develops a new error modelling method for ensemble short-term and real-time streamflow forecasting, called error reduction and representation in stages (ERRIS). The novelty of ERRIS is that it does not rely on a single complex error model but runs a sequence of simple error models through four stages. At each stage, an error model attempts to incrementally improve over the previous stage. Stage 1 establishes parameters of a hydrological model and parameters of a transformation function for data normalization, Stage 2 applies a bias correction, Stage 3 applies autoregressive (AR) updating, and Stage 4 applies a Gaussian mixture distribution to represent model residuals. In a case study, we apply ERRIS for one-step-ahead forecasting at a range of catchments. The forecasts at the end of Stage 4 are shown to be much more accurate than at Stage 1 and to be highly reliable in representing forecast uncertainty. Specifically, the forecasts become more accurate by applying the AR updating at Stage 3, and more reliable in uncertainty spread by using a mixture of two Gaussian distributions to represent the residuals at Stage 4. ERRIS can be applied to any existing calibrated hydrological models, including those calibrated to deterministic (e.g. least-squares) objectives.
Rosenthal, William Steven; Tartakovsky, Alex; Huang, Zhenyu
2017-10-31
State and parameter estimation of power transmission networks is important for monitoring power grid operating conditions and analyzing transient stability. Wind power generation depends on fluctuating input power levels, which are correlated in time and contribute to uncertainty in turbine dynamical models. The ensemble Kalman filter (EnKF), a standard state estimation technique, uses a deterministic forecast and does not explicitly model time-correlated noise in parameters such as mechanical input power. However, this uncertainty affects the probability of fault-induced transient instability and increased prediction bias. Here a novel approach is to model input power noise with time-correlated stochastic fluctuations, and integratemore » them with the network dynamics during the forecast. While the EnKF has been used to calibrate constant parameters in turbine dynamical models, the calibration of a statistical model for a time-correlated parameter has not been investigated. In this study, twin experiments on a standard transmission network test case are used to validate our time-correlated noise model framework for state estimation of unsteady operating conditions and transient stability analysis, and a methodology is proposed for the inference of the mechanical input power time-correlation length parameter using time-series data from PMUs monitoring power dynamics at generator buses.« less
Solid-state ensemble of highly entangled photon sources at rubidium atomic transitions
NASA Astrophysics Data System (ADS)
Zopf, Michael; Keil, Robert; Chen, Yan; HöFer, Bianca; Zhang, Jiaxiang; Ding, Fei; Schmidt, Oliver G.
Semiconductor InAs/GaAs quantum dots grown by the Stranski-Krastanov method are among the leading candidates for the deterministic generation of polarization entangled photon pairs. Despite remarkable progress in the last twenty years, many challenges still remain for this material, such as the extremely low yield (< 1% quantum dots can emit entangled photons), the low degree of entanglement, and the large wavelength distribution. Here we show that, with an emerging family of GaAs/AlGaAs quantum dots grown by droplet etching and nanohole infilling, it is possible to obtain a large ensemble (close to 100%) of polarization-entangled photon emitters on a wafer without any post-growth tuning. Under pulsed resonant two-photon excitation, all measured quantum dots emit single pairs of entangled photons with ultra-high purity, high degree of entanglement (fidelity up to F=0.91, with a record high concurrence C=0.90), and ultra-narrow wavelength distribution at rubidium transitions. Therefore, a solid-state quantum repeater - among many other key enabling quantum photonic elements - can be practically implemented with this new material. Financially supported by BMBF Q.Com-H (16KIS0106) and the Euro- pean Union Seventh Framework Programme 209 (FP7/2007-2013) under Grant Agreement No. 601126 210 (HANAS).
MSEBAG: a dynamic classifier ensemble generation based on `minimum-sufficient ensemble' and bagging
NASA Astrophysics Data System (ADS)
Chen, Lei; Kamel, Mohamed S.
2016-01-01
In this paper, we propose a dynamic classifier system, MSEBAG, which is characterised by searching for the 'minimum-sufficient ensemble' and bagging at the ensemble level. It adopts an 'over-generation and selection' strategy and aims to achieve a good bias-variance trade-off. In the training phase, MSEBAG first searches for the 'minimum-sufficient ensemble', which maximises the in-sample fitness with the minimal number of base classifiers. Then, starting from the 'minimum-sufficient ensemble', a backward stepwise algorithm is employed to generate a collection of ensembles. The objective is to create a collection of ensembles with a descending fitness on the data, as well as a descending complexity in the structure. MSEBAG dynamically selects the ensembles from the collection for the decision aggregation. The extended adaptive aggregation (EAA) approach, a bagging-style algorithm performed at the ensemble level, is employed for this task. EAA searches for the competent ensembles using a score function, which takes into consideration both the in-sample fitness and the confidence of the statistical inference, and averages the decisions of the selected ensembles to label the test pattern. The experimental results show that the proposed MSEBAG outperforms the benchmarks on average.
Rana, Gianfranco; Katerji, Nader; Mastrorilli, Marcello
2012-10-01
The present study describes an operational method, based on the Katerji et al. (Eur J Agron 33:218-230, 2010) model, for determining the daily evapotranspiration (ET) for soybean inside open top chambers (OTCs). It includes two functions, calculated day par day, making it possible to separately take into account the effects of concentrations of air ozone and plant water stress. This last function was calibrated in function of the daily values of actual water reserve in the soil. The input variables of the method are (a) the diurnal values of global radiation and temperature, usually measured routinely in a standard weather station; (b) the daily values of the AOT40 index accumulated (accumulated ozone over a threshold of 40 ppb during daylight hours, when global radiation exceeds 50 Wm(-2)) determined inside the OTC; and (c) the actual water reserve in the soil, at the beginning of the trial. The ensemble of these input variables can be automatable; thus, the proposed method could be applied in routine. The ability of the method to take into account contrasting conditions of ozone air concentration and water stress was evaluated over three successive years, for 513 days, in ten crop growth cycles, excluding the days employed to calibrate the method. Tests were carried out in several chambers for each year and take into account the intra- and inter-year variability of ET measured inside the OTCs. On the daily scale, the slope of the linear regression between the ET measured by the soil water balance and that calculated by the proposed method, under different water conditions, are 0.98 and 1.05 for the filtered and unfiltered (or enriched) OTCs with root mean square error (RMSE) equal to 0.77 and 1.07 mm, respectively. On the seasonal scale, the mean difference between measured and calculated ET is equal to +5% and +11% for the filtered and unfiltered OTCs, respectively. The ability of the proposed method to estimate the daily and seasonal ET inside the OTCs is therefore satisfactory following inter- and intra-annual tests. Finally, suggestions about the applications of the proposed method for other species, different from soybean, were also discussed.
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Labute, M.; Chowdhary, K.; Debusschere, B.; Cameron-Smith, P. J.
2014-12-01
Simulating the atmospheric cycles of ozone, methane, and other radiatively important trace gases in global climate models is computationally demanding and requires the use of 100's of photochemical parameters with uncertain values. Quantitative analysis of the effects of these uncertainties on tracer distributions, radiative forcing, and other model responses is hindered by the "curse of dimensionality." We describe efforts to overcome this curse using ensemble simulations and advanced statistical methods. Uncertainties from 95 photochemical parameters in the trop-MOZART scheme were sampled using a Monte Carlo method and propagated through 10,000 simulations of the single column version of the Community Atmosphere Model (CAM). The variance of the ensemble was represented as a network with nodes and edges, and the topology and connections in the network were analyzed using lasso regression, Bayesian compressive sensing, and centrality measures from the field of social network theory. Despite the limited sample size for this high dimensional problem, our methods determined the key sources of variation and co-variation in the ensemble and identified important clusters in the network topology. Our results can be used to better understand the flow of photochemical uncertainty in simulations using CAM and other climate models. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and supported by the DOE Office of Science through the Scientific Discovery Through Advanced Computing (SciDAC).
Storm Surge Simulation and Ensemble Forecast for Hurricane Irene (2011)
NASA Astrophysics Data System (ADS)
Lin, N.; Emanuel, K.
2012-12-01
Hurricane Irene, raking the U.S. East Coast during the period of 26-30 August 2011, caused widespread damage estimated at $15.8 billion and was responsible for 49 direct deaths (Avila and Cangialosi, 2011). Although the most severe impact in the northeastern U.S. was catastrophic inland flooding, with its unusually large size, Irene also generated high waves and storm surges and caused moderate to major coastal flooding. The most severe surge damage occurred between Oregon Inlet and Cape Hatteras in North Carolina (NC). Significant storm surge damage also occurred along southern Chesapeake Bay, and moderate and high surges were observed along the coast from New Jersey (NJ) northward. A storm surge of 0.9-1.8 m caused hundreds of millions of dollars in property damage in New York City (NYC) and Long Island, despite the fact that the storm made landfall to the west of NYC with peak winds of no more than tropical storm strength. Making three U.S. landfalls (in NC, NJ, and NY), Hurricane Irene provides a unique case for studying storm surge along the eastern U.S. coastline. We apply the hydrodynamic model ADCIRC (Luettich et al. 1992) to conduct surge simulations for Pamlico Sound, Chesapeake Bay, and NYC, using best track data and parametric wind and pressure models. The results agree well with tidal-gauge observations. Then we explore a new methodology for storm surge ensemble forecasting and apply it to Irene. This method applies a statistical/deterministic hurricane model (Emanuel et al. 2006) to generate large numbers of storm ensembles under the storm environment described by the 51 ECMWF ensemble members. The associated surge ensembles are then generated with the ADCIRC model. The numerical simulation is computationally efficient, making the method applicable to real-time storm surge ensemble forecasting. We report the results for NYC in this presentation. The ADCIRC simulation using the best track data generates a storm surge of 1.3 m and a storm tide of 2.1 m at the Battery, NYC, which agree well with the observed storm surge of 1.33 m and storm tide of 2.12 m, although the simulated surge arrives about 2 hours earlier than the observed. Based on the surge climatology estimated by Lin et al. (2012), Hurricane Irene's storm surge is approximately a 60-year event for NYC, but its storm tide, with the surge happening right at the high astronomical tide, is a 100-year event. Lin et al. (2012) also projected that such 100-year storm tide events might occur on average every 3-20 years by the end of the century, under the IPCC A1B emission scenario and a 1-m sea level rise. The ensemble forecasting, starting from two and one days (each with 1000 ensembles) before Irene's first landfall in NC, shows that Irene's actual storm surge at the Battery had a chance of about 9% and 10% to be exceeded, respectively. The largest surges among the two ensemble sets are 2.28 m and 2.05 m, respectively. If happening at the high tide, as with Hurricane Irene, the worst-case storm tides would be about 3-3.2 m, similar to the highest historical water level at the Battery due to a hurricane in 1821. Lin et al. (2012) estimated that such a storm tide of about 3.1 m had a return period of about 500 years under current climate conditions, but the return period might become 25-240 years by the end of the century, under the IPCC A1B emission scenario and a 1-m sea level rise.
Uncertainty estimation of long-range ensemble forecasts of snowmelt flood characteristics
NASA Astrophysics Data System (ADS)
Kuchment, L.
2012-04-01
Long-range forecasts of snowmelt flood characteristics with the lead time of 2-3 months have important significance for regulation of flood runoff and mitigation of flood damages at almost all large Russian rivers At the same time, the application of current forecasting techniques based on regression relationships between the runoff volume and the indexes of river basin conditions can lead to serious errors in forecasting resulted in large economic losses caused by wrong flood regulation. The forecast errors can be caused by complicated processes of soil freezing and soil moisture redistribution, too high rate of snow melt, large liquid precipitation before snow melt. or by large difference of meteorological conditions during the lead-time periods from climatologic ones. Analysis of economic losses had shown that the largest damages could, to a significant extent, be avoided if the decision makers had an opportunity to take into account predictive uncertainty and could use more cautious strategies in runoff regulation. Development of methodology of long-range ensemble forecasting of spring/summer floods which is based on distributed physically-based runoff generation models has created, in principle, a new basis for improving hydrological predictions as well as for estimating their uncertainty. This approach is illustrated by forecasting of the spring-summer floods at the Vyatka River and the Seim River basins. The application of the physically - based models of snowmelt runoff generation give a essential improving of statistical estimates of the deterministic forecasts of the flood volume in comparison with the forecasts obtained from the regression relationships. These models had been used also for the probabilistic forecasts assigning meteorological inputs during lead time periods from the available historical daily series, and from the series simulated by using a weather generator and the Monte Carlo procedure. The weather generator consists of the stochastic models of daily temperature and precipitation. The performance of the probabilistic forecasts were estimated by the ranked probability skill scores. The application of Monte Carlo simulations using weather generator has given better results then using the historical meteorological series.
Project fires. Volume 2: Protective ensemble performance standards, phase 1B
NASA Astrophysics Data System (ADS)
Abeles, F. J.
1980-05-01
The design of the prototype protective ensemble was finalized. Prototype ensembles were fabricated and then subjected to a series of qualification tests which were based upon the protective ensemble performance standards PEPS requirements. Engineering drawings and purchase specifications were prepared for the new protective ensemble.
HLPI-Ensemble: Prediction of human lncRNA-protein interactions based on ensemble strategy.
Hu, Huan; Zhang, Li; Ai, Haixin; Zhang, Hui; Fan, Yetian; Zhao, Qi; Liu, Hongsheng
2018-03-27
LncRNA plays an important role in many biological and disease progression by binding to related proteins. However, the experimental methods for studying lncRNA-protein interactions are time-consuming and expensive. Although there are a few models designed to predict the interactions of ncRNA-protein, they all have some common drawbacks that limit their predictive performance. In this study, we present a model called HLPI-Ensemble designed specifically for human lncRNA-protein interactions. HLPI-Ensemble adopts the ensemble strategy based on three mainstream machine learning algorithms of Support Vector Machines (SVM), Random Forests (RF) and Extreme Gradient Boosting (XGB) to generate HLPI-SVM Ensemble, HLPI-RF Ensemble and HLPI-XGB Ensemble, respectively. The results of 10-fold cross-validation show that HLPI-SVM Ensemble, HLPI-RF Ensemble and HLPI-XGB Ensemble achieved AUCs of 0.95, 0.96 and 0.96, respectively, in the test dataset. Furthermore, we compared the performance of the HLPI-Ensemble models with the previous models through external validation dataset. The results show that the false positives (FPs) of HLPI-Ensemble models are much lower than that of the previous models, and other evaluation indicators of HLPI-Ensemble models are also higher than those of the previous models. It is further showed that HLPI-Ensemble models are superior in predicting human lncRNA-protein interaction compared with previous models. The HLPI-Ensemble is publicly available at: http://ccsipb.lnu.edu.cn/hlpiensemble/ .
Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa
NASA Technical Reports Server (NTRS)
Roberts, J. Brent; Robertson, Franklin R.; Bosilovich, Michael; Lyon, Bradfield; Funk, Chris
2013-01-01
The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period
Evaluating Downscaling Methods for Seasonal Climate Forecasts over East Africa
NASA Technical Reports Server (NTRS)
Robertson, Franklin R.; Roberts, J. Brent; Bosilovich, Michael; Lyon, Bradfield
2013-01-01
The U.S. National Multi-Model Ensemble seasonal forecasting system is providing hindcast and real-time data streams to be used in assessing and improving seasonal predictive capacity. The NASA / USAID SERVIR project, which leverages satellite and modeling-based resources for environmental decision making in developing nations, is focusing on the evaluation of NMME forecasts specifically for use in impact modeling within hub regions including East Africa, the Hindu Kush-Himalayan (HKH) region and Mesoamerica. One of the participating models in NMME is the NASA Goddard Earth Observing System (GEOS5). This work will present an intercomparison of downscaling methods using the GEOS5 seasonal forecasts of temperature and precipitation over East Africa. The current seasonal forecasting system provides monthly averaged forecast anomalies. These anomalies must be spatially downscaled and temporally disaggregated for use in application modeling (e.g. hydrology, agriculture). There are several available downscaling methodologies that can be implemented to accomplish this goal. Selected methods include both a non-homogenous hidden Markov model and an analogue based approach. A particular emphasis will be placed on quantifying the ability of different methods to capture the intermittency of precipitation within both the short and long rain seasons. Further, the ability to capture spatial covariances will be assessed. Both probabilistic and deterministic skill measures will be evaluated over the hindcast period.
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Ground-Based Lidar for Atmospheric Boundary Layer Ozone Measurements
NASA Technical Reports Server (NTRS)
Kuang, Shi; Newchurch, Michael J.; Burris, John; Liu, Xiong
2013-01-01
Ground-based lidars are suitable for long-term ozone monitoring as a complement to satellite and ozonesonde measurements. However, current ground-based lidars are unable to consistently measure ozone below 500 m above ground level (AGL) due to both engineering issues and high retrieval sensitivity to various measurement errors. In this paper, we present our instrument design, retrieval techniques, and preliminary results that focus on the high-temporal profiling of ozone within the atmospheric boundary layer (ABL) achieved by the addition of an inexpensive and compact mini-receiver to the previous system. For the first time, to the best of our knowledge, the lowest, consistently achievable observation height has been extended down to 125 m AGL for a ground-based ozone lidar system. Both the analysis and preliminary measurements demonstrate that this lidar measures ozone with a precision generally better than 10% at a temporal resolution of 10 min and a vertical resolution from 150 m at the bottom of the ABL to 550 m at the top. A measurement example from summertime shows that inhomogeneous ozone aloft was affected by both surface emissions and the evolution of ABL structures.
NASA Technical Reports Server (NTRS)
Strode, Sarah A.; Douglass, Anne R.; Ziemke, Jerald R.; Manyin, Michael; Nielsen, J. Eric; Oman, Luke D.
2017-01-01
Satellite observations of in-cloud ozone concentrations from the Ozone Monitoring Instrument and Microwave Limb Sounder instruments show substantial differences from background ozone concentrations. We develop a method for comparing a free-running chemistry-climate model (CCM) to in-cloud and background ozone observations using a simple criterion based on cloud fraction to separate cloudy and clear-sky days. We demonstrate that the CCM simulates key features of the in-cloud versus background ozone differences and of the geographic distribution of in-cloud ozone. Since the agreement is not dependent on matching the meteorological conditions of a specific day, this is a promising method for diagnosing how accurately CCMs represent the relationships between ozone and clouds, including the lower ozone concentrations shown by in-cloud satellite observations. Since clouds are associated with convection as well as changes in chemistry, we diagnose the tendency of tropical ozone at 400 hPa due to chemistry, convection and turbulence, and large-scale dynamics. While convection acts to reduce ozone concentrations at 400 hPa throughout much of the tropics, it has the opposite effect over highly polluted regions of South and East Asia.
NASA Astrophysics Data System (ADS)
Wang, Wei; Ruiz, Isaac; Lee, Ilkeun; Zaera, Francisco; Ozkan, Mihrimah; Ozkan, Cengiz S.
2015-04-01
Optimization of the electrode/electrolyte double-layer interface is a key factor for improving electrode performance of aqueous electrolyte based supercapacitors (SCs). Here, we report the improved functionality of carbon materials via a non-invasive, high-throughput, and inexpensive UV generated ozone (UV-ozone) treatment. This process allows precise tuning of the graphene and carbon nanotube hybrid foam (GM) transitionally from ultrahydrophobic to hydrophilic within 60 s. The continuous tuning of surface energy can be controlled by simply varying the UV-ozone exposure time, while the ozone-oxidized carbon nanostructure maintains its integrity. Symmetric SCs based on the UV-ozone treated GM foam demonstrated enhanced rate performance. This technique can be readily applied to other CVD-grown carbonaceous materials by taking advantage of its ease of processing, low cost, scalability, and controllability.Optimization of the electrode/electrolyte double-layer interface is a key factor for improving electrode performance of aqueous electrolyte based supercapacitors (SCs). Here, we report the improved functionality of carbon materials via a non-invasive, high-throughput, and inexpensive UV generated ozone (UV-ozone) treatment. This process allows precise tuning of the graphene and carbon nanotube hybrid foam (GM) transitionally from ultrahydrophobic to hydrophilic within 60 s. The continuous tuning of surface energy can be controlled by simply varying the UV-ozone exposure time, while the ozone-oxidized carbon nanostructure maintains its integrity. Symmetric SCs based on the UV-ozone treated GM foam demonstrated enhanced rate performance. This technique can be readily applied to other CVD-grown carbonaceous materials by taking advantage of its ease of processing, low cost, scalability, and controllability. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr06795a
NASA Astrophysics Data System (ADS)
Ross, M. N.; Toohey, D.
2008-12-01
Emissions from solid and liquid propellant rocket engines reduce global stratospheric ozone levels. Currently ~ one kiloton of payloads are launched into earth orbit annually by the global space industry. Stratospheric ozone depletion from present day launches is a small fraction of the ~ 4% globally averaged ozone loss caused by halogen gases. Thus rocket engine emissions are currently considered a minor, if poorly understood, contributor to ozone depletion. Proposed space-based geoengineering projects designed to mitigate climate change would require order of magnitude increases in the amount of material launched into earth orbit. The increased launches would result in comparable increases in the global ozone depletion caused by rocket emissions. We estimate global ozone loss caused by three space-based geoengineering proposals to mitigate climate change: (1) mirrors, (2) sunshade, and (3) space-based solar power (SSP). The SSP concept does not directly engineer climate, but is touted as a mitigation strategy in that SSP would reduce CO2 emissions. We show that launching the mirrors or sunshade would cause global ozone loss between 2% and 20%. Ozone loss associated with an economically viable SSP system would be at least 0.4% and possibly as large as 3%. It is not clear which, if any, of these levels of ozone loss would be acceptable under the Montreal Protocol. The large uncertainties are mainly caused by a lack of data or validated models regarding liquid propellant rocket engine emissions. Our results offer four main conclusions. (1) The viability of space-based geoengineering schemes could well be undermined by the relatively large ozone depletion that would be caused by the required rocket launches. (2) Analysis of space- based geoengineering schemes should include the difficult tradeoff between the gain of long-term (~ decades) climate control and the loss of short-term (~ years) deep ozone loss. (3) The trade can be properly evaluated only if our understanding of the stratospheric impact of rocket emissions is significantly improved. (4) Such an improved understanding requires a concerted effort of research including new in situ measurements in a variety of rocket plumes and a multi-scale modeling program similar in scope to the effort required to address the climate and ozone impacts of aircraft emissions.
Local fluctuations of ozone from 16 km to 45 km deduced from in situ vertical ozone profile
NASA Technical Reports Server (NTRS)
Moreau, G.; Robert, C.
1994-01-01
A vertical ozone profile obtained by an in situ ozone sonde from 16 km to 45 km, has allowed to observe local ozone concentration variations. These variations can be observed, thanks to a fast measurement system based on a UV absorption KrF excimer laser beam in a multipass cell. Ozone standard deviation versus altitude calculated from the mean is derived. Ozone variations or fluctuations are correlated with the different dynamic zones of the stratosphere.
Effect of Pulse Width on Oxygen-fed Ozonizer
NASA Astrophysics Data System (ADS)
Okada, Sho; Wang, Douyan; Namihira, Takao; Katsuki, Sunao; Akiyama, Hidenori
Though general ozonizers based on silent discharge (barrier discharge) have been used to supply ozone at many industrial situations, there is still some problem, such as improvements of ozone yield. In this work, ozone was generated by pulsed discharge in order to improve the characteristics of ozone generation. It is known that a pulse width gives strong effect to the improvement of energy efficiency in exhaust gas processing. In this paper, the effect of pulse duration on ozone generation by pulsed discharge in oxygen would be reported.
DOAS-based total column ozone retrieval from Phaethon system
NASA Astrophysics Data System (ADS)
Gkertsi, F.; Bais, A. F.; Kouremeti, N.; Drosoglou, Th; Fountoulakis, I.; Fragkos, K.
2018-05-01
This study introduces the measurement of the total ozone column using Differential Optical Absorption Spectroscopy (DOAS) analysis of direct-sun spectra recorded by the Phaethon system. This methodology is based on the analysis of spectra relative to a reference spectrum that has been recorded by the same instrument. The slant column density of ozone associated with the reference spectrum is derived by Langley extrapolation. Total ozone data derived by Phaethon over two years in Thessaloniki are compared with those of a collocated, well-maintained and calibrated, Brewer spectrophotometer. When the retrieval of total ozone is based on the absorption cross sections of (Paur and Bass, 1984) at 228 K, Phaethon shows an average overestimation of 1.85 ± 1.86%. Taking into account the effect of the day-to-day variability of stratospheric temperature on total ozone derived by both systems, the bias is reduced to 0.94 ± 1.26%. The sensitivity of the total ozone retrieval to changes in temperature is larger for Phaethon than for Brewer.
NASA Technical Reports Server (NTRS)
Chyba, Thomas; Zenker, Thomas
1998-01-01
The objective of this project is to develop a portable, eye-safe, ground-based ozone lidar instrument specialized for ozone differential absorption lidar (DIAL) measurements in the troposphere. This prototype instrument is intended to operate at remote field sites and to serve as the basic unit for monitoring projects requiring multi-instrument networks, such as that discussed in the science plan for the Global Tropospheric Ozone Project (GTOP). This instrument will be based at HU for student training in lidar technology as well as atmospheric ozone data analysis and interpretation. It will be also available for off-site measurement campaigns and will serve as a test bed for further instrument development. Later development beyond this grant to extend the scientific usefulness of the instrument may include incorporation of an aerosol channel and upgrading the laser to make stratospheric ozone measurements. Undergraduate and graduate students have been and will be active participants in this research effort.
Deterministic and efficient quantum cryptography based on Bell's theorem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen Zengbing; Pan Jianwei; Physikalisches Institut, Universitaet Heidelberg, Philosophenweg 12, 69120 Heidelberg
2006-05-15
We propose a double-entanglement-based quantum cryptography protocol that is both efficient and deterministic. The proposal uses photon pairs with entanglement both in polarization and in time degrees of freedom; each measurement in which both of the two communicating parties register a photon can establish one and only one perfect correlation, and thus deterministically create a key bit. Eavesdropping can be detected by violation of local realism. A variation of the protocol shows a higher security, similar to the six-state protocol, under individual attacks. Our scheme allows a robust implementation under the current technology.
NASA Astrophysics Data System (ADS)
García, Constantino A.; Otero, Abraham; Félix, Paulo; Presedo, Jesús; Márquez, David G.
2018-07-01
In the past few decades, it has been recognized that 1 / f fluctuations are ubiquitous in nature. The most widely used mathematical models to capture the long-term memory properties of 1 / f fluctuations have been stochastic fractal models. However, physical systems do not usually consist of just stochastic fractal dynamics, but they often also show some degree of deterministic behavior. The present paper proposes a model based on fractal stochastic and deterministic components that can provide a valuable basis for the study of complex systems with long-term correlations. The fractal stochastic component is assumed to be a fractional Brownian motion process and the deterministic component is assumed to be a band-limited signal. We also provide a method that, under the assumptions of this model, is able to characterize the fractal stochastic component and to provide an estimate of the deterministic components present in a given time series. The method is based on a Bayesian wavelet shrinkage procedure that exploits the self-similar properties of the fractal processes in the wavelet domain. This method has been validated over simulated signals and over real signals with economical and biological origin. Real examples illustrate how our model may be useful for exploring the deterministic-stochastic duality of complex systems, and uncovering interesting patterns present in time series.
NASA Technical Reports Server (NTRS)
Abeles, F. J.
1980-01-01
The design of the prototype protective ensemble was finalized. Prototype ensembles were fabricated and then subjected to a series of qualification tests which were based upon the protective ensemble performance standards PEPS requirements. Engineering drawings and purchase specifications were prepared for the new protective ensemble.
Forcing of stratospheric chemistry and dynamics during the Dalton Minimum
NASA Astrophysics Data System (ADS)
Anet, J. G.; Muthers, S.; Rozanov, E.; Raible, C. C.; Peter, T.; Stenke, A.; Shapiro, A. I.; Beer, J.; Steinhilber, F.; Brönnimann, S.; Arfeuille, F.; Brugnara, Y.; Schmutz, W.
2013-11-01
The response of atmospheric chemistry and dynamics to volcanic eruptions and to a decrease in solar activity during the Dalton Minimum is investigated with the fully coupled atmosphere-ocean chemistry general circulation model SOCOL-MPIOM (modeling tools for studies of SOlar Climate Ozone Links-Max Planck Institute Ocean Model) covering the time period 1780 to 1840 AD. We carried out several sensitivity ensemble experiments to separate the effects of (i) reduced solar ultra-violet (UV) irradiance, (ii) reduced solar visible and near infrared irradiance, (iii) enhanced galactic cosmic ray intensity as well as less intensive solar energetic proton events and auroral electron precipitation, and (iv) volcanic aerosols. The introduced changes of UV irradiance and volcanic aerosols significantly influence stratospheric dynamics in the early 19th century, whereas changes in the visible part of the spectrum and energetic particles have smaller effects. A reduction of UV irradiance by 15%, which represents the presently discussed highest estimate of UV irradiance change caused by solar activity changes, causes global ozone decrease below the stratopause reaching as much as 8% in the midlatitudes at 5 hPa and a significant stratospheric cooling of up to 2 °C in the mid-stratosphere and to 6 °C in the lower mesosphere. Changes in energetic particle precipitation lead only to minor changes in the yearly averaged temperature fields in the stratosphere. Volcanic aerosols heat the tropical lower stratosphere, allowing more water vapour to enter the tropical stratosphere, which, via HOx reactions, decreases upper stratospheric and mesospheric ozone by roughly 4%. Conversely, heterogeneous chemistry on aerosols reduces stratospheric NOx, leading to a 12% ozone increase in the tropics, whereas a decrease in ozone of up to 5% is found over Antarctica in boreal winter. The linear superposition of the different contributions is not equivalent to the response obtained in a simulation when all forcing factors are applied during the Dalton Minimum (DM) - this effect is especially well visible for NOx/NOy. Thus, this study also shows the non-linear behaviour of the coupled chemistry-climate system. Finally, we conclude that especially UV and volcanic eruptions dominate the changes in the ozone, temperature and dynamics while the NOx field is dominated by the energetic particle precipitation. Visible radiation changes have only very minor effects on both stratospheric dynamics and chemistry.
EXPERIMENTAL AND THEORETICAL EVALUATIONS OF OBSERVATIONAL-BASED TECHNIQUES
Observational Based Methods (OBMs) can be used by EPA and the States to develop reliable ozone controls approaches. OBMs use actual measured concentrations of ozone, its precursors, and other indicators to determine the most appropriate strategy for ozone control. The usual app...
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Jancso, Leonhardt M.; Rocco, Stefania Di; Staehelin, Johannes; Maeder, Joerg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; de Backer, Hugo; Koehler, Ulf; Krzyścin, Janusz; Vaníček, Karel
2011-11-01
We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear ‘fingerprints’ of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector.
NASA Astrophysics Data System (ADS)
Christensen, H. M.; Moroz, I.; Palmer, T.
2015-12-01
It is now acknowledged that representing model uncertainty in atmospheric simulators is essential for the production of reliable probabilistic ensemble forecasts, and a number of different techniques have been proposed for this purpose. Stochastic convection parameterization schemes use random numbers to represent the difference between a deterministic parameterization scheme and the true atmosphere, accounting for the unresolved sub grid-scale variability associated with convective clouds. An alternative approach varies the values of poorly constrained physical parameters in the model to represent the uncertainty in these parameters. This study presents new perturbed parameter schemes for use in the European Centre for Medium Range Weather Forecasts (ECMWF) convection scheme. Two types of scheme are developed and implemented. Both schemes represent the joint uncertainty in four of the parameters in the convection parametrisation scheme, which was estimated using the Ensemble Prediction and Parameter Estimation System (EPPES). The first scheme developed is a fixed perturbed parameter scheme, where the values of uncertain parameters are changed between ensemble members, but held constant over the duration of the forecast. The second is a stochastically varying perturbed parameter scheme. The performance of these schemes was compared to the ECMWF operational stochastic scheme, Stochastically Perturbed Parametrisation Tendencies (SPPT), and to a model which does not represent uncertainty in convection. The skill of probabilistic forecasts made using the different models was evaluated. While the perturbed parameter schemes improve on the stochastic parametrisation in some regards, the SPPT scheme outperforms the perturbed parameter approaches when considering forecast variables that are particularly sensitive to convection. Overall, SPPT schemes are the most skilful representations of model uncertainty due to convection parametrisation. Reference: H. M. Christensen, I. M. Moroz, and T. N. Palmer, 2015: Stochastic and Perturbed Parameter Representations of Model Uncertainty in Convection Parameterization. J. Atmos. Sci., 72, 2525-2544.
NASA Astrophysics Data System (ADS)
Tian, Xin; Negenborn, Rudy R.; van Overloop, Peter-Jules; María Maestre, José; Sadowska, Anna; van de Giesen, Nick
2017-11-01
Model Predictive Control (MPC) is one of the most advanced real-time control techniques that has been widely applied to Water Resources Management (WRM). MPC can manage the water system in a holistic manner and has a flexible structure to incorporate specific elements, such as setpoints and constraints. Therefore, MPC has shown its versatile performance in many branches of WRM. Nonetheless, with the in-depth understanding of stochastic hydrology in recent studies, MPC also faces the challenge of how to cope with hydrological uncertainty in its decision-making process. A possible way to embed the uncertainty is to generate an Ensemble Forecast (EF) of hydrological variables, rather than a deterministic one. The combination of MPC and EF results in a more comprehensive approach: Multi-scenario MPC (MS-MPC). In this study, we will first assess the model performance of MS-MPC, considering an ensemble streamflow forecast. Noticeably, the computational inefficiency may be a critical obstacle that hinders applicability of MS-MPC. In fact, with more scenarios taken into account, the computational burden of solving an optimization problem in MS-MPC accordingly increases. To deal with this challenge, we propose the Adaptive Control Resolution (ACR) approach as a computationally efficient scheme to practically reduce the number of control variables in MS-MPC. In brief, the ACR approach uses a mixed-resolution control time step from the near future to the distant future. The ACR-MPC approach is tested on a real-world case study: an integrated flood control and navigation problem in the North Sea Canal of the Netherlands. Such an approach reduces the computation time by 18% and up in our case study. At the same time, the model performance of ACR-MPC remains close to that of conventional MPC.
NASA Astrophysics Data System (ADS)
Borah, Nabanita; Sukumarpillai, Abhilash; Sahai, Atul Kumar; Chattopadhyay, Rajib; Joseph, Susmitha; De, Soumyendu; Nath Goswami, Bhupendra; Kumar, Arun
2014-05-01
An ensemble prediction system (EPS) is devised for the extended range prediction (ERP) of monsoon intraseasonal oscillations (MISO) of Indian summer monsoon (ISM) using NCEP Climate Forecast System model version2 at T126 horizontal resolution. The EPS is formulated by producing 11 member ensembles through the perturbation of atmospheric initial conditions. The hindcast experiments were conducted at every 5-day interval for 45 days lead time starting from 16th May to 28th September during 2001-2012. The general simulation of ISM characteristics and the ERP skill of the proposed EPS at pentad mean scale are evaluated in the present study. Though the EPS underestimates both the mean and variability of ISM rainfall, it simulates the northward propagation of MISO reasonably well. It is found that the signal-to-noise ratio becomes unity by about18 days and the predictability error saturates by about 25 days. Though useful deterministic forecasts could be generated up to 2nd pentad lead, significant correlations are observed even up to 4th pentad lead. The skill in predicting large-scale MISO, which is assessed by comparing the predicted and observed MISO indices, is found to be ~17 days. It is noted that the prediction skill of actual rainfall is closely related to the prediction of amplitude of large scale MISO as well as the initial conditions related to the different phases of MISO. Categorical prediction skills reveals that break is more skillfully predicted, followed by active and then normal. The categorical probability skill scores suggest that useful probabilistic forecasts could be generated even up to 4th pentad lead.
NASA Astrophysics Data System (ADS)
Borah, N.; Abhilash, S.; Sahai, A. K.; Chattopadhyay, R.; Joseph, S.; Sharmila, S.; de, S.; Goswami, B.; Kumar, A.
2013-12-01
An ensemble prediction system (EPS) is devised for the extended range prediction (ERP) of monsoon intraseasonal oscillations (MISOs) of Indian summer monsoon (ISM) using NCEP Climate Forecast System model version2 at T126 horizontal resolution. The EPS is formulated by producing 11 member ensembles through the perturbation of atmospheric initial conditions. The hindcast experiments were conducted at every 5-day interval for 45 days lead time starting from 16th May to 28th September during 2001-2012. The general simulation of ISM characteristics and the ERP skill of the proposed EPS at pentad mean scale are evaluated in the present study. Though the EPS underestimates both the mean and variability of ISM rainfall, it simulates the northward propagation of MISO reasonably well. It is found that the signal-to-noise ratio becomes unity by about18 days and the predictability error saturates by about 25 days. Though useful deterministic forecasts could be generated up to 2nd pentad lead, significant correlations are observed even up to 4th pentad lead. The skill in predicting large-scale MISO, which is assessed by comparing the predicted and observed MISO indices, is found to be ~17 days. It is noted that the prediction skill of actual rainfall is closely related to the prediction of amplitude of large scale MISO as well as the initial conditions related to the different phases of MISO. Categorical prediction skills reveals that break is more skillfully predicted, followed by active and then normal. The categorical probability skill scores suggest that useful probabilistic forecasts could be generated even up to 4th pentad lead.
NASA Astrophysics Data System (ADS)
Abhilash, S.; Sahai, A. K.; Borah, N.; Chattopadhyay, R.; Joseph, S.; Sharmila, S.; De, S.; Goswami, B. N.; Kumar, Arun
2014-05-01
An ensemble prediction system (EPS) is devised for the extended range prediction (ERP) of monsoon intraseasonal oscillations (MISO) of Indian summer monsoon (ISM) using National Centers for Environmental Prediction Climate Forecast System model version 2 at T126 horizontal resolution. The EPS is formulated by generating 11 member ensembles through the perturbation of atmospheric initial conditions. The hindcast experiments were conducted at every 5-day interval for 45 days lead time starting from 16th May to 28th September during 2001-2012. The general simulation of ISM characteristics and the ERP skill of the proposed EPS at pentad mean scale are evaluated in the present study. Though the EPS underestimates both the mean and variability of ISM rainfall, it simulates the northward propagation of MISO reasonably well. It is found that the signal-to-noise ratio of the forecasted rainfall becomes unity by about 18 days. The potential predictability error of the forecasted rainfall saturates by about 25 days. Though useful deterministic forecasts could be generated up to 2nd pentad lead, significant correlations are found even up to 4th pentad lead. The skill in predicting large-scale MISO, which is assessed by comparing the predicted and observed MISO indices, is found to be ~17 days. It is noted that the prediction skill of actual rainfall is closely related to the prediction of large-scale MISO amplitude as well as the initial conditions related to the different phases of MISO. An analysis of categorical prediction skills reveals that break is more skillfully predicted, followed by active and then normal. The categorical probability skill scores suggest that useful probabilistic forecasts could be generated even up to 4th pentad lead.
NASA Astrophysics Data System (ADS)
Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.
2016-01-01
The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e., the observation that large-scale rainfall structures are more persistent and predictable than small-scale convective cells. This paper presents the development, adaptation and verification of the STEPS system for Belgium (STEPS-BE). STEPS-BE provides in real-time 20-member ensemble precipitation nowcasts at 1 km and 5 min resolutions up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 75-90 % of the forecast errors.
NASA Astrophysics Data System (ADS)
Foresti, L.; Reyniers, M.; Seed, A.; Delobbe, L.
2015-07-01
The Short-Term Ensemble Prediction System (STEPS) is implemented in real-time at the Royal Meteorological Institute (RMI) of Belgium. The main idea behind STEPS is to quantify the forecast uncertainty by adding stochastic perturbations to the deterministic Lagrangian extrapolation of radar images. The stochastic perturbations are designed to account for the unpredictable precipitation growth and decay processes and to reproduce the dynamic scaling of precipitation fields, i.e. the observation that large scale rainfall structures are more persistent and predictable than small scale convective cells. This paper presents the development, adaptation and verification of the system STEPS for Belgium (STEPS-BE). STEPS-BE provides in real-time 20 member ensemble precipitation nowcasts at 1 km and 5 min resolution up to 2 h lead time using a 4 C-band radar composite as input. In the context of the PLURISK project, STEPS forecasts were generated to be used as input in sewer system hydraulic models for nowcasting urban inundations in the cities of Ghent and Leuven. Comprehensive forecast verification was performed in order to detect systematic biases over the given urban areas and to analyze the reliability of probabilistic forecasts for a set of case studies in 2013 and 2014. The forecast biases over the cities of Leuven and Ghent were found to be small, which is encouraging for future integration of STEPS nowcasts into the hydraulic models. Probabilistic forecasts of exceeding 0.5 mm h-1 are reliable up to 60-90 min lead time, while the ones of exceeding 5.0 mm h-1 are only reliable up to 30 min. The STEPS ensembles are slightly under-dispersive and represent only 80-90 % of the forecast errors.
NASA Astrophysics Data System (ADS)
Feng, Y. Katherina; Robinson, Tyler D.; Fortney, Jonathan J.; Lupu, Roxana E.; Marley, Mark S.; Lewis, Nikole K.; Macintosh, Bruce; Line, Michael R.
2018-05-01
Space-based high-contrast imaging mission concepts for studying rocky exoplanets in reflected light are currently under community study. We develop an inverse modeling framework to estimate the science return of such missions given different instrument design considerations. By combining an exoplanet albedo model, instrument noise model, and ensemble Markov chain Monte Carlo sampler, we explore retrievals of atmospheric and planetary properties for Earth twins as a function of signal-to-noise ratio (S/N) and resolution (R). Our forward model includes Rayleigh-scattering, single-layer water clouds with patchy coverage, and pressure-dependent absorption due to water vapor, oxygen, and ozone. We simulate data at R = 70 and 140 from 0.4 to 1.0 μm with S/N = 5, 10, 15, and 20 at 550 nm (i.e., for HabEx/LUVOIR-type instruments). At these same S/Ns, we simulate data for WFIRST paired with a starshade, which includes two photometric points between 0.48 and 0.6 μm and R = 50 spectroscopy from 0.6 to 0.97 μm. Given our noise model for WFIRST-type detectors, we find that weak detections of water vapor, ozone, and oxygen can be achieved with observations with at least R = 70/S/N = 15 or R = 140/S/N = 10 for improved detections. Meaningful constraints are only achieved with R = 140/S/N = 20 data. The WFIRST data offer limited diagnostic information, needing at least S/N = 20 to weakly detect gases. Most scenarios place limits on planetary radius but cannot constrain surface gravity and, thus, planetary mass.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-20
... 1997 ozone NAAQS which revised the health-based NAAQS for ozone by setting the NAAQS at 0.08 parts per... evidence demonstrating that ozone causes adverse health effects at lower ozone concentrations and over... determined that the 1997 ozone NAAQS would be more protective of human health, especially for children and...
CPT-based probabilistic and deterministic assessment of in situ seismic soil liquefaction potential
Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Der Kiureghian, A.; Cetin, K.O.
2006-01-01
This paper presents a complete methodology for both probabilistic and deterministic assessment of seismic soil liquefaction triggering potential based on the cone penetration test (CPT). A comprehensive worldwide set of CPT-based liquefaction field case histories were compiled and back analyzed, and the data then used to develop probabilistic triggering correlations. Issues investigated in this study include improved normalization of CPT resistance measurements for the influence of effective overburden stress, and adjustment to CPT tip resistance for the potential influence of "thin" liquefiable layers. The effects of soil type and soil character (i.e., "fines" adjustment) for the new correlations are based on a combination of CPT tip and sleeve resistance. To quantify probability for performancebased engineering applications, Bayesian "regression" methods were used, and the uncertainties of all variables comprising both the seismic demand and the liquefaction resistance were estimated and included in the analysis. The resulting correlations were developed using a Bayesian framework and are presented in both probabilistic and deterministic formats. The results are compared to previous probabilistic and deterministic correlations. ?? 2006 ASCE.
NASA Astrophysics Data System (ADS)
Schranz, Franziska; Fernandez, Susana; Kämpfer, Niklaus; Palm, Mathias
2018-03-01
We present an analysis of the diurnal ozone cycle from 1 year of continuous ozone measurements from two ground-based microwave radiometers in the Arctic. The instruments GROMOS-C and OZORAM are located at the AWIPEV research base at Ny-Ålesund, Svalbard (79° N, 12° E), and gathered a comprehensive time series of middle-atmospheric ozone profiles with a high time resolution. An intercomparison was performed with EOS MLS and ozone sonde measurements and simulations with SD-WACCM. The measured data sets were used to study the photochemically induced diurnal cycle of ozone in the stratosphere and mesosphere. Throughout the year the insolation in the Arctic changes drastically from polar night to polar day. Accordingly, the seasonal variations in the diurnal ozone cycle are large. In the stratosphere we found a diurnal cycle throughout the entire period of polar day with the largest amplitude in April. In the mesosphere a diurnal cycle was detected in spring and fall. SD-WACCM has been proven to capture the diurnal cycle well and was therefore used to analyse the chemical reaction rates of ozone production and loss at equinox and summer solstice. Furthermore GROMOS-C proved capable of measuring the tertiary ozone layer above Ny-Ålesund in winter.
NASA Astrophysics Data System (ADS)
Hunziker, Jürg; Laloy, Eric; Linde, Niklas
2016-04-01
Deterministic inversion procedures can often explain field data, but they only deliver one final subsurface model that depends on the initial model and regularization constraints. This leads to poor insights about the uncertainties associated with the inferred model properties. In contrast, probabilistic inversions can provide an ensemble of model realizations that accurately span the range of possible models that honor the available calibration data and prior information allowing a quantitative description of model uncertainties. We reconsider the problem of inferring the dielectric permittivity (directly related to radar velocity) structure of the subsurface by inversion of first-arrival travel times from crosshole ground penetrating radar (GPR) measurements. We rely on the DREAM_(ZS) algorithm that is a state-of-the-art Markov chain Monte Carlo (MCMC) algorithm. Such algorithms need several orders of magnitude more forward simulations than deterministic algorithms and often become infeasible in high parameter dimensions. To enable high-resolution imaging with MCMC, we use a recently proposed dimensionality reduction approach that allows reproducing 2D multi-Gaussian fields with far fewer parameters than a classical grid discretization. We consider herein a dimensionality reduction from 5000 to 257 unknowns. The first 250 parameters correspond to a spectral representation of random and uncorrelated spatial fluctuations while the remaining seven geostatistical parameters are (1) the standard deviation of the data error, (2) the mean and (3) the variance of the relative electric permittivity, (4) the integral scale along the major axis of anisotropy, (5) the anisotropy angle, (6) the ratio of the integral scale along the minor axis of anisotropy to the integral scale along the major axis of anisotropy and (7) the shape parameter of the Matérn function. The latter essentially defines the type of covariance function (e.g., exponential, Whittle, Gaussian). We present an improved formulation of the dimensionality reduction, and numerically show how it reduces artifacts in the generated models and provides better posterior estimation of the subsurface geostatistical structure. We next show that the results of the method compare very favorably against previous deterministic and stochastic inversion results obtained at the South Oyster Bacterial Transport Site in Virginia, USA. The long-term goal of this work is to enable MCMC-based full waveform inversion of crosshole GPR data.
Dissipative production of a maximally entangled steady state of two quantum bits.
Lin, Y; Gaebler, J P; Reiter, F; Tan, T R; Bowler, R; Sørensen, A S; Leibfried, D; Wineland, D J
2013-12-19
Entangled states are a key resource in fundamental quantum physics, quantum cryptography and quantum computation. Introduction of controlled unitary processes--quantum gates--to a quantum system has so far been the most widely used method to create entanglement deterministically. These processes require high-fidelity state preparation and minimization of the decoherence that inevitably arises from coupling between the system and the environment, and imperfect control of the system parameters. Here we combine unitary processes with engineered dissipation to deterministically produce and stabilize an approximate Bell state of two trapped-ion quantum bits (qubits), independent of their initial states. Compared with previous studies that involved dissipative entanglement of atomic ensembles or the application of sequences of multiple time-dependent gates to trapped ions, we implement our combined process using trapped-ion qubits in a continuous time-independent fashion (analogous to optical pumping of atomic states). By continuously driving the system towards the steady state, entanglement is stabilized even in the presence of experimental noise and decoherence. Our demonstration of an entangled steady state of two qubits represents a step towards dissipative state engineering, dissipative quantum computation and dissipative phase transitions. Following this approach, engineered coupling to the environment may be applied to a broad range of experimental systems to achieve desired quantum dynamics or steady states. Indeed, concurrently with this work, an entangled steady state of two superconducting qubits was demonstrated using dissipation.
Forcing of stratospheric chemistry and dynamics during the Dalton Minimum
NASA Astrophysics Data System (ADS)
Anet, J. G.; Muthers, S.; Rozanov, E.; Raible, C. C.; Peter, T.; Stenke, A.; Shapiro, A. I.; Beer, J.; Steinhilber, F.; Brönnimann, S.; Arfeuille, F.; Brugnara, Y.; Schmutz, W.
2013-06-01
The response of atmospheric chemistry and climate to volcanic eruptions and a decrease in solar activity during the Dalton Minimum is investigated with the fully coupled atmosphere-ocean-chemistry general circulation model SOCOL-MPIOM covering the time period 1780 to 1840 AD. We carried out several sensitivity ensemble experiments to separate the effects of (i) reduced solar ultra-violet (UV) irradiance, (ii) reduced solar visible and near infrared irradiance, (iii) enhanced galactic cosmic ray intensity as well as less intensive solar energetic proton events and auroral electron precipitation, and (iv) volcanic aerosols. The introduced changes of UV irradiance and volcanic aerosols significantly influence stratospheric climate in the early 19th century, whereas changes in the visible part of the spectrum and energetic particles have smaller effects. A reduction of UV irradiance by 15% causes global ozone decrease below the stratopause reaching 8% in the midlatitudes at 5 hPa and a significant stratospheric cooling of up to 2 °C in the midstratosphere and to 6 °C in the lower mesosphere. Changes in energetic particle precipitation lead only to minor changes in the yearly averaged temperature fields in the stratosphere. Volcanic aerosols heat the tropical lower stratosphere allowing more water vapor to enter the tropical stratosphere, which, via HOx reactions, decreases upper stratospheric and mesospheric ozone by roughly 4%. Conversely, heterogeneous chemistry on aerosols reduces stratospheric NOx leading to a 12% ozone increase in the tropics, whereas a decrease in ozone of up to 5% is found over Antarctica in boreal winter. The linear superposition of the different contributions is not equivalent to the response obtained in a simulation when all forcing factors are applied during the DM - this effect is especially well visible for NOx/NOy. Thus, this study highlights the non-linear behavior of the coupled chemistry-climate system. Finally, we conclude that especially UV and volcanic eruptions dominate the changes in the ozone, temperature and dynamics while the NOx field is dominated by the EPP. Visible radiation changes have only very minor effects on both stratospheric dynamics and chemistry.
Dobson spectrophotometer ozone measurements during international ozone rocketsonde intercomparison
NASA Technical Reports Server (NTRS)
Parsons, C. L.
1980-01-01
Measurements of the total ozone content of the atmosphere, made with seven ground based instruments at a site near Wallops Island, Virginia, are discussed in terms for serving as control values with which the rocketborne sensor data products can be compared. These products are profiles of O3 concentration with altitude. By integrating over the range of altitudes from the surface to the rocket apogee and by appropriately estimating the residual ozone amount from apogee to the top of the atmosphere, a total ozone amount can be computed from the profiles that can be directly compared with the ground based instrumentation results. Dobson spectrophotometers were used for two of the ground-based instruments. Preliminary data collected during the IORI from Dobson spectrophotometers 72 and 38 are presented. The agreement between the two and the variability of total ozone overburden through the experiment period are discussed.
ROCOZ-A (improved rocket launched ozone sensor) for middle atmosphere ozone measurements
NASA Technical Reports Server (NTRS)
Lee, H. S.; Parsons, C. L.
1987-01-01
An improved interference filter based ultraviolet photometer (ROCOZ-A) for measuring stratospheric ozone is discussed. The payload is launched aboard a Super-Loki to a typical apogee of 70 km. The instrument measures the solar ultraviolet irradiance as it descends on a parachute. The total cumulative ozone is then calculated based on the Beer-Lambert law. The cumulative ozone precision measured in this way is 2.0% to 2.5% over an altitude range of 20 and 55 km. Results of the intercomparison with the SBUV overpass data and ROCOZ-A data are also discussed.
Skill of Global Raw and Postprocessed Ensemble Predictions of Rainfall over Northern Tropical Africa
NASA Astrophysics Data System (ADS)
Vogel, Peter; Knippertz, Peter; Fink, Andreas H.; Schlueter, Andreas; Gneiting, Tilmann
2018-04-01
Accumulated precipitation forecasts are of high socioeconomic importance for agriculturally dominated societies in northern tropical Africa. In this study, we analyze the performance of nine operational global ensemble prediction systems (EPSs) relative to climatology-based forecasts for 1 to 5-day accumulated precipitation based on the monsoon seasons 2007-2014 for three regions within northern tropical Africa. To assess the full potential of raw ensemble forecasts across spatial scales, we apply state-of-the-art statistical postprocessing methods in form of Bayesian Model Averaging (BMA) and Ensemble Model Output Statistics (EMOS), and verify against station and spatially aggregated, satellite-based gridded observations. Raw ensemble forecasts are uncalibrated, unreliable, and underperform relative to climatology, independently of region, accumulation time, monsoon season, and ensemble. Differences between raw ensemble and climatological forecasts are large, and partly stem from poor prediction for low precipitation amounts. BMA and EMOS postprocessed forecasts are calibrated, reliable, and strongly improve on the raw ensembles, but - somewhat disappointingly - typically do not outperform climatology. Most EPSs exhibit slight improvements over the period 2007-2014, but overall have little added value compared to climatology. We suspect that the parametrization of convection is a potential cause for the sobering lack of ensemble forecast skill in a region dominated by mesoscale convective systems.
Ground-based lidar for atmospheric boundary layer ozone measurements.
Kuang, Shi; Newchurch, Michael J; Burris, John; Liu, Xiong
2013-05-20
Ground-based lidars are suitable for long-term ozone monitoring as a complement to satellite and ozonesonde measurements. However, current ground-based lidars are unable to consistently measure ozone below 500 m above ground level (AGL) due to both engineering issues and high retrieval sensitivity to various measurement errors. In this paper, we present our instrument design, retrieval techniques, and preliminary results that focus on the high-temporal profiling of ozone within the atmospheric boundary layer (ABL) achieved by the addition of an inexpensive and compact mini-receiver to the previous system. For the first time, to the best of our knowledge, the lowest, consistently achievable observation height has been extended down to 125 m AGL for a ground-based ozone lidar system. Both the analysis and preliminary measurements demonstrate that this lidar measures ozone with a precision generally better than ±10% at a temporal resolution of 10 min and a vertical resolution from 150 m at the bottom of the ABL to 550 m at the top. A measurement example from summertime shows that inhomogeneous ozone aloft was affected by both surface emissions and the evolution of ABL structures.
A Two Time-scale response of the Southern Ocean to the Ozone Hole: Regional Responses and Mechanisms
NASA Astrophysics Data System (ADS)
Gnanadesikan, A.; Seviour, W.; Waugh, D.; Pradal, M. A. S.
2016-12-01
The impact of changing ozone on the climate of the Southern Ocean is evaluated using an ensemble of coupled climate models. By imposing a step change from 1860 to 2000 conditions we are able to estimate response functions associated with this change. Two time scales are found, an initial cooling centered in the Southwest Pacific followed by cooling in the Pacific sector and then warming in both sectors. The physical processes that drive this response are different across time periods and locations, as is the sign of the response itself. Initial cooling in the Pacific sector is not just driven by the increased winds pushing cold water northward, but also by a decrease in surface salinity reducing wintertime mixing and increased ice and clouds reflecting more shortwave radiation back to space. The decrease in salinity is primarily driven by a southward shift of precipitation associated with a shifting storm track, coupled with decreased evaporation associated with colder surface temperatures. A subsurface increase in heat associated with this reduction in mixing then upwells along the Antarctic coast, producing a subsequent warming. Similar changes in convective activity occur in the Weddell Sea but are offset in time.
NASA Astrophysics Data System (ADS)
Määttä, A.; Laine, M.; Tamminen, J.; Veefkind, J. P.
2013-09-01
We study uncertainty quantification in remote sensing of aerosols in the atmosphere with top of the atmosphere reflectance measurements from the nadir-viewing Ozone Monitoring Instrument (OMI). Focus is on the uncertainty in aerosol model selection of pre-calculated aerosol models and on the statistical modelling of the model inadequacies. The aim is to apply statistical methodologies that improve the uncertainty estimates of the aerosol optical thickness (AOT) retrieval by propagating model selection and model error related uncertainties more realistically. We utilise Bayesian model selection and model averaging methods for the model selection problem and use Gaussian processes to model the smooth systematic discrepancies from the modelled to observed reflectance. The systematic model error is learned from an ensemble of operational retrievals. The operational OMI multi-wavelength aerosol retrieval algorithm OMAERO is used for cloud free, over land pixels of the OMI instrument with the additional Bayesian model selection and model discrepancy techniques. The method is demonstrated with four examples with different aerosol properties: weakly absorbing aerosols, forest fires over Greece and Russia, and Sahara dessert dust. The presented statistical methodology is general; it is not restricted to this particular satellite retrieval application.
Atmospheric transport of ozone between Southern and Eastern Asia.
Chakraborty, T; Beig, G; Dentener, F J; Wild, O
2015-08-01
This study describes the effect of pollution transport between East Asia and South Asia on tropospheric ozone (O3) using model results from the Task Force on Hemispheric Transport of Air Pollution (TF HTAP). Ensemble mean O3 concentrations are evaluated against satellite-data and ground observations of surface O3 at four stations in India. Although modeled surface O3 concentrations are 1020ppb higher than those observed, the relative magnitude of the seasonal cycle of O3 is reproduced well. Using 20% reductions in regional anthropogenic emissions, we quantify the seasonal variations in pollution transport between East Asia and South Asia. While there is only a difference of 0.05 to 0.1ppb in the magnitudes of the regional contributions from one region to the other, O3 from East Asian sources affects the most densely populated parts of South Asia while Southern Asian sources only partly affect the populated parts of East Asia. We show that emission changes over East Asia between 2000 and 2010 had a larger impact on populated parts of South Asia than vice versa. This study will help inform future decisions on emission control policy over these regions. Copyright © 2015 Elsevier B.V. All rights reserved.
Ozone Climatological Profiles for Version 8 TOMS and SBUV Retrievals
NASA Technical Reports Server (NTRS)
McPeters, R. D.; Logan, J. A.; Labow, G. J.
2003-01-01
A new altitude dependent ozone climatology has been produced for use with the latest Total Ozone Mapping Spectrometer (TOMS) and Solar Backscatter Ultraviolet (SBUV) retrieval algorithms. The climatology consists of monthly average profiles for ten degree latitude zones covering from 0 to 60 km. The climatology was formed by combining data from SAGE II (1988 to 2000) and MLS (1991-1999) with data from balloon sondes (1988-2002). Ozone below about 20 km is based on balloons sondes, while ozone above 30 km is based on satellite measurements. The profiles join smoothly between 20 and 30 km. The ozone climatology in the southern hemisphere and tropics has been greatly enhanced in recent years by the addition of balloon sonde stations under the SHADOZ (Southern Hemisphere Additional Ozonesondes) program. A major source of error in the TOMS and SBUV retrieval of total column ozone comes from their reduced sensitivity to ozone in the lower troposphere. An accurate climatology for the retrieval a priori is important for reducing this error on the average. The new climatology follows the seasonal behavior of tropospheric ozone and reflects its hemispheric asymmetry. Comparisons of TOMS version 8 ozone with ground stations show an improvement due in part to the new climatology.
Use of fluorescence spectroscopy to control ozone dosage in recirculating aquaculture systems.
Spiliotopoulou, Aikaterini; Martin, Richard; Pedersen, Lars-Flemming; Andersen, Henrik R
2017-03-15
The aim of this study was to investigate the potential of fluorescence spectroscopy to be used as an ozone dosage determination tool in recirculating aquaculture systems (RASs), by studying the relationship between fluorescence intensities and dissolved organic matter (DOM) degradation by ozone, in order to optimise ozonation treatment. Water samples from six different Danish facilities (two rearing units from a commercial trout RAS, a commercial eel RAS, a pilot RAS and two marine water aquariums) were treated with different O 3 dosages (1.0-20.0 mg/L ozone) in bench-scale experiments, following which fluorescence intensity degradation was eventually determined. Ozonation kinetic experiments showed that RAS water contains fluorescent organic matter, which is easily oxidised upon ozonation in relatively low concentrations (0-5 mg O 3 /L). Fluorescence spectroscopy has a high level of sensitivity and selectivity in relation to associated fluorophores, and it is able to determine accurately the ozone demand of each system. The findings can potentially be used to design offline or online sensors based on the reduction by ozone of natural fluorescent-dissolved organic matter in RAS. The suggested indirect determination of ozone delivered into water can potentially contribute to a safer and more adequate ozone-based treatment to improve water quality. Copyright © 2016 Elsevier Ltd. All rights reserved.
Deterministic multidimensional nonuniform gap sampling.
Worley, Bradley; Powers, Robert
2015-12-01
Born from empirical observations in nonuniformly sampled multidimensional NMR data relating to gaps between sampled points, the Poisson-gap sampling method has enjoyed widespread use in biomolecular NMR. While the majority of nonuniform sampling schemes are fully randomly drawn from probability densities that vary over a Nyquist grid, the Poisson-gap scheme employs constrained random deviates to minimize the gaps between sampled grid points. We describe a deterministic gap sampling method, based on the average behavior of Poisson-gap sampling, which performs comparably to its random counterpart with the additional benefit of completely deterministic behavior. We also introduce a general algorithm for multidimensional nonuniform sampling based on a gap equation, and apply it to yield a deterministic sampling scheme that combines burst-mode sampling features with those of Poisson-gap schemes. Finally, we derive a relationship between stochastic gap equations and the expectation value of their sampling probability densities. Copyright © 2015 Elsevier Inc. All rights reserved.
Nanni, Loris; Lumini, Alessandra
2009-01-01
The focuses of this work are: to propose a novel method for building an ensemble of classifiers for peptide classification based on substitution matrices; to show the importance to select a proper set of the parameters of the classifiers that build the ensemble of learning systems. The HIV-1 protease cleavage site prediction problem is here studied. The results obtained by a blind testing protocol are reported, the comparison with other state-of-the-art approaches, based on ensemble of classifiers, allows to quantify the performance improvement obtained by the systems proposed in this paper. The simulation based on experimentally determined protease cleavage data has demonstrated the success of these new ensemble algorithms. Particularly interesting it is to note that also if the HIV-1 protease cleavage site prediction problem is considered linearly separable we obtain the best performance using an ensemble of non-linear classifiers.
Sanchez-Martinez, M; Crehuet, R
2014-12-21
We present a method based on the maximum entropy principle that can re-weight an ensemble of protein structures based on data from residual dipolar couplings (RDCs). The RDCs of intrinsically disordered proteins (IDPs) provide information on the secondary structure elements present in an ensemble; however even two sets of RDCs are not enough to fully determine the distribution of conformations, and the force field used to generate the structures has a pervasive influence on the refined ensemble. Two physics-based coarse-grained force fields, Profasi and Campari, are able to predict the secondary structure elements present in an IDP, but even after including the RDC data, the re-weighted ensembles differ between both force fields. Thus the spread of IDP ensembles highlights the need for better force fields. We distribute our algorithm in an open-source Python code.
Batakliev, Todor; Georgiev, Vladimir; Anachkov, Metody; Rakovsky, Slavcho
2014-01-01
Catalytic ozone decomposition is of great significance because ozone is a toxic substance commonly found or generated in human environments (aircraft cabins, offices with photocopiers, laser printers, sterilizers). Considerable work has been done on ozone decomposition reported in the literature. This review provides a comprehensive summary of the literature, concentrating on analysis of the physico-chemical properties, synthesis and catalytic decomposition of ozone. This is supplemented by a review on kinetics and catalyst characterization which ties together the previously reported results. Noble metals and oxides of transition metals have been found to be the most active substances for ozone decomposition. The high price of precious metals stimulated the use of metal oxide catalysts and particularly the catalysts based on manganese oxide. It has been determined that the kinetics of ozone decomposition is of first order importance. A mechanism of the reaction of catalytic ozone decomposition is discussed, based on detailed spectroscopic investigations of the catalytic surface, showing the existence of peroxide and superoxide surface intermediates. PMID:26109880
NASA Technical Reports Server (NTRS)
Stone, J. B.; Thompson, A. M.; Frolov, A. D.; Hudson, R. D.; Bhartia, P. K. (Technical Monitor)
2002-01-01
There are a number of published residual-type methods for deriving tropospheric ozone from TOMS (Total Ozone Mapping Spectrometer). The basic concept of these methods is that within a zone of constant stratospheric ozone, the tropospheric ozone column can be computed by subtracting stratospheric ozone from the TOMS Level 2 total ozone column, We used the modified-residual method for retrieving tropospheric ozone during SAFARI-2000 and found disagreements with in-situ ozone data over Africa in September 2000. Using the newly developed TDOT (TOMS-Direct-Ozone-in-Troposphere) method that uses TOMS radiances and a modified lookup table based on actual profiles during high ozone pollution periods, new maps were prepared and found to compare better to soundings over Lusaka, Zambia (15.5 S, 28 E), Nairobi and several African cities where MOZAIC aircraft operated in September 2000. The TDOT technique and comparisons are described in detail.
Quantification of Uncertainty in the Flood Frequency Analysis
NASA Astrophysics Data System (ADS)
Kasiapillai Sudalaimuthu, K.; He, J.; Swami, D.
2017-12-01
Flood frequency analysis (FFA) is usually carried out for planning and designing of water resources and hydraulic structures. Owing to the existence of variability in sample representation, selection of distribution and estimation of distribution parameters, the estimation of flood quantile has been always uncertain. Hence, suitable approaches must be developed to quantify the uncertainty in the form of prediction interval as an alternate to deterministic approach. The developed framework in the present study to include uncertainty in the FFA discusses a multi-objective optimization approach to construct the prediction interval using ensemble of flood quantile. Through this approach, an optimal variability of distribution parameters is identified to carry out FFA. To demonstrate the proposed approach, annual maximum flow data from two gauge stations (Bow river at Calgary and Banff, Canada) are used. The major focus of the present study was to evaluate the changes in magnitude of flood quantiles due to the recent extreme flood event occurred during the year 2013. In addition, the efficacy of the proposed method was further verified using standard bootstrap based sampling approaches and found that the proposed method is reliable in modeling extreme floods as compared to the bootstrap methods.
A ray tracing model for leaf bidirectional scattering studies
NASA Technical Reports Server (NTRS)
Brakke, T. W.; Smith, J. A.
1987-01-01
A leaf is modeled as a deterministic two-dimensional structure consisting of a network of circular arcs designed to represent the internal morphology of major species. The path of an individual ray through the leaf is computed using geometric optics. At each intersection of the ray with an arc, the specular reflected and transmitted rays are calculated according to the Snell and Fresnel equations. Diffuse scattering is treated according to Lambert's law. Absorption is also permitted but requires a detailed knowledge of the spectral attenuation coefficients. An ensemble of initial rays are chosen for each incident direction with the initial intersection points on the leaf surface selected randomly. The final equilibrium state after all interactions then yields the leaf bidirectional reflectance and transmittance distributions. The model also yields the internal two dimensional light gradient profile of the leaf.
Modelling the Ozone-Based Treatments for Inactivation of Microorganisms.
Brodowska, Agnieszka Joanna; Nowak, Agnieszka; Kondratiuk-Janyska, Alina; Piątkowski, Marcin; Śmigielski, Krzysztof
2017-10-09
The paper presents the development of a model for ozone treatment in a dynamic bed of different microorganisms ( Bacillus subtilis , B. cereus , B. pumilus , Escherichia coli , Pseudomonas fluorescens , Aspergillus niger , Eupenicillium cinnamopurpureum ) on a heterogeneous matrix (juniper berries, cardamom seeds) initially treated with numerous ozone doses during various contact times was studied. Taking into account various microorganism susceptibility to ozone, it was of great importance to develop a sufficiently effective ozone dose to preserve food products using different strains based on the microbial model. For this purpose, we have chosen the Weibull model to describe the survival curves of different microorganisms. Based on the results of microorganism survival modelling after ozone treatment and considering the least susceptible strains to ozone, we selected the critical ones. Among tested strains, those from genus Bacillus were recognized as the most critical strains. In particular, B. subtilis and B. pumilus possessed the highest resistance to ozone treatment because the time needed to achieve the lowest level of its survival was the longest (up to 17.04 min and 16.89 min for B. pumilus reduction on juniper berry and cardamom seed matrix, respectively). Ozone treatment allow inactivate microorganisms to achieving lower survival rates by ozone dose (20.0 g O₃/m³ O₂, with a flow rate of 0.4 L/min) and contact time (up to 20 min). The results demonstrated that a linear correlation between parameters p and k in Weibull distribution, providing an opportunity to calculate a fitted equation of the process.
Taking Control: Stealth Assessment of Deterministic Behaviors within a Game-Based System
ERIC Educational Resources Information Center
Snow, Erica L.; Likens, Aaron D.; Allen, Laura K.; McNamara, Danielle S.
2016-01-01
Game-based environments frequently afford students the opportunity to exert agency over their learning paths by making various choices within the environment. The combination of log data from these systems and dynamic methodologies may serve as a stealth means to assess how students behave (i.e., deterministic or random) within these learning…
Taking Control: Stealth Assessment of Deterministic Behaviors within a Game-Based System
ERIC Educational Resources Information Center
Snow, Erica L.; Likens, Aaron D.; Allen, Laura K.; McNamara, Danielle S.
2015-01-01
Game-based environments frequently afford students the opportunity to exert agency over their learning paths by making various choices within the environment. The combination of log data from these systems and dynamic methodologies may serve as a stealth means to assess how students behave (i.e., deterministic or random) within these learning…
Change in ozone trends at southern high latitudes
NASA Technical Reports Server (NTRS)
Yang, E.-S.; Cunnold, D. M.; Newchurch, M. J.; Salawitch, R. J.
2005-01-01
Long-term ozone variations at 60-70degS in spring are investigated using ground-based and satellite measurements. Strong positive correlation is shown between year-to-year variations of ozone and temperature in the Antarctic collar region in Septembers and Octobers. Based on this relationship, the effect of year-to-year variations in vortex dynamics has been filtered out. This process results in an ozone time series that shows increasing springtime ozone losses over the Antarctic until the mid-1990s. Since approximately 1997 the ozone losses have leveled off. The analysis confirms that this change is consistent across all instruments and is statistically significant at the 95% confidence level. This analysis quantifies the beginning of the recovery of the ozone hole, which is expected from the leveling off of stratospheric halogen loading due to the ban on CFCs and other halocarbons initiated by the Montreal Protocol.
NASA Astrophysics Data System (ADS)
Fares, S.; McKay, M.; Goldstein, A.
2008-12-01
Ecosystems remove ozone from the troposphere through both stomatal and non-stomatal deposition. The portion of ozone taken up through stomata has an oxidative effect causing damage. We used a multi-year dataset to assess the physiological controls over ozone deposition. Environmental parameters, CO2 and ozone fluxes were measured continuously from January 2001 to December 2006 above a ponderosa pine plantation near Blodgett Forest, Georgetown, California. We studied the dynamic of NEE (Net Ecosystem Exchange, -838 g C m-2 yr-1) and water evapotranspiration on an annual and daily basis. These processes are tightly coupled to stomatal aperture which also controlled ozone fluxes. High levels of ozone concentrations (~ 100 ppb) were observed during the spring-summer period, with corresponding high levels of ozone fluxes (~ 30 μmol m-2 h-1). During the summer season, a large portion of the total ozone flux was due to non-stomatal processes, and we propose that a plant physiological control, releasing BVOC (Biogenic Volatile Organic Compounds), is mainly responsible. We analyzed the correlations of common ozone exposure metrics based on accumulation of concentrations (AOT40 and SUM0) with ozone fluxes (total, stomatal and non-stomatal). Stomatal flux showed poorer correlation with ozone concentrations than non-stomatal flux during summer and fall seasons, which largely corresponded to the growing period. We therefore suggest that AOT40 and SUM0 are poor predictors of ozone damage and that a physiologically based metric would be more effective.
SVM and SVM Ensembles in Breast Cancer Prediction.
Huang, Min-Wei; Chen, Chih-Wen; Lin, Wei-Chao; Ke, Shih-Wen; Tsai, Chih-Fong
2017-01-01
Breast cancer is an all too common disease in women, making how to effectively predict it an active research problem. A number of statistical and machine learning techniques have been employed to develop various breast cancer prediction models. Among them, support vector machines (SVM) have been shown to outperform many related techniques. To construct the SVM classifier, it is first necessary to decide the kernel function, and different kernel functions can result in different prediction performance. However, there have been very few studies focused on examining the prediction performances of SVM based on different kernel functions. Moreover, it is unknown whether SVM classifier ensembles which have been proposed to improve the performance of single classifiers can outperform single SVM classifiers in terms of breast cancer prediction. Therefore, the aim of this paper is to fully assess the prediction performance of SVM and SVM ensembles over small and large scale breast cancer datasets. The classification accuracy, ROC, F-measure, and computational times of training SVM and SVM ensembles are compared. The experimental results show that linear kernel based SVM ensembles based on the bagging method and RBF kernel based SVM ensembles with the boosting method can be the better choices for a small scale dataset, where feature selection should be performed in the data pre-processing stage. For a large scale dataset, RBF kernel based SVM ensembles based on boosting perform better than the other classifiers.
SVM and SVM Ensembles in Breast Cancer Prediction
Huang, Min-Wei; Chen, Chih-Wen; Lin, Wei-Chao; Ke, Shih-Wen; Tsai, Chih-Fong
2017-01-01
Breast cancer is an all too common disease in women, making how to effectively predict it an active research problem. A number of statistical and machine learning techniques have been employed to develop various breast cancer prediction models. Among them, support vector machines (SVM) have been shown to outperform many related techniques. To construct the SVM classifier, it is first necessary to decide the kernel function, and different kernel functions can result in different prediction performance. However, there have been very few studies focused on examining the prediction performances of SVM based on different kernel functions. Moreover, it is unknown whether SVM classifier ensembles which have been proposed to improve the performance of single classifiers can outperform single SVM classifiers in terms of breast cancer prediction. Therefore, the aim of this paper is to fully assess the prediction performance of SVM and SVM ensembles over small and large scale breast cancer datasets. The classification accuracy, ROC, F-measure, and computational times of training SVM and SVM ensembles are compared. The experimental results show that linear kernel based SVM ensembles based on the bagging method and RBF kernel based SVM ensembles with the boosting method can be the better choices for a small scale dataset, where feature selection should be performed in the data pre-processing stage. For a large scale dataset, RBF kernel based SVM ensembles based on boosting perform better than the other classifiers. PMID:28060807
Haque, Mohammad Nazmul; Noman, Nasimul; Berretta, Regina; Moscato, Pablo
2016-01-01
Classification of datasets with imbalanced sample distributions has always been a challenge. In general, a popular approach for enhancing classification performance is the construction of an ensemble of classifiers. However, the performance of an ensemble is dependent on the choice of constituent base classifiers. Therefore, we propose a genetic algorithm-based search method for finding the optimum combination from a pool of base classifiers to form a heterogeneous ensemble. The algorithm, called GA-EoC, utilises 10 fold-cross validation on training data for evaluating the quality of each candidate ensembles. In order to combine the base classifiers decision into ensemble's output, we used the simple and widely used majority voting approach. The proposed algorithm, along with the random sub-sampling approach to balance the class distribution, has been used for classifying class-imbalanced datasets. Additionally, if a feature set was not available, we used the (α, β) - k Feature Set method to select a better subset of features for classification. We have tested GA-EoC with three benchmarking datasets from the UCI-Machine Learning repository, one Alzheimer's disease dataset and a subset of the PubFig database of Columbia University. In general, the performance of the proposed method on the chosen datasets is robust and better than that of the constituent base classifiers and many other well-known ensembles. Based on our empirical study we claim that a genetic algorithm is a superior and reliable approach to heterogeneous ensemble construction and we expect that the proposed GA-EoC would perform consistently in other cases.
NASA Astrophysics Data System (ADS)
Krzycin, Janusz W.
2002-10-01
Decadal changes of ozone mini-hole event appearance over the Northern Hemisphere midlatitudes are examined based on daily total ozone data from seven stations having long records (four decades or more) of ozone observations. The various threshold methods for accepting and rejecting the ozone minima as mini-holes are examined. Mini-hole event activity is seen to be rather stable when averaged over a decadal time scale if the mini-holes are selected as large negative departures (exceeding 20%) relative to the moving long-term total ozone reference. The results are compared with a previous ozone mini-hole climatology derived from satellite data (TOMS measurements on board the Nimbus-7 satellite for the period 1978-93). A nonlinear statistical model (MARS), which takes into account various total ozone dynamical proxies (from NCEP-NCAR reanalysis), is used to study dynamical factors responsible for the ozone extremes over Arosa in the period 1950-99. The model explains as much as 95% of the total variance of the ozone extremes. The model-observation differences averaged over the decadal intervals are rather smooth throughout the whole period analysed. It is suggested that the short-term dynamical processes controlling the appearance of ozone extremes influenced the ozone field in a similar way before and after the onset of abrupt ozone depletion in the early 1980s. The analysis of the ozone profile and the tropopause pressure (from the ozonesondings over Hohenpeissenberg, 1966-99) during mini-hole events shows 60% ozone reduction in the lower stratosphere and an approximately 50 hPa upward shift of the thermal tropopause there.
NASA Astrophysics Data System (ADS)
Thompson, Anne M.; Witte, Jacquelyn C.; Sterling, Chance; Jordan, Allen; Johnson, Bryan J.; Oltmans, Samuel J.; Fujiwara, Masatomo; Vömel, Holger; Allaart, Marc; Piters, Ankie; Coetzee, Gert J. R.; Posny, Françoise; Corrales, Ernesto; Diaz, Jorge Andres; Félix, Christian; Komala, Ninong; Lai, Nga; Ahn Nguyen, H. T.; Maata, Matakite; Mani, Francis; Zainal, Zamuna; Ogino, Shin-ya; Paredes, Francisco; Penha, Tercio Luiz Bezerra; da Silva, Francisco Raimundo; Sallons-Mitro, Sukarni; Selkirk, Henry B.; Schmidlin, F. J.; Stübi, Rene; Thiongo, Kennedy
2017-12-01
The Southern Hemisphere ADditional OZonesonde (SHADOZ) network was assembled to validate a new generation of ozone-monitoring satellites and to better characterize the vertical structure of tropical ozone in the troposphere and stratosphere. Beginning with nine stations in 1998, more than 7,000 ozone and P-T-U profiles are available from 14 SHADOZ sites that have operated continuously for at least a decade. We analyze ozone profiles from the recently reprocessed SHADOZ data set that is based on adjustments for inconsistencies caused by varying ozonesonde instruments and operating techniques. First, sonde-derived total ozone column amounts are compared to the overpasses from the Earth Probe/Total Ozone Mapping Spectrometer, Ozone Monitoring Instrument, and Ozone Mapping and Profiler Suite satellites that cover 1998-2016. Second, characteristics of the stratospheric and tropospheric columns are examined along with ozone structure in the tropical tropopause layer (TTL). We find that (1) relative to our earlier evaluations of SHADOZ data, in 2003, 2007, and 2012, sonde-satellite total ozone column offsets at 12 stations are 2% or less, a significant improvement; (2) as in prior studies, the 10 tropical SHADOZ stations, defined as within ±19° latitude, display statistically uniform stratospheric column ozone, 229 ± 3.9 DU (Dobson units), and a tropospheric zonal wave-one pattern with a 14 DU mean amplitude; (3) the TTL ozone column, which is also zonally uniform, masks complex vertical structure, and this argues against using satellites for lower stratospheric ozone trends; and (4) reprocessing has led to more uniform stratospheric column amounts across sites and reduced bias in stratospheric profiles. As a consequence, the uncertainty in total column ozone now averages 5%.
({The) Solar System Large Planets influence on a new Maunder Miniμm}
NASA Astrophysics Data System (ADS)
Yndestad, Harald; Solheim, Jan-Erik
2016-04-01
In 1890´s G. Spörer and E. W. Maunder (1890) reported that the solar activity stopped in a period of 70 years from 1645 to 1715. Later a reconstruction of the solar activity confirms the grand minima Maunder (1640-1720), Spörer (1390-1550), Wolf (1270-1340), and the minima Oort (1010-1070) and Dalton (1785-1810) since the year 1000 A.D. (Usoskin et al. 2007). These minimum periods have been associated with less irradiation from the Sun and cold climate periods on Earth. An identification of a three grand Maunder type periods and two Dalton type periods in a period thousand years, indicates that sooner or later there will be a colder climate on Earth from a new Maunder- or Dalton- type period. The cause of these minimum periods, are not well understood. An expected new Maunder-type period is based on the properties of solar variability. If the solar variability has a deterministic element, we can estimate better a new Maunder grand minimum. A random solar variability can only explain the past. This investigation is based on the simple idea that if the solar variability has a deterministic property, it must have a deterministic source, as a first cause. If this deterministic source is known, we can compute better estimates the next expected Maunder grand minimum period. The study is based on a TSI ACRIM data series from 1700, a TSI ACRIM data series from 1000 A.D., sunspot data series from 1611 and a Solar Barycenter orbit data series from 1000. The analysis method is based on a wavelet spectrum analysis, to identify stationary periods, coincidence periods and their phase relations. The result shows that the TSI variability and the sunspots variability have deterministic oscillations, controlled by the large planets Jupiter, Uranus and Neptune, as the first cause. A deterministic model of TSI variability and sunspot variability confirms the known minimum and grand minimum periods since 1000. From this deterministic model we may expect a new Maunder type sunspot minimum period from about 2018 to 2055. The deterministic model of a TSI ACRIM data series from 1700 computes a new Maunder type grand minimum period from 2015 to 2071. A model of the longer TSI ACRIM data series from 1000 computes a new Dalton to Maunder type minimum irradiation period from 2047 to 2068.
NASA Technical Reports Server (NTRS)
Hollandsworth, Stacey M.; Schoeberl, Mark R.; Morris, Gary A.; Long, Craig; Zhou, Shuntai; Miller, Alvin J.
1999-01-01
In this study we utilize potential vorticity - isentropic (PVI) coordinate transformations as a means of combining ozone data from different sources to construct daily, synthetic three-dimensional ozone fields. This methodology has been used successfully to reconstruct ozone maps in particular regions from aircraft data over the period of the aircraft campaign. We expand this method to create high-resolution daily global maps of profile ozone data, particularly in the lower stratosphere, where high-resolution ozone data are sparse. Ozone climatologies in PVI-space are constructed from satellite-based SAGE II and UARS/HALOE data, both of which-use solar occultation techniques to make high vertical resolution ozone profile measurements, but with low spatial resolution. A climatology from ground-based balloonsonde data is also created. The climatologies are used to establish the relationship between ozone and dynamical variability, which is defined by the potential vorticity (in the form of equivalent latitude) and potential temperature fields. Once a PVI climatology has been created from data taken by one or more instruments, high-resolution daily profile ozone field estimates are constructed based solely on the PVI fields, which are available on a daily basis from NCEP analysis. These profile ozone maps could be used for a variety of applications, including use in conjunction with total ozone maps to create a daily tropospheric ozone product, as input to forecast models, or as a tool for validating independent ozone measurements when correlative data are not available. This technique is limited to regions where the ozone is a long-term tracer and the flow is adiabatic. We evaluate the internal consistency of the technique by transforming the ozone back to physical space and comparing to the original profiles. Biases in the long-term average of the differences are used to identify regions where the technique is consistently introducing errors. Initial results show the technique is useful in the lower stratosphere at most latitudes throughout the year,and in the winter hemisphere in the middle stratosphere. The results are problematic in the summer hemisphere middle stratosphere due to increased ozone photochemistry and weak PV gradients. Alternate techniques in these regions will be discussed. An additional limitation is the quality and resolution of the meteorological data.
NASA Astrophysics Data System (ADS)
Cassagnole, Manon; Ramos, Maria-Helena; Thirel, Guillaume; Gailhard, Joël; Garçon, Rémy
2017-04-01
The improvement of a forecasting system and the evaluation of the quality of its forecasts are recurrent steps in operational practice. However, the evaluation of forecast value or forecast usefulness for better decision-making is, to our knowledge, less frequent, even if it might be essential in many sectors such as hydropower and flood warning. In the hydropower sector, forecast value can be quantified by the economic gain obtained with the optimization of operations or reservoir management rules. Several hydropower operational systems use medium-range forecasts (up to 7-10 days ahead) and energy price predictions to optimize hydropower production. Hence, the operation of hydropower systems, including the management of water in reservoirs, is impacted by weather, climate and hydrologic variability as well as extreme events. In order to assess how the quality of hydrometeorological forecasts impact operations, it is essential to first understand if and how operations and management rules are sensitive to input predictions of different quality. This study investigates how 7-day ahead deterministic and ensemble streamflow forecasts of different quality might impact the economic gains of energy production. It is based on a research model developed by Irstea and EDF to investigate issues relevant to the links between quality and value of forecasts in the optimisation of energy production at the short range. Based on streamflow forecasts and pre-defined management constraints, the model defines the best hours (i.e., the hours with high energy prices) to produce electricity. To highlight the link between forecasts quality and their economic value, we built several synthetic ensemble forecasts based on observed streamflow time series. These inputs are generated in a controlled environment in order to obtain forecasts of different quality in terms of accuracy and reliability. These forecasts are used to assess the sensitivity of the decision model to forecast quality. Relationships between forecast quality and economic value are discussed. This work is part of the IMPREX project, a research project supported by the European Commission under the Horizon 2020 Framework programme, with grant No. 641811 (http://www.imprex.eu)
NASA Technical Reports Server (NTRS)
Bak, Juseon; Liu, X.; Wei, J.; Kim, J. H.; Chance, K.; Barnet, C.
2011-01-01
An advance algorithm based on the optimal estimation technique has beeen developed to derive ozone profile from GOME UV radiances and have adapted it to OMI UV radiances. OMI vertical resolution : 7-11 km in the troposphere and 10-14 km in the stratosphere. Satellite ultraviolet measurements (GOME, OMI) contain little vertical information for the small scale of ozone, especially in the upper troposphere (UT) and lower stratosphere (LS) where the sharp O3 gradient across the tropopause and large ozone variability are observed. Therefore, retrievals depend greatly on the a-priori knowledge in the UTLS
An Ensemble-Based Smoother with Retrospectively Updated Weights for Highly Nonlinear Systems
NASA Technical Reports Server (NTRS)
Chin, T. M.; Turmon, M. J.; Jewell, J. B.; Ghil, M.
2006-01-01
Monte Carlo computational methods have been introduced into data assimilation for nonlinear systems in order to alleviate the computational burden of updating and propagating the full probability distribution. By propagating an ensemble of representative states, algorithms like the ensemble Kalman filter (EnKF) and the resampled particle filter (RPF) rely on the existing modeling infrastructure to approximate the distribution based on the evolution of this ensemble. This work presents an ensemble-based smoother that is applicable to the Monte Carlo filtering schemes like EnKF and RPF. At the minor cost of retrospectively updating a set of weights for ensemble members, this smoother has demonstrated superior capabilities in state tracking for two highly nonlinear problems: the double-well potential and trivariate Lorenz systems. The algorithm does not require retrospective adaptation of the ensemble members themselves, and it is thus suited to a streaming operational mode. The accuracy of the proposed backward-update scheme in estimating non-Gaussian distributions is evaluated by comparison to the more accurate estimates provided by a Markov chain Monte Carlo algorithm.
40 CFR 52.1582 - Control strategy and regulations: Ozone.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 4 2014-07-01 2014-07-01 false Control strategy and regulations: Ozone... Control strategy and regulations: Ozone. (a) Subchapter 16 of the New Jersey Administrative Code, entitled... of the 1990 Clean Air Act. (d)(1) The base year ozone precursor emission inventory requirement of...
40 CFR 52.1582 - Control strategy and regulations: Ozone.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 4 2012-07-01 2012-07-01 false Control strategy and regulations: Ozone... Control strategy and regulations: Ozone. (a) Subchapter 16 of the New Jersey Administrative Code, entitled... of the 1990 Clean Air Act. (d)(1) The base year ozone precursor emission inventory requirement of...
40 CFR 52.1582 - Control strategy and regulations: Ozone.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 4 2011-07-01 2011-07-01 false Control strategy and regulations: Ozone... Control strategy and regulations: Ozone. (a) Subchapter 16 of the New Jersey Administrative Code, entitled... of the 1990 Clean Air Act. (d)(1) The base year ozone precursor emission inventory requirement of...
40 CFR 52.1582 - Control strategy and regulations: Ozone.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 4 2010-07-01 2010-07-01 false Control strategy and regulations: Ozone... Control strategy and regulations: Ozone. (a) Subchapter 16 of the New Jersey Administrative Code, entitled... of the 1990 Clean Air Act. (d)(1) The base year ozone precursor emission inventory requirement of...
40 CFR 52.1582 - Control strategy and regulations: Ozone.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 4 2013-07-01 2013-07-01 false Control strategy and regulations: Ozone... Control strategy and regulations: Ozone. (a) Subchapter 16 of the New Jersey Administrative Code, entitled... of the 1990 Clean Air Act. (d)(1) The base year ozone precursor emission inventory requirement of...
A Chemiluminescence Detector for Ozone Measurement.
ERIC Educational Resources Information Center
Carroll, H.; And Others
An ozone detector was built and evaluated for its applicability in smog chamber studies. The detection method is based on reaction of ozone with ethylene and measurement of resultant chemiluminescence. In the first phase of evaluation, the detector's response to ozone was studied as a function of several instrument parameters, and optimum…
2003-06-26
VANDENBERG AIR FORCE BASE, CALIF. - At Vandenberg Air Force Base, Calif., the Pegasus launch vehicle is moved toward its hangar. The Pegasus will carry the SciSat-1 spacecraft in a 400-mile-high polar orbit to investigate processes that control the distribution of ozone in the upper atmosphere. The data from the satellite will provide Canadian and international scientists with improved measurements relating to global ozone processes and help policymakers assess existing environmental policy and develop protective measures for improving the health of our atmosphere, preventing further ozone depletion. The mission is designed to last two years.
Understanding Southern Ocean SST Trends in Historical Simulations and Observations
NASA Astrophysics Data System (ADS)
Kostov, Yavor; Ferreira, David; Marshall, John; Armour, Kyle
2017-04-01
Historical simulations with CMIP5 global climate models do not reproduce the observed 1979-2014 Southern Ocean (SO) cooling, and most ensemble members predict gradual warming around Antarctica. In order to understand this discrepancy and the mechanisms behind the SO cooling, we analyze output from 19 CMIP5 models. For each ensemble member we estimate the characteristic responses of SO SST to step changes in greenhouse gas (GHG) forcing and in the seasonal indices of the Southern Annular Mode (SAM). Using these step-response functions and linear convolution theory, we reconstruct the original CMIP5 simulations of 1979-2014 SO SST trends. We recover the CMIP5 ensemble mean trend, capture the intermodel spread, and reproduce very well the behavior of individual models. We thus suggest that GHG forcing and the SAM are major drivers of the simulated 1979-2014 SO SST trends. In consistence with the seasonal signature of the Antarctic ozone hole, our results imply that the summer (DJF) and fall (MAM) SAM exert a particularly important effect on the SO SST. In some CMIP5 models the SO SST response to SAM partially counteracts the warming due to GHG forcing, while in other ensemble members the SAM-induced SO SST trends complement the warming effect of GHG forcing. The compensation between GHG and SAM-induced SO SST anomalies is model-dependent and is determined by multiple factors. Firstly, CMIP5 models have different characteristic SST step response functions to SAM. Kostov et al. (2016) relate these differences to biases in the models' climatological SO temperature gradients. Secondly, many CMIP5 historical simulations underestimate the observed positive trends in the DJF and MAM seasonal SAM indices. We show that this affects the models' ability to reproduce the observed SO cooling. Last but not least, CMIP5 models differ in their SO SST step response functions to GHG forcing. Understanding the diverse behavior of CMIP5 models helps shed light on the physical processes that drive SST trends in the real SO.
Efficient Algorithms for Handling Nondeterministic Automata
NASA Astrophysics Data System (ADS)
Vojnar, Tomáš
Finite (word, tree, or omega) automata play an important role in different areas of computer science, including, for instance, formal verification. Often, deterministic automata are used for which traditional algorithms for important operations such as minimisation and inclusion checking are available. However, the use of deterministic automata implies a need to determinise nondeterministic automata that often arise during various computations even when the computations start with deterministic automata. Unfortunately, determinisation is a very expensive step since deterministic automata may be exponentially bigger than the original nondeterministic automata. That is why, it appears advantageous to avoid determinisation and work directly with nondeterministic automata. This, however, brings a need to be able to implement operations traditionally done on deterministic automata on nondeterministic automata instead. In particular, this is the case of inclusion checking and minimisation (or rather reduction of the size of automata). In the talk, we review several recently proposed techniques for inclusion checking on nondeterministic finite word and tree automata as well as Büchi automata. These techniques are based on using the so called antichains, possibly combined with a use of suitable simulation relations (and, in the case of Büchi automata, the so called Ramsey-based or rank-based approaches). Further, we discuss techniques for reducing the size of nondeterministic word and tree automata using quotienting based on the recently proposed notion of mediated equivalences. The talk is based on several common works with Parosh Aziz Abdulla, Ahmed Bouajjani, Yu-Fang Chen, Peter Habermehl, Lisa Kaati, Richard Mayr, Tayssir Touili, Lorenzo Clemente, Lukáš Holík, and Chih-Duo Hong.
NASA Astrophysics Data System (ADS)
Clark, E.; Wood, A.; Nijssen, B.; Newman, A. J.; Mendoza, P. A.
2016-12-01
The System for Hydrometeorological Applications, Research and Prediction (SHARP), developed at the National Center for Atmospheric Research (NCAR), University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation, is a fully automated ensemble prediction system for short-term to seasonal applications. It incorporates uncertainty in initial hydrologic conditions (IHCs) and in hydrometeorological predictions. In this implementation, IHC uncertainty is estimated by propagating an ensemble of 100 plausible temperature and precipitation time series through the Sacramento/Snow-17 model. The forcing ensemble explicitly accounts for measurement and interpolation uncertainties in the development of gridded meteorological forcing time series. The resulting ensemble of derived IHCs exhibits a broad range of possible soil moisture and snow water equivalent (SWE) states. To select the IHCs that are most consistent with the observations, we employ a particle filter (PF) that weights IHC ensemble members based on observations of streamflow and SWE. These particles are then used to initialize ensemble precipitation and temperature forecasts downscaled from the Global Ensemble Forecast System (GEFS), generating a streamflow forecast ensemble. We test this method in two basins in the Pacific Northwest that are important for water resources management: 1) the Green River upstream of Howard Hanson Dam, and 2) the South Fork Flathead River upstream of Hungry Horse Dam. The first of these is characterized by mixed snow and rain, while the second is snow-dominated. The PF-based forecasts are compared to forecasts based on a single IHC (corresponding to median streamflow) paired with the full GEFS ensemble, and 2) the full IHC ensemble, without filtering, paired with the full GEFS ensemble. In addition to assessing improvements in the spread of IHCs, we perform a hindcast experiment to evaluate the utility of PF-based data assimilation on streamflow forecasts at 1- to 7-day lead times.
Annual and Seasonal Global Variation in Total Ozone and Layer-Mean Ozone, 1958-1987 (1991)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angell, J. K.; Korshover, J.; Planet, W. G.
For 1958 through 1987, this data base presents total ozone variations and layer mean ozone variations expressed as percent deviations from the 1958 to 1977 mean. The total ozone variations were derived from mean monthly ozone values published in Ozone Data for the World by the Atmospheric Environment Service in cooperation with the World Meteorological Organization. The layer mean ozone variations are derived from ozonesonde and Umkehr observations. The data records include year, seasonal and annual total ozone variations, and seasonal and annual layer mean ozone variations. The total ozone data are for four regions (Soviet Union, Europe, North America,more » and Asia); five climatic zones (north and south polar, north and south temperate, and tropical); both hemispheres; and the world. Layer mean ozone data are for four climatic zones (north and south temperate and north and south polar) and for the stratosphere, troposphere, and tropopause layers. The data are in two files [seasonal and year-average total ozone (13.4 kB) and layer mean ozone variations (24.2 kB)].« less
Improving ensemble decision tree performance using Adaboost and Bagging
NASA Astrophysics Data System (ADS)
Hasan, Md. Rajib; Siraj, Fadzilah; Sainin, Mohd Shamrie
2015-12-01
Ensemble classifier systems are considered as one of the most promising in medical data classification and the performance of deceision tree classifier can be increased by the ensemble method as it is proven to be better than single classifiers. However, in a ensemble settings the performance depends on the selection of suitable base classifier. This research employed two prominent esemble s namely Adaboost and Bagging with base classifiers such as Random Forest, Random Tree, j48, j48grafts and Logistic Model Regression (LMT) that have been selected independently. The empirical study shows that the performance varries when different base classifiers are selected and even some places overfitting issue also been noted. The evidence shows that ensemble decision tree classfiers using Adaboost and Bagging improves the performance of selected medical data sets.
Oliveira, Roberta B; Pereira, Aledir S; Tavares, João Manuel R S
2017-10-01
The number of deaths worldwide due to melanoma has risen in recent times, in part because melanoma is the most aggressive type of skin cancer. Computational systems have been developed to assist dermatologists in early diagnosis of skin cancer, or even to monitor skin lesions. However, there still remains a challenge to improve classifiers for the diagnosis of such skin lesions. The main objective of this article is to evaluate different ensemble classification models based on input feature manipulation to diagnose skin lesions. Input feature manipulation processes are based on feature subset selections from shape properties, colour variation and texture analysis to generate diversity for the ensemble models. Three subset selection models are presented here: (1) a subset selection model based on specific feature groups, (2) a correlation-based subset selection model, and (3) a subset selection model based on feature selection algorithms. Each ensemble classification model is generated using an optimum-path forest classifier and integrated with a majority voting strategy. The proposed models were applied on a set of 1104 dermoscopic images using a cross-validation procedure. The best results were obtained by the first ensemble classification model that generates a feature subset ensemble based on specific feature groups. The skin lesion diagnosis computational system achieved 94.3% accuracy, 91.8% sensitivity and 96.7% specificity. The input feature manipulation process based on specific feature subsets generated the greatest diversity for the ensemble classification model with very promising results. Copyright © 2017 Elsevier B.V. All rights reserved.
Ensemble Classifier Strategy Based on Transient Feature Fusion in Electronic Nose
NASA Astrophysics Data System (ADS)
Bagheri, Mohammad Ali; Montazer, Gholam Ali
2011-09-01
In this paper, we test the performance of several ensembles of classifiers and each base learner has been trained on different types of extracted features. Experimental results show the potential benefits introduced by the usage of simple ensemble classification systems for the integration of different types of transient features.