Sample records for forecasting multivariate realized

  1. Forecasting stock market volatility: Do realized skewness and kurtosis help?

    NASA Astrophysics Data System (ADS)

    Mei, Dexiang; Liu, Jing; Ma, Feng; Chen, Wang

    2017-09-01

    In this study, we investigate the predictability of the realized skewness (RSK) and realized kurtosis (RKU) to stock market volatility, that has not been addressed in the existing studies. Out-of-sample results show that RSK, which can significantly improve forecast accuracy in mid- and long-term, is more powerful than RKU in forecasting volatility. Whereas these variables are useless in short-term forecasting. Furthermore, we employ the realized kernel (RK) for the robustness analysis and the conclusions are consistent with the RV measures. Our results are of great importance for portfolio allocation and financial risk management.

  2. A multivariate time series approach to modeling and forecasting demand in the emergency department.

    PubMed

    Jones, Spencer S; Evans, R Scott; Allen, Todd L; Thomas, Alun; Haug, Peter J; Welch, Shari J; Snow, Gregory L

    2009-02-01

    The goals of this investigation were to study the temporal relationships between the demands for key resources in the emergency department (ED) and the inpatient hospital, and to develop multivariate forecasting models. Hourly data were collected from three diverse hospitals for the year 2006. Descriptive analysis and model fitting were carried out using graphical and multivariate time series methods. Multivariate models were compared to a univariate benchmark model in terms of their ability to provide out-of-sample forecasts of ED census and the demands for diagnostic resources. Descriptive analyses revealed little temporal interaction between the demand for inpatient resources and the demand for ED resources at the facilities considered. Multivariate models provided more accurate forecasts of ED census and of the demands for diagnostic resources. Our results suggest that multivariate time series models can be used to reliably forecast ED patient census; however, forecasts of the demands for diagnostic resources were not sufficiently reliable to be useful in the clinical setting.

  3. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    PubMed

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Simultaneous calibration of ensemble river flow predictions over an entire range of lead times

    NASA Astrophysics Data System (ADS)

    Hemri, S.; Fundel, F.; Zappa, M.

    2013-10-01

    Probabilistic estimates of future water levels and river discharge are usually simulated with hydrologic models using ensemble weather forecasts as main inputs. As hydrologic models are imperfect and the meteorological ensembles tend to be biased and underdispersed, the ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, in order to achieve both reliable and sharp predictions statistical postprocessing is required. In this work Bayesian model averaging (BMA) is applied to statistically postprocess ensemble runoff raw forecasts for a catchment in Switzerland, at lead times ranging from 1 to 240 h. The raw forecasts have been obtained using deterministic and ensemble forcing meteorological models with different forecast lead time ranges. First, BMA is applied based on mixtures of univariate normal distributions, subject to the assumption of independence between distinct lead times. Then, the independence assumption is relaxed in order to estimate multivariate runoff forecasts over the entire range of lead times simultaneously, based on a BMA version that uses multivariate normal distributions. Since river runoff is a highly skewed variable, Box-Cox transformations are applied in order to achieve approximate normality. Both univariate and multivariate BMA approaches are able to generate well calibrated probabilistic forecasts that are considerably sharper than climatological forecasts. Additionally, multivariate BMA provides a promising approach for incorporating temporal dependencies into the postprocessed forecasts. Its major advantage against univariate BMA is an increase in reliability when the forecast system is changing due to model availability.

  5. Prediction of ENSO episodes using canonical correlation analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnston, A.G.; Ropelewski, C.F.

    Canonical correlation analysis (CCA) is explored as a multivariate linear statistical methodology with which to forecast fluctuations of the El Nino/Southern Oscillation (ENSO) in real time. CCA is capable of identifying critical sequences of predictor patterns that tend to evolve into subsequent pattern that can be used to form a forecast. The CCA model is used to forecast the 3-month mean sea surface temperature (SST) in several regions of the tropical Pacific and Indian oceans for projection times of 0 to 4 seasons beyond the immediately forthcoming season. The predictor variables, representing the climate situation in the four consecutive 3-monthmore » periods ending at the time of the forecast, are (1) quasi-global seasonal mean sea level pressure (SLP) and (2) SST in the predicted regions themselves. Forecast skill is estimated using cross-validation, and persistence is used as the primary skill control measure. Results indicate that a large region in the eastern equatorial Pacific (120[degrees]-170[degrees] W longitude) has the highest overall predictability, with excellent skill realized for winter forecasts made at the end of summer. CCA outperforms persistence in this region under most conditions, and does noticeably better with the SST included as a predictor in addition to the SLP. It is demonstrated that better forecast performance at the longer lead times would be obtained if some significantly earlier (i.e., up to 4 years) predictor data were included, because the ability to predict the lower-frequency ENSO phase changes would increase. The good performance of the current system at shorter lead times appears to be based largely on the ability to predict ENSO evolution for events already in progress. The forecasting of the eastern tropical Pacific SST using CCA is now done routinely on a monthly basis for a O-, 1-, and 2-season lead at the Climate Analysis Center.« less

  6. Experiments with a three-dimensional statistical objective analysis scheme using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, Wayman E.; Bloom, Stephen C.; Woollen, John S.; Nestler, Mark S.; Brin, Eugenia

    1987-01-01

    A three-dimensional (3D), multivariate, statistical objective analysis scheme (referred to as optimum interpolation or OI) has been developed for use in numerical weather prediction studies with the FGGE data. Some novel aspects of the present scheme include: (1) a multivariate surface analysis over the oceans, which employs an Ekman balance instead of the usual geostrophic relationship, to model the pressure-wind error cross correlations, and (2) the capability to use an error correlation function which is geographically dependent. A series of 4-day data assimilation experiments are conducted to examine the importance of some of the key features of the OI in terms of their effects on forecast skill, as well as to compare the forecast skill using the OI with that utilizing a successive correction method (SCM) of analysis developed earlier. For the three cases examined, the forecast skill is found to be rather insensitive to varying the error correlation function geographically. However, significant differences are noted between forecasts from a two-dimensional (2D) version of the OI and those from the 3D OI, with the 3D OI forecasts exhibiting better forecast skill. The 3D OI forecasts are also more accurate than those from the SCM initial conditions. The 3D OI with the multivariate oceanic surface analysis was found to produce forecasts which were slightly more accurate, on the average, than a univariate version.

  7. Using Meteorological Analogues for Reordering Postprocessed Precipitation Ensembles in Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Bellier, Joseph; Bontron, Guillaume; Zin, Isabella

    2017-12-01

    Meteorological ensemble forecasts are nowadays widely used as input of hydrological models for probabilistic streamflow forecasting. These forcings are frequently biased and have to be statistically postprocessed, using most of the time univariate techniques that apply independently to individual locations, lead times and weather variables. Postprocessed ensemble forecasts therefore need to be reordered so as to reconstruct suitable multivariate dependence structures. The Schaake shuffle and ensemble copula coupling are the two most popular methods for this purpose. This paper proposes two adaptations of them that make use of meteorological analogues for reconstructing spatiotemporal dependence structures of precipitation forecasts. Performances of the original and adapted techniques are compared through a multistep verification experiment using real forecasts from the European Centre for Medium-Range Weather Forecasts. This experiment evaluates not only multivariate precipitation forecasts but also the corresponding streamflow forecasts that derive from hydrological modeling. Results show that the relative performances of the different reordering methods vary depending on the verification step. In particular, the standard Schaake shuffle is found to perform poorly when evaluated on streamflow. This emphasizes the crucial role of the precipitation spatiotemporal dependence structure in hydrological ensemble forecasting.

  8. Impact of multi-resolution analysis of artificial intelligence models inputs on multi-step ahead river flow forecasting

    NASA Astrophysics Data System (ADS)

    Badrzadeh, Honey; Sarukkalige, Ranjan; Jayawardena, A. W.

    2013-12-01

    Discrete wavelet transform was applied to decomposed ANN and ANFIS inputs.Novel approach of WNF with subtractive clustering applied for flow forecasting.Forecasting was performed in 1-5 step ahead, using multi-variate inputs.Forecasting accuracy of peak values and longer lead-time significantly improved.

  9. Multivariate postprocessing techniques for probabilistic hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2016-04-01

    Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power generation, Applied Energy, 96, 12-20, DOI: 10.1016/j.apenergy.2011.11.004. Schefzik, R., T. L. Thorarinsdottir, and T. Gneiting (2013), Uncertainty quantification in complex simulation models using ensemble copula coupling, Statistical Science, 28, 616-640, DOI: 10.1214/13-STS443.

  10. Ecological forecasting in the presence of abrupt regime shifts

    NASA Astrophysics Data System (ADS)

    Dippner, Joachim W.; Kröncke, Ingrid

    2015-10-01

    Regime shifts may cause an intrinsic decrease in the potential predictability of marine ecosystems. In such cases, forecasts of biological variables fail. To improve prediction of long-term variability in environmental variables, we constructed a multivariate climate index and applied it to forecast ecological time series. The concept is demonstrated herein using climate and macrozoobenthos data from the southern North Sea. Special emphasis is given to the influence of selection of length of fitting period to the quality of forecast skill especially in the presence of regime shifts. Our results indicate that the performance of multivariate predictors in biological forecasts is much better than that of single large-scale climate indices, especially in the presence of regime shifts. The approach used to develop the index is generally applicable to all geographical regions in the world and to all areas of marine biology, from the species level up to biodiversity. Such forecasts are of vital interest for practical aspects of the sustainable management of marine ecosystems and the conservation of ecosystem goods and services.

  11. Structural changes and out-of-sample prediction of realized range-based variance in the stock market

    NASA Astrophysics Data System (ADS)

    Gong, Xu; Lin, Boqiang

    2018-03-01

    This paper aims to examine the effects of structural changes on forecasting the realized range-based variance in the stock market. Considering structural changes in variance in the stock market, we develop the HAR-RRV-SC model on the basis of the HAR-RRV model. Subsequently, the HAR-RRV and HAR-RRV-SC models are used to forecast the realized range-based variance of S&P 500 Index. We find that there are many structural changes in variance in the U.S. stock market, and the period after the financial crisis contains more structural change points than the period before the financial crisis. The out-of-sample results show that the HAR-RRV-SC model significantly outperforms the HAR-BV model when they are employed to forecast the 1-day, 1-week, and 1-month realized range-based variances, which means that structural changes can improve out-of-sample prediction of realized range-based variance. The out-of-sample results remain robust across the alternative rolling fixed-window, the alternative threshold value in ICSS algorithm, and the alternative benchmark models. More importantly, we believe that considering structural changes can help improve the out-of-sample performances of most of other existing HAR-RRV-type models in addition to the models used in this paper.

  12. Analysis/forecast experiments with a multivariate statistical analysis scheme using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Nestler, M. S.

    1985-01-01

    A three-dimensional, multivariate, statistical analysis method, optimal interpolation (OI) is described for modeling meteorological data from widely dispersed sites. The model was developed to analyze FGGE data at the NASA-Goddard Laboratory of Atmospherics. The model features a multivariate surface analysis over the oceans, including maintenance of the Ekman balance and a geographically dependent correlation function. Preliminary comparisons are made between the OI model and similar schemes employed at the European Center for Medium Range Weather Forecasts and the National Meteorological Center. The OI scheme is used to provide input to a GCM, and model error correlations are calculated for forecasts of 500 mb vertical water mixing ratios and the wind profiles. Comparisons are made between the predictions and measured data. The model is shown to be as accurate as a successive corrections model out to 4.5 days.

  13. Which is the better forecasting model? A comparison between HAR-RV and multifractality volatility

    NASA Astrophysics Data System (ADS)

    Ma, Feng; Wei, Yu; Huang, Dengshi; Chen, Yixiang

    2014-07-01

    In this paper, by taking the 5-min high frequency data of the Shanghai Composite Index as example, we compare the forecasting performance of HAR-RV and Multifractal volatility, Realized volatility, Realized Bipower Variation and their corresponding short memory model with rolling windows forecasting method and the Model Confidence Set which is proved superior to SPA test. The empirical results show that, for six loss functions, HAR-RV outperforms other models. Moreover, to make the conclusions more precise and robust, we use the MCS test to compare the performance of their logarithms form models, and find that the HAR-log(RV) has a better performance in predicting future volatility. Furthermore, by comparing the two models of HAR-RV and HAR-log(RV), we conclude that, in terms of performance forecasting, the HAR-log(RV) model is the best model among models we have discussed in this paper.

  14. Theoretical Models for Aircraft Availability: Classical Approach to Identification of Trends, Seasonality, and System Constraints in the Development of Realized Models

    DTIC Science & Technology

    2004-03-01

    predicting future events ( Heizer and Render , 1999). Forecasting techniques fall into two major categories, qualitative and quantitative methods...Globemaster III.” Excerpt from website. www.globalsecurity.org/military /systems/ aircraft/c-17-history.htm. 2003. Heizer , Jay, and Barry Render ...of the past data used to make the forecast ( Heizer , et. al., 1999). Explanatory forecasting models assume that the variable being forecasted

  15. A Comparison of Conventional Linear Regression Methods and Neural Networks for Forecasting Educational Spending.

    ERIC Educational Resources Information Center

    Baker, Bruce D.; Richards, Craig E.

    1999-01-01

    Applies neural network methods for forecasting 1991-95 per-pupil expenditures in U.S. public elementary and secondary schools. Forecasting models included the National Center for Education Statistics' multivariate regression model and three neural architectures. Regarding prediction accuracy, neural network results were comparable or superior to…

  16. Probabilistic postprocessing models for flow forecasts for a system of catchments and several lead times

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjorn; Steinsland, Ingelin

    2014-05-01

    This study introduces a methodology for the construction of probabilistic inflow forecasts for multiple catchments and lead times, and investigates criterions for evaluation of multi-variate forecasts. A post-processing approach is used, and a Gaussian model is applied for transformed variables. The post processing model has two main components, the mean model and the dependency model. The mean model is used to estimate the marginal distributions for forecasted inflow for each catchment and lead time, whereas the dependency models was used to estimate the full multivariate distribution of forecasts, i.e. co-variances between catchments and lead times. In operational situations, it is a straightforward task to use the models to sample inflow ensembles which inherit the dependencies between catchments and lead times. The methodology was tested and demonstrated in the river systems linked to the Ulla-Førre hydropower complex in southern Norway, where simultaneous probabilistic forecasts for five catchments and ten lead times were constructed. The methodology exhibits sufficient flexibility to utilize deterministic flow forecasts from a numerical hydrological model as well as statistical forecasts such as persistent forecasts and sliding window climatology forecasts. It also deals with variation in the relative weights of these forecasts with both catchment and lead time. When evaluating predictive performance in original space using cross validation, the case study found that it is important to include the persistent forecast for the initial lead times and the hydrological forecast for medium-term lead times. Sliding window climatology forecasts become more important for the latest lead times. Furthermore, operationally important features in this case study such as heteroscedasticity, lead time varying between lead time dependency and lead time varying between catchment dependency are captured. Two criterions were used for evaluating the added value of the dependency model. The first one was the Energy score (ES) that is a multi-dimensional generalization of continuous rank probability score (CRPS). ES was calculated for all lead-times and catchments together, for each catchment across all lead times and for each lead time across all catchments. The second criterion was to use CRPS for forecasted inflows accumulated over several lead times and catchments. The results showed that ES was not very sensitive to correct covariance structure, whereas CRPS for accumulated flows where more suitable for evaluating the dependency model. This indicates that it is more appropriate to evaluate relevant univariate variables that depends on the dependency structure then to evaluate the multivariate forecast directly.

  17. Delivering bad news in emergency care medicine.

    PubMed

    Maynard, Douglas W

    2017-01-01

    Forecasting is a strategy for delivering bad news and is compared to two other strategies, stalling and being blunt. Forecasting provides some warning that bad news is forthcoming without keeping the recipient in a state of indefinite suspense (stalling) or conveying the news abruptly (being blunt). Forecasting appears to be more effective than stalling or being blunt in helping a recipient to "realize" the bad news because it involves the deliverer and recipient in a particular social relation. The deliverer of bad news initiates the telling by giving an advance indication of the bad news to come; this allows the recipient to calculate the news in advance of its final presentation, when the deliverer confirms what the recipient has been led to anticipate. Thus, realization of bad news emerges from intimate collaboration, whereas stalling and being blunt require recipients to apprehend the news in a social vacuum. Exacerbating disruption to recipients' everyday world, stalling and being blunt increase the probability of misapprehension (denying, blaming, taking the situation as a joke, etc.) and thereby inhibit rather than facilitate realization. Particular attention is paid to the "perspective display sequence", a particular forecasting strategy that enables both confirming the recipient's perspective and using that perspective to affirm the clinical news. An example from acute or emergency medicine is examined at the close of the paper.

  18. Forecasting of municipal solid waste quantity in a developing country using multivariate grey models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Intharathirat, Rotchana, E-mail: rotchana.in@gmail.com; Abdul Salam, P., E-mail: salam@ait.ac.th; Kumar, S., E-mail: kumar@ait.ac.th

    Highlights: • Grey model can be used to forecast MSW quantity accurately with the limited data. • Prediction interval overcomes the uncertainty of MSW forecast effectively. • A multivariate model gives accuracy associated with factors affecting MSW quantity. • Population, urbanization, employment and household size play role for MSW quantity. - Abstract: In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developingmore » countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435–44,994 tonnes per day in 2013 to 55,177–56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period.« less

  19. A New Multivariate Approach in Generating Ensemble Meteorological Forcings for Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2015-04-01

    Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.

  20. Intraday LeBaron effects

    PubMed Central

    Bianco, Simone; Corsi, Fulvio; Renò, Roberto

    2009-01-01

    We study the relation at intraday level between serial correlation and volatility of the Standard and Poor (S&P) 500 stock index futures returns. At daily and weekly levels, serial correlation and volatility forecasts have been found to be negatively correlated (LeBaron effect). After finding a significant attenuation of the original effect over time, we show that a similar but more pronounced effect holds by using intraday measures, by such as realized volatility and variance ratio. We also test the impact of unexpected volatility, defined as the part of volatility which cannot be forecasted, on the presence of intraday serial correlation in the time series by employing a model for realized volatility based on the heterogeneous market hypothesis. We find that intraday serial correlation is negatively correlated to volatility forecasts, whereas it is positively correlated to unexpected volatility.

  1. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    NASA Astrophysics Data System (ADS)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  2. Forecasting of the electrical actuators condition using stator’s current signals

    NASA Astrophysics Data System (ADS)

    Kruglova, T. N.; Yaroshenko, I. V.; Rabotalov, N. N.; Melnikov, M. A.

    2017-02-01

    This article describes a forecasting method for electrical actuators realized through the combination of Fourier transformation and neural network techniques. The method allows finding the value of diagnostic functions in the iterating operating cycle and the number of operational cycles in time before the BLDC actuator fails. For forecasting of the condition of the actuator, we propose a hierarchical structure of the neural network aiming to reduce the training time of the neural network and improve estimation accuracy.

  3. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  4. Business Planning in the Light of Neuro-fuzzy and Predictive Forecasting

    NASA Astrophysics Data System (ADS)

    Chakrabarti, Prasun; Basu, Jayanta Kumar; Kim, Tai-Hoon

    In this paper we have pointed out gain sensing on forecast based techniques.We have cited an idea of neural based gain forecasting. Testing of sequence of gain pattern is also verifies using statsistical analysis of fuzzy value assignment. The paper also suggests realization of stable gain condition using K-Means clustering of data mining. A new concept of 3D based gain sensing has been pointed out. The paper also reveals what type of trend analysis can be observed for probabilistic gain prediction.

  5. Modeling returns volatility: Realized GARCH incorporating realized risk measure

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Ruan, Qingsong; Li, Jianfeng; Li, Ye

    2018-06-01

    This study applies realized GARCH models by introducing several risk measures of intraday returns into the measurement equation, to model the daily volatility of E-mini S&P 500 index futures returns. Besides using the conventional realized measures, realized volatility and realized kernel as our benchmarks, we also use generalized realized risk measures, realized absolute deviation, and two realized tail risk measures, realized value-at-risk and realized expected shortfall. The empirical results show that realized GARCH models using the generalized realized risk measures provide better volatility estimation for the in-sample and substantial improvement in volatility forecasting for the out-of-sample. In particular, the realized expected shortfall performs best for all of the alternative realized measures. Our empirical results reveal that future volatility may be more attributable to present losses (risk measures). The results are robust to different sample estimation windows.

  6. Some Advances in Downscaling Probabilistic Climate Forecasts for Agricultural Decision Support

    NASA Astrophysics Data System (ADS)

    Han, E.; Ines, A.

    2015-12-01

    Seasonal climate forecasts, commonly provided in tercile-probabilities format (below-, near- and above-normal), need to be translated into more meaningful information for decision support of practitioners in agriculture. In this paper, we will present two new novel approaches to temporally downscale probabilistic seasonal climate forecasts: one non-parametric and another parametric method. First, the non-parametric downscaling approach called FResampler1 uses the concept of 'conditional block sampling' of weather data to create daily weather realizations of a tercile-based seasonal climate forecasts. FResampler1 randomly draws time series of daily weather parameters (e.g., rainfall, maximum and minimum temperature and solar radiation) from historical records, for the season of interest from years that belong to a certain rainfall tercile category (e.g., being below-, near- and above-normal). In this way, FResampler1 preserves the covariance between rainfall and other weather parameters as if conditionally sampling maximum and minimum temperature and solar radiation if that day is wet or dry. The second approach called predictWTD is a parametric method based on a conditional stochastic weather generator. The tercile-based seasonal climate forecast is converted into a theoretical forecast cumulative probability curve. Then the deviates for each percentile is converted into rainfall amount or frequency or intensity to downscale the 'full' distribution of probabilistic seasonal climate forecasts. Those seasonal deviates are then disaggregated on a monthly basis and used to constrain the downscaling of forecast realizations at different percentile values of the theoretical forecast curve. As well as the theoretical basis of the approaches we will discuss sensitivity analysis (length of data and size of samples) of them. In addition their potential applications for managing climate-related risks in agriculture will be shown through a couple of case studies based on actual seasonal climate forecasts for: rice cropping in the Philippines and maize cropping in India and Kenya.

  7. The NLS-Based Nonlinear Grey Multivariate Model for Forecasting Pollutant Emissions in China.

    PubMed

    Pei, Ling-Ling; Li, Qin; Wang, Zheng-Xin

    2018-03-08

    The relationship between pollutant discharge and economic growth has been a major research focus in environmental economics. To accurately estimate the nonlinear change law of China's pollutant discharge with economic growth, this study establishes a transformed nonlinear grey multivariable (TNGM (1, N )) model based on the nonlinear least square (NLS) method. The Gauss-Seidel iterative algorithm was used to solve the parameters of the TNGM (1, N ) model based on the NLS basic principle. This algorithm improves the precision of the model by continuous iteration and constantly approximating the optimal regression coefficient of the nonlinear model. In our empirical analysis, the traditional grey multivariate model GM (1, N ) and the NLS-based TNGM (1, N ) models were respectively adopted to forecast and analyze the relationship among wastewater discharge per capita (WDPC), and per capita emissions of SO₂ and dust, alongside GDP per capita in China during the period 1996-2015. Results indicated that the NLS algorithm is able to effectively help the grey multivariable model identify the nonlinear relationship between pollutant discharge and economic growth. The results show that the NLS-based TNGM (1, N ) model presents greater precision when forecasting WDPC, SO₂ emissions and dust emissions per capita, compared to the traditional GM (1, N ) model; WDPC indicates a growing tendency aligned with the growth of GDP, while the per capita emissions of SO₂ and dust reduce accordingly.

  8. A New Approach in Generating Meteorological Forecasts for Ensemble Streamflow Forecasting using Multivariate Functions

    NASA Astrophysics Data System (ADS)

    Khajehei, S.; Madadgar, S.; Moradkhani, H.

    2014-12-01

    The reliability and accuracy of hydrological predictions are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model parameters and model structure. To reduce the total uncertainty in hydrological applications, one approach is to reduce the uncertainty in meteorological forcing by using the statistical methods based on the conditional probability density functions (pdf). However, one of the requirements for current methods is to assume the Gaussian distribution for the marginal distribution of the observed and modeled meteorology. Here we propose a Bayesian approach based on Copula functions to develop the conditional distribution of precipitation forecast needed in deriving a hydrologic model for a sub-basin in the Columbia River Basin. Copula functions are introduced as an alternative approach in capturing the uncertainties related to meteorological forcing. Copulas are multivariate joint distribution of univariate marginal distributions, which are capable to model the joint behavior of variables with any level of correlation and dependency. The method is applied to the monthly forecast of CPC with 0.25x0.25 degree resolution to reproduce the PRISM dataset over 1970-2000. Results are compared with Ensemble Pre-Processor approach as a common procedure used by National Weather Service River forecast centers in reproducing observed climatology during a ten-year verification period (2000-2010).

  9. Realized niche shift during a global biological invasion

    PubMed Central

    Tingley, Reid; Vallinoto, Marcelo; Sequeira, Fernando; Kearney, Michael R.

    2014-01-01

    Accurate forecasts of biological invasions are crucial for managing invasion risk but are hampered by niche shifts resulting from evolved environmental tolerances (fundamental niche shifts) or the presence of novel biotic and abiotic conditions in the invaded range (realized niche shifts). Distinguishing between these kinds of niche shifts is impossible with traditional, correlative approaches to invasion forecasts, which exclusively consider the realized niche. Here we overcome this challenge by combining a physiologically mechanistic model of the fundamental niche with correlative models based on the realized niche to study the global invasion of the cane toad Rhinella marina. We find strong evidence that the success of R. marina in Australia reflects a shift in the species’ realized niche, as opposed to evolutionary shifts in range-limiting traits. Our results demonstrate that R. marina does not fill its fundamental niche in its native South American range and that areas of niche unfilling coincide with the presence of a closely related species with which R. marina hybridizes. Conversely, in Australia, where coevolved taxa are absent, R. marina largely fills its fundamental niche in areas behind the invasion front. The general approach taken here of contrasting fundamental and realized niche models provides key insights into the role of biotic interactions in shaping range limits and can inform effective management strategies not only for invasive species but also for assisted colonization under climate change. PMID:24982155

  10. Asymmetric affective forecasting errors and their correlation with subjective well-being

    PubMed Central

    2018-01-01

    Aims Social scientists have postulated that the discrepancy between achievements and expectations affects individuals' subjective well-being. Still, little has been done to qualify and quantify such a psychological effect. Our empirical analysis assesses the consequences of positive and negative affective forecasting errors—the difference between realized and expected subjective well-being—on the subsequent level of subjective well-being. Data We use longitudinal data on a representative sample of 13,431 individuals from the German Socio-Economic Panel. In our sample, 52% of individuals are females, average age is 43 years, average years of education is 11.4 and 27% of our sample lives in East Germany. Subjective well-being (measured by self-reported life satisfaction) is assessed on a 0–10 discrete scale and its sample average is equal to 6.75 points. Methods We develop a simple theoretical framework to assess the consequences of positive and negative affective forecasting errors—the difference between realized and expected subjective well-being—on the subsequent level of subjective well-being, properly accounting for the endogenous adjustment of expectations to positive and negative affective forecasting errors, and use it to derive testable predictions. Given the theoretical framework, we estimate two panel-data equations, the first depicting the association between positive and negative affective forecasting errors and the successive level of subjective well-being and the second describing the correlation between subjective well-being expectations for the future and hedonic failures and successes. Our models control for individual fixed effects and a large battery of time-varying demographic characteristics, health and socio-economic status. Results and conclusions While surpassing expectations is uncorrelated with subjective well-being, failing to match expectations is negatively associated with subsequent realizations of subjective well-being. Expectations are positively (negatively) correlated to positive (negative) forecasting errors. We speculate that in the first case the positive adjustment in expectations is strong enough to cancel out the potential positive effects on subjective well-being of beaten expectations, while in the second case it is not, and individuals persistently bear the negative emotional consequences of not achieving expectations. PMID:29513685

  11. The NLS-Based Nonlinear Grey Multivariate Model for Forecasting Pollutant Emissions in China

    PubMed Central

    Pei, Ling-Ling; Li, Qin

    2018-01-01

    The relationship between pollutant discharge and economic growth has been a major research focus in environmental economics. To accurately estimate the nonlinear change law of China’s pollutant discharge with economic growth, this study establishes a transformed nonlinear grey multivariable (TNGM (1, N)) model based on the nonlinear least square (NLS) method. The Gauss–Seidel iterative algorithm was used to solve the parameters of the TNGM (1, N) model based on the NLS basic principle. This algorithm improves the precision of the model by continuous iteration and constantly approximating the optimal regression coefficient of the nonlinear model. In our empirical analysis, the traditional grey multivariate model GM (1, N) and the NLS-based TNGM (1, N) models were respectively adopted to forecast and analyze the relationship among wastewater discharge per capita (WDPC), and per capita emissions of SO2 and dust, alongside GDP per capita in China during the period 1996–2015. Results indicated that the NLS algorithm is able to effectively help the grey multivariable model identify the nonlinear relationship between pollutant discharge and economic growth. The results show that the NLS-based TNGM (1, N) model presents greater precision when forecasting WDPC, SO2 emissions and dust emissions per capita, compared to the traditional GM (1, N) model; WDPC indicates a growing tendency aligned with the growth of GDP, while the per capita emissions of SO2 and dust reduce accordingly. PMID:29517985

  12. Enhanced seasonal forecast skill following stratospheric sudden warmings

    NASA Astrophysics Data System (ADS)

    Sigmond, M.; Scinocca, J. F.; Kharin, V. V.; Shepherd, T. G.

    2013-02-01

    Advances in seasonal forecasting have brought widespread socio-economic benefits. However, seasonal forecast skill in the extratropics is relatively modest, prompting the seasonal forecasting community to search for additional sources of predictability. For over a decade it has been suggested that knowledge of the state of the stratosphere can act as a source of enhanced seasonal predictability; long-lived circulation anomalies in the lower stratosphere that follow stratospheric sudden warmings are associated with circulation anomalies in the troposphere that can last up to two months. Here, we show by performing retrospective ensemble model forecasts that such enhanced predictability can be realized in a dynamical seasonal forecast system with a good representation of the stratosphere. When initialized at the onset date of stratospheric sudden warmings, the model forecasts faithfully reproduce the observed mean tropospheric conditions in the months following the stratospheric sudden warmings. Compared with an equivalent set of forecasts that are not initialized during stratospheric sudden warmings, we document enhanced forecast skill for atmospheric circulation patterns, surface temperatures over northern Russia and eastern Canada and North Atlantic precipitation. We suggest that seasonal forecast systems initialized during stratospheric sudden warmings are likely to yield significantly greater forecast skill in some regions.

  13. A comparison of multivariate and univariate time series approaches to modelling and forecasting emergency department demand in Western Australia.

    PubMed

    Aboagye-Sarfo, Patrick; Mai, Qun; Sanfilippo, Frank M; Preen, David B; Stewart, Louise M; Fatovich, Daniel M

    2015-10-01

    To develop multivariate vector-ARMA (VARMA) forecast models for predicting emergency department (ED) demand in Western Australia (WA) and compare them to the benchmark univariate autoregressive moving average (ARMA) and Winters' models. Seven-year monthly WA state-wide public hospital ED presentation data from 2006/07 to 2012/13 were modelled. Graphical and VARMA modelling methods were used for descriptive analysis and model fitting. The VARMA models were compared to the benchmark univariate ARMA and Winters' models to determine their accuracy to predict ED demand. The best models were evaluated by using error correction methods for accuracy. Descriptive analysis of all the dependent variables showed an increasing pattern of ED use with seasonal trends over time. The VARMA models provided a more precise and accurate forecast with smaller confidence intervals and better measures of accuracy in predicting ED demand in WA than the ARMA and Winters' method. VARMA models are a reliable forecasting method to predict ED demand for strategic planning and resource allocation. While the ARMA models are a closely competing alternative, they under-estimated future ED demand. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Drought forecasting in Luanhe River basin involving climatic indices

    NASA Astrophysics Data System (ADS)

    Ren, Weinan; Wang, Yixuan; Li, Jianzhu; Feng, Ping; Smith, Ronald J.

    2017-11-01

    Drought is regarded as one of the most severe natural disasters globally. This is especially the case in Tianjin City, Northern China, where drought can affect economic development and people's livelihoods. Drought forecasting, the basis of drought management, is an important mitigation strategy. In this paper, we evolve a probabilistic forecasting model, which forecasts transition probabilities from a current Standardized Precipitation Index (SPI) value to a future SPI class, based on conditional distribution of multivariate normal distribution to involve two large-scale climatic indices at the same time, and apply the forecasting model to 26 rain gauges in the Luanhe River basin in North China. The establishment of the model and the derivation of the SPI are based on the hypothesis of aggregated monthly precipitation that is normally distributed. Pearson correlation and Shapiro-Wilk normality tests are used to select appropriate SPI time scale and large-scale climatic indices. Findings indicated that longer-term aggregated monthly precipitation, in general, was more likely to be considered normally distributed and forecasting models should be applied to each gauge, respectively, rather than to the whole basin. Taking Liying Gauge as an example, we illustrate the impact of the SPI time scale and lead time on transition probabilities. Then, the controlled climatic indices of every gauge are selected by Pearson correlation test and the multivariate normality of SPI, corresponding climatic indices for current month and SPI 1, 2, and 3 months later are demonstrated using Shapiro-Wilk normality test. Subsequently, we illustrate the impact of large-scale oceanic-atmospheric circulation patterns on transition probabilities. Finally, we use a score method to evaluate and compare the performance of the three forecasting models and compare them with two traditional models which forecast transition probabilities from a current to a future SPI class. The results show that the three proposed models outperform the two traditional models and involving large-scale climatic indices can improve the forecasting accuracy.

  15. Forecasting the short-term passenger flow on high-speed railway with neural networks.

    PubMed

    Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing

    2014-01-01

    Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway.

  16. Multivariate Statistical Postprocessing of Ensemble Forcasts of Precipitation and Temperature over four River Basins in California

    NASA Astrophysics Data System (ADS)

    Scheuerer, Michael; Hamill, Thomas M.; Whitin, Brett; He, Minxue; Henkel, Arthur

    2017-04-01

    Hydrological forecasts strongly rely on predictions of precipitation amounts and temperature as meteorological inputs to hydrological models. Ensemble weather predictions provide a number of different scenarios that reflect the uncertainty about these meteorological inputs, but are often biased and underdispersive, and therefore require statistical postprocessing. In hydrological applications it is crucial that spatial and temporal (i.e. between different forecast lead times) dependencies as well as dependence between the two weather variables is adequately represented by the recalibrated forecasts. We present a study with temperature and precipitation forecasts over four river basins over California that are postprocessed with a variant of the nonhomogeneous Gaussian regression method (Gneiting et al., 2005) and the censored, shifted gamma distribution approach (Scheuerer and Hamill, 2015) respectively. For modelling spatial, temporal and inter-variable dependence we propose a variant of the Schaake Shuffle (Clark et al., 2005) that uses spatio-temporal trajectories of observed temperture and precipitation as a dependence template, and chooses the historic dates in such a way that the divergence between the marginal distributions of these trajectories and the univariate forecast distributions is minimized. For the four river basins considered in our study, this new multivariate modelling technique consistently improves upon the Schaake Shuffle and yields reliable spatio-temporal forecast trajectories of temperature and precipitation that can be used to force hydrological forecast systems. References: Clark, M., Gangopadhyay, S., Hay, L., Rajagopalan, B., Wilby, R., 2004. The Schaake Shuffle: A method for reconstructing space-time variability in forecasted precipitation and temperature fields. Journal of Hydrometeorology, 5, pp.243-262. Gneiting, T., Raftery, A.E., Westveld, A.H., Goldman, T., 2005. Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS. Monthly Weather Review, 133, pp.1098-1118. Scheuerer, M., Hamill, T.M., 2015. Statistical postprocessing of ensemble precipitation forecasts by fitting censored, shifted gamma distributions. Monthly Weather Review, 143, pp.4578-4596. Scheuerer, M., Hamill, T.M., Whitin, B., He, M., and Henkel, A., 2016: A method for preferential selection of dates in the Schaake shuffle approach to constructing spatio-temporal forecast fields of temperature and precipitation. Water Resources Research, submitted.

  17. Forecasting of particulate matter time series using wavelet analysis and wavelet-ARMA/ARIMA model in Taiyuan, China.

    PubMed

    Zhang, Hong; Zhang, Sheng; Wang, Ping; Qin, Yuzhe; Wang, Huifeng

    2017-07-01

    Particulate matter with aerodynamic diameter below 10 μm (PM 10 ) forecasting is difficult because of the uncertainties in describing the emission and meteorological fields. This paper proposed a wavelet-ARMA/ARIMA model to forecast the short-term series of the PM 10 concentrations. It was evaluated by experiments using a 10-year data set of daily PM 10 concentrations from 4 stations located in Taiyuan, China. The results indicated the following: (1) PM 10 concentrations of Taiyuan had a decreasing trend during 2005 to 2012 but increased in 2013. PM 10 concentrations had an obvious seasonal fluctuation related to coal-fired heating in winter and early spring. (2) Spatial differences among the four stations showed that the PM 10 concentrations in industrial and heavily trafficked areas were higher than those in residential and suburb areas. (3) Wavelet analysis revealed that the trend variation and the changes of the PM 10 concentration of Taiyuan were complicated. (4) The proposed wavelet-ARIMA model could be efficiently and successfully applied to the PM 10 forecasting field. Compared with the traditional ARMA/ARIMA methods, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. Wavelet analysis can filter noisy signals and identify the variation trend and the fluctuation of the PM 10 time-series data. Wavelet decomposition and reconstruction reduce the nonstationarity of the PM 10 time-series data, and thus improve the accuracy of the prediction. This paper proposed a wavelet-ARMA/ARIMA model to forecast the PM 10 time series. Compared with the traditional ARMA/ARIMA method, this wavelet-ARMA/ARIMA method could effectively reduce the forecasting error, improve the prediction accuracy, and realize multiple-time-scale prediction. The proposed model could be efficiently and successfully applied to the PM 10 forecasting field.

  18. Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Mohamed Ismael, Hawa; Vandyck, George Kobina

    The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.

  19. Medium-term electric power demand forecasting based on economic-electricity transmission model

    NASA Astrophysics Data System (ADS)

    Li, Wenfeng; Bao, Fangmin; Bai, Hongkun; Liu, Wei; Liu, Yongmin; Mao, Yubin; Wang, Jiangbo; Liu, Junhui

    2018-06-01

    Electric demand forecasting is a basic work to ensure the safe operation of power system. Based on the theories of experimental economics and econometrics, this paper introduces Prognoz Platform 7.2 intelligent adaptive modeling platform, and constructs the economic electricity transmission model that considers the economic development scenarios and the dynamic adjustment of industrial structure to predict the region's annual electricity demand, and the accurate prediction of the whole society's electricity consumption is realized. Firstly, based on the theories of experimental economics and econometrics, this dissertation attempts to find the economic indicator variables that drive the most economical growth of electricity consumption and availability, and build an annual regional macroeconomic forecast model that takes into account the dynamic adjustment of industrial structure. Secondly, it innovatively put forward the economic electricity directed conduction theory and constructed the economic power transfer function to realize the group forecast of the primary industry + rural residents living electricity consumption, urban residents living electricity, the second industry electricity consumption, the tertiary industry electricity consumption; By comparing with the actual value of economy and electricity in Henan province in 2016, the validity of EETM model is proved, and the electricity consumption of the whole province from 2017 to 2018 is predicted finally.

  20. Forecasting the realized volatility of the Chinese stock market: Do the G7 stock markets help?

    NASA Astrophysics Data System (ADS)

    Peng, Huan; Chen, Ruoxun; Mei, Dexiang; Diao, Xiaohua

    2018-07-01

    In this paper, we use a comprehensive look to investigate whether the G7 stock markets can contain predictive information to help in forecasting the Chinese stock market volatility. Our out-of-sample empirical results indicate the kitchen sink (HAR-RV-SK) model is able to attain better performance than the benchmark model (HAR-RV) and other models, implying that the G7 stock markets can help in predicting the one-day volatility of the Chinese stock market. Moreover, the kitchen sink strategy can beat the strategy of the simple combination forecasts. Finally, the G7 stock markets can indeed contain useful information, which can increase the accuracy forecasts of the Chinese stock market.

  1. Post-processing of multi-model ensemble river discharge forecasts using censored EMOS

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2014-05-01

    When forecasting water levels and river discharge, ensemble weather forecasts are used as meteorological input to hydrologic process models. As hydrologic models are imperfect and the input ensembles tend to be biased and underdispersed, the output ensemble forecasts for river runoff typically are biased and underdispersed, too. Thus, statistical post-processing is required in order to achieve calibrated and sharp predictions. Standard post-processing methods such as Ensemble Model Output Statistics (EMOS) that have their origins in meteorological forecasting are now increasingly being used in hydrologic applications. Here we consider two sub-catchments of River Rhine, for which the forecasting system of the Federal Institute of Hydrology (BfG) uses runoff data that are censored below predefined thresholds. To address this methodological challenge, we develop a censored EMOS method that is tailored to such data. The censored EMOS forecast distribution can be understood as a mixture of a point mass at the censoring threshold and a continuous part based on a truncated normal distribution. Parameter estimates of the censored EMOS model are obtained by minimizing the Continuous Ranked Probability Score (CRPS) over the training dataset. Model fitting on Box-Cox transformed data allows us to take account of the positive skewness of river discharge distributions. In order to achieve realistic forecast scenarios over an entire range of lead-times, there is a need for multivariate extensions. To this end, we smooth the marginal parameter estimates over lead-times. In order to obtain realistic scenarios of discharge evolution over time, the marginal distributions have to be linked with each other. To this end, the multivariate dependence structure can either be adopted from the raw ensemble like in Ensemble Copula Coupling (ECC), or be estimated from observations in a training period. The censored EMOS model has been applied to multi-model ensemble forecasts issued on a daily basis over a period of three years. For the two catchments considered, this resulted in well calibrated and sharp forecast distributions over all lead-times from 1 to 114 h. Training observations tended to be better indicators for the dependence structure than the raw ensemble.

  2. Gridded Calibration of Ensemble Wind Vector Forecasts Using Ensemble Model Output Statistics

    NASA Astrophysics Data System (ADS)

    Lazarus, S. M.; Holman, B. P.; Splitt, M. E.

    2017-12-01

    A computationally efficient method is developed that performs gridded post processing of ensemble wind vector forecasts. An expansive set of idealized WRF model simulations are generated to provide physically consistent high resolution winds over a coastal domain characterized by an intricate land / water mask. Ensemble model output statistics (EMOS) is used to calibrate the ensemble wind vector forecasts at observation locations. The local EMOS predictive parameters (mean and variance) are then spread throughout the grid utilizing flow-dependent statistical relationships extracted from the downscaled WRF winds. Using data withdrawal and 28 east central Florida stations, the method is applied to one year of 24 h wind forecasts from the Global Ensemble Forecast System (GEFS). Compared to the raw GEFS, the approach improves both the deterministic and probabilistic forecast skill. Analysis of multivariate rank histograms indicate the post processed forecasts are calibrated. Two downscaling case studies are presented, a quiescent easterly flow event and a frontal passage. Strengths and weaknesses of the approach are presented and discussed.

  3. Does the OVX matter for volatility forecasting? Evidence from the crude oil market

    NASA Astrophysics Data System (ADS)

    Lv, Wendai

    2018-02-01

    In this paper, I investigate that whether the OVX and its truncated parts with a certain threshold can significantly help in forecasting the oil futures price volatility basing on the Heterogeneous Autoregressive model of Realized Volatility (HAR-RV). In-sample estimation results show that the OVX has a significantly positive impact on futures volatility. The impact of large OVX on future volatility has slightly powerful compared to the small ones. Moreover, the HARQ-RV model outperforms the HAR-RV in predicting the oil futures volatility. More importantly, the decomposed OVX have more powerful in forecasting the oil futures price volatility compared to the OVX itself.

  4. Forecasting the Short-Term Passenger Flow on High-Speed Railway with Neural Networks

    PubMed Central

    Xie, Mei-Quan; Li, Xia-Miao; Zhou, Wen-Liang; Fu, Yan-Bing

    2014-01-01

    Short-term passenger flow forecasting is an important component of transportation systems. The forecasting result can be applied to support transportation system operation and management such as operation planning and revenue management. In this paper, a divide-and-conquer method based on neural network and origin-destination (OD) matrix estimation is developed to forecast the short-term passenger flow in high-speed railway system. There are three steps in the forecasting method. Firstly, the numbers of passengers who arrive at each station or depart from each station are obtained from historical passenger flow data, which are OD matrices in this paper. Secondly, short-term passenger flow forecasting of the numbers of passengers who arrive at each station or depart from each station based on neural network is realized. At last, the OD matrices in short-term time are obtained with an OD matrix estimation method. The experimental results indicate that the proposed divide-and-conquer method performs well in forecasting the short-term passenger flow on high-speed railway. PMID:25544838

  5. Forecasting of municipal solid waste quantity in a developing country using multivariate grey models.

    PubMed

    Intharathirat, Rotchana; Abdul Salam, P; Kumar, S; Untong, Akarapong

    2015-05-01

    In order to plan, manage and use municipal solid waste (MSW) in a sustainable way, accurate forecasting of MSW generation and composition plays a key role. It is difficult to carry out the reliable estimates using the existing models due to the limited data available in the developing countries. This study aims to forecast MSW collected in Thailand with prediction interval in long term period by using the optimized multivariate grey model which is the mathematical approach. For multivariate models, the representative factors of residential and commercial sectors affecting waste collected are identified, classified and quantified based on statistics and mathematics of grey system theory. Results show that GMC (1, 5), the grey model with convolution integral, is the most accurate with the least error of 1.16% MAPE. MSW collected would increase 1.40% per year from 43,435-44,994 tonnes per day in 2013 to 55,177-56,735 tonnes per day in 2030. This model also illustrates that population density is the most important factor affecting MSW collected, followed by urbanization, proportion employment and household size, respectively. These mean that the representative factors of commercial sector may affect more MSW collected than that of residential sector. Results can help decision makers to develop the measures and policies of waste management in long term period. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Development of an Operation Control System for Photovoltaics and Electric Storage Heaters for Houses Based on Information in Weather Forecasts

    NASA Astrophysics Data System (ADS)

    Obara, Shin'ya

    An all-electric home using an electric storage heater with safety and cleaning is expanded. However, the general electric storage heater leads to an unpleasant room temperature and energy loss by the overs and shorts of the amount of heat radiation when the climate condition changes greatly. Consequently, the operation of the electric storage heater introduced into an all-electric home, a storage type electric water heater, and photovoltaics was planned using weather forecast information distributed by a communication line. The comfortable evaluation (the difference between a room-temperature target and a room-temperature result) when the proposed system was employed based on the operation planning, purchase electric energy, and capacity of photovoltaics was investigated. As a result, comfortable heating operation was realized by using weather forecast data; furthermore, it is expected that the purchase cost of the commercial power in daytime can be reduced by introducing photovoltaics. Moreover, when the capacity of the photovoltaics was increased, the surplus power was stored in the electric storage heater, but an extremely unpleasant room temperature was not shown in the investigation ranges of this paper. By obtaining weather information from the forecast of the day from an external service using a communication line, the heating system of the all-electric home with low energy loss and comfort temperature is realizable.

  7. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China.

    PubMed

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry.

  8. Forecasting electric vehicles sales with univariate and multivariate time series models: The case of China

    PubMed Central

    Zhang, Yong; Zhong, Miner; Geng, Nana; Jiang, Yunjian

    2017-01-01

    The market demand for electric vehicles (EVs) has increased in recent years. Suitable models are necessary to understand and forecast EV sales. This study presents a singular spectrum analysis (SSA) as a univariate time-series model and vector autoregressive model (VAR) as a multivariate model. Empirical results suggest that SSA satisfactorily indicates the evolving trend and provides reasonable results. The VAR model, which comprised exogenous parameters related to the market on a monthly basis, can significantly improve the prediction accuracy. The EV sales in China, which are categorized into battery and plug-in EVs, are predicted in both short term (up to December 2017) and long term (up to 2020), as statistical proofs of the growth of the Chinese EV industry. PMID:28459872

  9. Sign realized jump risk and the cross-section of stock returns: Evidence from China's stock market.

    PubMed

    Chao, Youcong; Liu, Xiaoqun; Guo, Shijun

    2017-01-01

    Using 5-minute high frequency data from the Chinese stock market, we employ a non-parametric method to estimate Fama-French portfolio realized jumps and investigate whether the estimated positive, negative and sign realized jumps could forecast or explain the cross-sectional stock returns. The Fama-MacBeth regression results show that not only have the realized jump components and the continuous volatility been compensated with risk premium, but also that the negative jump risk, the positive jump risk and the sign jump risk, to some extent, could explain the return of the stock portfolios. Therefore, we should pay high attention to the downside tail risk and the upside tail risk.

  10. Leverage effect, economic policy uncertainty and realized volatility with regime switching

    NASA Astrophysics Data System (ADS)

    Duan, Yinying; Chen, Wang; Zeng, Qing; Liu, Zhicao

    2018-03-01

    In this study, we first investigate the impacts of leverage effect and economic policy uncertainty (EPU) on future volatility in the framework of regime switching. Out-of-sample results show that the HAR-RV including the leverage effect and economic policy uncertainty with regimes can achieve higher forecast accuracy than RV-type and GARCH-class models. Our robustness results further imply that these factors in the framework of regime switching can substantially improve the HAR-RV's forecast performance.

  11. Forecasting volatility of SSEC in Chinese stock market using multifractal analysis

    NASA Astrophysics Data System (ADS)

    Wei, Yu; Wang, Peng

    2008-03-01

    In this paper, taking about 7 years’ high-frequency data of the Shanghai Stock Exchange Composite Index (SSEC) as an example, we propose a daily volatility measure based on the multifractal spectrum of the high-frequency price variability within a trading day. An ARFIMA model is used to depict the dynamics of this multifractal volatility (MFV) measures. The one-day ahead volatility forecasting performances of the MFV model and some other existing volatility models, such as the realized volatility model, stochastic volatility model and GARCH, are evaluated by the superior prediction ability (SPA) test. The empirical results show that under several loss functions, the MFV model obtains the best forecasting accuracy.

  12. Error discrimination of an operational hydrological forecasting system at a national scale

    NASA Astrophysics Data System (ADS)

    Jordan, F.; Brauchli, T.

    2010-09-01

    The use of operational hydrological forecasting systems is recommended for hydropower production as well as flood management. However, the forecast uncertainties can be important and lead to bad decisions such as false alarms and inappropriate reservoir management of hydropower plants. In order to improve the forecasting systems, it is important to discriminate the different sources of uncertainties. To achieve this task, reanalysis of past predictions can be realized and provide information about the structure of the global uncertainty. In order to discriminate between uncertainty due to the weather numerical model and uncertainty due to the rainfall-runoff model, simulations assuming perfect weather forecast must be realized. This contribution presents the spatial analysis of the weather uncertainties and their influence on the river discharge prediction of a few different river basins where an operational forecasting system exists. The forecast is based on the RS 3.0 system [1], [2], which is also running the open Internet platform www.swissrivers.ch [3]. The uncertainty related to the hydrological model is compared to the uncertainty related to the weather prediction. A comparison between numerous weather prediction models [4] at different lead times is also presented. The results highlight an important improving potential of both forecasting components: the hydrological rainfall-runoff model and the numerical weather prediction models. The hydrological processes must be accurately represented during the model calibration procedure, while weather prediction models suffer from a systematic spatial bias. REFERENCES [1] Garcia, J., Jordan, F., Dubois, J. & Boillat, J.-L. 2007. "Routing System II, Modélisation d'écoulements dans des systèmes hydrauliques", Communication LCH n° 32, Ed. Prof. A. Schleiss, Lausanne [2] Jordan, F. 2007. Modèle de prévision et de gestion des crues - optimisation des opérations des aménagements hydroélectriques à accumulation pour la réduction des débits de crue, thèse de doctorat n° 3711, Ecole Polytechnique Fédérale, Lausanne [3] Keller, R. 2009. "Le débit des rivières au peigne fin", Revue Technique Suisse, N°7/8 2009, Swiss engineering RTS, UTS SA, Lausanne, p. 11 [4] Kaufmann, P., Schubiger, F. & Binder, P. 2003. Precipitation forecasting by a mesoscale numerical weather prediction (NWP) model : eight years of experience, Hydrology and Earth System

  13. Forecasting malaria in a highly endemic country using environmental and clinical predictors.

    PubMed

    Zinszer, Kate; Kigozi, Ruth; Charland, Katia; Dorsey, Grant; Brewer, Timothy F; Brownstein, John S; Kamya, Moses R; Buckeridge, David L

    2015-06-18

    Malaria thrives in poor tropical and subtropical countries where local resources are limited. Accurate disease forecasts can provide public and clinical health services with the information needed to implement targeted approaches for malaria control that make effective use of limited resources. The objective of this study was to determine the relevance of environmental and clinical predictors of malaria across different settings in Uganda. Forecasting models were based on health facility data collected by the Uganda Malaria Surveillance Project and satellite-derived rainfall, temperature, and vegetation estimates from 2006 to 2013. Facility-specific forecasting models of confirmed malaria were developed using multivariate autoregressive integrated moving average models and produced weekly forecast horizons over a 52-week forecasting period. The model with the most accurate forecasts varied by site and by forecast horizon. Clinical predictors were retained in the models with the highest predictive power for all facility sites. The average error over the 52 forecasting horizons ranged from 26 to 128% whereas the cumulative burden forecast error ranged from 2 to 22%. Clinical data, such as drug treatment, could be used to improve the accuracy of malaria predictions in endemic settings when coupled with environmental predictors. Further exploration of malaria forecasting is necessary to improve its accuracy and value in practice, including examining other environmental and intervention predictors, including insecticide-treated nets.

  14. Forecasting the value-at-risk of Chinese stock market using the HARQ model and extreme value theory

    NASA Astrophysics Data System (ADS)

    Liu, Guangqiang; Wei, Yu; Chen, Yongfei; Yu, Jiang; Hu, Yang

    2018-06-01

    Using intraday data of the CSI300 index, this paper discusses value-at-risk (VaR) forecasting of the Chinese stock market from the perspective of high-frequency volatility models. First, we measure the realized volatility (RV) with 5-minute high-frequency returns of the CSI300 index and then model it with the newly introduced heterogeneous autoregressive quarticity (HARQ) model, which can handle the time-varying coefficients of the HAR model. Second, we forecast the out-of-sample VaR of the CSI300 index by combining the HARQ model and extreme value theory (EVT). Finally, using several popular backtesting methods, we compare the VaR forecasting accuracy of HARQ model with other traditional HAR-type models, such as HAR, HAR-J, CHAR, and SHAR. The empirical results show that the novel HARQ model can beat other HAR-type models in forecasting the VaR of the Chinese stock market at various risk levels.

  15. Sign realized jump risk and the cross-section of stock returns: Evidence from China's stock market

    PubMed Central

    Chao, Youcong; Liu, Xiaoqun; Guo, Shijun

    2017-01-01

    Using 5-minute high frequency data from the Chinese stock market, we employ a non-parametric method to estimate Fama-French portfolio realized jumps and investigate whether the estimated positive, negative and sign realized jumps could forecast or explain the cross-sectional stock returns. The Fama-MacBeth regression results show that not only have the realized jump components and the continuous volatility been compensated with risk premium, but also that the negative jump risk, the positive jump risk and the sign jump risk, to some extent, could explain the return of the stock portfolios. Therefore, we should pay high attention to the downside tail risk and the upside tail risk. PMID:28771514

  16. The economic impact of longer range weather information on the production of peas in Wisconsin

    NASA Technical Reports Server (NTRS)

    Smith, K. R.; Torkelson, A. W.

    1972-01-01

    The extent of benefits which will be realized in the pea industry as a result of improved long range weather forecasts are outlined. Particular attention was given to planting and harvesting operations.

  17. Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Prive, N. C.; Errico, Ronald M.

    2015-01-01

    The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.

  18. Design of online monitoring and forecasting system for electrical equipment temperature of prefabricated substation based on WSN

    NASA Astrophysics Data System (ADS)

    Qi, Weiran; Miao, Hongxia; Miao, Xuejiao; Xiao, Xuanxuan; Yan, Kuo

    2016-10-01

    In order to ensure the safe and stable operation of the prefabricated substations, temperature sensing subsystem, temperature remote monitoring and management subsystem, forecast subsystem are designed in the paper. Wireless temperature sensing subsystem which consists of temperature sensor and MCU sends the electrical equipment temperature to the remote monitoring center by wireless sensor network. Remote monitoring center can realize the remote monitoring and prediction by monitoring and management subsystem and forecast subsystem. Real-time monitoring of power equipment temperature, history inquiry database, user management, password settings, etc., were achieved by monitoring and management subsystem. In temperature forecast subsystem, firstly, the chaos of the temperature data was verified and phase space is reconstructed. Then Support Vector Machine - Particle Swarm Optimization (SVM-PSO) was used to predict the temperature of the power equipment in prefabricated substations. The simulation results found that compared with the traditional methods SVM-PSO has higher prediction accuracy.

  19. Improving Flood Forecasting in International River Basins

    NASA Astrophysics Data System (ADS)

    Hossain, Faisal; Katiyar, Nitin

    2006-01-01

    In flood-prone international river basins (IRBs), many riparian nations that are located close to a basin's outlet face a major problem in effectively forecasting flooding because they are unable to assimilate in situ rainfall data in real time across geopolitical boundaries. NASA's proposed Global Precipitation Measurement (GPM) mission, which is expected to begin in 2010, will comprise high-resolution passive microwave (PM) sensors (at resolution ~3-6 hours, 10 × 10 square kilometers) that may provide new opportunities to improve flood forecasting in these river basins. Research is now needed to realize the potential of GPM. With adequate research in the coming years, it may be possible to identify the specific IRBs that would benefit cost-effectively from a preprogrammed satellite-based forecasting system in anticipation of GPM. Acceleration of such a research initiative is worthwhile because it could reduce the risk of the cancellation of GPM [see Zielinski, 2005].

  20. Stock price forecasting for companies listed on Tehran stock exchange using multivariate adaptive regression splines model and semi-parametric splines technique

    NASA Astrophysics Data System (ADS)

    Rounaghi, Mohammad Mahdi; Abbaszadeh, Mohammad Reza; Arashi, Mohammad

    2015-11-01

    One of the most important topics of interest to investors is stock price changes. Investors whose goals are long term are sensitive to stock price and its changes and react to them. In this regard, we used multivariate adaptive regression splines (MARS) model and semi-parametric splines technique for predicting stock price in this study. The MARS model as a nonparametric method is an adaptive method for regression and it fits for problems with high dimensions and several variables. semi-parametric splines technique was used in this study. Smoothing splines is a nonparametric regression method. In this study, we used 40 variables (30 accounting variables and 10 economic variables) for predicting stock price using the MARS model and using semi-parametric splines technique. After investigating the models, we select 4 accounting variables (book value per share, predicted earnings per share, P/E ratio and risk) as influencing variables on predicting stock price using the MARS model. After fitting the semi-parametric splines technique, only 4 accounting variables (dividends, net EPS, EPS Forecast and P/E Ratio) were selected as variables effective in forecasting stock prices.

  1. Short-term ensemble streamflow forecasting using operationally-produced single-valued streamflow forecasts - A Hydrologic Model Output Statistics (HMOS) approach

    NASA Astrophysics Data System (ADS)

    Regonda, Satish Kumar; Seo, Dong-Jun; Lawrence, Bill; Brown, James D.; Demargne, Julie

    2013-08-01

    We present a statistical procedure for generating short-term ensemble streamflow forecasts from single-valued, or deterministic, streamflow forecasts produced operationally by the U.S. National Weather Service (NWS) River Forecast Centers (RFCs). The resulting ensemble streamflow forecast provides an estimate of the predictive uncertainty associated with the single-valued forecast to support risk-based decision making by the forecasters and by the users of the forecast products, such as emergency managers. Forced by single-valued quantitative precipitation and temperature forecasts (QPF, QTF), the single-valued streamflow forecasts are produced at a 6-h time step nominally out to 5 days into the future. The single-valued streamflow forecasts reflect various run-time modifications, or "manual data assimilation", applied by the human forecasters in an attempt to reduce error from various sources in the end-to-end forecast process. The proposed procedure generates ensemble traces of streamflow from a parsimonious approximation of the conditional multivariate probability distribution of future streamflow given the single-valued streamflow forecast, QPF, and the most recent streamflow observation. For parameter estimation and evaluation, we used a multiyear archive of the single-valued river stage forecast produced operationally by the NWS Arkansas-Red River Basin River Forecast Center (ABRFC) in Tulsa, Oklahoma. As a by-product of parameter estimation, the procedure provides a categorical assessment of the effective lead time of the operational hydrologic forecasts for different QPF and forecast flow conditions. To evaluate the procedure, we carried out hindcasting experiments in dependent and cross-validation modes. The results indicate that the short-term streamflow ensemble hindcasts generated from the procedure are generally reliable within the effective lead time of the single-valued forecasts and well capture the skill of the single-valued forecasts. For smaller basins, however, the effective lead time is significantly reduced by short basin memory and reduced skill in the single-valued QPF.

  2. Recruiting From Within: Action-Oriented Research Solutions to Internal Student Recruitment in Collegiate Aviation Education

    DOT National Transportation Integrated Search

    1999-01-01

    Forecasts by the Federal Aviation Administration(FAA) and industry document renewed growth and demand for aviation employment. That need should be realized by increased enrollments on our aviation college campuses. Collegiate aviation education provi...

  3. Adjusting Wavelet-based Multiresolution Analysis Boundary Conditions for Robust Long-term Streamflow Forecasting Model

    NASA Astrophysics Data System (ADS)

    Maslova, I.; Ticlavilca, A. M.; McKee, M.

    2012-12-01

    There has been an increased interest in wavelet-based streamflow forecasting models in recent years. Often overlooked in this approach are the circularity assumptions of the wavelet transform. We propose a novel technique for minimizing the wavelet decomposition boundary condition effect to produce long-term, up to 12 months ahead, forecasts of streamflow. A simulation study is performed to evaluate the effects of different wavelet boundary rules using synthetic and real streamflow data. A hybrid wavelet-multivariate relevance vector machine model is developed for forecasting the streamflow in real-time for Yellowstone River, Uinta Basin, Utah, USA. The inputs of the model utilize only the past monthly streamflow records. They are decomposed into components formulated in terms of wavelet multiresolution analysis. It is shown that the model model accuracy can be increased by using the wavelet boundary rule introduced in this study. This long-term streamflow modeling and forecasting methodology would enable better decision-making and managing water availability risk.

  4. Multivariate time series modeling of short-term system scale irrigation demand

    NASA Astrophysics Data System (ADS)

    Perera, Kushan C.; Western, Andrew W.; George, Biju; Nawarathna, Bandara

    2015-12-01

    Travel time limits the ability of irrigation system operators to react to short-term irrigation demand fluctuations that result from variations in weather, including very hot periods and rainfall events, as well as the various other pressures and opportunities that farmers face. Short-term system-wide irrigation demand forecasts can assist in system operation. Here we developed a multivariate time series (ARMAX) model to forecast irrigation demands with respect to aggregated service points flows (IDCGi, ASP) and off take regulator flows (IDCGi, OTR) based across 5 command areas, which included area covered under four irrigation channels and the study area. These command area specific ARMAX models forecast 1-5 days ahead daily IDCGi, ASP and IDCGi, OTR using the real time flow data recorded at the service points and the uppermost regulators and observed meteorological data collected from automatic weather stations. The model efficiency and the predictive performance were quantified using the root mean squared error (RMSE), Nash-Sutcliffe model efficiency coefficient (NSE), anomaly correlation coefficient (ACC) and mean square skill score (MSSS). During the evaluation period, NSE for IDCGi, ASP and IDCGi, OTR across 5 command areas were ranged 0.98-0.78. These models were capable of generating skillful forecasts (MSSS ⩾ 0.5 and ACC ⩾ 0.6) of IDCGi, ASP and IDCGi, OTR for all 5 lead days and IDCGi, ASP and IDCGi, OTR forecasts were better than using the long term monthly mean irrigation demand. Overall these predictive performance from the ARMAX time series models were higher than almost all the previous studies we are aware. Further, IDCGi, ASP and IDCGi, OTR forecasts have improved the operators' ability to react for near future irrigation demand fluctuations as the developed ARMAX time series models were self-adaptive to reflect the short-term changes in the irrigation demand with respect to various pressures and opportunities that farmers' face, such as changing water policy, continued development of water markets, drought and changing technology.

  5. WILD SALMON IN WESTERN NORTH AMERICA: FORECASTING THE MOST LIKELY STATUS IN 2100

    EPA Science Inventory

    The future of wild salmon in western North America (especially California, Oregon, Washington, Idaho, and southern British Columbia), as earnest, expensive, and socially disruptive as current recovery efforts are, does not appear likely to realize sustain biologically significan...

  6. Generating synthetic daily precipitation realizations for seasonal precipitation forecasts

    USDA-ARS?s Scientific Manuscript database

    Synthetic weather generation models that depend on statistics of past weather observations are often limited in their applications to issues that depend upon historical weather characteristics. Enhancing these models to take advantage of increasingly available and skillful seasonal climate outlook p...

  7. Integrated Forecast-Decision Systems For River Basin Planning and Management

    NASA Astrophysics Data System (ADS)

    Georgakakos, A. P.

    2005-12-01

    A central application of climatology, meteorology, and hydrology is the generation of reliable forecasts for water resources management. In principle, effective use of forecasts could improve water resources management by providing extra protection against floods, mitigating the adverse effects of droughts, generating more hydropower, facilitating recreational activities, and minimizing the impacts of extreme events on the environment and the ecosystems. In practice, however, realization of these benefits depends on three requisite elements. First is the skill and reliability of forecasts. Second is the existence of decision support methods/systems with the ability to properly utilize forecast information. And third is the capacity of the institutional infrastructure to incorporate the information provided by the decision support systems into the decision making processes. This presentation discusses several decision support systems (DSS) using ensemble forecasting that have been developed by the Georgia Water Resources Institute for river basin management. These DSS are currently operational in Africa, Europe, and the US and address integrated water resources and energy planning and management in river basins with multiple water uses, multiple relevant temporal and spatial scales, and multiple decision makers. The article discusses the methods used and advocates that the design, development, and implementation of effective forecast-decision support systems must bring together disciplines, people, and institutions necessary to address today's complex water resources challenges.

  8. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    PubMed

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  9. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis

    PubMed Central

    Mohammed, Emad A.; Naugler, Christopher

    2017-01-01

    Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand. PMID:28400996

  10. Tourism demand in the Algarve region: Evolution and forecast using SVARMA models

    NASA Astrophysics Data System (ADS)

    Lopes, Isabel Cristina; Soares, Filomena; Silva, Eliana Costa e.

    2017-06-01

    Tourism is one of the Portuguese economy's key sectors, and its relative weight has grown over recent years. The Algarve region is particularly focused on attracting foreign tourists and has built over the years a large offer of diversified hotel units. In this paper we present multivariate time series approach to forecast the number of overnight stays in hotel units (hotels, guesthouses or hostels, and tourist apartments) in Algarve. We adjust a seasonal vector autoregressive and moving averages model (SVARMA) to monthly data between 2006 and 2016. The forecast values were compared with the actual values of the overnight stays in Algarve in 2016 and led to a MAPE of 15.1% and RMSE= 53847.28. The MAPE for the Hotel series was merely 4.56%. These forecast values can be used by a hotel manager to predict their occupancy and to determine the best pricing policy.

  11. High-Resolution Hydrological Sub-Seasonal Forecasting for Water Resources Management Over Europe

    NASA Astrophysics Data System (ADS)

    Wood, E. F.; Wanders, N.; Pan, M.; Sheffield, J.; Samaniego, L. E.; Thober, S.; Kumar, R.; Prudhomme, C.; Houghton-Carr, H.

    2017-12-01

    For decision-making at the sub-seasonal and seasonal time scale, hydrological forecasts with a high temporal and spatial resolution are required by water managers. So far such forecasts have been unavailable due to 1) lack of availability of meteorological seasonal forecasts, 2) coarse temporal resolution of meteorological seasonal forecasts, requiring temporal downscaling, 3) lack of consistency between observations and seasonal forecasts, requiring bias-correction. The EDgE (End-to-end Demonstrator for improved decision making in the water sector in Europe) project commissioned by the ECMWF (C3S) created a unique dataset of hydrological seasonal forecasts derived from four global climate models (CanCM4, FLOR-B01, ECMF, LFPW) in combination with four global hydrological models (PCR-GLOBWB, VIC, mHM, Noah-MP), resulting in 208 forecasts for any given day. The forecasts provide a daily temporal and 5-km spatial resolution, and are bias corrected against E-OBS meteorological observations. The forecasts are communicated to stakeholders via Sectoral Climate Impact Indicators (SCIIs), created in collaboration with the end-user community of the EDgE project (e.g. the percentage of ensemble realizations above the 10th percentile of monthly river flow, or below the 90th). Results show skillful forecasts for discharge from 3 months to 6 months (latter for N Europe due to snow); for soil moisture up to three months due precipitation forecast skill and short initial condition memory; and for groundwater greater than 6 months (lowest skill in western Europe.) The SCIIs are effective in communicating both forecast skill and uncertainty. Overall the new system provides an unprecedented ensemble for seasonal forecasts with significant skill over Europe to support water management. The consistency in both the GCM forecasts and the LSM parameterization ensures a stable and reliable forecast framework and methodology, even if additional GCMs or LSMs are added in the future.

  12. Exploring heterogeneous market hypothesis using realized volatility

    NASA Astrophysics Data System (ADS)

    Chin, Wen Cheong; Isa, Zaidi; Mohd Nor, Abu Hassan Shaari

    2013-04-01

    This study investigates the heterogeneous market hypothesis using high frequency data. The cascaded heterogeneous trading activities with different time durations are modelled by the heterogeneous autoregressive framework. The empirical study indicated the presence of long memory behaviour and predictability elements in the financial time series which supported heterogeneous market hypothesis. Besides the common sum-of-square intraday realized volatility, we also advocated two power variation realized volatilities in forecast evaluation and risk measurement in order to overcome the possible abrupt jumps during the credit crisis. Finally, the empirical results are used in determining the market risk using the value-at-risk approach. The findings of this study have implications for informationally market efficiency analysis, portfolio strategies and risk managements.

  13. Improving Global Forecast System of extreme precipitation events with regional statistical model: Application of quantile-based probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Shastri, Hiteshri; Ghosh, Subimal; Karmakar, Subhankar

    2017-02-01

    Forecasting of extreme precipitation events at a regional scale is of high importance due to their severe impacts on society. The impacts are stronger in urban regions due to high flood potential as well high population density leading to high vulnerability. Although significant scientific improvements took place in the global models for weather forecasting, they are still not adequate at a regional scale (e.g., for an urban region) with high false alarms and low detection. There has been a need to improve the weather forecast skill at a local scale with probabilistic outcome. Here we develop a methodology with quantile regression, where the reliably simulated variables from Global Forecast System are used as predictors and different quantiles of rainfall are generated corresponding to that set of predictors. We apply this method to a flood-prone coastal city of India, Mumbai, which has experienced severe floods in recent years. We find significant improvements in the forecast with high detection and skill scores. We apply the methodology to 10 ensemble members of Global Ensemble Forecast System and find a reduction in ensemble uncertainty of precipitation across realizations with respect to that of original precipitation forecasts. We validate our model for the monsoon season of 2006 and 2007, which are independent of the training/calibration data set used in the study. We find promising results and emphasize to implement such data-driven methods for a better probabilistic forecast at an urban scale primarily for an early flood warning.

  14. On the effect of model parameters on forecast objects

    NASA Astrophysics Data System (ADS)

    Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott

    2018-04-01

    Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map. The field for some quantities generally consists of spatially coherent and disconnected objects. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.

  15. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs

    DOE PAGES

    Buitrago, Jaime; Asfour, Shihab

    2017-01-01

    Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less

  16. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buitrago, Jaime; Asfour, Shihab

    Short-term load forecasting is crucial for the operations planning of an electrical grid. Forecasting the next 24 h of electrical load in a grid allows operators to plan and optimize their resources. The purpose of this study is to develop a more accurate short-term load forecasting method utilizing non-linear autoregressive artificial neural networks (ANN) with exogenous multi-variable input (NARX). The proposed implementation of the network is new: the neural network is trained in open-loop using actual load and weather data, and then, the network is placed in closed-loop to generate a forecast using the predicted load as the feedback input.more » Unlike the existing short-term load forecasting methods using ANNs, the proposed method uses its own output as the input in order to improve the accuracy, thus effectively implementing a feedback loop for the load, making it less dependent on external data. Using the proposed framework, mean absolute percent errors in the forecast in the order of 1% have been achieved, which is a 30% improvement on the average error using feedforward ANNs, ARMAX and state space methods, which can result in large savings by avoiding commissioning of unnecessary power plants. Finally, the New England electrical load data are used to train and validate the forecast prediction.« less

  17. Thermal habitat index of many northwest Atlantic temperate species stays neutral under warming projected for 2030 but changes radically by 2060.

    PubMed

    Shackell, Nancy L; Ricard, Daniel; Stortini, Christine

    2014-01-01

    Global scale forecasts of range shifts in response to global warming have provided vital insight into predicted species redistribution. We build on that insight by examining whether local warming will affect habitat on spatiotemporal scales relevant to regional agencies. We used generalized additive models to quantify the realized habitat of 46 temperate/boreal marine species using 41+ years of survey data from 35°N-48°N in the Northwest Atlantic. We then estimated change in a "realized thermal habitat index" under short-term (2030) and long-term (2060) warming scenarios. Under the 2030 scenario, ∼10% of species will lose realized thermal habitat at the national scale (USA and Canada) but planktivores are expected to lose significantly in both countries which may result in indirect changes in their predators' distribution. In contrast, by 2060 in Canada, the realized habitat of 76% of species will change (55% will lose, 21% will gain) while in the USA, the realized habitat of 85% of species will change (65% will lose, 20% will gain). If all else were held constant, the ecosystem is projected to change radically based on thermal habitat alone. The magnitude of the 2060 warming projection (∼1.5-3°C) was observed in 2012 affirming that research is needed on effects of extreme "weather" in addition to increasing mean temperature. Our approach can be used to aggregate at smaller spatial scales where temperate/boreal species are hypothesized to have a greater loss at ∼40°N. The uncertainty associated with climate change forecasts is large, yet resource management agencies still have to address climate change. How? Since many fishery agencies do not plan beyond 5 years, a logical way forward is to incorporate a "realized thermal habitat index" into the stock assessment process. Over time, decisions would be influenced by the amount of suitable thermal habitat, in concert with gradual or extreme warming.

  18. Modeling inflation rates and exchange rates in Ghana: application of multivariate GARCH models.

    PubMed

    Nortey, Ezekiel Nn; Ngoh, Delali D; Doku-Amponsah, Kwabena; Ofori-Boateng, Kenneth

    2015-01-01

    This paper was aimed at investigating the volatility and conditional relationship among inflation rates, exchange rates and interest rates as well as to construct a model using multivariate GARCH DCC and BEKK models using Ghana data from January 1990 to December 2013. The study revealed that the cumulative depreciation of the cedi to the US dollar from 1990 to 2013 is 7,010.2% and the yearly weighted depreciation of the cedi to the US dollar for the period is 20.4%. There was evidence that, the fact that inflation rate was stable, does not mean that exchange rates and interest rates are expected to be stable. Rather, when the cedi performs well on the forex, inflation rates and interest rates react positively and become stable in the long run. The BEKK model is robust to modelling and forecasting volatility of inflation rates, exchange rates and interest rates. The DCC model is robust to model the conditional and unconditional correlation among inflation rates, exchange rates and interest rates. The BEKK model, which forecasted high exchange rate volatility for the year 2014, is very robust for modelling the exchange rates in Ghana. The mean equation of the DCC model is also robust to forecast inflation rates in Ghana.

  19. A Gaussian Processes Technique for Short-term Load Forecasting with Considerations of Uncertainty

    NASA Astrophysics Data System (ADS)

    Ohmi, Masataro; Mori, Hiroyuki

    In this paper, an efficient method is proposed to deal with short-term load forecasting with the Gaussian Processes. Short-term load forecasting plays a key role to smooth power system operation such as economic load dispatching, unit commitment, etc. Recently, the deregulated and competitive power market increases the degree of uncertainty. As a result, it is more important to obtain better prediction results to save the cost. One of the most important aspects is that power system operator needs the upper and lower bounds of the predicted load to deal with the uncertainty while they require more accurate predicted values. The proposed method is based on the Bayes model in which output is expressed in a distribution rather than a point. To realize the model efficiently, this paper proposes the Gaussian Processes that consists of the Bayes linear model and kernel machine to obtain the distribution of the predicted value. The proposed method is successively applied to real data of daily maximum load forecasting.

  20. The NRL relocatable ocean/acoustic ensemble forecast system

    NASA Astrophysics Data System (ADS)

    Rowley, C.; Martin, P.; Cummings, J.; Jacobs, G.; Coelho, E.; Bishop, C.; Hong, X.; Peggion, G.; Fabre, J.

    2009-04-01

    A globally relocatable regional ocean nowcast/forecast system has been developed to support rapid implementation of new regional forecast domains. The system is in operational use at the Naval Oceanographic Office for a growing number of regional and coastal implementations. The new system is the basis for an ocean acoustic ensemble forecast and adaptive sampling capability. We present an overview of the forecast system and the ocean ensemble and adaptive sampling methods. The forecast system consists of core ocean data analysis and forecast modules, software for domain configuration, surface and boundary condition forcing processing, and job control, and global databases for ocean climatology, bathymetry, tides, and river locations and transports. The analysis component is the Navy Coupled Ocean Data Assimilation (NCODA) system, a 3D multivariate optimum interpolation system that produces simultaneous analyses of temperature, salinity, geopotential, and vector velocity using remotely-sensed SST, SSH, and sea ice concentration, plus in situ observations of temperature, salinity, and currents from ships, buoys, XBTs, CTDs, profiling floats, and autonomous gliders. The forecast component is the Navy Coastal Ocean Model (NCOM). The system supports one-way nesting and multiple assimilation methods. The ensemble system uses the ensemble transform technique with error variance estimates from the NCODA analysis to represent initial condition error. Perturbed surface forcing or an atmospheric ensemble is used to represent errors in surface forcing. The ensemble transform Kalman filter is used to assess the impact of adaptive observations on future analysis and forecast uncertainty for both ocean and acoustic properties.

  1. Multivariate Bias Correction Procedures for Improving Water Quality Predictions from the SWAT Model

    NASA Astrophysics Data System (ADS)

    Arumugam, S.; Libera, D.

    2017-12-01

    Water quality observations are usually not available on a continuous basis for longer than 1-2 years at a time over a decadal period given the labor requirements making calibrating and validating mechanistic models difficult. Further, any physical model predictions inherently have bias (i.e., under/over estimation) and require post-simulation techniques to preserve the long-term mean monthly attributes. This study suggests a multivariate bias-correction technique and compares to a common technique in improving the performance of the SWAT model in predicting daily streamflow and TN loads across the southeast based on split-sample validation. The approach is a dimension reduction technique, canonical correlation analysis (CCA) that regresses the observed multivariate attributes with the SWAT model simulated values. The common approach is a regression based technique that uses an ordinary least squares regression to adjust model values. The observed cross-correlation between loadings and streamflow is better preserved when using canonical correlation while simultaneously reducing individual biases. Additionally, canonical correlation analysis does a better job in preserving the observed joint likelihood of observed streamflow and loadings. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically, watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are compared for the observed period and over a multi-decadal period using loading estimates from the USGS LOADEST model. Lastly, the CCA technique is applied in a forecasting sense by using 1-month ahead forecasts of P & T from ECHAM4.5 as forcings in the SWAT model. Skill in using the SWAT model for forecasting loadings and streamflow at the monthly and seasonal timescale is also discussed.

  2. ADVANCES IN THE APPLICATION OF REMOTE SENSING TO PLANT INCORPORATED PROTECTANT CROP MONITORING

    EPA Science Inventory

    Current forecasts call for significant increases to the plantings of transgenic corn in the United States for the 2007 growing season and beyond. Transgenic acreage approaching 80% of the total corn plantings could be realized by 2009. These conditions call for a new approach to ...

  3. Impact of number of realizations on the suitability of simulated weather data for hydrologic and environmental applications

    USDA-ARS?s Scientific Manuscript database

    Stochastic weather generators are widely used in hydrological, environmental, and agricultural applications to simulate and forecast weather time series. However, such stochastic processes usually produce random outputs hence the question on how representative the generated data are if obtained fro...

  4. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  5. Stochastic simulation of predictive space–time scenarios of wind speed using observations and physical model outputs

    DOE PAGES

    Bessac, Julie; Constantinescu, Emil; Anitescu, Mihai

    2018-03-01

    We propose a statistical space-time model for predicting atmospheric wind speed based on deterministic numerical weather predictions and historical measurements. We consider a Gaussian multivariate space-time framework that combines multiple sources of past physical model outputs and measurements in order to produce a probabilistic wind speed forecast within the prediction window. We illustrate this strategy on wind speed forecasts during several months in 2012 for a region near the Great Lakes in the United States. The results show that the prediction is improved in the mean-squared sense relative to the numerical forecasts as well as in probabilistic scores. Moreover, themore » samples are shown to produce realistic wind scenarios based on sample spectra and space-time correlation structure.« less

  6. Estimating predictive hydrological uncertainty by dressing deterministic and ensemble forecasts; a comparison, with application to Meuse and Rhine

    NASA Astrophysics Data System (ADS)

    Verkade, J. S.; Brown, J. D.; Davids, F.; Reggiani, P.; Weerts, A. H.

    2017-12-01

    Two statistical post-processing approaches for estimation of predictive hydrological uncertainty are compared: (i) 'dressing' of a deterministic forecast by adding a single, combined estimate of both hydrological and meteorological uncertainty and (ii) 'dressing' of an ensemble streamflow forecast by adding an estimate of hydrological uncertainty to each individual streamflow ensemble member. Both approaches aim to produce an estimate of the 'total uncertainty' that captures both the meteorological and hydrological uncertainties. They differ in the degree to which they make use of statistical post-processing techniques. In the 'lumped' approach, both sources of uncertainty are lumped by post-processing deterministic forecasts using their verifying observations. In the 'source-specific' approach, the meteorological uncertainties are estimated by an ensemble of weather forecasts. These ensemble members are routed through a hydrological model and a realization of the probability distribution of hydrological uncertainties (only) is then added to each ensemble member to arrive at an estimate of the total uncertainty. The techniques are applied to one location in the Meuse basin and three locations in the Rhine basin. Resulting forecasts are assessed for their reliability and sharpness, as well as compared in terms of multiple verification scores including the relative mean error, Brier Skill Score, Mean Continuous Ranked Probability Skill Score, Relative Operating Characteristic Score and Relative Economic Value. The dressed deterministic forecasts are generally more reliable than the dressed ensemble forecasts, but the latter are sharper. On balance, however, they show similar quality across a range of verification metrics, with the dressed ensembles coming out slightly better. Some additional analyses are suggested. Notably, these include statistical post-processing of the meteorological forecasts in order to increase their reliability, thus increasing the reliability of the streamflow forecasts produced with ensemble meteorological forcings.

  7. Remote Multivariable Control Design Using a Competition Game

    ERIC Educational Resources Information Center

    Atanasijevic-Kunc, M.; Logar, V.; Karba, R.; Papic, M.; Kos, A.

    2011-01-01

    In this paper, some approaches to teaching multivariable control design are discussed, with special attention being devoted to a step-by-step transition to e-learning. The approach put into practice and presented here is developed through design projects, from which one is chosen as a competition game and is realized using the E-CHO system,…

  8. Insights on multivariate updates of physical and biogeochemical ocean variables using an Ensemble Kalman Filter and an idealized model of upwelling

    NASA Astrophysics Data System (ADS)

    Yu, Liuqian; Fennel, Katja; Bertino, Laurent; Gharamti, Mohamad El; Thompson, Keith R.

    2018-06-01

    Effective data assimilation methods for incorporating observations into marine biogeochemical models are required to improve hindcasts, nowcasts and forecasts of the ocean's biogeochemical state. Recent assimilation efforts have shown that updating model physics alone can degrade biogeochemical fields while only updating biogeochemical variables may not improve a model's predictive skill when the physical fields are inaccurate. Here we systematically investigate whether multivariate updates of physical and biogeochemical model states are superior to only updating either physical or biogeochemical variables. We conducted a series of twin experiments in an idealized ocean channel that experiences wind-driven upwelling. The forecast model was forced with biased wind stress and perturbed biogeochemical model parameters compared to the model run representing the "truth". Taking advantage of the multivariate nature of the deterministic Ensemble Kalman Filter (DEnKF), we assimilated different combinations of synthetic physical (sea surface height, sea surface temperature and temperature profiles) and biogeochemical (surface chlorophyll and nitrate profiles) observations. We show that when biogeochemical and physical properties are highly correlated (e.g., thermocline and nutricline), multivariate updates of both are essential for improving model skill and can be accomplished by assimilating either physical (e.g., temperature profiles) or biogeochemical (e.g., nutrient profiles) observations. In our idealized domain, the improvement is largely due to a better representation of nutrient upwelling, which results in a more accurate nutrient input into the euphotic zone. In contrast, assimilating surface chlorophyll improves the model state only slightly, because surface chlorophyll contains little information about the vertical density structure. We also show that a degradation of the correlation between observed subsurface temperature and nutrient fields, which has been an issue in several previous assimilation studies, can be reduced by multivariate updates of physical and biogeochemical fields.

  9. Forecasted economic change and the self-fulfilling prophecy in economic decision-making

    PubMed Central

    2017-01-01

    This study addresses the self-fulfilling prophecy effect, in the domain of economic decision-making. We present experimental data in support of the hypothesis that speculative forecasts of economic change can impact individuals’ economic decision behavior, prior to any realized changes. In a within-subjects experiment, participants (N = 40) played 180 trials in a Balloon Analogue Risk Talk (BART) in which they could make actual profit. Simple messages about possible (positive and negative) changes in outcome probabilities of future trials had significant effects on measures of risk taking (number of inflations) and actual profits in the game. These effects were enduring, even though no systematic changes in actual outcome probabilities took place following any of the messages. Risk taking also found to be reflected in reaction times revealing increasing reaction times with riskier decisions. Positive and negative economic forecasts affected reaction times slopes differently, with negative forecasts resulting in increased reaction time slopes as a function of risk. These findings suggest that forecasted positive or negative economic change can bias people’s mental model of the economy and reduce or stimulate risk taking. Possible implications for media-fulfilling prophecies in the domain of the economy are considered. PMID:28334031

  10. Towards smart energy systems: application of kernel machine regression for medium term electricity load forecasting.

    PubMed

    Alamaniotis, Miltiadis; Bargiotas, Dimitrios; Tsoukalas, Lefteri H

    2016-01-01

    Integration of energy systems with information technologies has facilitated the realization of smart energy systems that utilize information to optimize system operation. To that end, crucial in optimizing energy system operation is the accurate, ahead-of-time forecasting of load demand. In particular, load forecasting allows planning of system expansion, and decision making for enhancing system safety and reliability. In this paper, the application of two types of kernel machines for medium term load forecasting (MTLF) is presented and their performance is recorded based on a set of historical electricity load demand data. The two kernel machine models and more specifically Gaussian process regression (GPR) and relevance vector regression (RVR) are utilized for making predictions over future load demand. Both models, i.e., GPR and RVR, are equipped with a Gaussian kernel and are tested on daily predictions for a 30-day-ahead horizon taken from the New England Area. Furthermore, their performance is compared to the ARMA(2,2) model with respect to mean average percentage error and squared correlation coefficient. Results demonstrate the superiority of RVR over the other forecasting models in performing MTLF.

  11. Forecasting the stochastic demand for inpatient care: the case of the Greek national health system.

    PubMed

    Boutsioli, Zoe

    2010-08-01

    The aim of this study is to estimate the unexpected demand of Greek public hospitals. A multivariate model with four explanatory variables is used. These are as follows: the weekend effect, the duty effect, the summer holiday and the official holiday. The method of the ordinary least squares is used to estimate the impact of these variables on the daily hospital emergency admissions series. The forecasted residuals of hospital regressions for each year give the estimated stochastic demand. Daily emergency admissions decline during weekends, summer months and official holidays, and increase on duty hospital days. Stochastic hospital demand varies both among hospitals and over the five-year time period under investigation. Variations among hospitals are larger than time variations. Hospital managers and health policy-makers can be availed by forecasting the future flows of emergent patients. The benefit can be both at managerial and economical level. More advanced models including additional daily variables such as the weather forecasts could provide more accurate estimations.

  12. Analysis/forecast experiments with a flow-dependent correlation function using FGGE data

    NASA Technical Reports Server (NTRS)

    Baker, W. E.; Bloom, S. C.; Carus, H.; Nestler, M. S.

    1986-01-01

    The use of a flow-dependent correlation function to improve the accuracy of an optimum interpolation (OI) scheme is examined. The development of the correlation function for the OI analysis scheme used for numerical weather prediction is described. The scheme uses a multivariate surface analysis over the oceans to model the pressure-wind error cross-correlation and it has the ability to use an error correlation function that is flow- and geographically-dependent. A series of four-day data assimilation experiments, conducted from January 5-9, 1979, were used to investigate the effect of the different features of the OI scheme (error correlation) on forecast skill for the barotropic lows and highs. The skill of the OI was compared with that of a successive correlation method (SCM) of analysis. It is observed that the largest difference in the correlation statistics occurred in barotropic and baroclinic lows and highs. The comparison reveals that the OI forecasts were more accurate than the SCM forecasts.

  13. Bayesian integration of sensor information and a multivariate dynamic linear model for prediction of dairy cow mastitis.

    PubMed

    Jensen, Dan B; Hogeveen, Henk; De Vries, Albert

    2016-09-01

    Rapid detection of dairy cow mastitis is important so corrective action can be taken as soon as possible. Automatically collected sensor data used to monitor the performance and the health state of the cow could be useful for rapid detection of mastitis while reducing the labor needs for monitoring. The state of the art in combining sensor data to predict clinical mastitis still does not perform well enough to be applied in practice. Our objective was to combine a multivariate dynamic linear model (DLM) with a naïve Bayesian classifier (NBC) in a novel method using sensor and nonsensor data to detect clinical cases of mastitis. We also evaluated reductions in the number of sensors for detecting mastitis. With the DLM, we co-modeled 7 sources of sensor data (milk yield, fat, protein, lactose, conductivity, blood, body weight) collected at each milking for individual cows to produce one-step-ahead forecasts for each sensor. The observations were subsequently categorized according to the errors of the forecasted values and the estimated forecast variance. The categorized sensor data were combined with other data pertaining to the cow (week in milk, parity, mastitis history, somatic cell count category, and season) using Bayes' theorem, which produced a combined probability of the cow having clinical mastitis. If this probability was above a set threshold, the cow was classified as mastitis positive. To illustrate the performance of our method, we used sensor data from 1,003,207 milkings from the University of Florida Dairy Unit collected from 2008 to 2014. Of these, 2,907 milkings were associated with recorded cases of clinical mastitis. Using the DLM/NBC method, we reached an area under the receiver operating characteristic curve of 0.89, with a specificity of 0.81 when the sensitivity was set at 0.80. Specificities with omissions of sensor data ranged from 0.58 to 0.81. These results are comparable to other studies, but differences in data quality, definitions of clinical mastitis, and time windows make comparisons across studies difficult. We found the DLM/NBC method to be a flexible method for combining multiple sensor and nonsensor data sources to predict clinical mastitis and accommodate missing observations. Further research is needed before practical implementation is possible. In particular, the performance of our method needs to be improved in the first 2 wk of lactation. The DLM method produces forecasts that are based on continuously estimated multivariate normal distributions, which makes forecasts and forecast errors easy to interpret, and new sensors can easily be added. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  14. Impact of variational assimilation using multivariate background error covariances on the simulation of monsoon depressions over India

    NASA Astrophysics Data System (ADS)

    Dhanya, M.; Chandrasekar, A.

    2016-02-01

    The background error covariance structure influences a variational data assimilation system immensely. The simulation of a weather phenomenon like monsoon depression can hence be influenced by the background correlation information used in the analysis formulation. The Weather Research and Forecasting Model Data assimilation (WRFDA) system includes an option for formulating multivariate background correlations for its three-dimensional variational (3DVar) system (cv6 option). The impact of using such a formulation in the simulation of three monsoon depressions over India is investigated in this study. Analysis and forecast fields generated using this option are compared with those obtained using the default formulation for regional background error correlations (cv5) in WRFDA and with a base run without any assimilation. The model rainfall forecasts are compared with rainfall observations from the Tropical Rainfall Measurement Mission (TRMM) and the other model forecast fields are compared with a high-resolution analysis as well as with European Centre for Medium-Range Weather Forecasts (ECMWF) ERA-Interim reanalysis. The results of the study indicate that inclusion of additional correlation information in background error statistics has a moderate impact on the vertical profiles of relative humidity, moisture convergence, horizontal divergence and the temperature structure at the depression centre at the analysis time of the cv5/cv6 sensitivity experiments. Moderate improvements are seen in two of the three depressions investigated in this study. An improved thermodynamic and moisture structure at the initial time is expected to provide for improved rainfall simulation. The results of the study indicate that the skill scores of accumulated rainfall are somewhat better for the cv6 option as compared to the cv5 option for at least two of the three depression cases studied, especially at the higher threshold levels. Considering the importance of utilising improved flow-dependent correlation structures for efficient data assimilation, the need for more studies on the impact of background error covariances is obvious.

  15. The scientific challenges to forecasting and nowcasting the solar origins of space weather (Invited)

    NASA Astrophysics Data System (ADS)

    Schrijver, C. J.; Title, A. M.

    2013-12-01

    With the full-sphere continuous coverage of the Sun achieved by combining SDO and STEREO imagery comes the realization that solar activity is a manifestation of local processes that respond to long-range if not global influences. Numerical experiments provide insights into these couplings, as well as into the intricacies of destabilizations of field emerging into pre-existing configurations and evolving within the context of their dynamic surroundings. With these capabilities grows an understanding of the difficulties in forecasting of the solar origins of space weather: we need assimilative global non-potential field models, but our observational resources are too limited to meet that need.

  16. Global analysis of seasonal streamflow predictability using an ensemble prediction system and observations from 6192 small catchments worldwide

    NASA Astrophysics Data System (ADS)

    van Dijk, Albert I. J. M.; Peña-Arancibia, Jorge L.; Wood, Eric F.; Sheffield, Justin; Beck, Hylke E.

    2013-05-01

    Ideally, a seasonal streamflow forecasting system would ingest skilful climate forecasts and propagate these through calibrated hydrological models initialized with observed catchment conditions. At global scale, practical problems exist in each of these aspects. For the first time, we analyzed theoretical and actual skill in bimonthly streamflow forecasts from a global ensemble streamflow prediction (ESP) system. Forecasts were generated six times per year for 1979-2008 by an initialized hydrological model and an ensemble of 1° resolution daily climate estimates for the preceding 30 years. A post-ESP conditional sampling method was applied to 2.6% of forecasts, based on predictive relationships between precipitation and 1 of 21 climate indices prior to the forecast date. Theoretical skill was assessed against a reference run with historic forcing. Actual skill was assessed against streamflow records for 6192 small (<10,000 km2) catchments worldwide. The results show that initial catchment conditions provide the main source of skill. Post-ESP sampling enhanced skill in equatorial South America and Southeast Asia, particularly in terms of tercile probability skill, due to the persistence and influence of the El Niño Southern Oscillation. Actual skill was on average 54% of theoretical skill but considerably more for selected regions and times of year. The realized fraction of the theoretical skill probably depended primarily on the quality of precipitation estimates. Forecast skill could be predicted as the product of theoretical skill and historic model performance. Increases in seasonal forecast skill are likely to require improvement in the observation of precipitation and initial hydrological conditions.

  17. The Impact of Implementing a Demand Forecasting System into a Low-Income Country’s Supply Chain

    PubMed Central

    Mueller, Leslie E.; Haidari, Leila A.; Wateska, Angela R.; Phillips, Roslyn J.; Schmitz, Michelle M.; Connor, Diana L.; Norman, Bryan A.; Brown, Shawn T.; Welling, Joel S.; Lee, Bruce Y.

    2016-01-01

    OBJECTIVE To evaluate the potential impact and value of applications (e.g., ordering levels, storage capacity, transportation capacity, distribution frequency) of data from demand forecasting systems implemented in a lower-income country’s vaccine supply chain with different levels of population change to urban areas. MATERIALS AND METHODS Using our software, HERMES, we generated a detailed discrete event simulation model of Niger’s entire vaccine supply chain, including every refrigerator, freezer, transport, personnel, vaccine, cost, and location. We represented the introduction of a demand forecasting system to adjust vaccine ordering that could be implemented with increasing delivery frequencies and/or additions of cold chain equipment (storage and/or transportation) across the supply chain during varying degrees of population movement. RESULTS Implementing demand forecasting system with increased storage and transport frequency increased the number of successfully administered vaccine doses and lowered the logistics cost per dose up to 34%. Implementing demand forecasting system without storage/transport increases actually decreased vaccine availability in certain circumstances. DISCUSSION The potential maximum gains of a demand forecasting system may only be realized if the system is implemented to both augment the supply chain cold storage and transportation. Implementation may have some impact but, in certain circumstances, may hurt delivery. Therefore, implementation of demand forecasting systems with additional storage and transport may be the better approach. Significant decreases in the logistics cost per dose with more administered vaccines support investment in these forecasting systems. CONCLUSION Demand forecasting systems have the potential to greatly improve vaccine demand fulfillment, and decrease logistics cost/dose when implemented with storage and transportation increases direct vaccines. Simulation modeling can demonstrate the potential health and economic benefits of supply chain improvements. PMID:27219341

  18. The impact of implementing a demand forecasting system into a low-income country's supply chain.

    PubMed

    Mueller, Leslie E; Haidari, Leila A; Wateska, Angela R; Phillips, Roslyn J; Schmitz, Michelle M; Connor, Diana L; Norman, Bryan A; Brown, Shawn T; Welling, Joel S; Lee, Bruce Y

    2016-07-12

    To evaluate the potential impact and value of applications (e.g. adjusting ordering levels, storage capacity, transportation capacity, distribution frequency) of data from demand forecasting systems implemented in a lower-income country's vaccine supply chain with different levels of population change to urban areas. Using our software, HERMES, we generated a detailed discrete event simulation model of Niger's entire vaccine supply chain, including every refrigerator, freezer, transport, personnel, vaccine, cost, and location. We represented the introduction of a demand forecasting system to adjust vaccine ordering that could be implemented with increasing delivery frequencies and/or additions of cold chain equipment (storage and/or transportation) across the supply chain during varying degrees of population movement. Implementing demand forecasting system with increased storage and transport frequency increased the number of successfully administered vaccine doses and lowered the logistics cost per dose up to 34%. Implementing demand forecasting system without storage/transport increases actually decreased vaccine availability in certain circumstances. The potential maximum gains of a demand forecasting system may only be realized if the system is implemented to both augment the supply chain cold storage and transportation. Implementation may have some impact but, in certain circumstances, may hurt delivery. Therefore, implementation of demand forecasting systems with additional storage and transport may be the better approach. Significant decreases in the logistics cost per dose with more administered vaccines support investment in these forecasting systems. Demand forecasting systems have the potential to greatly improve vaccine demand fulfilment, and decrease logistics cost/dose when implemented with storage and transportation increases. Simulation modeling can demonstrate the potential health and economic benefits of supply chain improvements. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Potential use of multiple surveillance data in the forecast of hospital admissions

    PubMed Central

    Lau, Eric H.Y.; Ip, Dennis K.M.; Cowling, Benjamin J.

    2013-01-01

    Objective This paper describes the potential use of multiple influenza surveillance data to forecast hospital admissions for respiratory diseases. Introduction A sudden surge in hospital admissions in public hospital during influenza peak season has been a challenge to healthcare and manpower planning. In Hong Kong, the timing of influenza peak seasons are variable and early short-term indication of possible surge may facilitate preparedness which could be translated into strategies such as early discharge or reallocation of extra hospital beds. In this study we explore the potential use of multiple routinely collected syndromic data in the forecast of hospital admissions. Methods A multivariate dynamic linear time series model was fitted to multiple syndromic data including influenza-like illness (ILI) rates among networks of public and private general practitioners (GP), and school absenteeism rates, plus drop-in fever count data from designated flu clinics (DFC) that were created during the pandemic. The latent process derived from the model has been used as a measure of the influenza activity [1]. We compare the cross-correlations between estimated influenza level based on multiple surveillance data and GP ILI data, versus accident and emergency hospital admissions with principal diagnoses of respiratory diseases and pneumonia & influenza (P&I). Results The estimated influenza activity has higher cross-correlation with respiratory and P&I admissions (ρ=0.66 and 0.73 respectively) compared to that of GP ILI rates (Table 1). Cross correlations drop distinctly after lag 2 for both estimated influenza activity and GP ILI rates. Conclusions The use of a multivariate method to integrate information from multiple sources of influenza surveillance data may have the potential to improve forecasting of admission surge of respiratory diseases.

  20. An Ensemble System Based on Hybrid EGARCH-ANN with Different Distributional Assumptions to Predict S&P 500 Intraday Volatility

    NASA Astrophysics Data System (ADS)

    Lahmiri, S.; Boukadoum, M.

    2015-10-01

    Accurate forecasting of stock market volatility is an important issue in portfolio risk management. In this paper, an ensemble system for stock market volatility is presented. It is composed of three different models that hybridize the exponential generalized autoregressive conditional heteroscedasticity (GARCH) process and the artificial neural network trained with the backpropagation algorithm (BPNN) to forecast stock market volatility under normal, t-Student, and generalized error distribution (GED) assumption separately. The goal is to design an ensemble system where each single hybrid model is capable to capture normality, excess skewness, or excess kurtosis in the data to achieve complementarity. The performance of each EGARCH-BPNN and the ensemble system is evaluated by the closeness of the volatility forecasts to realized volatility. Based on mean absolute error and mean of squared errors, the experimental results show that proposed ensemble model used to capture normality, skewness, and kurtosis in data is more accurate than the individual EGARCH-BPNN models in forecasting the S&P 500 intra-day volatility based on one and five-minute time horizons data.

  1. A probabilistic approach to the drag-based model

    NASA Astrophysics Data System (ADS)

    Napoletano, Gianluca; Forte, Roberta; Moro, Dario Del; Pietropaolo, Ermanno; Giovannelli, Luca; Berrilli, Francesco

    2018-02-01

    The forecast of the time of arrival (ToA) of a coronal mass ejection (CME) to Earth is of critical importance for our high-technology society and for any future manned exploration of the Solar System. As critical as the forecast accuracy is the knowledge of its precision, i.e. the error associated to the estimate. We propose a statistical approach for the computation of the ToA using the drag-based model by introducing the probability distributions, rather than exact values, as input parameters, thus allowing the evaluation of the uncertainty on the forecast. We test this approach using a set of CMEs whose transit times are known, and obtain extremely promising results: the average value of the absolute differences between measure and forecast is 9.1h, and half of these residuals are within the estimated errors. These results suggest that this approach deserves further investigation. We are working to realize a real-time implementation which ingests the outputs of automated CME tracking algorithms as inputs to create a database of events useful for a further validation of the approach.

  2. Dynamical Downscaling of Seasonal Climate Prediction over Nordeste Brazil with ECHAM3 and NCEP's Regional Spectral Models at IRI.

    NASA Astrophysics Data System (ADS)

    Nobre, Paulo; Moura, Antonio D.; Sun, Liqiang

    2001-12-01

    This study presents an evaluation of a seasonal climate forecast done with the International Research Institute for Climate Prediction (IRI) dynamical forecast system (regional model nested into a general circulation model) over northern South America for January-April 1999, encompassing the rainy season over Brazil's Nordeste. The one-way nesting is one in two tiers: first the NCEP's Regional Spectral Model (RSM) runs with an 80-km grid mesh forced by the ECHAM3 atmospheric general circulation model (AGCM) outputs; then the RSM runs with a finer grid mesh (20 km) forced by the forecasts generated by the RSM-80. An ensemble of three realizations is done. Lower boundary conditions over the oceans for both ECHAM and RSM model runs are sea surface temperature forecasts over the tropical oceans. Soil moisture is initialized by ECHAM's inputs. The rainfall forecasts generated by the regional model are compared with those of the AGCM and observations. It is shown that the regional model at 80-km resolution improves upon the AGCM rainfall forecast, reducing both seasonal bias and root-mean-square error. On the other hand, the RSM-20 forecasts presented larger errors, with spatial patterns that resemble those of local topography. The better forecast of the position and width of the intertropical convergence zone (ITCZ) over the tropical Atlantic by the RSM-80 model is one of the principal reasons for better-forecast scores of the RSM-80 relative to the AGCM. The regional model improved the spatial as well as the temporal details of rainfall distribution, and also presenting the minimum spread among the ensemble members. The statistics of synoptic-scale weather variability on seasonal timescales were best forecast with the regional 80-km model over the Nordeste. The possibility of forecasting the frequency distribution of dry and wet spells within the rainy season is encouraging.

  3. Ensemble Downscaling of Winter Seasonal Forecasts: The MRED Project

    NASA Astrophysics Data System (ADS)

    Arritt, R. W.; Mred Team

    2010-12-01

    The Multi-Regional climate model Ensemble Downscaling (MRED) project is a multi-institutional project that is producing large ensembles of downscaled winter seasonal forecasts from coupled atmosphere-ocean seasonal prediction models. Eight regional climate models each are downscaling 15-member ensembles from the National Centers for Environmental Prediction (NCEP) Climate Forecast System (CFS) and the new NASA seasonal forecast system based on the GEOS5 atmospheric model coupled with the MOM4 ocean model. This produces 240-member ensembles, i.e., 8 regional models x 15 global ensemble members x 2 global models, for each winter season (December-April) of 1982-2003. Results to date show that combined global-regional downscaled forecasts have greatest skill for seasonal precipitation anomalies during strong El Niño events such as 1982-83 and 1997-98. Ensemble means of area-averaged seasonal precipitation for the regional models generally track the corresponding results for the global model, though there is considerable inter-model variability amongst the regional models. For seasons and regions where area mean precipitation is accurately simulated the regional models bring added value by extracting greater spatial detail from the global forecasts, mainly due to better resolution of terrain in the regional models. Our results also emphasize that an ensemble approach is essential to realizing the added value from the combined global-regional modeling system.

  4. Research and Application of an Air Quality Early Warning System Based on a Modified Least Squares Support Vector Machine and a Cloud Model.

    PubMed

    Wang, Jianzhou; Niu, Tong; Wang, Rui

    2017-03-02

    The worsening atmospheric pollution increases the necessity of air quality early warning systems (EWSs). Despite the fact that a massive amount of investigation about EWS in theory and practicality has been conducted by numerous researchers, studies concerning the quantification of uncertain information and comprehensive evaluation are still lacking, which impedes further development in the area. In this paper, firstly a comprehensive warning system is proposed, which consists of two vital indispensable modules, namely effective forecasting and scientific evaluation, respectively. For the forecasting module, a novel hybrid model combining the theory of data preprocessing and numerical optimization is first developed to implement effective forecasting for air pollutant concentration. Especially, in order to further enhance the accuracy and robustness of the warning system, interval forecasting is implemented to quantify the uncertainties generated by forecasts, which can provide significant risk signals by using point forecasting for decision-makers. For the evaluation module, a cloud model, based on probability and fuzzy set theory, is developed to perform comprehensive evaluations of air quality, which can realize the transformation between qualitative concept and quantitative data. To verify the effectiveness and efficiency of the warning system, extensive simulations based on air pollutants data from Dalian in China were effectively implemented, which illustrate that the warning system is not only remarkably high-performance, but also widely applicable.

  5. Research and Application of an Air Quality Early Warning System Based on a Modified Least Squares Support Vector Machine and a Cloud Model

    PubMed Central

    Wang, Jianzhou; Niu, Tong; Wang, Rui

    2017-01-01

    The worsening atmospheric pollution increases the necessity of air quality early warning systems (EWSs). Despite the fact that a massive amount of investigation about EWS in theory and practicality has been conducted by numerous researchers, studies concerning the quantification of uncertain information and comprehensive evaluation are still lacking, which impedes further development in the area. In this paper, firstly a comprehensive warning system is proposed, which consists of two vital indispensable modules, namely effective forecasting and scientific evaluation, respectively. For the forecasting module, a novel hybrid model combining the theory of data preprocessing and numerical optimization is first developed to implement effective forecasting for air pollutant concentration. Especially, in order to further enhance the accuracy and robustness of the warning system, interval forecasting is implemented to quantify the uncertainties generated by forecasts, which can provide significant risk signals by using point forecasting for decision-makers. For the evaluation module, a cloud model, based on probability and fuzzy set theory, is developed to perform comprehensive evaluations of air quality, which can realize the transformation between qualitative concept and quantitative data. To verify the effectiveness and efficiency of the warning system, extensive simulations based on air pollutants data from Dalian in China were effectively implemented, which illustrate that the warning system is not only remarkably high-performance, but also widely applicable. PMID:28257122

  6. EDgE multi-model hydro-meteorological seasonal hindcast experiments over Europe

    NASA Astrophysics Data System (ADS)

    Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Rakovec, Oldrich; Wood, Eric; Sheffield, Justin; Pan, Ming; Wanders, Niko; Prudhomme, Christel

    2017-04-01

    Extreme hydrometeorological events (e.g., floods, droughts and heat waves) caused serious damage to society and infrastructures over Europe during the past decades. Developing a seamless and skillful operational seasonal forecasting system of these extreme events is therefore a key tool for short-term decision making at local and regional scales. The EDgE project funded by the Copernicus programme (C3S) provides an unique opportunity to investigate the skill of a newly created large multi-model hydro-meteorological ensemble for predicting extreme events over the Pan-EU domain at a higher resolution 5×5 km2. Two state-of-the-art seasonal prediction systems were chosen for this project. Two models from the North American MultiModel ensemble (NMME) with 22 realizations, and two models provided by the ECMWF with 30 realizations. All models provide daily forcings (P, Ta, Tmin, Tmax) of the the Pan-EU at 1°. Downscaling has been carried out with the MTCLIM algorithm (Bohn et al. 2013) and external drift Kriging using elevation as drift to induce orographic effects. In this project, four high-resolution seamless hydrologic simulations with the mHM (www.ufz.de/mhm), Noah-MP, VIC and PCR-GLOBWB have been completed for the common hindcast period of 1993-2012 resulting in an ensemble size of 208 realizations. Key indicators are focussing on six terrestrial Essential Climate Variables (tECVs): river runoff, soil moisture, groundwater recharge, precipitation, potential evapotranspiration, and snow water equivalent. Impact Indicators have been co-designed with stakeholders in Norway (hydro-power), UK (water supply), and Spain (river basin authority) to provide an improved information for decision making. The Indicators encompass diverse information such as the occurrence of high and low streamflow percentiles (floods, and hydrological drought) and lower percentiles of top soil moisture (agricultural drought) among others. Preliminary results evaluated at study sites in Norway, Spain, and UK indicate that extreme events such as the 2003 European drought can be forecasted consistently by all models at short lead times of one to two months. At six month lead time, the 208 model realizations show little skill to forecast extreme events. The predictability of extreme events is not uniformly distributed across Europe. For example, Northern Europe exhibits higher predictability due to the persistence induced by cold processes (e.g., snow). In general, the major source of poor forecasting skill is the little skill in precipitation forecast. References http://climate.copernicus.eu/edge-end-end-demonstrator-improved-decision-making-water-sector-europe Bohn, T. J. , B., Livneh J. W. Oyler, S. W. Running, B. Nijssen, D. P. Lettenmaier, 2013: Global evaluation of MTCLIM and related algorithms for forcing of ecological and hydrological models. Agricultural and Forest Meteorology, 176 , pp. 38-49. Samaniego, L., R. Kumar, and S. Attinger (2010), Multiscale parameter regionalization of a grid-based hydrologic model at the mesoscale, Water Resource Research, 46, W05523, doi:10.1029/2008WR007327 Thober, S., R. Kumar, J. Sheffield, J. Mai, D. Schaefer, and L. Samaniego, 2015: Seasonal soil moisture drought prediction over Europe using the North American Multi-Model Ensemble (NMME). J. Hydrometeor., 16, 2329-2344.

  7. Multivariate Cryptography Based on Clipped Hopfield Neural Network.

    PubMed

    Wang, Jia; Cheng, Lee-Ming; Su, Tong

    2018-02-01

    Designing secure and efficient multivariate public key cryptosystems [multivariate cryptography (MVC)] to strengthen the security of RSA and ECC in conventional and quantum computational environment continues to be a challenging research in recent years. In this paper, we will describe multivariate public key cryptosystems based on extended Clipped Hopfield Neural Network (CHNN) and implement it using the MVC (CHNN-MVC) framework operated in space. The Diffie-Hellman key exchange algorithm is extended into the matrix field, which illustrates the feasibility of its new applications in both classic and postquantum cryptography. The efficiency and security of our proposed new public key cryptosystem CHNN-MVC are simulated and found to be NP-hard. The proposed algorithm will strengthen multivariate public key cryptosystems and allows hardware realization practicality.

  8. Thermal Habitat Index of Many Northwest Atlantic Temperate Species Stays Neutral under Warming Projected for 2030 but Changes Radically by 2060

    PubMed Central

    Shackell, Nancy L.; Ricard, Daniel; Stortini, Christine

    2014-01-01

    Global scale forecasts of range shifts in response to global warming have provided vital insight into predicted species redistribution. We build on that insight by examining whether local warming will affect habitat on spatiotemporal scales relevant to regional agencies. We used generalized additive models to quantify the realized habitat of 46 temperate/boreal marine species using 41+ years of survey data from 35°N–48°N in the Northwest Atlantic. We then estimated change in a “realized thermal habitat index” under short-term (2030) and long-term (2060) warming scenarios. Under the 2030 scenario, ∼10% of species will lose realized thermal habitat at the national scale (USA and Canada) but planktivores are expected to lose significantly in both countries which may result in indirect changes in their predators’ distribution. In contrast, by 2060 in Canada, the realized habitat of 76% of species will change (55% will lose, 21% will gain) while in the USA, the realized habitat of 85% of species will change (65% will lose, 20% will gain). If all else were held constant, the ecosystem is projected to change radically based on thermal habitat alone. The magnitude of the 2060 warming projection (∼1.5–3°C) was observed in 2012 affirming that research is needed on effects of extreme “weather” in addition to increasing mean temperature. Our approach can be used to aggregate at smaller spatial scales where temperate/boreal species are hypothesized to have a greater loss at ∼40°N. The uncertainty associated with climate change forecasts is large, yet resource management agencies still have to address climate change. How? Since many fishery agencies do not plan beyond 5 years, a logical way forward is to incorporate a “realized thermal habitat index” into the stock assessment process. Over time, decisions would be influenced by the amount of suitable thermal habitat, in concert with gradual or extreme warming. PMID:24599187

  9. Forecast of dengue incidence using temperature and rainfall.

    PubMed

    Hii, Yien Ling; Zhu, Huaiping; Ng, Nawi; Ng, Lee Ching; Rocklöv, Joacim

    2012-01-01

    An accurate early warning system to predict impending epidemics enhances the effectiveness of preventive measures against dengue fever. The aim of this study was to develop and validate a forecasting model that could predict dengue cases and provide timely early warning in Singapore. We developed a time series Poisson multivariate regression model using weekly mean temperature and cumulative rainfall over the period 2000-2010. Weather data were modeled using piecewise linear spline functions. We analyzed various lag times between dengue and weather variables to identify the optimal dengue forecasting period. Autoregression, seasonality and trend were considered in the model. We validated the model by forecasting dengue cases for week 1 of 2011 up to week 16 of 2012 using weather data alone. Model selection and validation were based on Akaike's Information Criterion, standardized Root Mean Square Error, and residuals diagnoses. A Receiver Operating Characteristics curve was used to analyze the sensitivity of the forecast of epidemics. The optimal period for dengue forecast was 16 weeks. Our model forecasted correctly with errors of 0.3 and 0.32 of the standard deviation of reported cases during the model training and validation periods, respectively. It was sensitive enough to distinguish between outbreak and non-outbreak to a 96% (CI = 93-98%) in 2004-2010 and 98% (CI = 95%-100%) in 2011. The model predicted the outbreak in 2011 accurately with less than 3% possibility of false alarm. We have developed a weather-based dengue forecasting model that allows warning 16 weeks in advance of dengue epidemics with high sensitivity and specificity. We demonstrate that models using temperature and rainfall could be simple, precise, and low cost tools for dengue forecasting which could be used to enhance decision making on the timing, scale of vector control operations, and utilization of limited resources.

  10. Forecasting Twenty-First Century Information Technology Skills: A Delphi Study

    ERIC Educational Resources Information Center

    Young, Jackie A.

    2012-01-01

    As cities and regions seek to increase the stock of college educated citizens in order to compete in the twenty-first century knowledge-economy, colleges and universities are realizing increased enrollment. At the same time, much is being written about the skills needed by graduates for the new economy. These studies articulate skills in critical…

  11. CMB constraints on running non-Gaussianity

    NASA Astrophysics Data System (ADS)

    Oppizzi, F.; Liguori, M.; Renzi, A.; Arroja, F.; Bartolo, N.

    2018-05-01

    We develop a complete set of tools for CMB forecasting, simulation and estimation of primordial running bispectra, arising from a variety of curvaton and single-field (DBI) models of Inflation. We validate our pipeline using mock CMB running non-Gaussianity realizations and test it on real data by obtaining experimental constraints on the fNL running spectral index, nNG, using WMAP 9-year data. Our final bounds (68% C.L.) read ‑0.6< nNG<1.4}, ‑0.3< nNG<1.2, ‑1.1

  12. Technical Note: Initial assessment of a multi-method approach to spring-flood forecasting in Sweden

    NASA Astrophysics Data System (ADS)

    Olsson, J.; Uvo, C. B.; Foster, K.; Yang, W.

    2016-02-01

    Hydropower is a major energy source in Sweden, and proper reservoir management prior to the spring-flood onset is crucial for optimal production. This requires accurate forecasts of the accumulated discharge in the spring-flood period (i.e. the spring-flood volume, SFV). Today's SFV forecasts are generated using a model-based climatological ensemble approach, where time series of precipitation and temperature from historical years are used to force a calibrated and initialized set-up of the HBV model. In this study, a number of new approaches to spring-flood forecasting that reflect the latest developments with respect to analysis and modelling on seasonal timescales are presented and evaluated. Three main approaches, represented by specific methods, are evaluated in SFV hindcasts for the Swedish river Vindelälven over a 10-year period with lead times between 0 and 4 months. In the first approach, historically analogue years with respect to the climate in the period preceding the spring flood are identified and used to compose a reduced ensemble. In the second, seasonal meteorological ensemble forecasts are used to drive the HBV model over the spring-flood period. In the third approach, statistical relationships between SFV and the large-sale atmospheric circulation are used to build forecast models. None of the new approaches consistently outperform the climatological ensemble approach, but for early forecasts improvements of up to 25 % are found. This potential is reasonably well realized in a multi-method system, which over all forecast dates reduced the error in SFV by ˜ 4 %. This improvement is limited but potentially significant for e.g. energy trading.

  13. Evaluating the Effectiveness of DART® Buoy Networks Based on Forecast Accuracy

    NASA Astrophysics Data System (ADS)

    Percival, Donald B.; Denbo, Donald W.; Gica, Edison; Huang, Paul Y.; Mofjeld, Harold O.; Spillane, Michael C.; Titov, Vasily V.

    2018-04-01

    A performance measure for a DART® tsunami buoy network has been developed. DART® buoys are used to detect tsunamis, but the full potential of the data they collect is realized through accurate forecasts of inundations caused by the tsunamis. The performance measure assesses how well the network achieves its full potential through a statistical analysis of simulated forecasts of wave amplitudes outside an impact site and a consideration of how much the forecasts are degraded in accuracy when one or more buoys are inoperative. The analysis uses simulated tsunami amplitude time series collected at each buoy from selected source segments in the Short-term Inundation Forecast for Tsunamis database and involves a set for 1000 forecasts for each buoy/segment pair at sites just offshore of selected impact communities. Random error-producing scatter in the time series is induced by uncertainties in the source location, addition of real oceanic noise, and imperfect tidal removal. Comparison with an error-free standard leads to root-mean-square errors (RMSEs) for DART® buoys located near a subduction zone. The RMSEs indicate which buoy provides the best forecast (lowest RMSE) for sections of the zone, under a warning-time constraint for the forecasts of 3 h. The analysis also shows how the forecasts are degraded (larger minimum RMSE among the remaining buoys) when one or more buoys become inoperative. The RMSEs provide a way to assess array augmentation or redesign such as moving buoys to more optimal locations. Examples are shown for buoys off the Aleutian Islands and off the West Coast of South America for impact sites at Hilo HI and along the US West Coast (Crescent City CA and Port San Luis CA, USA). A simple measure (coded green, yellow or red) of the current status of the network's ability to deliver accurate forecasts is proposed to flag the urgency of buoy repair.

  14. Evaluating the Effectiveness of DART® Buoy Networks Based on Forecast Accuracy

    NASA Astrophysics Data System (ADS)

    Percival, Donald B.; Denbo, Donald W.; Gica, Edison; Huang, Paul Y.; Mofjeld, Harold O.; Spillane, Michael C.; Titov, Vasily V.

    2018-03-01

    A performance measure for a DART® tsunami buoy network has been developed. DART® buoys are used to detect tsunamis, but the full potential of the data they collect is realized through accurate forecasts of inundations caused by the tsunamis. The performance measure assesses how well the network achieves its full potential through a statistical analysis of simulated forecasts of wave amplitudes outside an impact site and a consideration of how much the forecasts are degraded in accuracy when one or more buoys are inoperative. The analysis uses simulated tsunami amplitude time series collected at each buoy from selected source segments in the Short-term Inundation Forecast for Tsunamis database and involves a set for 1000 forecasts for each buoy/segment pair at sites just offshore of selected impact communities. Random error-producing scatter in the time series is induced by uncertainties in the source location, addition of real oceanic noise, and imperfect tidal removal. Comparison with an error-free standard leads to root-mean-square errors (RMSEs) for DART® buoys located near a subduction zone. The RMSEs indicate which buoy provides the best forecast (lowest RMSE) for sections of the zone, under a warning-time constraint for the forecasts of 3 h. The analysis also shows how the forecasts are degraded (larger minimum RMSE among the remaining buoys) when one or more buoys become inoperative. The RMSEs provide a way to assess array augmentation or redesign such as moving buoys to more optimal locations. Examples are shown for buoys off the Aleutian Islands and off the West Coast of South America for impact sites at Hilo HI and along the US West Coast (Crescent City CA and Port San Luis CA, USA). A simple measure (coded green, yellow or red) of the current status of the network's ability to deliver accurate forecasts is proposed to flag the urgency of buoy repair.

  15. Evaluation of model-based seasonal streamflow and water allocation forecasts for the Elqui Valley, Chile

    NASA Astrophysics Data System (ADS)

    Delorit, Justin; Cristian Gonzalez Ortuya, Edmundo; Block, Paul

    2017-09-01

    In many semi-arid regions, multisectoral demands often stress available water supplies. Such is the case in the Elqui River valley of northern Chile, which draws on a limited-capacity reservoir to allocate 25 000 water rights. Delayed infrastructure investment forces water managers to address demand-based allocation strategies, particularly in dry years, which are realized through reductions in the volume associated with each water right. Skillful season-ahead streamflow forecasts have the potential to inform managers with an indication of future conditions to guide reservoir allocations. This work evaluates season-ahead statistical prediction models of October-January (growing season) streamflow at multiple lead times associated with manager and user decision points, and links predictions with a reservoir allocation tool. Skillful results (streamflow forecasts outperform climatology) are produced for short lead times (1 September: ranked probability skill score (RPSS) of 0.31, categorical hit skill score of 61 %). At longer lead times, climatological skill exceeds forecast skill due to fewer observations of precipitation. However, coupling the 1 September statistical forecast model with a sea surface temperature phase and strength statistical model allows for equally skillful categorical streamflow forecasts to be produced for a 1 May lead, triggered for 60 % of years (1950-2015), suggesting forecasts need not be strictly deterministic to be useful for water rights holders. An early (1 May) categorical indication of expected conditions is reinforced with a deterministic forecast (1 September) as more observations of local variables become available. The reservoir allocation model is skillful at the 1 September lead (categorical hit skill score of 53 %); skill improves to 79 % when categorical allocation prediction certainty exceeds 80 %. This result implies that allocation efficiency may improve when forecasts are integrated into reservoir decision frameworks. The methods applied here advance the understanding of the mechanisms and timing responsible for moisture transport to the Elqui Valley and provide a unique application of streamflow forecasting in the prediction of water right allocations.

  16. Semi-nonparametric VaR forecasts for hedge funds during the recent crisis

    NASA Astrophysics Data System (ADS)

    Del Brio, Esther B.; Mora-Valencia, Andrés; Perote, Javier

    2014-05-01

    The need to provide accurate value-at-risk (VaR) forecasting measures has triggered an important literature in econophysics. Although these accurate VaR models and methodologies are particularly demanded for hedge fund managers, there exist few articles specifically devoted to implement new techniques in hedge fund returns VaR forecasting. This article advances in these issues by comparing the performance of risk measures based on parametric distributions (the normal, Student’s t and skewed-t), semi-nonparametric (SNP) methodologies based on Gram-Charlier (GC) series and the extreme value theory (EVT) approach. Our results show that normal-, Student’s t- and Skewed t- based methodologies fail to forecast hedge fund VaR, whilst SNP and EVT approaches accurately success on it. We extend these results to the multivariate framework by providing an explicit formula for the GC copula and its density that encompasses the Gaussian copula and accounts for non-linear dependences. We show that the VaR obtained by the meta GC accurately captures portfolio risk and outperforms regulatory VaR estimates obtained through the meta Gaussian and Student’s t distributions.

  17. Ensemble-sensitivity Analysis Based Observation Targeting for Mesoscale Convection Forecasts and Factors Influencing Observation-Impact Prediction

    NASA Astrophysics Data System (ADS)

    Hill, A.; Weiss, C.; Ancell, B. C.

    2017-12-01

    The basic premise of observation targeting is that additional observations, when gathered and assimilated with a numerical weather prediction (NWP) model, will produce a more accurate forecast related to a specific phenomenon. Ensemble-sensitivity analysis (ESA; Ancell and Hakim 2007; Torn and Hakim 2008) is a tool capable of accurately estimating the proper location of targeted observations in areas that have initial model uncertainty and large error growth, as well as predicting the reduction of forecast variance due to the assimilated observation. ESA relates an ensemble of NWP model forecasts, specifically an ensemble of scalar forecast metrics, linearly to earlier model states. A thorough investigation is presented to determine how different factors of the forecast process are impacting our ability to successfully target new observations for mesoscale convection forecasts. Our primary goals for this work are to determine: (1) If targeted observations hold more positive impact over non-targeted (i.e. randomly chosen) observations; (2) If there are lead-time constraints to targeting for convection; (3) How inflation, localization, and the assimilation filter influence impact prediction and realized results; (4) If there exist differences between targeted observations at the surface versus aloft; and (5) how physics errors and nonlinearity may augment observation impacts.Ten cases of dryline-initiated convection between 2011 to 2013 are simulated within a simplified OSSE framework and presented here. Ensemble simulations are produced from a cycling system that utilizes the Weather Research and Forecasting (WRF) model v3.8.1 within the Data Assimilation Research Testbed (DART). A "truth" (nature) simulation is produced by supplying a 3-km WRF run with GFS analyses and integrating the model forward 90 hours, from the beginning of ensemble initialization through the end of the forecast. Target locations for surface and radiosonde observations are computed 6, 12, and 18 hours into the forecast based on a chosen scalar forecast response metric (e.g., maximum reflectivity at convection initiation). A variety of experiments are designed to achieve the aforementioned goals and will be presented, along with their results, detailing the feasibility of targeting for mesoscale convection forecasts.

  18. A comparison of ensemble post-processing approaches that preserve correlation structures

    NASA Astrophysics Data System (ADS)

    Schefzik, Roman; Van Schaeybroeck, Bert; Vannitsem, Stéphane

    2016-04-01

    Despite the fact that ensemble forecasts address the major sources of uncertainty, they exhibit biases and dispersion errors and therefore are known to improve by calibration or statistical post-processing. For instance the ensemble model output statistics (EMOS) method, also known as non-homogeneous regression approach (Gneiting et al., 2005) is known to strongly improve forecast skill. EMOS is based on fitting and adjusting a parametric probability density function (PDF). However, EMOS and other common post-processing approaches apply to a single weather quantity at a single location for a single look-ahead time. They are therefore unable of taking into account spatial, inter-variable and temporal dependence structures. Recently many research efforts have been invested in designing post-processing methods that resolve this drawback but also in verification methods that enable the detection of dependence structures. New verification methods are applied on two classes of post-processing methods, both generating physically coherent ensembles. A first class uses the ensemble copula coupling (ECC) that starts from EMOS but adjusts the rank structure (Schefzik et al., 2013). The second class is a member-by-member post-processing (MBM) approach that maps each raw ensemble member to a corrected one (Van Schaeybroeck and Vannitsem, 2015). We compare variants of the EMOS-ECC and MBM classes and highlight a specific theoretical connection between them. All post-processing variants are applied in the context of the ensemble system of the European Centre of Weather Forecasts (ECMWF) and compared using multivariate verification tools including the energy score, the variogram score (Scheuerer and Hamill, 2015) and the band depth rank histogram (Thorarinsdottir et al., 2015). Gneiting, Raftery, Westveld, and Goldman, 2005: Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Wea. Rev., {133}, 1098-1118. Scheuerer and Hamill, 2015. Variogram-based proper scoring rules for probabilistic forecasts of multivariate quantities. Mon. Wea. Rev. {143},1321-1334. Schefzik, Thorarinsdottir, Gneiting. Uncertainty quantification in complex simulation models using ensemble copula coupling. Statistical Science {28},616-640, 2013. Thorarinsdottir, M. Scheuerer, and C. Heinz, 2015. Assessing the calibration of high-dimensional ensemble forecasts using rank histograms, arXiv:1310.0236. Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.

  19. Drought forecasting in eastern Australia using multivariate adaptive regression spline, least square support vector machine and M5Tree model

    NASA Astrophysics Data System (ADS)

    Deo, Ravinesh C.; Kisi, Ozgur; Singh, Vijay P.

    2017-02-01

    Drought forecasting using standardized metrics of rainfall is a core task in hydrology and water resources management. Standardized Precipitation Index (SPI) is a rainfall-based metric that caters for different time-scales at which the drought occurs, and due to its standardization, is well-suited for forecasting drought at different periods in climatically diverse regions. This study advances drought modelling using multivariate adaptive regression splines (MARS), least square support vector machine (LSSVM), and M5Tree models by forecasting SPI in eastern Australia. MARS model incorporated rainfall as mandatory predictor with month (periodicity), Southern Oscillation Index, Pacific Decadal Oscillation Index and Indian Ocean Dipole, ENSO Modoki and Nino 3.0, 3.4 and 4.0 data added gradually. The performance was evaluated with root mean square error (RMSE), mean absolute error (MAE), and coefficient of determination (r2). Best MARS model required different input combinations, where rainfall, sea surface temperature and periodicity were used for all stations, but ENSO Modoki and Pacific Decadal Oscillation indices were not required for Bathurst, Collarenebri and Yamba, and the Southern Oscillation Index was not required for Collarenebri. Inclusion of periodicity increased the r2 value by 0.5-8.1% and reduced RMSE by 3.0-178.5%. Comparisons showed that MARS superseded the performance of the other counterparts for three out of five stations with lower MAE by 15.0-73.9% and 7.3-42.2%, respectively. For the other stations, M5Tree was better than MARS/LSSVM with lower MAE by 13.8-13.4% and 25.7-52.2%, respectively, and for Bathurst, LSSVM yielded more accurate result. For droughts identified by SPI ≤ - 0.5, accurate forecasts were attained by MARS/M5Tree for Bathurst, Yamba and Peak Hill, whereas for Collarenebri and Barraba, M5Tree was better than LSSVM/MARS. Seasonal analysis revealed disparate results where MARS/M5Tree was better than LSSVM. The results highlight the importance of periodicity in drought forecasting and also ascertains that model accuracy scales with geographic/seasonal factors due to complexity of drought and its relationship with inputs and data attributes that can affect the evolution of drought events.

  20. Performance of the Prognocean Plus system during the El Niño 2015/2016: predictions of sea level anomalies as tools for forecasting El Niño

    NASA Astrophysics Data System (ADS)

    Świerczyńska-Chlaściak, Małgorzata; Niedzielski, Tomasz; Miziński, Bartłomiej

    2017-04-01

    The aim of this paper is to present the performance of the Prognocean Plus system, which produces long-term predictions of sea level anomalies, during the El Niño 2015/2016. The main objective of work is to identify such ocean areas in which long-term forecasts of sea level anomalies during El Niño 2015/2016 reveal a considerable accuracy. At present, the system produces prognoses using four data-based models and their combinations: polynomial-harmonic model, autoregressive model, threshold autoregressive model and multivariate autoregressive model. The system offers weekly forecasts, with lead time up to 12 weeks. Several statistics that describe the efficiency of the available prediction models in four seasons used for estimating Oceanic Niño index (ONI) are calculated. The accuracies/skills of the predicting models were computed in the specific locations in the equatorial Pacific, namely the geometrically-determined central points of all Niño regions. For the said locations, we focused on the forecasts which targeted at the local maximum of sea level, driven by the El Niño 2015/2016. As a result, a series of the "spaghetti" graphs (for each point, season and model) as well as plots presenting the prognostic performance of every model - for all lead times, seasons and locations - were created. It is found that the Prognocean Plus system has a potential to become a new solution which may enhance the diagnostic discussions on the El Niño development. The forecasts produced by the threshold autoregressive model, for lead times of 5-6 weeks and 9 weeks, within the Niño1+2 region for the November-to-January (NDJ) season anticipated the culmination of the El Niño 2015/2016. The longest forecasts (8-12 weeks) were found to be the most accurate in the phase of transition from El Niño to normal conditions (the multivariate autoregressive model, central point of Niño1+2 region, the December-to-February season). The study was conducted to verify the ability and usefulness of sea level anomaly forecasts in predicting phenomena that are controlled by the ocean-atmosphere processes, such as El Niño Southern Oscillation or North Atlantic Oscillation. The results may support further investigations into long-term forecasting of the quantitative indices of these oscillations, solely based on prognoses of sea level change. In particular, comparing the accuracies of prognoses of the North Atlantic Oscillation index remains one of the tasks of the research project no. 2016/21/N/ST10/03231, financed by the National Science Center of Poland.

  1. Ecological Forecasting in Chesapeake Bay: Using a Mechanistic-Empirical Modelling Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, C. W.; Hood, Raleigh R.; Long, Wen

    The Chesapeake Bay Ecological Prediction System (CBEPS) automatically generates daily nowcasts and three-day forecasts of several environmental variables, such as sea-surface temperature and salinity, the concentrations of chlorophyll, nitrate, and dissolved oxygen, and the likelihood of encountering several noxious species, including harmful algal blooms and water-borne pathogens, for the purpose of monitoring the Bay's ecosystem. While the physical and biogeochemical variables are forecast mechanistically using the Regional Ocean Modeling System configured for the Chesapeake Bay, the species predictions are generated using a novel mechanistic empirical approach, whereby real-time output from the coupled physical biogeochemical model drives multivariate empirical habitat modelsmore » of the target species. The predictions, in the form of digital images, are available via the World Wide Web to interested groups to guide recreational, management, and research activities. Though full validation of the integrated forecasts for all species is still a work in progress, we argue that the mechanistic–empirical approach can be used to generate a wide variety of short-term ecological forecasts, and that it can be applied in any marine system where sufficient data exist to develop empirical habitat models. This paper provides an overview of this system, its predictions, and the approach taken.« less

  2. Spatio-Temporal Change Modeling of Lulc: a Semantic Kriging Approach

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, S.; Ghosh, S. K.

    2015-07-01

    Spatio-temporal land-use/ land-cover (LULC) change modeling is important to forecast the future LULC distribution, which may facilitate natural resource management, urban planning, etc. The spatio-temporal change in LULC trend often exhibits non-linear behavior, due to various dynamic factors, such as, human intervention (e.g., urbanization), environmental factors, etc. Hence, proper forecasting of LULC distribution should involve the study and trend modeling of historical data. Existing literatures have reported that the meteorological attributes (e.g., NDVI, LST, MSI), are semantically related to the terrain. Being influenced by the terrestrial dynamics, the temporal changes of these attributes depend on the LULC properties. Hence, incorporating meteorological knowledge into the temporal prediction process may help in developing an accurate forecasting model. This work attempts to study the change in inter-annual LULC pattern and the distribution of different meteorological attributes of a region in Kolkata (a metropolitan city in India) during the years 2000-2010 and forecast the future spread of LULC using semantic kriging (SemK) approach. A new variant of time-series SemK is proposed, namely Rev-SemKts to capture the multivariate semantic associations between different attributes. From empirical analysis, it may be observed that the augmentation of semantic knowledge in spatio-temporal modeling of meteorological attributes facilitate more precise forecasting of LULC pattern.

  3. Forecasting asthma-related hospital admissions in London using negative binomial models.

    PubMed

    Soyiri, Ireneous N; Reidpath, Daniel D; Sarran, Christophe

    2013-05-01

    Health forecasting can improve health service provision and individual patient outcomes. Environmental factors are known to impact chronic respiratory conditions such as asthma, but little is known about the extent to which these factors can be used for forecasting. Using weather, air quality and hospital asthma admissions, in London (2005-2006), two related negative binomial models were developed and compared with a naive seasonal model. In the first approach, predictive forecasting models were fitted with 7-day averages of each potential predictor, and then a subsequent multivariable model is constructed. In the second strategy, an exhaustive search of the best fitting models between possible combinations of lags (0-14 days) of all the environmental effects on asthma admission was conducted. Three models were considered: a base model (seasonal effects), contrasted with a 7-day average model and a selected lags model (weather and air quality effects). Season is the best predictor of asthma admissions. The 7-day average and seasonal models were trivial to implement. The selected lags model was computationally intensive, but of no real value over much more easily implemented models. Seasonal factors can predict daily hospital asthma admissions in London, and there is a little evidence that additional weather and air quality information would add to forecast accuracy.

  4. Cardiac catheterization laboratory inpatient forecast tool: a prospective evaluation

    PubMed Central

    Flanagan, Eleni; Siddiqui, Sauleh; Appelbaum, Jeff; Kasper, Edward K; Levin, Scott

    2016-01-01

    Objective To develop and prospectively evaluate a web-based tool that forecasts the daily bed need for admissions from the cardiac catheterization laboratory using routinely available clinical data within electronic medical records (EMRs). Methods The forecast model was derived using a 13-month retrospective cohort of 6384 catheterization patients. Predictor variables such as demographics, scheduled procedures, and clinical indicators mined from free-text notes were input to a multivariable logistic regression model that predicted the probability of inpatient admission. The model was embedded into a web-based application connected to the local EMR system and used to support bed management decisions. After implementation, the tool was prospectively evaluated for accuracy on a 13-month test cohort of 7029 catheterization patients. Results The forecast model predicted admission with an area under the receiver operating characteristic curve of 0.722. Daily aggregate forecasts were accurate to within one bed for 70.3% of days and within three beds for 97.5% of days during the prospective evaluation period. The web-based application housing the forecast model was used by cardiology providers in practice to estimate daily admissions from the catheterization laboratory. Discussion The forecast model identified older age, male gender, invasive procedures, coronary artery bypass grafts, and a history of congestive heart failure as qualities indicating a patient was at increased risk for admission. Diagnostic procedures and less acute clinical indicators decreased patients’ risk of admission. Despite the site-specific limitations of the model, these findings were supported by the literature. Conclusion Data-driven predictive analytics may be used to accurately forecast daily demand for inpatient beds for cardiac catheterization patients. Connecting these analytics to EMR data sources has the potential to provide advanced operational decision support. PMID:26342217

  5. NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science

    NASA Astrophysics Data System (ADS)

    Robertson, F. R.; Roberts, J. B.

    2014-12-01

    This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.

  6. NMME Monthly / Seasonal Forecasts for NASA SERVIR Applications Science

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Roberts, Jason B.

    2014-01-01

    This work details use of the North American Multi-Model Ensemble (NMME) experimental forecasts as drivers for Decision Support Systems (DSSs) in the NASA / USAID initiative, SERVIR (a Spanish acronym meaning "to serve"). SERVIR integrates satellite observations, ground-based data and forecast models to monitor and forecast environmental changes and to improve response to natural disasters. Through the use of DSSs whose "front ends" are physically based models, the SERVIR activity provides a natural testbed to determine the extent to which NMME monthly to seasonal projections enable scientists, educators, project managers and policy implementers in developing countries to better use probabilistic outlooks of seasonal hydrologic anomalies in assessing agricultural / food security impacts, water availability, and risk to societal infrastructure. The multi-model NMME framework provides a "best practices" approach to probabilistic forecasting. The NMME forecasts are generated at resolution more coarse than that required to support DSS models; downscaling in both space and time is necessary. The methodology adopted here applied model output statistics where we use NMME ensemble monthly projections of sea-surface temperature (SST) and precipitation from 30 years of hindcasts with observations of precipitation and temperature for target regions. Since raw model forecasts are well-known to have structural biases, a cross-validated multivariate regression methodology (CCA) is used to link the model projected states as predictors to the predictands of the target region. The target regions include a number of basins in East and South Africa as well as the Ganges / Baramaputra / Meghna basin complex. The MOS approach used address spatial downscaling. Temporal disaggregation of monthly seasonal forecasts is achieved through use of a tercile bootstrapping approach. We interpret the results of these studies, the levels of skill by several metrics, and key uncertainties.

  7. TaiWan Ionospheric Model (TWIM) prediction based on time series autoregressive analysis

    NASA Astrophysics Data System (ADS)

    Tsai, L. C.; Macalalad, Ernest P.; Liu, C. H.

    2014-10-01

    As described in a previous paper, a three-dimensional ionospheric electron density (Ne) model has been constructed from vertical Ne profiles retrieved from the FormoSat3/Constellation Observing System for Meteorology, Ionosphere, and Climate GPS radio occultation measurements and worldwide ionosonde foF2 and foE data and named the TaiWan Ionospheric Model (TWIM). The TWIM exhibits vertically fitted α-Chapman-type layers with distinct F2, F1, E, and D layers, and surface spherical harmonic approaches for the fitted layer parameters including peak density, peak density height, and scale height. To improve the TWIM into a real-time model, we have developed a time series autoregressive model to forecast short-term TWIM coefficients. The time series of TWIM coefficients are considered as realizations of stationary stochastic processes within a processing window of 30 days. These autocorrelation coefficients are used to derive the autoregressive parameters and then forecast the TWIM coefficients, based on the least squares method and Lagrange multiplier technique. The forecast root-mean-square relative TWIM coefficient errors are generally <30% for 1 day predictions. The forecast TWIM values of foE and foF2 values are also compared and evaluated using worldwide ionosonde data.

  8. A Comparison of Perturbed Initial Conditions and Multiphysics Ensembles in a Severe Weather Episode in Spain

    NASA Technical Reports Server (NTRS)

    Tapiador, Francisco; Tao, Wei-Kuo; Angelis, Carlos F.; Martinez, Miguel A.; Cecilia Marcos; Antonio Rodriguez; Hou, Arthur; Jong Shi, Jain

    2012-01-01

    Ensembles of numerical model forecasts are of interest to operational early warning forecasters as the spread of the ensemble provides an indication of the uncertainty of the alerts, and the mean value is deemed to outperform the forecasts of the individual models. This paper explores two ensembles on a severe weather episode in Spain, aiming to ascertain the relative usefulness of each one. One ensemble uses sensible choices of physical parameterizations (precipitation microphysics, land surface physics, and cumulus physics) while the other follows a perturbed initial conditions approach. The results show that, depending on the parameterizations, large differences can be expected in terms of storm location, spatial structure of the precipitation field, and rain intensity. It is also found that the spread of the perturbed initial conditions ensemble is smaller than the dispersion due to physical parameterizations. This confirms that in severe weather situations operational forecasts should address moist physics deficiencies to realize the full benefits of the ensemble approach, in addition to optimizing initial conditions. The results also provide insights into differences in simulations arising from ensembles of weather models using several combinations of different physical parameterizations.

  9. Impact of nowcasting on the production and processing of agricultural crops. [in the US

    NASA Technical Reports Server (NTRS)

    Dancer, W. S.; Tibbitts, T. W.

    1973-01-01

    The value was studied of improved weather information and weather forecasting to farmers, growers, and agricultural processing industries in the United States. The study was undertaken to identify the production and processing operations that could be improved with accurate and timely information on changing weather patterns. Estimates were then made of the potential savings that could be realized with accurate information about the prevailing weather and short term forecasts for up to 12 hours. This weather information has been termed nowcasting. The growing, marketing, and processing operations of the twenty most valuable crops in the United States were studied to determine those operations that are sensitive to short-term weather forecasting. Agricultural extension specialists, research scientists, growers, and representatives of processing industries were consulted and interviewed. The value of the crops included in this survey and their production levels are given. The total value for crops surveyed exceeds 24 billion dollars and represents more than 92 percent of total U.S. crop value.

  10. Linking Science of Flood Forecasts to Humanitarian Actions for Improved Preparedness and Effective Response

    NASA Astrophysics Data System (ADS)

    Uprety, M.; Dugar, S.; Gautam, D.; Kanel, D.; Kshetri, M.; Kharbuja, R. G.; Acharya, S. H.

    2017-12-01

    Advances in flood forecasting have provided opportunities for humanitarian responders to employ a range of preparedness activities at different forecast time horizons. Yet, the science of prediction is less understood and realized across the humanitarian landscape, and often preparedness plans are based upon average level of flood risk. Working under the remit of Forecast Based Financing (FbF), we present a pilot from Nepal on how available flood and weather forecast products are informing specific pre-emptive actions in the local preparedness and response plans, thereby supporting government stakeholders and humanitarian agencies to take early actions before an impending flood event. In Nepal, forecasting capabilities are limited but in a state of positive flux. Whilst local flood forecasts based upon rainfall-runoff models are yet to be operationalized, streamflow predictions from Global Flood Awareness System (GLoFAS) can be utilized to plan and implement preparedness activities several days in advance. Likewise, 3-day rainfall forecasts from Nepal Department of Hydrology and Meteorology (DHM) can further inform specific set of early actions for potential flash floods due to heavy precipitation. Existing community based early warning systems in the major river basins of Nepal are utilizing real time monitoring of water levels and rainfall together with localised probabilistic flood forecasts which has increased warning lead time from 2-3 hours to 7-8 hours. Based on these available forecast products, thresholds and trigger levels have been determined for different flood scenarios. Matching these trigger levels and assigning responsibilities to relevant actors for early actions, a set of standard operating procedures (SOPs) are being developed, broadly covering general preparedness activities and science informed anticipatory actions for different forecast lead times followed by the immediate response activities. These SOPs are currently being rolled out and tested by the Ministry of Home Affairs (MoHA) through its district emergency operation centres in West Nepal. Potential scale up and successful implementation of this science based approach would be instrumental to take forward global commitments on disaster risk reduction, climate change adaptation and sustainable goals in Nepal.

  11. Forecasted range shifts of arid-land fishes in response to climate change

    USGS Publications Warehouse

    Whitney, James E.; Whittier, Joanna B.; Paukert, Craig P.; Olden, Julian D.; Strecker, Angela L.

    2017-01-01

    Climate change is poised to alter the distributional limits, center, and size of many species. Traits may influence different aspects of range shifts, with trophic generality facilitating shifts at the leading edge, and greater thermal tolerance limiting contractions at the trailing edge. The generality of relationships between traits and range shifts remains ambiguous however, especially for imperiled fishes residing in xeric riverscapes. Our objectives were to quantify contemporary fish distributions in the Lower Colorado River Basin, forecast climate change by 2085 using two general circulation models, and quantify shifts in the limits, center, and size of fish elevational ranges according to fish traits. We examined relationships among traits and range shift metrics either singly using univariate linear modeling or combined with multivariate redundancy analysis. We found that trophic and dispersal traits were associated with shifts at the leading and trailing edges, respectively, although projected range shifts were largely unexplained by traits. As expected, piscivores and omnivores with broader diets shifted upslope most at the leading edge while more specialized invertivores exhibited minimal changes. Fishes that were more mobile shifted upslope most at the trailing edge, defying predictions. No traits explained changes in range center or size. Finally, current preference explained multivariate range shifts, as fishes with faster current preferences exhibited smaller multivariate changes. Although range shifts were largely unexplained by traits, more specialized invertivorous fishes with lower dispersal propensity or greater current preference may require the greatest conservation efforts because of their limited capacity to shift ranges under climate change.

  12. Extravehicular Activity Technology Development Status and Forecast

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Westheimer, David T.

    2011-01-01

    The goal of NASA s current EVA technology effort is to further develop technologies that will be used to demonstrate a robust EVA system that has application for a variety of future missions including microgravity and surface EVA. Overall the objectives will be to reduce system mass, reduce consumables and maintenance, increase EVA hardware robustness and life, increase crew member efficiency and autonomy, and enable rapid vehicle egress and ingress. Over the past several years, NASA realized a tremendous increase in EVA system development as part of the Exploration Technology Development Program and the Constellation Program. The evident demand for efficient and reliable EVA technologies, particularly regenerable technologies was apparent under these former programs and will continue to be needed as future mission opportunities arise. The technological need for EVA in space has been realized over the last several decades by the Gemini, Apollo, Skylab, Space Shuttle, and the International Space Station (ISS) programs. EVAs were critical to the success of these programs. Now with the ISS extension to 2028 in conjunction with a current forecasted need of at least eight EVAs per year, the EVA hardware life and limited availability of the Extravehicular Mobility Units (EMUs) will eventually become a critical issue. The current EMU has successfully served EVA demands by performing critical operations to assemble the ISS and provide repairs of satellites such as the Hubble Space Telescope. However, as the life of ISS and the vision for future mission opportunities are realized, a new EVA systems capability will be needed and the current architectures and technologies under development offer significant improvements over the current flight systems. In addition to ISS, potential mission applications include EVAs for missions to Near Earth Objects (NEO), Phobos, or future surface missions. Surface missions could include either exploration of the Moon or Mars. Providing an EVA capability for these types of missions enables in-space construction of complex vehicles or satellites, hands on exploration of new parts of our solar system, and engages the public through the inspiration of knowing that humans are exploring places that they have never been before. This paper offers insight into what is currently being developed and what the potential opportunities are in the forecast.

  13. A global perspective of the limits of prediction skill based on the ECMWF ensemble

    NASA Astrophysics Data System (ADS)

    Zagar, Nedjeljka

    2016-04-01

    In this talk presents a new model of the global forecast error growth applied to the forecast errors simulated by the ensemble prediction system (ENS) of the ECMWF. The proxy for forecast errors is the total spread of the ECMWF operational ensemble forecasts obtained by the decomposition of the wind and geopotential fields in the normal-mode functions. In this way, the ensemble spread can be quantified separately for the balanced and inertio-gravity (IG) modes for every forecast range. Ensemble reliability is defined for the balanced and IG modes comparing the ensemble spread with the control analysis in each scale. The results show that initial uncertainties in the ECMWF ENS are largest in the tropical large-scale modes and their spatial distribution is similar to the distribution of the short-range forecast errors. Initially the ensemble spread grows most in the smallest scales and in the synoptic range of the IG modes but the overall growth is dominated by the increase of spread in balanced modes in synoptic and planetary scales in the midlatitudes. During the forecasts, the distribution of spread in the balanced and IG modes grows towards the climatological spread distribution characteristic of the analyses. The ENS system is found to be somewhat under-dispersive which is associated with the lack of tropical variability, primarily the Kelvin waves. The new model of the forecast error growth has three fitting parameters to parameterize the initial fast growth and a more slow exponential error growth later on. The asymptotic values of forecast errors are independent of the exponential growth rate. It is found that the asymptotic values of the errors due to unbalanced dynamics are around 10 days while the balanced and total errors saturate in 3 to 4 weeks. Reference: Žagar, N., R. Buizza, and J. Tribbia, 2015: A three-dimensional multivariate modal analysis of atmospheric predictability with application to the ECMWF ensemble. J. Atmos. Sci., 72, 4423-4444.

  14. Forecasting Daily Volume and Acuity of Patients in the Emergency Department.

    PubMed

    Calegari, Rafael; Fogliatto, Flavio S; Lucini, Filipe R; Neyeloff, Jeruza; Kuchenbecker, Ricardo S; Schaan, Beatriz D

    2016-01-01

    This study aimed at analyzing the performance of four forecasting models in predicting the demand for medical care in terms of daily visits in an emergency department (ED) that handles high complexity cases, testing the influence of climatic and calendrical factors on demand behavior. We tested different mathematical models to forecast ED daily visits at Hospital de Clínicas de Porto Alegre (HCPA), which is a tertiary care teaching hospital located in Southern Brazil. Model accuracy was evaluated using mean absolute percentage error (MAPE), considering forecasting horizons of 1, 7, 14, 21, and 30 days. The demand time series was stratified according to patient classification using the Manchester Triage System's (MTS) criteria. Models tested were the simple seasonal exponential smoothing (SS), seasonal multiplicative Holt-Winters (SMHW), seasonal autoregressive integrated moving average (SARIMA), and multivariate autoregressive integrated moving average (MSARIMA). Performance of models varied according to patient classification, such that SS was the best choice when all types of patients were jointly considered, and SARIMA was the most accurate for modeling demands of very urgent (VU) and urgent (U) patients. The MSARIMA models taking into account climatic factors did not improve the performance of the SARIMA models, independent of patient classification.

  15. Forecasting Daily Volume and Acuity of Patients in the Emergency Department

    PubMed Central

    Fogliatto, Flavio S.; Neyeloff, Jeruza; Kuchenbecker, Ricardo S.; Schaan, Beatriz D.

    2016-01-01

    This study aimed at analyzing the performance of four forecasting models in predicting the demand for medical care in terms of daily visits in an emergency department (ED) that handles high complexity cases, testing the influence of climatic and calendrical factors on demand behavior. We tested different mathematical models to forecast ED daily visits at Hospital de Clínicas de Porto Alegre (HCPA), which is a tertiary care teaching hospital located in Southern Brazil. Model accuracy was evaluated using mean absolute percentage error (MAPE), considering forecasting horizons of 1, 7, 14, 21, and 30 days. The demand time series was stratified according to patient classification using the Manchester Triage System's (MTS) criteria. Models tested were the simple seasonal exponential smoothing (SS), seasonal multiplicative Holt-Winters (SMHW), seasonal autoregressive integrated moving average (SARIMA), and multivariate autoregressive integrated moving average (MSARIMA). Performance of models varied according to patient classification, such that SS was the best choice when all types of patients were jointly considered, and SARIMA was the most accurate for modeling demands of very urgent (VU) and urgent (U) patients. The MSARIMA models taking into account climatic factors did not improve the performance of the SARIMA models, independent of patient classification. PMID:27725842

  16. Impact of seasonal forecast use on agricultural income in a system with varying crop costs and returns: an empirically-grounded simulation

    NASA Astrophysics Data System (ADS)

    Gunda, T.; Bazuin, J. T.; Nay, J.; Yeung, K. L.

    2017-03-01

    Access to seasonal climate forecasts can benefit farmers by allowing them to make more informed decisions about their farming practices. However, it is unclear whether farmers realize these benefits when crop choices available to farmers have different and variable costs and returns; multiple countries have programs that incentivize production of certain crops while other crops are subject to market fluctuations. We hypothesize that the benefits of forecasts on farmer livelihoods will be moderated by the combined impact of differing crop economics and changing climate. Drawing upon methods and insights from both physical and social sciences, we develop a model of farmer decision-making to evaluate this hypothesis. The model dynamics are explored using empirical data from Sri Lanka; primary sources include survey and interview information as well as game-based experiments conducted with farmers in the field. Our simulations show that a farmer using seasonal forecasts has more diversified crop selections, which drive increases in average agricultural income. Increases in income are particularly notable under a drier climate scenario, when a farmer using seasonal forecasts is more likely to plant onions, a crop with higher possible returns. Our results indicate that, when water resources are scarce (i.e. drier climate scenario), farmer incomes could become stratified, potentially compounding existing disparities in farmers’ financial and technical abilities to use forecasts to inform their crop selections. This analysis highlights that while programs that promote production of certain crops may ensure food security in the short-term, the long-term implications of these dynamics need careful evaluation.

  17. Variance analysis of forecasted streamflow maxima in a wet temperate climate

    NASA Astrophysics Data System (ADS)

    Al Aamery, Nabil; Fox, James F.; Snyder, Mark; Chandramouli, Chandra V.

    2018-05-01

    Coupling global climate models, hydrologic models and extreme value analysis provides a method to forecast streamflow maxima, however the elusive variance structure of the results hinders confidence in application. Directly correcting the bias of forecasts using the relative change between forecast and control simulations has been shown to marginalize hydrologic uncertainty, reduce model bias, and remove systematic variance when predicting mean monthly and mean annual streamflow, prompting our investigation for maxima streamflow. We assess the variance structure of streamflow maxima using realizations of emission scenario, global climate model type and project phase, downscaling methods, bias correction, extreme value methods, and hydrologic model inputs and parameterization. Results show that the relative change of streamflow maxima was not dependent on systematic variance from the annual maxima versus peak over threshold method applied, albeit we stress that researchers strictly adhere to rules from extreme value theory when applying the peak over threshold method. Regardless of which method is applied, extreme value model fitting does add variance to the projection, and the variance is an increasing function of the return period. Unlike the relative change of mean streamflow, results show that the variance of the maxima's relative change was dependent on all climate model factors tested as well as hydrologic model inputs and calibration. Ensemble projections forecast an increase of streamflow maxima for 2050 with pronounced forecast standard error, including an increase of +30(±21), +38(±34) and +51(±85)% for 2, 20 and 100 year streamflow events for the wet temperate region studied. The variance of maxima projections was dominated by climate model factors and extreme value analyses.

  18. Reconstructing multi-mode networks from multivariate time series

    NASA Astrophysics Data System (ADS)

    Gao, Zhong-Ke; Yang, Yu-Xuan; Dang, Wei-Dong; Cai, Qing; Wang, Zhen; Marwan, Norbert; Boccaletti, Stefano; Kurths, Jürgen

    2017-09-01

    Unveiling the dynamics hidden in multivariate time series is a task of the utmost importance in a broad variety of areas in physics. We here propose a method that leads to the construction of a novel functional network, a multi-mode weighted graph combined with an empirical mode decomposition, and to the realization of multi-information fusion of multivariate time series. The method is illustrated in a couple of successful applications (a multi-phase flow and an epileptic electro-encephalogram), which demonstrate its powerfulness in revealing the dynamical behaviors underlying the transitions of different flow patterns, and enabling to differentiate brain states of seizure and non-seizure.

  19. Economic indicators selection for crime rates forecasting using cooperative feature selection

    NASA Astrophysics Data System (ADS)

    Alwee, Razana; Shamsuddin, Siti Mariyam Hj; Salleh Sallehuddin, Roselina

    2013-04-01

    Features selection in multivariate forecasting model is very important to ensure that the model is accurate. The purpose of this study is to apply the Cooperative Feature Selection method for features selection. The features are economic indicators that will be used in crime rate forecasting model. The Cooperative Feature Selection combines grey relational analysis and artificial neural network to establish a cooperative model that can rank and select the significant economic indicators. Grey relational analysis is used to select the best data series to represent each economic indicator and is also used to rank the economic indicators according to its importance to the crime rate. After that, the artificial neural network is used to select the significant economic indicators for forecasting the crime rates. In this study, we used economic indicators of unemployment rate, consumer price index, gross domestic product and consumer sentiment index, as well as data rates of property crime and violent crime for the United States. Levenberg-Marquardt neural network is used in this study. From our experiments, we found that consumer price index is an important economic indicator that has a significant influence on the violent crime rate. While for property crime rate, the gross domestic product, unemployment rate and consumer price index are the influential economic indicators. The Cooperative Feature Selection is also found to produce smaller errors as compared to Multiple Linear Regression in forecasting property and violent crime rates.

  20. Evaluation of probabilistic forecasts with the scoringRules package

    NASA Astrophysics Data System (ADS)

    Jordan, Alexander; Krüger, Fabian; Lerch, Sebastian

    2017-04-01

    Over the last decades probabilistic forecasts in the form of predictive distributions have become popular in many scientific disciplines. With the proliferation of probabilistic models arises the need for decision-theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way in order to better understand sources of prediction errors and to improve the models. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. In coherence with decision-theoretical principles they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This contribution presents the software package scoringRules for the statistical programming language R, which provides functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. For univariate variables, two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, ensemble weather forecasts take this form. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices. Recent developments include the addition of scoring rules to evaluate multivariate forecast distributions. The use of the scoringRules package is illustrated in an example on post-processing ensemble forecasts of temperature.

  1. Numerical Methods in Atmospheric and Oceanic Modelling: The Andre J. Robert Memorial Volume

    NASA Astrophysics Data System (ADS)

    Rosmond, Tom

    Most people, even including some in the scientific community, do not realize how much the weather forecasts they use to guide the activities of their daily lives depend on very complex mathematics and numerical methods that are the basis of modern numerical weather prediction (NWP). André Robert (1929-1993), to whom Numerical Methods in Atmospheric and Oceanic Modelling is dedicated, had a career that contributed greatly to the growth of NWP and the role that the atmospheric computer models of NWP play in our society. There are probably no NWP models running anywhere in the world today that do not use numerical methods introduced by Robert, and those of us who work with and use these models everyday are indebted to him.The first two chapters of the volume are chronicles of Robert's life and career. The first is a 1987 interview by Harold Ritchie, one of Robert's many proteges and colleagues at the Canadian Atmospheric Environment Service. The interview traces Robert's life from his birth in New York to French Canadian parents, to his emigration to Quebec at an early age, his education and early employment, and his rise in stature as one of the preeminent research meteorologists of our time. An amusing anecdote he relates is his impression of weather forecasts while he was considering his first job as a meteorologist in the early 1950s. A newspaper of the time placed the weather forecast and daily horoscope side by side, and Robert regarded each to have a similar scientific basis. Thankfully he soon realized there was a difference between the two, and his subsequent career certainly confirmed the distinction.

  2. Uncertainty analysis of an inflow forecasting model: extension of the UNEEC machine learning-based method

    NASA Astrophysics Data System (ADS)

    Pianosi, Francesca; Lal Shrestha, Durga; Solomatine, Dimitri

    2010-05-01

    This research presents an extension of UNEEC (Uncertainty Estimation based on Local Errors and Clustering, Shrestha and Solomatine, 2006, 2008 & Solomatine and Shrestha, 2009) method in the direction of explicit inclusion of parameter uncertainty. UNEEC method assumes that there is an optimal model and the residuals of the model can be used to assess the uncertainty of the model prediction. It is assumed that all sources of uncertainty including input, parameter and model structure uncertainty are explicitly manifested in the model residuals. In this research, theses assumptions are relaxed, and the UNEEC method is extended to consider parameter uncertainty as well (abbreviated as UNEEC-P). In UNEEC-P, first we use Monte Carlo (MC) sampling in parameter space to generate N model realizations (each of which is a time series), estimate the prediction quantiles based on the empirical distribution functions of the model residuals considering all the residual realizations, and only then apply the standard UNEEC method that encapsulates the uncertainty of a hydrologic model (expressed by quantiles of the error distribution) in a machine learning model (e.g., ANN). UNEEC-P is applied first to a linear regression model of synthetic data, and then to a real case study of forecasting inflow to lake Lugano in northern Italy. The inflow forecasting model is a stochastic heteroscedastic model (Pianosi and Soncini-Sessa, 2009). The preliminary results show that the UNEEC-P method produces wider uncertainty bounds, which is consistent with the fact that the method considers also parameter uncertainty of the optimal model. In the future UNEEC method will be further extended to consider input and structure uncertainty which will provide more realistic estimation of model predictions.

  3. The Importance of Human Resource Planning in Industrial Enterprises

    NASA Astrophysics Data System (ADS)

    Koltnerová, Kristína; Chlpeková, Andrea; Samáková, Jana

    2012-12-01

    Human resource planning in the business practice should represent generally used and key activity for human resource management because human resource planning helps to make optimum utilisation of the human resources in the enterprise and it helps to avoid wastage of human resources. Human resource planning allows to forecast the future manpower requirements and also to forecast the number and type of employees who will be required by the enterprise in a near future. In the long term period, success of any enterprise depends on whether the right people are in the right places at the right time, which is the nature of human resource planning. The aim of this contribution is to explain the importance of human resource planning and to outline results of questionnaire survey which it was realized in industrial enterprises.

  4. Determining effective forecast horizons for multi-purpose reservoirs with short- and long-term operating objectives

    NASA Astrophysics Data System (ADS)

    Luchner, Jakob; Anghileri, Daniela; Castelletti, Andrea

    2017-04-01

    Real-time control of multi-purpose reservoirs can benefit significantly from hydro-meteorological forecast products. Because of their reliability, the most used forecasts range on time scales from hours to few days and are suitable for short-term operation targets such as flood control. In recent years, hydro-meteorological forecasts have become more accurate and reliable on longer time scales, which are more relevant to long-term reservoir operation targets such as water supply. While the forecast quality of such products has been studied extensively, the forecast value, i.e. the operational effectiveness of using forecasts to support water management, has been only relatively explored. It is comparatively easy to identify the most effective forecasting information needed to design reservoir operation rules for flood control but it is not straightforward to identify which forecast variable and lead time is needed to define effective hedging rules for operational targets with slow dynamics such as water supply. The task is even more complex when multiple targets, with diverse slow and fast dynamics, are considered at the same time. In these cases, the relative importance of different pieces of information, e.g. magnitude and timing of peak flow rate and accumulated inflow on different time lags, may vary depending on the season or the hydrological conditions. In this work, we analyze the relationship between operational forecast value and streamflow forecast horizon for different multi-purpose reservoir trade-offs. We use the Information Selection and Assessment (ISA) framework to identify the most effective forecast variables and horizons for informing multi-objective reservoir operation over short- and long-term temporal scales. The ISA framework is an automatic iterative procedure to discriminate the information with the highest potential to improve multi-objective reservoir operating performance. Forecast variables and horizons are selected using a feature selection technique. The technique determines the most informative combination in a multi-variate regression model to the optimal reservoir releases based on perfect information at a fixed objective trade-off. The improved reservoir operation is evaluated against optimal reservoir operation conditioned upon perfect information on future disturbances and basic reservoir operation using only the day of the year and the reservoir level. Different objective trade-offs are selected for analyzing resulting differences in improved reservoir operation and selected forecast variables and horizons. For comparison, the effective streamflow forecast horizon determined by the ISA framework is benchmarked against the performances obtained with a deterministic model predictive control (MPC) optimization scheme. Both the ISA framework and the MPC optimization scheme are applied to the real-world case study of Lake Como, Italy, using perfect streamflow forecast information. The principal operation targets for Lake Como are flood control and downstream water supply which makes its operation a suitable case study. Results provide critical feedback to reservoir operators on the use of long-term streamflow forecasts and to the hydro-meteorological forecasting community with respect to the forecast horizon needed from reliable streamflow forecasts.

  5. Modelling world gold prices and USD foreign exchange relationship using multivariate GARCH model

    NASA Astrophysics Data System (ADS)

    Ping, Pung Yean; Ahmad, Maizah Hura Binti

    2014-12-01

    World gold price is a popular investment commodity. The series have often been modeled using univariate models. The objective of this paper is to show that there is a co-movement between gold price and USD foreign exchange rate. Using the effect of the USD foreign exchange rate on the gold price, a model that can be used to forecast future gold prices is developed. For this purpose, the current paper proposes a multivariate GARCH (Bivariate GARCH) model. Using daily prices of both series from 01.01.2000 to 05.05.2014, a causal relation between the two series understudied are found and a bivariate GARCH model is produced.

  6. Introducing an operational method to forecast long-term regional drought based on the application of artificial intelligence capabilities

    NASA Astrophysics Data System (ADS)

    Kousari, Mohammad Reza; Hosseini, Mitra Esmaeilzadeh; Ahani, Hossein; Hakimelahi, Hemila

    2017-01-01

    An effective forecast of the drought definitely gives lots of advantages in regard to the management of water resources being used in agriculture, industry, and households consumption. To introduce such a model applying simple data inputs, in this study a regional drought forecast method on the basis of artificial intelligence capabilities (artificial neural networks) and Standardized Precipitation Index (SPI in 3, 6, 9, 12, 18, and 24 monthly series) has been presented in Fars Province of Iran. The precipitation data of 41 rain gauge stations were applied for computing SPI values. Besides, weather signals including Multivariate ENSO Index (MEI), North Atlantic Oscillation (NAO), Southern Oscillation Index (SOI), NINO1+2, anomaly NINO1+2, NINO3, anomaly NINO3, NINO4, anomaly NINO4, NINO3.4, and anomaly NINO3.4 were also used as the predictor variables for SPI time series forecast the next 12 months. Frequent testing and validating steps were considered to obtain the best artificial neural networks (ANNs) models. The forecasted values were mapped in verification sector then they were compared with the observed maps at the same dates. Results showed considerable spatial and temporal relationships even among the maps of different SPI time series. Also, the first 6 months forecasted maps showed an average of 73 % agreements with the observed ones. The most important finding and the strong point of this study was the fact that although drought forecast in each station and time series was completely independent, the relationships between spatial and temporal predictions remained. This strong point mainly referred to frequent testing and validating steps in order to explore the best drought forecast models from plenty of produced ANNs models. Finally, wherever the precipitation data are available, the practical application of the presented method is possible.

  7. Short-term sea ice forecasts with the RASM-ESRL coupled model: A testbed for improving simulations of ocean-ice-atmosphere interactions in the marginal ice zone

    NASA Astrophysics Data System (ADS)

    Solomon, A.; Cox, C. J.; Hughes, M.; Intrieri, J. M.; Persson, O. P. G.

    2015-12-01

    The dramatic decrease of Arctic sea-ice has led to a new Arctic sea-ice paradigm and to increased commercial activity in the Arctic Ocean. NOAA's mission to provide accurate and timely sea-ice forecasts, as explicitly outlined in the National Ocean Policy and the U.S. National Strategy for the Arctic Region, needs significant improvement across a range of time scales to improve safety for human activity. Unfortunately, the sea-ice evolution in the new Arctic involves the interaction of numerous physical processes in the atmosphere, ice, and ocean, some of which are not yet understood. These include atmospheric forcing of sea-ice movement through stress and stress deformation; atmospheric forcing of sea-ice melt and formation through energy fluxes; and ocean forcing of the atmosphere through new regions of seasonal heat release. Many of these interactions involve emerging complex processes that first need to be understood and then incorporated into forecast models in order to realize the goal of useful sea-ice forecasting. The underlying hypothesis for this study is that errors in simulations of "fast" atmospheric processes significantly impact the forecast of seasonal sea-ice retreat in summer and its advance in autumn in the marginal ice zone (MIZ). We therefore focus on short-term (0-20 day) ice-floe movement, the freeze-up and melt-back processes in the MIZ, and the role of storms in modulating stress and heat fluxes. This study uses a coupled ocean-atmosphere-seaice forecast model as a testbed to investigate; whether ocean-sea ice-atmosphere coupling improves forecasts on subseasonal time scales, where systematic biases develop due to inadequate parameterizations (focusing on mixed-phase clouds and surface fluxes), how increased atmospheric resolution of synoptic features improves the forecasts, and how initialization of sea ice area and thickness and snow depth impacts the skill of the forecasts. Simulations are validated with measurements at pan-Arctic land sites, satellite data, and recent ocean field campaigns.

  8. Earthquake Forecasting Through Semi-periodicity Analysis of Labeled Point Processes

    NASA Astrophysics Data System (ADS)

    Quinteros Cartaya, C. B. M.; Nava Pichardo, F. A.; Glowacka, E.; Gomez-Trevino, E.

    2015-12-01

    Large earthquakes have semi-periodic behavior as result of critically self-organized processes of stress accumulation and release in some seismogenic region. Thus, large earthquakes in a region constitute semi-periodic sequences with recurrence times varying slightly from periodicity. Nava et al., 2013 and Quinteros et al., 2013 realized that not all earthquakes in a given region need belong to the same sequence, since there can be more than one process of stress accumulation and release in it; they also proposed a method to identify semi-periodic sequences through analytic Fourier analysis. This work presents improvements on the above-mentioned method: the influence of earthquake size on the spectral analysis, and its importance in semi-periodic events identification, which means that earthquake occurrence times are treated as a labeled point process; the estimation of appropriate upper limit uncertainties to use in forecasts; and the use of Bayesian analysis to evaluate the forecast performance. This improved method is applied to specific regions: the southwestern coast of Mexico, the northeastern Japan Arc, the San Andreas Fault zone at Parkfield, and northeastern Venezuela.

  9. [Research on engine remaining useful life prediction based on oil spectrum analysis and particle filtering].

    PubMed

    Sun, Lei; Jia, Yun-xian; Cai, Li-ying; Lin, Guo-yu; Zhao, Jin-song

    2013-09-01

    The spectrometric oil analysis(SOA) is an important technique for machine state monitoring, fault diagnosis and prognosis, and SOA based remaining useful life(RUL) prediction has an advantage of finding out the optimal maintenance strategy for machine system. Because the complexity of machine system, its health state degradation process can't be simply characterized by linear model, while particle filtering(PF) possesses obvious advantages over traditional Kalman filtering for dealing nonlinear and non-Gaussian system, the PF approach was applied to state forecasting by SOA, and the RUL prediction technique based on SOA and PF algorithm is proposed. In the prediction model, according to the estimating result of system's posterior probability, its prior probability distribution is realized, and the multi-step ahead prediction model based on PF algorithm is established. Finally, the practical SOA data of some engine was analyzed and forecasted by the above method, and the forecasting result was compared with that of traditional Kalman filtering method. The result fully shows the superiority and effectivity of the

  10. Value of biologic therapy: a forecasting model in three disease areas.

    PubMed

    Paramore, L Clark; Hunter, Craig A; Luce, Bryan R; Nordyke, Robert J; Halbert, R J

    2010-01-01

    Forecast the return on investment (ROI) for advances in biologic therapies in years 2015 and 2030, based upon impact on disease prevalence, morbidity, and mortality for asthma, diabetes, and colorectal cancer. A deterministic, spreadsheet-based, forecasting model was developed based on trends in demographics and disease epidemiology. 'Return' was defined as reductions in disease burden (prevalence, morbidity, mortality) translated into monetary terms; 'investment' was defined as the incremental costs of biologic therapy advances. Data on disease prevalence, morbidity, mortality, and associated costs were obtained from government survey statistics or published literature. Expected impact of advances in biologic therapies was based on expert opinion. Gains in quality-adjusted life years (QALYs) were valued at $100,000 per QALY. The base case analysis, in which reductions in disease prevalence and mortality predicted by the expert panel are not considered, shows the resulting ROIs remain positive for asthma and diabetes but fall below $1 for colorectal cancer. Analysis involving expert panel predictions indicated positive ROI results for all three diseases at both time points, ranging from $207 for each incremental dollar spent on biologic therapies to treat asthma in 2030, to $4 for each incremental dollar spent on biologic therapies to treat colorectal cancer in 2015. If QALYs are not considered, the resulting ROIs remain positive for all three diseases at both time points. Society may expect substantial returns from investments in innovative biologic therapies. These benefits are most likely to be realized in an environment of appropriate use of new molecules. The potential variance between forecasted (from expert opinion) and actual future health outcomes could be significant. Similarly, the forecasted growth in use of biologic therapies relied upon unvalidated market forecasts.

  11. Advances in Monitoring, Modelling and Forecasting Volcanic Ash Plumes over the Past 5 Years and the Impact on Preparedness from the London VAAC Perspective

    NASA Astrophysics Data System (ADS)

    Lee, D. S.; Lisk, I.

    2015-12-01

    Hosted and run by the Met Office, the London VAAC (Volcanic Ash Advisory Centre) is responsible for issuing advisories on the location and likely dispersion of ash clouds originating from volcanoes in the North East Atlantic, primarily from Iceland. These advisories and additional guidance products are used by the civil aviation community to make decisions on airspace flight management. London VAAC has specialist forecasters who use a combination of volcano source data, satellite-based, ground-based and aircraft observations, weather forecast models and dispersion models. Since the eruption of the Icelandic volcano Eyjafjallajökull in 2010, which resulted in the decision by many northern European countries to impose significant restrictions on the use of their airspace, London VAAC has been active in further developing its volcanic ash monitoring, modelling and forecasting capabilities, collaborating with research organisations, industry, other VAACs, Meteorological Services and the Volcano Observatory in Iceland. It has been necessary to advance operational capabilities to address evolving requirements, including for more quantitative assessments of volcanic ash in the atmosphere. Here we summarise advances in monitoring, modelling and forecasting of volcanic ash plumes over the past 5 years from the London VAAC perspective, and the realization of science into operations. We also highlight the importance of collaborative activities, such as the 'VAAC Best Practice' Workshop, where information is exchanged between all nine VAACs worldwide on the operational practices in monitoring and forecasting volcanic ash, with the aim of working toward a more harmonized service for decision makers in the aviation community. We conclude on an evaluation of how better we are prepared for the next significant ash-rich Icelandic eruption, and the challenges still remaining.

  12. On the Likely Utility of Hybrid Weights Optimized for Variances in Hybrid Error Covariance Models

    NASA Astrophysics Data System (ADS)

    Satterfield, E.; Hodyss, D.; Kuhl, D.; Bishop, C. H.

    2017-12-01

    Because of imperfections in ensemble data assimilation schemes, one cannot assume that the ensemble covariance is equal to the true error covariance of a forecast. Previous work demonstrated how information about the distribution of true error variances given an ensemble sample variance can be revealed from an archive of (observation-minus-forecast, ensemble-variance) data pairs. Here, we derive a simple and intuitively compelling formula to obtain the mean of this distribution of true error variances given an ensemble sample variance from (observation-minus-forecast, ensemble-variance) data pairs produced by a single run of a data assimilation system. This formula takes the form of a Hybrid weighted average of the climatological forecast error variance and the ensemble sample variance. Here, we test the extent to which these readily obtainable weights can be used to rapidly optimize the covariance weights used in Hybrid data assimilation systems that employ weighted averages of static covariance models and flow-dependent ensemble based covariance models. Univariate data assimilation and multi-variate cycling ensemble data assimilation are considered. In both cases, it is found that our computationally efficient formula gives Hybrid weights that closely approximate the optimal weights found through the simple but computationally expensive process of testing every plausible combination of weights.

  13. The Application of Magnesium Alloys in Aircraft Interiors — Changing the Rules

    NASA Astrophysics Data System (ADS)

    Davis, Bruce

    The commercial aircraft market is forecast to steadily grow over the next two decades. Part of this growth is driven by the desire of airlines to replace older models in their fleet with newer, more fuel efficient designs, to realize lower operating costs and to address the rising cost of aviation fuel. As such the aircraft OEMs are beginning to set more and more demanding mass targets on their new platforms.

  14. Cluster-based exposure variation analysis

    PubMed Central

    2013-01-01

    Background Static posture, repetitive movements and lack of physical variation are known risk factors for work-related musculoskeletal disorders, and thus needs to be properly assessed in occupational studies. The aims of this study were (i) to investigate the effectiveness of a conventional exposure variation analysis (EVA) in discriminating exposure time lines and (ii) to compare it with a new cluster-based method for analysis of exposure variation. Methods For this purpose, we simulated a repeated cyclic exposure varying within each cycle between “low” and “high” exposure levels in a “near” or “far” range, and with “low” or “high” velocities (exposure change rates). The duration of each cycle was also manipulated by selecting a “small” or “large” standard deviation of the cycle time. Theses parameters reflected three dimensions of exposure variation, i.e. range, frequency and temporal similarity. Each simulation trace included two realizations of 100 concatenated cycles with either low (ρ = 0.1), medium (ρ = 0.5) or high (ρ = 0.9) correlation between the realizations. These traces were analyzed by conventional EVA, and a novel cluster-based EVA (C-EVA). Principal component analysis (PCA) was applied on the marginal distributions of 1) the EVA of each of the realizations (univariate approach), 2) a combination of the EVA of both realizations (multivariate approach) and 3) C-EVA. The least number of principal components describing more than 90% of variability in each case was selected and the projection of marginal distributions along the selected principal component was calculated. A linear classifier was then applied to these projections to discriminate between the simulated exposure patterns, and the accuracy of classified realizations was determined. Results C-EVA classified exposures more correctly than univariate and multivariate EVA approaches; classification accuracy was 49%, 47% and 52% for EVA (univariate and multivariate), and C-EVA, respectively (p < 0.001). All three methods performed poorly in discriminating exposure patterns differing with respect to the variability in cycle time duration. Conclusion While C-EVA had a higher accuracy than conventional EVA, both failed to detect differences in temporal similarity. The data-driven optimality of data reduction and the capability of handling multiple exposure time lines in a single analysis are the advantages of the C-EVA. PMID:23557439

  15. The Discriminant Analysis Flare Forecasting System (DAFFS)

    NASA Astrophysics Data System (ADS)

    Leka, K. D.; Barnes, Graham; Wagner, Eric; Hill, Frank; Marble, Andrew R.

    2016-05-01

    The Discriminant Analysis Flare Forecasting System (DAFFS) has been developed under NOAA/Small Business Innovative Research funds to quantitatively improve upon the NOAA/SWPC flare prediction. In the Phase-I of this project, it was demonstrated that DAFFS could indeed improve by the requested 25% most of the standard flare prediction data products from NOAA/SWPC. In the Phase-II of this project, a prototype has been developed and is presently running autonomously at NWRA.DAFFS uses near-real-time data from NOAA/GOES, SDO/HMI, and the NSO/GONG network to issue both region- and full-disk forecasts of solar flares, based on multi-variable non-parametric Discriminant Analysis. Presently, DAFFS provides forecasts which match those provided by NOAA/SWPC in terms of thresholds and validity periods (including 1-, 2-, and 3- day forecasts), although issued twice daily. Of particular note regarding DAFFS capabilities are the redundant system design, automatically-generated validation statistics and the large range of customizable options available. As part of this poster, a description of the data used, algorithm, performance and customizable options will be presented, as well as a demonstration of the DAFFS prototype.DAFFS development at NWRA is supported by NOAA/SBIR contracts WC-133R-13-CN-0079 and WC-133R-14-CN-0103, with additional support from NASA contract NNH12CG10C, plus acknowledgment to the SDO/HMI and NSO/GONG facilities and NOAA/SWPC personnel for data products, support, and feedback. DAFFS is presently ready for Phase-III development.

  16. Forecasting of hourly load by pattern recognition in a small area power system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehdashti-Shahrokh, A.

    1982-01-01

    An intuitive, logical, simple and efficient method of forecasting hourly load in a small area power system is presented. A pattern recognition approach is used in developing the forecasting model. Pattern recognition techniques are powerful tools in the field of artificial intelligence (cybernetics) and simulate the way the human brain operates to make decisions. Pattern recognition is generally used in analysis of processes where the total physical nature behind the process variation is unkown but specific kinds of measurements explain their behavior. In this research basic multivariate analyses, in conjunction with pattern recognition techniques, are used to develop a linearmore » deterministic model to forecast hourly load. This method assumes that load patterns in the same geographical area are direct results of climatological changes (weather sensitive load), and have occurred in the past as a result of similar climatic conditions. The algorithm described in here searches for the best possible pattern from a seasonal library of load and weather data in forecasting hourly load. To accommodate the unpredictability of weather and the resulting load, the basic twenty-four load pattern was divided into eight three-hour intervals. This division was made to make the model adaptive to sudden climatic changes. The proposed method offers flexible lead times of one to twenty-four hours. The results of actual data testing had indicated that this proposed method is computationally efficient, highly adaptive, with acceptable data storage size and accuracy that is comparable to many other existing methods.« less

  17. Assessing methods for developing crop forecasting in the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Ines, A. V. M.; Capa Morocho, M. I.; Baethgen, W.; Rodriguez-Fonseca, B.; Han, E.; Ruiz Ramos, M.

    2015-12-01

    Seasonal climate prediction may allow predicting crop yield to reduce the vulnerability of agricultural production to climate variability and its extremes. It has been already demonstrated that seasonal climate predictions at European (or Iberian) scale from ensembles of global coupled climate models have some skill (Palmer et al., 2004). The limited predictability that exhibits the atmosphere in mid-latitudes, and therefore de Iberian Peninsula (PI), can be managed by a probabilistic approach based in terciles. This study presents an application for the IP of two methods for linking tercile-based seasonal climate forecasts with crop models to improve crop predictability. Two methods were evaluated and applied for disaggregating seasonal rainfall forecasts into daily weather realizations: 1) a stochastic weather generator and 2) a forecast tercile resampler. Both methods were evaluated in a case study where the impacts of two seasonal rainfall forecasts (wet and dry forecast for 1998 and 2015 respectively) on rainfed wheat yield and irrigation requirements of maize in IP were analyzed. Simulated wheat yield and irrigation requirements of maize were computed with the crop models CERES-wheat and CERES-maize which are included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at several locations in Spain where the crop model was calibrated and validated with independent field data. These methodologies would allow quantifying the benefits and risks of a seasonal climate forecast to potential users as farmers, agroindustry and insurance companies in the IP. Therefore, we would be able to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse ones. ReferencesPalmer, T. et al., 2004. Development of a European multimodel ensemble system for seasonal-to-interannual prediction (DEMETER). Bulletin of the American Meteorological Society, 85(6): 853-872.

  18. Value of Adaptive Drought Forecasting and Management for the ACF River Basin in the Southeast U.S.

    NASA Astrophysics Data System (ADS)

    Georgakakos, A. P.; Kistenmacher, M.

    2016-12-01

    In recent times, severe droughts in the southeast U.S. occur every 6 to 10 years and last for up to 4 years. During such drought episodes, the ACF River Basin supplies decline by up to 50 % of their normal levels, and water stresses increase rather markedly, exacerbating stakeholder anxiety and conflicts. As part of the ACF Stakeholder planning process, GWRI has developed new tools and carried out comprehensive assessments to provide quantitative answers to several important questions related to drought prediction and management: (i) Can dry and wet climatic periods be reliably anticipated with sufficiently long lead times? What drought indices can support reliable, skillful, and long-lead forecasts? (ii) What management objectives can seasonal climate forecasts benefit? How should benefits/impacts be shared? (iii) What operational adjustments are likely to mitigate stakeholder impacts or increase benefits consistent with stakeholder expectations? Regarding drought prediction, a large number of indices were defined and tested at different basin locations and lag times. These included local/cumulative unimpaired flows (UIFs) at 10 river nodes; Mean Areal Precipitation (MAP); Standard Precipitation Index (SPI); Palmer Drought Severity Index; Palmer Modified Drought Index; Palmer Z-Index; Palmer Hydrologic Drought Severity Index; and Soil Moisture—GWRI watershed model. Our findings show that all ACF sub-basins exhibit good forecast skill throughout the year and with sufficient lead time. Index variables with high explanatory value include: previous UIFs, soil moisture states (generated by the GWRI watershed model), and PDSI. Regarding drought management, assessments with coupled forecast-management schemes demonstrate that the use of adaptive forecast-management procedures improves reservoir operations and meets basin demands more reliably. Such improvements can support better management of lake levels, higher environmental and navigation flows, higher dependable power generation hours, and better management of consumptive uses without adverse impacts on other stakeholder interests. However, realizing these improvements requires (1) usage of adaptive reservoir management procedures (incorporating forecasts), and (2) stakeholder agreement on equitable benefit sharing.

  19. Towards guided data assimilation for operational hydrologic forecasting in the US Tennessee River basin

    NASA Astrophysics Data System (ADS)

    Weerts, A.; Wood, A. W.; Clark, M. P.; Carney, S.; Day, G. N.; Lemans, M.; Sumihar, J.; Newman, A. J.

    2014-12-01

    In the US, the forecasting approach used by the NWS River Forecast Centers and other regional organizations such as the Bonneville Power Administration (BPA) or Tennessee Valley Authority (TVA) has traditionally involved manual model input and state modifications made by forecasters in real-time. This process is time consuming and requires expert knowledge and experience. The benefits of automated data assimilation (DA) as a strategy for avoiding manual modification approaches have been demonstrated in research studies (eg. Seo et al., 2009). This study explores the usage of various ensemble DA algorithms within the operational platform used by TVA. The final goal is to identify a DA algorithm that will guide the manual modification process used by TVA forecasters and realize considerable time gains (without loss of quality or even enhance the quality) within the forecast process. We evaluate the usability of various popular algorithms for DA that have been applied on a limited basis for operational hydrology. To this end, Delft-FEWS was wrapped (via piwebservice) in OpenDA to enable execution of FEWS workflows (and the chained models within these workflows, including SACSMA, UNITHG and LAGK) in a DA framework. Within OpenDA, several filter methods are available. We considered 4 algorithms: particle filter (RRF), Ensemble Kalman Filter and Asynchronous Ensemble Kalman and Particle filter. Retrospective simulation results for one location and algorithm (AEnKF) are illustrated in Figure 1. The initial results are promising. We will present verification results for these methods (and possible more) for a variety of sub basins in the Tennessee River basin. Finally, we will offer recommendations for guided DA based on our results. References Seo, D.-J., L. Cajina, R. Corby and T. Howieson, 2009: Automatic State Updating for Operational Streamflow Forecasting via Variational Data Assimilation, 367, Journal of Hydrology, 255-275. Figure 1. Retrospectively simulated streamflow for the headwater basin above Powell River at Jonesville (red is observed flow, blue is simulated flow without DA, black is simulated flow with DA)

  20. Research on strategy and optimization method of PRT empty vehicles resource allocation based on traffic demand forecast

    NASA Astrophysics Data System (ADS)

    Xiang, Yu; Tao, Cheng

    2018-05-01

    During the operation of the personal rapid transit system(PRT), the empty vehicle resources is distributed unevenly because of different passenger demand. In order to maintain the balance between supply and demand, and to meet the passenger needs of the ride, PRT empty vehicle resource allocation model is constructed based on the future demand forecasted by historical demand in this paper. The improved genetic algorithm is implied in distribution of the empty vehicle which can reduce the customers waiting time and improve the operation efficiency of the PRT system so that all passengers can take the PRT vehicles in the shortest time. The experimental result shows that the improved genetic algorithm can allocate the empty vehicle from the system level optimally, and realize the distribution of the empty vehicle resources reasonably in the system.

  1. AOD furnace splash soft-sensor in the smelting process based on improved BP neural network

    NASA Astrophysics Data System (ADS)

    Ma, Haitao; Wang, Shanshan; Wu, Libin; Yu, Ying

    2017-11-01

    In view of argon oxygen refining low carbon ferrochrome production process, in the splash of smelting process as the research object, based on splash mechanism analysis in the smelting process , using multi-sensor information fusion and BP neural network modeling techniques is proposed in this paper, using the vibration signal, the audio signal and the flame image signal in the furnace as the characteristic signal of splash, the vibration signal, the audio signal and the flame image signal in the furnace integration and modeling, and reconstruct splash signal, realize the splash soft measurement in the smelting process, the simulation results show that the method can accurately forecast splash type in the smelting process, provide a new method of measurement for forecast splash in the smelting process, provide more accurate information to control splash.

  2. Prediction of the birch pollen season characteristics in Cracow, Poland using an 18-year data series.

    PubMed

    Dorota, Myszkowska

    2013-03-01

    The aim of the study was to construct the model forecasting the birch pollen season characteristics in Cracow on the basis of an 18-year data series. The study was performed using the volumetric method (Lanzoni/Burkard trap). The 98/95 % method was used to calculate the pollen season. The Spearman's correlation test was applied to find the relationship between the meteorological parameters and pollen season characteristics. To construct the predictive model, the backward stepwise multiple regression analysis was used including the multi-collinearity of variables. The predictive models best fitted the pollen season start and end, especially models containing two independent variables. The peak concentration value was predicted with the higher prediction error. Also the accuracy of the models predicting the pollen season characteristics in 2009 was higher in comparison with 2010. Both, the multi-variable model and one-variable model for the beginning of the pollen season included air temperature during the last 10 days of February, while the multi-variable model also included humidity at the beginning of April. The models forecasting the end of the pollen season were based on temperature in March-April, while the peak day was predicted using the temperature during the last 10 days of March.

  3. Development of Ensemble Model Based Water Demand Forecasting Model

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Han; So, Byung-Jin; Kim, Seong-Hyeon; Kim, Byung-Seop

    2014-05-01

    In recent years, Smart Water Grid (SWG) concept has globally emerged over the last decade and also gained significant recognition in South Korea. Especially, there has been growing interest in water demand forecast and optimal pump operation and this has led to various studies regarding energy saving and improvement of water supply reliability. Existing water demand forecasting models are categorized into two groups in view of modeling and predicting their behavior in time series. One is to consider embedded patterns such as seasonality, periodicity and trends, and the other one is an autoregressive model that is using short memory Markovian processes (Emmanuel et al., 2012). The main disadvantage of the abovementioned model is that there is a limit to predictability of water demands of about sub-daily scale because the system is nonlinear. In this regard, this study aims to develop a nonlinear ensemble model for hourly water demand forecasting which allow us to estimate uncertainties across different model classes. The proposed model is consist of two parts. One is a multi-model scheme that is based on combination of independent prediction model. The other one is a cross validation scheme named Bagging approach introduced by Brieman (1996) to derive weighting factors corresponding to individual models. Individual forecasting models that used in this study are linear regression analysis model, polynomial regression, multivariate adaptive regression splines(MARS), SVM(support vector machine). The concepts are demonstrated through application to observed from water plant at several locations in the South Korea. Keywords: water demand, non-linear model, the ensemble forecasting model, uncertainty. Acknowledgements This subject is supported by Korea Ministry of Environment as "Projects for Developing Eco-Innovation Technologies (GT-11-G-02-001-6)

  4. Exploiting teleconnection indices for probabilistic forecasting of drought class transitions in Sicily region (Italy)

    NASA Astrophysics Data System (ADS)

    Bonaccorso, Brunella; Cancelliere, Antonino

    2015-04-01

    In the present study two probabilistic models for short-medium term drought forecasting able to include information provided by teleconnection indices are proposed and applied to Sicily region (Italy). Drought conditions are expressed in terms of the Standardized Precipitation-Evapotranspiration Index (SPEI) at different aggregation time scales. More specifically, a multivariate approach based on normal distribution is developed in order to estimate: 1) on the one hand transition probabilities to future SPEI drought classes and 2) on the other hand, SPEI forecasts at a generic time horizon M, as functions of past values of SPEI and the selected teleconnection index. To this end, SPEI series at 3, 4 and 6 aggregation time scales for Sicily region are extracted from the Global SPEI database, SPEIbase , available at Web repository of the Spanish National Research Council (http://sac.csic.es/spei/database.html), and averaged over the study area. In particular, SPEIbase v2.3 with spatial resolution of 0.5° lat/lon and temporal coverage between January 1901 and December 2013 is used. A preliminary correlation analysis is carried out to investigate the link between the drought index and different teleconnection patterns, namely: the North Atlantic Oscillation (NAO), the Scandinavian (SCA) and the East Atlantic-West Russia (EA-WR) patterns. Results of such analysis indicate a strongest influence of NAO on drought conditions in Sicily with respect to other teleconnection indices. Then, the proposed forecasting methodology is applied and the skill in forecasting of the proposed models is quantitatively assessed through the application of a simple score approach and of performance indices. Results indicate that inclusion of NAO index generally enhance model performance thus confirming the suitability of the models for short- medium term forecast of drought conditions.

  5. Big data driven cycle time parallel prediction for production planning in wafer manufacturing

    NASA Astrophysics Data System (ADS)

    Wang, Junliang; Yang, Jungang; Zhang, Jie; Wang, Xiaoxi; Zhang, Wenjun Chris

    2018-07-01

    Cycle time forecasting (CTF) is one of the most crucial issues for production planning to keep high delivery reliability in semiconductor wafer fabrication systems (SWFS). This paper proposes a novel data-intensive cycle time (CT) prediction system with parallel computing to rapidly forecast the CT of wafer lots with large datasets. First, a density peak based radial basis function network (DP-RBFN) is designed to forecast the CT with the diverse and agglomerative CT data. Second, the network learning method based on a clustering technique is proposed to determine the density peak. Third, a parallel computing approach for network training is proposed in order to speed up the training process with large scaled CT data. Finally, an experiment with respect to SWFS is presented, which demonstrates that the proposed CTF system can not only speed up the training process of the model but also outperform the radial basis function network, the back-propagation-network and multivariate regression methodology based CTF methods in terms of the mean absolute deviation and standard deviation.

  6. A Copula-Based Conditional Probabilistic Forecast Model for Wind Power Ramps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Brian S; Krishnan, Venkat K; Zhang, Jie

    Efficient management of wind ramping characteristics can significantly reduce wind integration costs for balancing authorities. By considering the stochastic dependence of wind power ramp (WPR) features, this paper develops a conditional probabilistic wind power ramp forecast (cp-WPRF) model based on Copula theory. The WPRs dataset is constructed by extracting ramps from a large dataset of historical wind power. Each WPR feature (e.g., rate, magnitude, duration, and start-time) is separately forecasted by considering the coupling effects among different ramp features. To accurately model the marginal distributions with a copula, a Gaussian mixture model (GMM) is adopted to characterize the WPR uncertaintymore » and features. The Canonical Maximum Likelihood (CML) method is used to estimate parameters of the multivariable copula. The optimal copula model is chosen based on the Bayesian information criterion (BIC) from each copula family. Finally, the best conditions based cp-WPRF model is determined by predictive interval (PI) based evaluation metrics. Numerical simulations on publicly available wind power data show that the developed copula-based cp-WPRF model can predict WPRs with a high level of reliability and sharpness.« less

  7. Improving the effectiveness of real-time flood forecasting through Predictive Uncertainty estimation: the multi-temporal approach

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Todini, Ezio

    2015-04-01

    The negative effects of severe flood events are usually contrasted through structural measures that, however, do not fully eliminate flood risk. Non-structural measures, such as real-time flood forecasting and warning, are also required. Accurate stage/discharge future predictions with appropriate forecast lead-time are sought by decision-makers for implementing strategies to mitigate the adverse effects of floods. Traditionally, flood forecasting has been approached by using rainfall-runoff and/or flood routing modelling. Indeed, both types of forecasts, cannot be considered perfectly representing future outcomes because of lacking of a complete knowledge of involved processes (Todini, 2004). Nonetheless, although aware that model forecasts are not perfectly representing future outcomes, decision makers are de facto implicitly assuming the forecast of water level/discharge/volume, etc. as "deterministic" and coinciding with what is going to occur. Recently the concept of Predictive Uncertainty (PU) was introduced in hydrology (Krzysztofowicz, 1999), and several uncertainty processors were developed (Todini, 2008). PU is defined as the probability of occurrence of the future realization of a predictand (water level/discharge/volume) conditional on: i) prior observations and knowledge, ii) the available information obtained on the future value, typically provided by one or more forecast models. Unfortunately, PU has been frequently interpreted as a measure of lack of accuracy rather than the appropriate tool allowing to take the most appropriate decisions, given a model or several models' forecasts. With the aim to shed light on the benefits for appropriately using PU, a multi-temporal approach based on the MCP approach (Todini, 2008; Coccia and Todini, 2011) is here applied to stage forecasts at sites along the Upper Tiber River. Specifically, the STAge Forecasting-Rating Curve Model Muskingum-based (STAFOM-RCM) (Barbetta et al., 2014) along with the Rating-Curve Model in Real Time (RCM-RT) (Barbetta and Moramarco, 2014) are used to this end. Both models without considering rainfall information explicitly considers, at each time of forecast, the estimate of lateral contribution along the river reach for which the stage forecast is performed at downstream end. The analysis is performed for several reaches using different lead times according to the channel length. Barbetta, S., Moramarco, T., Brocca, L., Franchini, M. and Melone, F. 2014. Confidence interval of real-time forecast stages provided by the STAFOM-RCM model: the case study of the Tiber River (Italy). Hydrological Processes, 28(3),729-743. Barbetta, S. and Moramarco, T. 2014. Real-time flood forecasting by relating local stage and remote discharge. Hydrological Sciences Journal, 59(9 ), 1656-1674. Coccia, G. and Todini, E. 2011. Recent developments in predictive uncertainty assessment based on the Model Conditional Processor approach. Hydrology and Earth System Sciences, 15, 3253-3274. doi:10.5194/hess-15-3253-2011. Krzysztofowicz, R. 1999. Bayesian theory of probabilistic forecasting via deterministic hydrologic model, Water Resour. Res., 35, 2739-2750. Todini, E. 2004. Role and treatment of uncertainty in real-time flood forecasting. Hydrological Processes 18(14), 2743_2746. Todini, E. 2008. A model conditional processor to assess predictive uncertainty in flood forecasting. Intl. J. River Basin Management, 6(2): 123-137.

  8. Decision Support on the Sediments Flushing of Aimorés Dam Using Medium-Range Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Mainardi Fan, Fernando; Schwanenberg, Dirk; Collischonn, Walter; Assis dos Reis, Alberto; Alvarado Montero, Rodolfo; Alencar Siqueira, Vinicius

    2015-04-01

    In the present study we investigate the use of medium-range streamflow forecasts in the Doce River basin (Brazil), at the reservoir of Aimorés Hydro Power Plant (HPP). During daily operations this reservoir acts as a "trap" to the sediments that originate from the upstream basin of the Doce River. This motivates a cleaning process called "pass through" to periodically remove the sediments from the reservoir. The "pass through" or "sediments flushing" process consists of a decrease of the reservoir's water level to a certain flushing level when a determined reservoir inflow threshold is forecasted. Then, the water in the approaching inflow is used to flush the sediments from the reservoir through the spillway and to recover the original reservoir storage. To be triggered, the sediments flushing operation requires an inflow larger than 3000m³/s in a forecast horizon of 7 days. This lead-time of 7 days is far beyond the basin's concentration time (around 2 days), meaning that the forecasts for the pass through procedure highly depends on Numerical Weather Predictions (NWP) models that generate Quantitative Precipitation Forecasts (QPF). This dependency creates an environment with a high amount of uncertainty to the operator. To support the decision making at Aimorés HPP we developed a fully operational hydrological forecasting system to the basin. The system is capable of generating ensemble streamflow forecasts scenarios when driven by QPF data from meteorological Ensemble Prediction Systems (EPS). This approach allows accounting for uncertainties in the NWP at a decision making level. This system is starting to be used operationally by CEMIG and is the one shown in the present study, including a hindcasting analysis to assess the performance of the system for the specific flushing problem. The QPF data used in the hindcasting study was derived from the TIGGE (THORPEX Interactive Grand Global Ensemble) database. Among all EPS available on TIGGE, three were selected: ECMWF, GEFS, and CPTEC. As a deterministic reference forecast, we adopt the high resolution ECMWF forecast for comparison. The experiment consisted on running retrospective forecasts for a full five-year period. To verify the proposed objectives of the study, we use different metrics to evaluate the forecast: ROC Curves, Exceedance Diagrams, Forecast Convergence Score (FCS). Metrics results enabled to understand the benefits of the hydrological ensemble prediction system as a decision making tool for the HPP operation. The ROC scores indicate that the use of the lower percentiles of the ensemble scenarios issues for a true alarm rate around 0,5 to 0,8 (depending on the model and on the percentile), for the lead time of seven days. While the false alarm rate is between 0 and 0,3. Those rates were better than the ones resulting from the deterministic reference forecast. Exceedance diagrams and forecast convergence scores indicate that the ensemble scenarios provide an early signal about the threshold crossing. Furthermore, the ensemble forecasts are more consistent between two subsequent forecasts in comparison to the deterministic forecast. The assessments results also give more credibility to CEMIG in the realization and communication of flushing operation with the stakeholders involved.

  9. [The warning model and influence of climatic changes on hemorrhagic fever with renal syndrome in Changsha city].

    PubMed

    Xiao, Hong; Tian, Huai-yu; Zhang, Xi-xing; Zhao, Jian; Zhu, Pei-juan; Liu, Ru-chun; Chen, Tian-mu; Dai, Xiang-yu; Lin, Xiao-ling

    2011-10-01

    To realize the influence of climatic changes on the transmission of hemorrhagic fever with renal syndrome (HFRS), and to explore the adoption of climatic factors in warning HFRS. A total of 2171 cases of HFRS and the synchronous climatic data in Changsha from 2000 to 2009 were collected to a climate-based forecasting model for HFRS transmission. The Cochran-Armitage trend test was employed to explore the variation trend of the annual incidence of HFRS. Cross-correlations analysis was then adopted to assess the time-lag period between the climatic factors, including monthly average temperature, relative humidity, rainfall and Multivariate Elño-Southern Oscillation Index (MEI) and the monthly HFRS cases. Finally the time-series Poisson regression model was constructed to analyze the influence of different climatic factors on the HFRS transmission. The annual incidence of HFRS in Changsha between 2000 - 2009 was 13.09/100 000 (755 cases), 9.92/100 000 (578 cases), 5.02/100 000 (294 cases), 2.55/100 000 (150 cases), 1.13/100 000 (67 cases), 1.16/100 000 (70 cases), 0.95/100 000 (58 cases), 1.40/100 000 (87 cases), 0.75/100 000 (47 cases) and 1.02/100 000 (65 cases), respectively. The incidence showed a decline during these years (Z = -5.78, P < 0.01). The results of Poisson regression model indicated that the monthly average temperature (18.00°C, r = 0.26, P < 0.01, 1-month lag period; IRR = 1.02, 95%CI: 1.00 - 1.03, P < 0.01), relative humidity (75.50%, r = 0.62, P < 0.01, 3-month lag period; IRR = 1.03, 95%CI: 1.02 - 1.04, P < 0.01), rainfall (112.40 mm, r = 0.25, P < 0.01, 6-month lag period; IRR = 1.01, 95CI: 1.01 - 1.02, P = 0.02), and MEI (r = 0.31, P < 0.01, 3-month lag period; IRR = 0.77, 95CI: 0.67 - 0.88, P < 0.01) were closely associated with monthly HFRS cases (18.10 cases). Climate factors significantly influence the incidence of HFRS. If the influence of variable-autocorrelation, seasonality, and long-term trend were controlled, the accuracy of forecasting by the time-series Poisson regression model in Changsha would be comparatively high, and we could forecast the incidence of HFRS in advance.

  10. Extravehicular Activity (EVA) Technology Development Status and Forecast

    NASA Technical Reports Server (NTRS)

    Chullen, Cinda; Westheimer, David T.

    2010-01-01

    Beginning in Fiscal Year (FY) 2011, Extravehicular activity (EVA) technology development became a technology foundational domain under a new program Enabling Technology Development and Demonstration. The goal of the EVA technology effort is to further develop technologies that will be used to demonstrate a robust EVA system that has application for a variety of future missions including microgravity and surface EVA. Overall the objectives will be reduce system mass, reduce consumables and maintenance, increase EVA hardware robustness and life, increase crew member efficiency and autonomy, and enable rapid vehicle egress and ingress. Over the past several years, NASA realized a tremendous increase in EVA system development as part of the Exploration Technology Development Program and the Constellation Program. The evident demand for efficient and reliable EVA technologies, particularly regenerable technologies was apparent under these former programs and will continue to be needed as future mission opportunities arise. The technological need for EVA in space has been realized over the last several decades by the Gemini, Apollo, Skylab, Space Shuttle, and the International Space Station (ISS) programs. EVAs were critical to the success of these programs. Now with the ISS extension to 2028 in conjunction with a current forecasted need of at least eight EVAs per year, the EVA technology life and limited availability of the EMUs will become a critical issue eventually. The current Extravehicular Mobility Unit (EMU) has vastly served EVA demands by performing critical operations to assemble the ISS and provide repairs of satellites such as the Hubble Space Telescope. However, as the life of ISS and the vision for future mission opportunities are realized, a new EVA systems capability could be an option for the future mission applications building off of the technology development over the last several years. Besides ISS, potential mission applications include EVAs for missions to Near Earth Objects (NEO), Phobos, or future surface missions. Surface missions could include either exploration of the Moon or Mars. Providing an EVA capability for these types of missions enables in-space construction of complex vehicles or satellites, hands on exploration of new parts of our solar system, and engages the public through the inspiration of knowing that humans are exploring places that they have never been before. This paper offers insight into what is currently being developed and what the potential opportunities are in the forecast

  11. Development and verification of a new wind speed forecasting system using an ensemble Kalman filter data assimilation technique in a fully coupled hydrologic and atmospheric model

    NASA Astrophysics Data System (ADS)

    Williams, John L.; Maxwell, Reed M.; Monache, Luca Delle

    2013-12-01

    Wind power is rapidly gaining prominence as a major source of renewable energy. Harnessing this promising energy source is challenging because of the chaotic nature of wind and its inherently intermittent nature. Accurate forecasting tools are critical to support the integration of wind energy into power grids and to maximize its impact on renewable energy portfolios. We have adapted the Data Assimilation Research Testbed (DART), a community software facility which includes the ensemble Kalman filter (EnKF) algorithm, to expand our capability to use observational data to improve forecasts produced with a fully coupled hydrologic and atmospheric modeling system, the ParFlow (PF) hydrologic model and the Weather Research and Forecasting (WRF) mesoscale atmospheric model, coupled via mass and energy fluxes across the land surface, and resulting in the PF.WRF model. Numerous studies have shown that soil moisture distribution and land surface vegetative processes profoundly influence atmospheric boundary layer development and weather processes on local and regional scales. We have used the PF.WRF model to explore the connections between the land surface and the atmosphere in terms of land surface energy flux partitioning and coupled variable fields including hydraulic conductivity, soil moisture, and wind speed and demonstrated that reductions in uncertainty in these coupled fields realized through assimilation of soil moisture observations propagate through the hydrologic and atmospheric system. The sensitivities found in this study will enable further studies to optimize observation strategies to maximize the utility of the PF.WRF-DART forecasting system.

  12. Forecasting impact injuries of unrestrained occupants in railway vehicle passenger compartments.

    PubMed

    Xie, Suchao; Zhou, Hui

    2014-01-01

    In order to predict the injury parameters of the occupants corresponding to different experimental parameters and to determine impact injury indices conveniently and efficiently, a model forecasting occupant impact injury was established in this work. The work was based on finite experimental observation values obtained by numerical simulation. First, the various factors influencing the impact injuries caused by the interaction between unrestrained occupants and the compartment's internal structures were collated and the most vulnerable regions of the occupant's body were analyzed. Then, the forecast model was set up based on a genetic algorithm-back propagation (GA-BP) hybrid algorithm, which unified the individual characteristics of the back propagation-artificial neural network (BP-ANN) model and the genetic algorithm (GA). The model was well suited to studies of occupant impact injuries and allowed multiple-parameter forecasts of the occupant impact injuries to be realized assuming values for various influencing factors. Finally, the forecast results for three types of secondary collision were analyzed using forecasting accuracy evaluation methods. All of the results showed the ideal accuracy of the forecast model. When an occupant faced a table, the relative errors between the predicted and experimental values of the respective injury parameters were kept within ± 6.0 percent and the average relative error (ARE) values did not exceed 3.0 percent. When an occupant faced a seat, the relative errors between the predicted and experimental values of the respective injury parameters were kept within ± 5.2 percent and the ARE values did not exceed 3.1 percent. When the occupant faced another occupant, the relative errors between the predicted and experimental values of the respective injury parameters were kept within ± 6.3 percent and the ARE values did not exceed 3.8 percent. The injury forecast model established in this article reduced repeat experiment times and improved the design efficiency of the internal compartment's structure parameters, and it provided a new way for assessing the safety performance of the interior structural parameters in existing, and newly designed, railway vehicle compartments.

  13. Deconstructing multivariate decoding for the study of brain function.

    PubMed

    Hebart, Martin N; Baker, Chris I

    2017-08-04

    Multivariate decoding methods were developed originally as tools to enable accurate predictions in real-world applications. The realization that these methods can also be employed to study brain function has led to their widespread adoption in the neurosciences. However, prior to the rise of multivariate decoding, the study of brain function was firmly embedded in a statistical philosophy grounded on univariate methods of data analysis. In this way, multivariate decoding for brain interpretation grew out of two established frameworks: multivariate decoding for predictions in real-world applications, and classical univariate analysis based on the study and interpretation of brain activation. We argue that this led to two confusions, one reflecting a mixture of multivariate decoding for prediction or interpretation, and the other a mixture of the conceptual and statistical philosophies underlying multivariate decoding and classical univariate analysis. Here we attempt to systematically disambiguate multivariate decoding for the study of brain function from the frameworks it grew out of. After elaborating these confusions and their consequences, we describe six, often unappreciated, differences between classical univariate analysis and multivariate decoding. We then focus on how the common interpretation of what is signal and noise changes in multivariate decoding. Finally, we use four examples to illustrate where these confusions may impact the interpretation of neuroimaging data. We conclude with a discussion of potential strategies to help resolve these confusions in interpreting multivariate decoding results, including the potential departure from multivariate decoding methods for the study of brain function. Copyright © 2017. Published by Elsevier Inc.

  14. Forecasting Individual Headache Attacks Using Perceived Stress: Development of a Multivariable Prediction Model for Persons With Episodic Migraine.

    PubMed

    Houle, Timothy T; Turner, Dana P; Golding, Adrienne N; Porter, John A H; Martin, Vincent T; Penzien, Donald B; Tegeler, Charles H

    2017-07-01

    To develop and validate a prediction model that forecasts future migraine attacks for an individual headache sufferer. Many headache patients and physicians believe that precipitants of headache can be identified and avoided or managed to reduce the frequency of headache attacks. Of the numerous candidate triggers, perceived stress has received considerable attention for its association with the onset of headache in episodic and chronic headache sufferers. However, no evidence is available to support forecasting headache attacks within individuals using any of the candidate headache triggers. This longitudinal cohort with forecasting model development study enrolled 100 participants with episodic migraine with or without aura, and N = 95 contributed 4626 days of electronic diary data and were included in the analysis. Individual headache forecasts were derived from current headache state and current levels of stress using several aspects of the Daily Stress Inventory, a measure of daily hassles that is completed at the end of each day. The primary outcome measure was the presence/absence of any headache attack (head pain > 0 on a numerical rating scale of 0-10) over the next 24 h period. After removing missing data (n = 431 days), participants in the study experienced a headache attack on 1613/4195 (38.5%) days. A generalized linear mixed-effects forecast model using either the frequency of stressful events or the perceived intensity of these events fit the data well. This simple forecasting model possessed promising predictive utility with an AUC of 0.73 (95% CI 0.71-0.75) in the training sample and an AUC of 0.65 (95% CI 0.6-0.67) in a leave-one-out validation sample. This forecasting model had a Brier score of 0.202 and possessed good calibration between forecasted probabilities and observed frequencies but had only low levels of resolution (ie, sharpness). This study demonstrates that future headache attacks can be forecasted for a diverse group of individuals over time. Future work will enhance prediction through improvements in the assessment of stress as well as the development of other candidate domains to use in the models. © 2017 American Headache Society.

  15. A three-dimensional multivariate representation of atmospheric variability

    NASA Astrophysics Data System (ADS)

    Žagar, Nedjeljka; Jelić, Damjan; Blaauw, Marten; Jesenko, Blaž

    2016-04-01

    A recently developed MODES software has been applied to the ECMWF analyses and forecasts and to several reanalysis datasets to describe the global variability of the balanced and inertio-gravity (IG) circulation across many scales by considering both mass and wind field and the whole model depth. In particular, the IG spectrum, which has only recently become observable in global datasets, can be studied simultaneously in the mass field and wind field and considering the whole model depth. MODES is open-access software that performs the normal-mode function decomposition of the 3D global datasets. Its application to the ERA Interim dataset reveals several aspects of the large-scale circulation after it has been partitioned into the linearly balanced and IG components. The global energy distribution is dominated by the balanced energy while the IG modes contribute around 8% of the total wave energy. However, on subsynoptic scales IG energy dominates and it is associated with the main features of tropical variability on all scales. The presented energy distribution and features of the zonally-averaged and equatorial circulation provide a reference for the intercomparison of several reanalysis datasets and for the validation of climate models. Features of the global IG circulation are compared in ERA Interim, MERRA and JRA reanalysis datasets and in several CMIP5 models. Since October 2014 the operational medium-range forecasts of the European Centre for Medium-Range Weather Forecasts (ECMWF) have been analyzed by MODES daily and an online archive of all the outputs is available at http://meteo.fmf.uni-lj.si/MODES. New outputs are made available daily based on the 00 UTC run and subsequent 12-hour forecasts up to 240-hour forecast. In addition to the energy spectra and horizontal circulation on selected levels for the balanced and IG components, the equatorial Kelvin waves are presented in time and space as the most energetic tropical IG modes propagating vertically and along the equator from its main generation regions in the upper troposphere over the Indian and Pacific region. The validation of the 10-day ECMWF forecasts with analyses in the modal space suggests a lack of variability in the tropics in the medium range. Reference: Žagar, N. et al., 2015: Normal-mode function representation of global 3-D data sets: open-access software for the atmospheric research community. Geosci. Model Dev., 8, 1169-1195, doi:10.5194/gmd-8-1169-2015 Žagar, N., R. Buizza, and J. Tribbia, 2015: A three-dimensional multivariate modal analysis of atmospheric predictability with application to the ECMWF ensemble. J. Atmos. Sci., 72, 4423-4444 The MODES software is available from http://meteo.fmf.uni-lj.si/MODES.

  16. Time Series Model Identification by Estimating Information, Memory, and Quantiles.

    DTIC Science & Technology

    1983-07-01

    Standards, Sect. D, 68D, 937-951. Parzen, Emanuel (1969) "Multiple time series modeling" Multivariate Analysis - II, edited by P. Krishnaiah , Academic... Krishnaiah , North Holland: Amsterdam, 283-295. Parzen, Emanuel (1979) "Forecasting and Whitening Filter Estimation" TIMS Studies in the Management...principle. Applications of Statistics, P. R. Krishnaiah , ed. North Holland: Amsterdam, 27-41. Box, G. E. P. and Jenkins, G. M. (1970) Time Series Analysis

  17. An evaluation of Bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting.

    PubMed

    Hippert, Henrique S; Taylor, James W

    2010-04-01

    Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation. Copyright 2009 Elsevier Ltd. All rights reserved.

  18. Improving Prediction Accuracy for WSN Data Reduction by Applying Multivariate Spatio-Temporal Correlation

    PubMed Central

    Carvalho, Carlos; Gomes, Danielo G.; Agoulmine, Nazim; de Souza, José Neuman

    2011-01-01

    This paper proposes a method based on multivariate spatial and temporal correlation to improve prediction accuracy in data reduction for Wireless Sensor Networks (WSN). Prediction of data not sent to the sink node is a technique used to save energy in WSNs by reducing the amount of data traffic. However, it may not be very accurate. Simulations were made involving simple linear regression and multiple linear regression functions to assess the performance of the proposed method. The results show a higher correlation between gathered inputs when compared to time, which is an independent variable widely used for prediction and forecasting. Prediction accuracy is lower when simple linear regression is used, whereas multiple linear regression is the most accurate one. In addition to that, our proposal outperforms some current solutions by about 50% in humidity prediction and 21% in light prediction. To the best of our knowledge, we believe that we are probably the first to address prediction based on multivariate correlation for WSN data reduction. PMID:22346626

  19. Essays on oil price volatility and irreversible investment

    NASA Astrophysics Data System (ADS)

    Pastor, Daniel J.

    In chapter 1, we provide an extensive and systematic evaluation of the relative forecasting performance of several models for the volatility of daily spot crude oil prices. Empirical research over the past decades has uncovered significant gains in forecasting performance of Markov Switching GARCH models over GARCH models for the volatility of financial assets and crude oil futures. We find that, for spot oil price returns, non-switching models perform better in the short run, whereas switching models tend to do better at longer horizons. In chapter 2, I investigate the impact of volatility on firms' irreversible investment decisions using real options theory. Cost incurred in oil drilling is considered sunk cost, thus irreversible. I collect detailed data on onshore, development oil well drilling on the North Slope of Alaska from 2003 to 2014. Volatility is modeled by constructing GARCH, EGARCH, and GJR-GARCH forecasts based on monthly real oil prices, and realized volatility from 5-minute intraday returns of oil futures prices. Using a duration model, I show that oil price volatility generally has a negative relationship with the hazard rate of drilling an oil well both when aggregating all the fields, and in individual fields.

  20. An operational global ocean forecast system and its applications

    NASA Astrophysics Data System (ADS)

    Mehra, A.; Tolman, H. L.; Rivin, I.; Rajan, B.; Spindler, T.; Garraffo, Z. D.; Kim, H.

    2012-12-01

    A global Real-Time Ocean Forecast System (RTOFS) was implemented in operations at NCEP/NWS/NOAA on 10/25/2011. This system is based on an eddy resolving 1/12 degree global HYCOM (HYbrid Coordinates Ocean Model) and is part of a larger national backbone capability of ocean modeling at NWS in strong partnership with US Navy. The forecast system is run once a day and produces a 6 day long forecast using the daily initialization fields produced at NAVOCEANO using NCODA (Navy Coupled Ocean Data Assimilation), a 3D multi-variate data assimilation methodology. As configured within RTOFS, HYCOM has a horizontal equatorial resolution of 0.08 degrees or ~9 km. The HYCOM grid is on a Mercator projection from 78.64 S to 47 N and north of this it employs an Arctic dipole patch where the poles are shifted over land to avoid a singularity at the North Pole. This gives a mid-latitude (polar) horizontal resolution of approximately 7 km (3.5 km). The coastline is fixed at 10 m isobath with open Bering Straits. This version employs 32 hybrid vertical coordinate surfaces with potential density referenced to 2000 m. Vertical coordinates can be isopycnals, often best for resolving deep water masses, levels of equal pressure (fixed depths), best for the well mixed unstratified upper ocean and sigma-levels (terrain-following), often the best choice in shallow water. The dynamic ocean model is coupled to a thermodynamic energy loan ice model and uses a non-slab mixed layer formulation. The forecast system is forced with 3-hourly momentum, radiation and precipitation fluxes from the operational Global Forecast System (GFS) fields. Results include global sea surface height and three dimensional fields of temperature, salinity, density and velocity fields used for validation and evaluation against available observations. Several downstream applications of this forecast system will also be discussed which include search and rescue operations at US Coast Guard, navigation safety information provided by OPC using real time ocean model guidance from Global RTOFS surface ocean currents, operational guidance on radionuclide dispersion near Fukushima using 3D tracers, boundary conditions for various operational coastal ocean forecast systems (COFS) run by NOS etc.

  1. Eco-morphological Real-time Forecasting tool to predict hydrodynamic, sediment and nutrient dynamic in Coastal Louisiana

    NASA Astrophysics Data System (ADS)

    Messina, F.; Meselhe, E. A.; Buckman, L.; Twight, D.

    2017-12-01

    Louisiana coastal zone is one of the most productive and dynamic eco-geomorphic systems in the world. This unique natural environment has been alternated by human activities and natural processes such as sea level rise, subsidence, dredging of canals for oil and gas production, the Mississippi River levees which don't allow the natural river sediment. As a result of these alterations land loss, erosion and flood risk are becoming real issues for Louisiana. Costal authorities have been studying the benefits and effects of several restoration projects, e.g. freshwater and sediment diversions. The protection of communities, wildlife and of the unique environments is a high priority in this region. The Water Institute of the Gulf, together with Deltares, has developed a forecasting and information system for a pilot location in Coastal Louisiana, specifically for Barataria Bay and Breton Sound Basins in the Mississippi River Deltaic Plain. The system provides a 7-day forecast of water level, salinity, and temperature, under atmospheric and coastal forecasted conditions, such as freshwater riverine inflow, rainfall, evaporation, wind, and tide. The system also forecasts nutrient distribution (e.g., Chla and dissolved oxygen) and sediment transport. The Flood Early Warning System FEWS is used as a platform to import multivariate data from several sources, use them to monitor the pilot location and to provide boundary conditions to the model. A hindcast model is applied to compare the model results to the observed data, and to provide the initial condition to the forecast model. This system represents a unique tool which provides valuable information regarding the overall conditions of the basins. It offers the opportunity to adaptively manage existing and planned diversions to meet certain salinity and water level targets or thresholds while maximizing land-building goals. Moreover, water quality predictions provide valuable information on the current ecological conditions of the area. Real time observations and model predictions can be used as guidance to decision makers regarding the operation of control structures in response to forecasted weather or river flood events. Coastal communities can benefit from water level, salinity and water quality forecast to manage their activities.

  2. Causality networks from multivariate time series and application to epilepsy.

    PubMed

    Siggiridou, Elsa; Koutlis, Christos; Tsimpiris, Alkiviadis; Kimiskidis, Vasilios K; Kugiumtzis, Dimitris

    2015-08-01

    Granger causality and variants of this concept allow the study of complex dynamical systems as networks constructed from multivariate time series. In this work, a large number of Granger causality measures used to form causality networks from multivariate time series are assessed. For this, realizations on high dimensional coupled dynamical systems are considered and the performance of the Granger causality measures is evaluated, seeking for the measures that form networks closest to the true network of the dynamical system. In particular, the comparison focuses on Granger causality measures that reduce the state space dimension when many variables are observed. Further, the linear and nonlinear Granger causality measures of dimension reduction are compared to a standard Granger causality measure on electroencephalographic (EEG) recordings containing episodes of epileptiform discharges.

  3. Prediction of mortality rates using a model with stochastic parameters

    NASA Astrophysics Data System (ADS)

    Tan, Chon Sern; Pooi, Ah Hin

    2016-10-01

    Prediction of future mortality rates is crucial to insurance companies because they face longevity risks while providing retirement benefits to a population whose life expectancy is increasing. In the past literature, a time series model based on multivariate power-normal distribution has been applied on mortality data from the United States for the years 1933 till 2000 to forecast the future mortality rates for the years 2001 till 2010. In this paper, a more dynamic approach based on the multivariate time series will be proposed where the model uses stochastic parameters that vary with time. The resulting prediction intervals obtained using the model with stochastic parameters perform better because apart from having good ability in covering the observed future mortality rates, they also tend to have distinctly shorter interval lengths.

  4. Influence factors and forecast of carbon emission in China: structure adjustment for emission peak

    NASA Astrophysics Data System (ADS)

    Wang, B.; Cui, C. Q.; Li, Z. P.

    2018-02-01

    This paper introduced Principal Component Analysis and Multivariate Linear Regression Model to verify long-term balance relationships between Carbon Emissions and the impact factors. The integrated model of improved PCA and multivariate regression analysis model is attainable to figure out the pattern of carbon emission sources. Main empirical results indicate that among all selected variables, the role of energy consumption scale was largest. GDP and Population follow and also have significant impacts on carbon emission. Industrialization rate and fossil fuel proportion, which is the indicator of reflecting the economic structure and energy structure, have a higher importance than the factor of urbanization rate and the dweller consumption level of urban areas. In this way, some suggestions are put forward for government to achieve the peak of carbon emissions.

  5. Evaluation of a regional assimilation system coupled with the WRF-chem model

    NASA Astrophysics Data System (ADS)

    Liu, Yan-an; Gao, Wei; Huang, Hung-lung; Strabala, Kathleen; Liu, Chaoshun; Shi, Runhe

    2013-09-01

    Air quality has become a social issue that is causing great concern to humankind across the globe, but particularly in developing countries. Even though the Weather Research and Forecasting with Chemistry (WRF-Chem) model has been applied in many regions, the resolution for inputting meteorology field analysis still impacts the accuracy of forecast. This article describes the application of the CIMSS Regional Assimilation System (CRAS) in East China, and its capability to assimilate the direct broadcast (DB) satellite data for obtaining more detailed meteorological information, including cloud top pressure (CTP) and total precipitation water (TPW) from MODIS. Performance evaluation of CRAS is based on qualitative and quantitative analyses. Compared with data collected from ERA-Interim, Radiosonde, and the Tropical Rainfall Measuring Mission (TRMM) precipitation measurements using bias and Root Mean Square Error (RMSE), CRAS has a systematic error due to the impact of topography and other factors; however, the forecast accuracy of all elements in the model center area is higher at various levels. The bias computed with Radiosonde reveals that the temperature and geopotential height of CRAS are better than ERA-Interim at first guess. Moreover, the location of the 24 h accumulated precipitation forecast are highly consistent with the TRMM retrieval precipitation, which means that the performance of CRAS is excellent. In summation, the newly built Vtable can realize the function of inputting the meteorology field from CRAS output into WRF, which couples the CRAS with WRF-Chem. Therefore, this study not only provides for forecast accuracy of CRAS, but also increases the capability of running the WRF-Chem model at higher resolutions in the future.

  6. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated for reliability and skill by retrospective testing, and the models should be under continuous prospective testing against long-term forecasts and alternative time-dependent models. (d) Short-term models used in operational forecasting should be consistent with the long-term forecasts used in probabilistic seismic hazard analysis. (e) Alert procedures should be standardized to facilitate decisions at different levels of government, based in part on objective analysis of costs and benefits. (f) In establishing alert protocols, consideration should also be given to the less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Authoritative statements of increased risk, even when the absolute probability is low, can provide a psychological benefit to the public by filling information vacuums that lead to informal predictions and misinformation. Formal OEF procedures based on probabilistic forecasting appropriately separate hazard estimation by scientists from the decision-making role of civil protection authorities. The prosecution of seven Italian scientists on manslaughter charges stemming from their actions before the L'Aquila earthquake makes clear why this separation should be explicit in defining OEF protocols.

  7. Climate Forecasts and Water Resource Management: Applications for a Developing Country

    NASA Astrophysics Data System (ADS)

    Brown, C.; Rogers, P.

    2002-05-01

    While the quantity of water on the planet earth is relatively constant, the demand for water is continuously increasing. Population growth leads to linear increases in water demand, and economic growth leads to further demand growth. Strzepek et al. calculate that with a United Nations mean population estimate of 8.5 billion people by 2025 and globally balanced economic growth, water use could increase by 70% over that time (Strzepek et al., 1995). For developing nations especially, supplying water for this growing demand requires the construction of new water supply infrastructure. The prospect of designing and constructing long life-span infrastructure is clouded by the uncertainty of future climate. The availability of future water resources is highly dependent on future climate. With realization of the nonstationarity of climate, responsible design emphasizes resiliency and robustness of water resource systems (IPCC, 1995; Gleick et al., 1999). Resilient systems feature multiple sources and complex transport and distribution systems, and so come at a high economic and environmental price. A less capital-intense alternative to creating resilient and robust water resource systems is the use of seasonal climate forecasts. Such forecasts provide adequate lead time and accuracy to allow water managers and water-based sectors such as agriculture or hydropower to optimize decisions for the expected water supply. This study will assess the use of seasonal climate forecasts from regional climate models as a method to improve water resource management in systems with limited water supply infrastructure

  8. Increasing horizontal resolution in numerical weather prediction and climate simulations: illusion or panacea?

    PubMed

    Wedi, Nils P

    2014-06-28

    The steady path of doubling the global horizontal resolution approximately every 8 years in numerical weather prediction (NWP) at the European Centre for Medium Range Weather Forecasts may be substantially altered with emerging novel computing architectures. It coincides with the need to appropriately address and determine forecast uncertainty with increasing resolution, in particular, when convective-scale motions start to be resolved. Blunt increases in the model resolution will quickly become unaffordable and may not lead to improved NWP forecasts. Consequently, there is a need to accordingly adjust proven numerical techniques. An informed decision on the modelling strategy for harnessing exascale, massively parallel computing power thus also requires a deeper understanding of the sensitivity to uncertainty--for each part of the model--and ultimately a deeper understanding of multi-scale interactions in the atmosphere and their numerical realization in ultra-high-resolution NWP and climate simulations. This paper explores opportunities for substantial increases in the forecast efficiency by judicious adjustment of the formal accuracy or relative resolution in the spectral and physical space. One path is to reduce the formal accuracy by which the spectral transforms are computed. The other pathway explores the importance of the ratio used for the horizontal resolution in gridpoint space versus wavenumbers in spectral space. This is relevant for both high-resolution simulations as well as ensemble-based uncertainty estimation. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  9. Rare events modeling with support vector machine: Application to forecasting large-amplitude geomagnetic substorms and extreme events in financial markets.

    NASA Astrophysics Data System (ADS)

    Gavrishchaka, V. V.; Ganguli, S. B.

    2001-12-01

    Reliable forecasting of rare events in a complex dynamical system is a challenging problem that is important for many practical applications. Due to the nature of rare events, data set available for construction of the statistical and/or machine learning model is often very limited and incomplete. Therefore many widely used approaches including such robust algorithms as neural networks can easily become inadequate for rare events prediction. Moreover in many practical cases models with high-dimensional inputs are required. This limits applications of the existing rare event modeling techniques (e.g., extreme value theory) that focus on univariate cases. These approaches are not easily extended to multivariate cases. Support vector machine (SVM) is a machine learning system that can provide an optimal generalization using very limited and incomplete training data sets and can efficiently handle high-dimensional data. These features may allow to use SVM to model rare events in some applications. We have applied SVM-based system to the problem of large-amplitude substorm prediction and extreme event forecasting in stock and currency exchange markets. Encouraging preliminary results will be presented and other possible applications of the system will be discussed.

  10. Body Fat Percentage Prediction Using Intelligent Hybrid Approaches

    PubMed Central

    Shao, Yuehjen E.

    2014-01-01

    Excess of body fat often leads to obesity. Obesity is typically associated with serious medical diseases, such as cancer, heart disease, and diabetes. Accordingly, knowing the body fat is an extremely important issue since it affects everyone's health. Although there are several ways to measure the body fat percentage (BFP), the accurate methods are often associated with hassle and/or high costs. Traditional single-stage approaches may use certain body measurements or explanatory variables to predict the BFP. Diverging from existing approaches, this study proposes new intelligent hybrid approaches to obtain fewer explanatory variables, and the proposed forecasting models are able to effectively predict the BFP. The proposed hybrid models consist of multiple regression (MR), artificial neural network (ANN), multivariate adaptive regression splines (MARS), and support vector regression (SVR) techniques. The first stage of the modeling includes the use of MR and MARS to obtain fewer but more important sets of explanatory variables. In the second stage, the remaining important variables are served as inputs for the other forecasting methods. A real dataset was used to demonstrate the development of the proposed hybrid models. The prediction results revealed that the proposed hybrid schemes outperformed the typical, single-stage forecasting models. PMID:24723804

  11. Regional Earthquake Likelihood Models: A realm on shaky grounds?

    NASA Astrophysics Data System (ADS)

    Kossobokov, V.

    2005-12-01

    Seismology is juvenile and its appropriate statistical tools to-date may have a "medievil flavor" for those who hurry up to apply a fuzzy language of a highly developed probability theory. To become "quantitatively probabilistic" earthquake forecasts/predictions must be defined with a scientific accuracy. Following the most popular objectivists' viewpoint on probability, we cannot claim "probabilities" adequate without a long series of "yes/no" forecast/prediction outcomes. Without "antiquated binary language" of "yes/no" certainty we cannot judge an outcome ("success/failure"), and, therefore, quantify objectively a forecast/prediction method performance. Likelihood scoring is one of the delicate tools of Statistics, which could be worthless or even misleading when inappropriate probability models are used. This is a basic loophole for a misuse of likelihood as well as other statistical methods on practice. The flaw could be avoided by an accurate verification of generic probability models on the empirical data. It is not an easy task in the frames of the Regional Earthquake Likelihood Models (RELM) methodology, which neither defines the forecast precision nor allows a means to judge the ultimate success or failure in specific cases. Hopefully, the RELM group realizes the problem and its members do their best to close the hole with an adequate, data supported choice. Regretfully, this is not the case with the erroneous choice of Gerstenberger et al., who started the public web site with forecasts of expected ground shaking for `tomorrow' (Nature 435, 19 May 2005). Gerstenberger et al. have inverted the critical evidence of their study, i.e., the 15 years of recent seismic record accumulated just in one figure, which suggests rejecting with confidence above 97% "the generic California clustering model" used in automatic calculations. As a result, since the date of publication in Nature the United States Geological Survey website delivers to the public, emergency planners and the media, a forecast product, which is based on wrong assumptions that violate the best-documented earthquake statistics in California, which accuracy was not investigated, and which forecasts were not tested in a rigorous way.

  12. Forecasting fluid milk and cheese demands for the next decade.

    PubMed

    Schmit, T M; Kaiser, H M

    2006-12-01

    Predictions of future market demands and farm prices for dairy products are important determinants in developing marketing strategies and farm-production planning decisions. The objective of this report was to use current aggregate forecast data, combined with existing econometric models of demand and supply, to forecast retail demands for fluid milk and cheese and the supply and price of farm milk over the next decade. In doing so, we can investigate whether projections of population and consumer food-spending patterns will extend or alter current consumption trends and examine the implications of future generic advertising strategies for dairy products. To conduct the forecast simulations and appropriately allocate the farm milk supply to various uses, we used a partial equilibrium model of the US domestic dairy sector that segmented the industry into retail, wholesale, and farm markets. Model simulation results indicated that declines in retail per capita demand would persist but at a reduced rate from years past and that retail per capita demand for cheese would continue to grow and strengthen over the next decade. These predictions rely on expected changes in the size of populations of various ages, races, and ethnicities and on existing patterns of spending on food at home and away from home. The combined effect of these forecasted changes in demand levels was reflected in annualized growth in the total farm-milk supply that was similar to growth realized during the past few years. Although we expect nominal farm milk prices to increase over the next decade, we expect real prices (relative to assumed growth in feed costs) to remain relatively stable and show no increase until the end of the forecast period. Supplemental industry model simulations also suggested that net losses in producer revenues would result if only nominal levels of generic advertising spending were maintained in forthcoming years. In fact, if real generic advertising expenditures are increased relative to 2005 levels, returns to the investment in generic advertising can be improved. Specifically, each additional real dollar invested in generic advertising for fluid milk and cheese products over the forecast period would result in an additional 5.61 dollars in producer revenues.

  13. The skill of ECMWF long range Forecasting System to drive impact models for health and hydrology in Africa

    NASA Astrophysics Data System (ADS)

    Di Giuseppe, F.; Tompkins, A. M.; Lowe, R.; Dutra, E.; Wetterhall, F.

    2012-04-01

    As the quality of numerical weather prediction over the monthly to seasonal leadtimes steadily improves there is an increasing motivation to apply these fruitfully to the impacts sectors of health, water, energy and agriculture. Despite these improvements, the accuracy of fields such as temperature and precipitation that are required to drive sectoral models can still be poor. This is true globally, but particularly so in Africa, the region of focus in the present study. In the last year ECMWF has been particularly active through EU research founded projects in demonstrating the capability of its longer range forecasting system to drive impact modeling systems in this region. A first assessment on the consequences of the documented errors in ECMWF forecasting system is therefore presented here looking at two different application fields which we found particularly critical for Africa - vector-born diseases prevention and hydrological monitoring. A new malaria community model (VECTRI) has been developed at ICTP and tested for the 3 target regions participating in the QWECI project. The impacts on the mean malaria climate is assessed using the newly realized seasonal forecasting system (Sys4) with the dismissed system 3 (Sys3) which had the same model cycle of the up-to-date ECMWF re-analysis product (ERA-Interim). The predictive skill of Sys4 to be employed for malaria monitoring and forecast are also evaluated by aggregating the fields to country level. As a part of the DEWFORA projects, ECMWF is also developing a system for drought monitoring and forecasting over Africa whose main meteorological input is precipitation. Similarly to what is done for the VECTRI model, the skill of seasonal forecasts of precipitation is, in this application, translated into the capability of predicting drought while ERA-Interim is used in monitoring. On a monitoring level, the near real-time update of ERA-Interim could compensate the lack of observations in the regions. However, ERA-Interim suffers from biases and drifts that limit its application for drought monitoring purposes in some regions.

  14. Self-organizing linear output map (SOLO): An artificial neural network suitable for hydrologic modeling and analysis

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Lin; Gupta, Hoshin V.; Gao, Xiaogang; Sorooshian, Soroosh; Imam, Bisher

    2002-12-01

    Artificial neural networks (ANNs) can be useful in the prediction of hydrologic variables, such as streamflow, particularly when the underlying processes have complex nonlinear interrelationships. However, conventional ANN structures suffer from network training issues that significantly limit their widespread application. This paper presents a multivariate ANN procedure entitled self-organizing linear output map (SOLO), whose structure has been designed for rapid, precise, and inexpensive estimation of network structure/parameters and system outputs. More important, SOLO provides features that facilitate insight into the underlying processes, thereby extending its usefulness beyond forecast applications as a tool for scientific investigations. These characteristics are demonstrated using a classic rainfall-runoff forecasting problem. Various aspects of model performance are evaluated in comparison with other commonly used modeling approaches, including multilayer feedforward ANNs, linear time series modeling, and conceptual rainfall-runoff modeling.

  15. Impact Of Three-Phase Relative Permeability and Hysteresis Models On Forecasts of Storage Associated with CO2-EOR

    NASA Astrophysics Data System (ADS)

    Jia, W.; Pan, F.; McPherson, B. J. O. L.

    2015-12-01

    Due to the presence of multiple phases in a given system, CO2 sequestration with enhanced oil recovery (CO2-EOR) includes complex multiphase flow processes compared to CO2 sequestration in deep saline aquifers (no hydrocarbons). Two of the most important factors are three-phase relative permeability and hysteresis effects, both of which are difficult to measure and are usually represented by numerical interpolation models. The purposes of this study included quantification of impacts of different three-phase relative permeability models and hysteresis models on CO2 sequestration simulation results, and associated quantitative estimation of uncertainty. Four three-phase relative permeability models and three hysteresis models were applied to a model of an active CO2-EOR site, the SACROC unit located in western Texas. To eliminate possible bias of deterministic parameters on the evaluation, a sequential Gaussian simulation technique was utilized to generate 50 realizations to describe heterogeneity of porosity and permeability, initially obtained from well logs and seismic survey data. Simulation results of forecasted pressure distributions and CO2 storage suggest that (1) the choice of three-phase relative permeability model and hysteresis model have noticeable impacts on CO2 sequestration simulation results; (2) influences of both factors are observed in all 50 realizations; and (3) the specific choice of hysteresis model appears to be somewhat more important relative to the choice of three-phase relative permeability model in terms of model uncertainty.

  16. Decadal Prediction Skill in the GEOS-5 Forecast System

    NASA Technical Reports Server (NTRS)

    Ham, Yoo-Geun; Rienecker, Michael M.; Suarez, M.; Vikhliaev, Yury V.; Zhao, Bin; Marshak, Jelena; Vernieres, Guillaume; Schubert, Siegfried D.

    2012-01-01

    A suite of decadal predictions has been conducted with the NASA Global Modeling and Assimilation Office?s GEOS-5 Atmosphere-Ocean General Circulation Model (AOGCM). The hindcasts are initialized every December from 1959 to 2010 following the CMIP5 experimental protocol for decadal predictions. The initial conditions are from a multi-variate ensemble optimal interpolation ocean and sea-ice reanalysis, and from the atmospheric reanalysis (MERRA, the Modern-Era Retrospective Analysis for Research and Applications) generated using the GEOS-5 atmospheric model. The forecast skill of a three-member-ensemble mean is compared to that of an experiment without initialization but forced with observed CO2. The results show that initialization acts to increase the forecast skill of Northern Atlantic SST compared to the uninitialized runs, with the increase in skill maintained for almost a decade over the subtropical and mid-latitude Atlantic. The annual-mean Atlantic Meridional Overturning Circulation (AMOC) index is predictable up to a 5-year lead time, consistent with the predictable signal in upper ocean heat content over the Northern Atlantic. While the skill measured by Mean Squared Skill Score (MSSS) shows 50% improvement up to 10-year lead forecast over the subtropical and mid-latitude Atlantic, however, prediction skill is relatively low in the subpolar gyre, due in part to the fact that the spatial pattern of the dominant simulated decadal mode in upper ocean heat content over this region appears to be unrealistic. An analysis of the large-scale temperature budget shows that this is the result of a model bias, implying that realistic simulation of the climatological fields is crucial for skillful decadal forecasts.

  17. Operational Monitoring and Forecasting in Regional Seas: the Aegean Sea example

    NASA Astrophysics Data System (ADS)

    Nittis, K.; Perivoliotis, L.; Zervakis, V.; Papadopoulos, A.; Tziavos, C.

    2003-04-01

    The increasing economic activities in the coastal zone and the associated pressure on the marine environment have raised the interest on monitoring systems able to provide supporting information for its effective management and protection. Such an integrated monitoring, forecasting and information system is being developed during the past years in the Aegean Sea. Its main component is the POSEIDON network that provides real-time data for meteorological and surface oceanographic parameters (waves, currents, hydrological and biochemical data) from 11 fixed oceanographic buoys. The numerical forecasting system is composed by an ETA atmospheric model, a WAM wave model and a POM hydrodynamic model that provide every day 72 hours forecasts. The system is operational since May 2000 and its products are published through Internet while a sub-set is also available through cellular telephony. New type of observing platforms will be available in the near future through a number of EU funded research projects. The Mediterranean Moored Multi-sensor Array (M3A) that was developed for the needs of the Mediterranean Forecasting System and was tested during 2000-2001 will be operational in 2004 during the MFSTEP project. The M3A system incorporates sensors for optical and chemical measurements (Oxygen, Turbidity, Chlorophyll-a, Nutrients and PAR) in the euphotic zone (0-100m) together with sensors for physical parameters (Temperature, Salinity, Current speed and direction) at the 0-500m layer. A Ferry-Box system will also operate during 2004 in the southern Aegean Sea, providing surface data for physical and bio-chemical properties. The ongoing modeling efforts include coupling with larger scale circulation models of the Mediterranean, high-resolution downscaling to coastal areas of the Aegean Sea and development of multi-variate data assimilation methods.

  18. Performance and Quality Assessment of the Forthcoming Copernicus Marine Service Global Ocean Monitoring and Forecasting Real-Time System

    NASA Astrophysics Data System (ADS)

    Lellouche, J. M.; Le Galloudec, O.; Greiner, E.; Garric, G.; Regnier, C.; Drillet, Y.

    2016-02-01

    Mercator Ocean currently delivers in real-time daily services (weekly analyses and daily forecast) with a global 1/12° high resolution system. The model component is the NEMO platform driven at the surface by the IFS ECMWF atmospheric analyses and forecasts. Observations are assimilated by means of a reduced-order Kalman filter with a 3D multivariate modal decomposition of the forecast error. It includes an adaptive-error estimate and a localization algorithm. Along track altimeter data, satellite Sea Surface Temperature and in situ temperature and salinity vertical profiles are jointly assimilated to estimate the initial conditions for numerical ocean forecasting. A 3D-Var scheme provides a correction for the slowly-evolving large-scale biases in temperature and salinity.Since May 2015, Mercator Ocean opened the Copernicus Marine Service (CMS) and is in charge of the global ocean analyses and forecast, at eddy resolving resolution. In this context, R&D activities have been conducted at Mercator Ocean these last years in order to improve the real-time 1/12° global system for the next CMS version in 2016. The ocean/sea-ice model and the assimilation scheme benefit among others from the following improvements: large-scale and objective correction of atmospheric quantities with satellite data, new Mean Dynamic Topography taking into account the last version of GOCE geoid, new adaptive tuning of some observational errors, new Quality Control on the assimilated temperature and salinity vertical profiles based on dynamic height criteria, assimilation of satellite sea-ice concentration, new freshwater runoff from ice sheets melting …This presentation doesn't focus on the impact of each update, but rather on the overall behavior of the system integrating all updates. This assessment reports on the products quality improvements, highlighting the level of performance and the reliability of the new system.

  19. New Approach To Hour-By-Hour Weather Forecast

    NASA Astrophysics Data System (ADS)

    Liao, Q. Q.; Wang, B.

    2017-12-01

    Fine hourly forecast in single station weather forecast is required in many human production and life application situations. Most previous MOS (Model Output Statistics) which used a linear regression model are hard to solve nonlinear natures of the weather prediction and forecast accuracy has not been sufficient at high temporal resolution. This study is to predict the future meteorological elements including temperature, precipitation, relative humidity and wind speed in a local region over a relatively short period of time at hourly level. By means of hour-to-hour NWP (Numeral Weather Prediction)meteorological field from Forcastio (https://darksky.net/dev/docs/forecast) and real-time instrumental observation including 29 stations in Yunnan and 3 stations in Tianjin of China from June to October 2016, predictions are made of the 24-hour hour-by-hour ahead. This study presents an ensemble approach to combine the information of instrumental observation itself and NWP. Use autoregressive-moving-average (ARMA) model to predict future values of the observation time series. Put newest NWP products into the equations derived from the multiple linear regression MOS technique. Handle residual series of MOS outputs with autoregressive (AR) model for the linear property presented in time series. Due to the complexity of non-linear property of atmospheric flow, support vector machine (SVM) is also introduced . Therefore basic data quality control and cross validation makes it able to optimize the model function parameters , and do 24 hours ahead residual reduction with AR/SVM model. Results show that AR model technique is better than corresponding multi-variant MOS regression method especially at the early 4 hours when the predictor is temperature. MOS-AR combined model which is comparable to MOS-SVM model outperform than MOS. Both of their root mean square error and correlation coefficients for 2 m temperature are reduced to 1.6 degree Celsius and 0.91 respectively. The forecast accuracy of 24- hour forecast deviation no more than 2 degree Celsius is 78.75 % for MOS-AR model and 81.23 % for AR model.

  20. Climate, Water, and Human Health: Large Scale Hydroclimatic Controls in Forecasting Cholera Epidemics

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2009-12-01

    Despite ravaging the continents through seven global pandemics in past centuries, the seasonal and interannual variability of cholera outbreaks remain a mystery. Previous studies have focused on the role of various environmental and climatic factors, but provided little or no predictive capability. Recent findings suggest a more prominent role of large scale hydroclimatic extremes - droughts and floods - and attempt to explain the seasonality and the unique dual cholera peaks in the Bengal Delta region of South Asia. We investigate the seasonal and interannual nature of cholera epidemiology in three geographically distinct locations within the region to identify the larger scale hydroclimatic controls that can set the ecological and environmental ‘stage’ for outbreaks and have significant memory on a seasonal scale. Here we show that two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large scale climatic controls prevail in the region. An implication of our findings is that extreme climatic events such as prolonged droughts, record floods, and major cyclones may cause major disruption in the ecosystem and trigger large epidemics. We postulate that a quantitative understanding of the large-scale hydroclimatic controls and dominant processes with significant system memory will form the basis for forecasting such epidemic outbreaks. A multivariate regression method using these predictor variables to develop probabilistic forecasts of cholera outbreaks will be explored. Forecasts from such a system with a seasonal lead-time are likely to have measurable impact on early cholera detection and prevention efforts in endemic regions.

  1. Toward long-lead operational forecasts of drought: An experimental study in the Murray-Darling River Basin

    NASA Astrophysics Data System (ADS)

    Barros, Ana P.; Bowden, Gavin J.

    2008-08-01

    SummaryResiliency and effectiveness in water resources management of drought is strongly depend on advanced knowledge of drought onset, duration and severity. The motivation of this work is to extend the lead time of operational drought forecasts. The research strategy is to explore the predictability of drought severity from space-time varying indices of large-scale climate phenomena relevant to regional hydrometeorology (e.g. ENSO) by integrating linear and non-linear statistical data models, specifically self-organizing maps (SOM) and multivariate linear regression analysis. The methodology is demonstrated through the step-by-step development of a model to forecast monthly spatial patterns of the standard precipitation index (SPI) within the Murray-Darling Basin (MDB) in Australia up to 12 months in advance. First, the rationale for the physical hypothesis and the exploratory data analysis including principal components, wavelet and partial mutual information analysis to identify and select predictor variables are presented. The focus is on spatial datasets of precipitation, sea surface temperature anomaly (SSTA) patterns over the Indian and Pacific Oceans, temporal and spatial gradients of outgoing longwave radiation (OLR) in the Pacific Ocean, and the far western Pacific wind-stress anomaly. Second, the process of model construction, calibration and evaluation is described. The experimental forecasts show that there is ample opportunity to increase the lead time of drought forecasts for decision support using parsimonious data models that capture the governing climate processes at regional scale. OLR gradients proved to be dispensable predictors, whereas SPI-based predictors appear to control predictability when the SSTA in the region [87.5°N-87.5°S; 27.5°E-67.5°W] and eastward wind-stress anomalies in the region [4°N-4°S; 130°E-160°E) are small, respectively, ±1° and ±0.01 dyne/cm 2, that is when ENSO activity is weak. The areal averaged 12-month lead-time forecasts of SPI in the MDB explain up to 60% of the variance in the observations ( r > 0.7). Based on a threshold SPI of -0.5 for severe drought at the regional scale and for a nominal 12-month lead time, the forecast of the timing of onset is within 0-2 months of the actual threshold being met by the observations, thus effectively a 10-month lead time forecast at a minimum. Spatial analysis suggests that forecast errors can be attributed in part to a mismatch between the spatial heterogeneity of rainfall and raingauge density in the observational network. Forecast uncertainty on the other hand appears associated with the number of redundant predictors used in the forecast model.

  2. R&D100: CO2 Memzyme

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rempe, Susan; Brinker, Jeff; Jiang, Ying-Bing

    2015-11-19

    By combining a water droplet loaded with CO2 enzymes in an ultrathin nanopore on a flexible substrate, researchers at Sandia National Laboratories realized the first technology that meets and exceeds DOE targets for cost-effective CO2 capture. When compared with the nearest membrane competitor, this technology delivers a three times permeation rate, twenty times higher selectivity, and ten time lower fabrication cost. The CO2 Memzyme has the potential to remove 90% of CO2 emissions and is forecasted to save the U.S. coal industry $90 billion a year compared to conventional technology.

  3. R&D100: CO2 Memzyme

    ScienceCinema

    Rempe, Susan; Brinker, Jeff; Jiang, Ying-Bing; Vanegas, Juan

    2018-06-25

    By combining a water droplet loaded with CO2 enzymes in an ultrathin nanopore on a flexible substrate, researchers at Sandia National Laboratories realized the first technology that meets and exceeds DOE targets for cost-effective CO2 capture. When compared with the nearest membrane competitor, this technology delivers a three times permeation rate, twenty times higher selectivity, and ten time lower fabrication cost. The CO2 Memzyme has the potential to remove 90% of CO2 emissions and is forecasted to save the U.S. coal industry $90 billion a year compared to conventional technology.

  4. Wave Extremes in the Northeast Atlantic from Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Breivik, Øyvind; Aarnes, Ole Johan; Bidlot, Jean-Raymond; Carrasco, Ana; Saetra, Øyvind

    2013-10-01

    A method for estimating return values from ensembles of forecasts at advanced lead times is presented. Return values of significant wave height in the North-East Atlantic, the Norwegian Sea and the North Sea are computed from archived +240-h forecasts of the ECMWF ensemble prediction system (EPS) from 1999 to 2009. We make three assumptions: First, each forecast is representative of a six-hour interval and collectively the data set is then comparable to a time period of 226 years. Second, the model climate matches the observed distribution, which we confirm by comparing with buoy data. Third, the ensemble members are sufficiently uncorrelated to be considered independent realizations of the model climate. We find anomaly correlations of 0.20, but peak events (>P97) are entirely uncorrelated. By comparing return values from individual members with return values of subsamples of the data set we also find that the estimates follow the same distribution and appear unaffected by correlations in the ensemble. The annual mean and variance over the 11-year archived period exhibit no significant departures from stationarity compared with a recent reforecast, i.e., there is no spurious trend due to model upgrades. EPS yields significantly higher return values than ERA-40 and ERA-Interim and is in good agreement with the high-resolution hindcast NORA10, except in the lee of unresolved islands where EPS overestimates and in enclosed seas where it is biased low. Confidence intervals are half the width of those found for ERA-Interim due to the magnitude of the data set.

  5. Heterogeneous autoregressive model with structural break using nearest neighbor truncation volatility estimators for DAX.

    PubMed

    Chin, Wen Cheong; Lee, Min Cherng; Yap, Grace Lee Ching

    2016-01-01

    High frequency financial data modelling has become one of the important research areas in the field of financial econometrics. However, the possible structural break in volatile financial time series often trigger inconsistency issue in volatility estimation. In this study, we propose a structural break heavy-tailed heterogeneous autoregressive (HAR) volatility econometric model with the enhancement of jump-robust estimators. The breakpoints in the volatility are captured by dummy variables after the detection by Bai-Perron sequential multi breakpoints procedure. In order to further deal with possible abrupt jump in the volatility, the jump-robust volatility estimators are composed by using the nearest neighbor truncation approach, namely the minimum and median realized volatility. Under the structural break improvements in both the models and volatility estimators, the empirical findings show that the modified HAR model provides the best performing in-sample and out-of-sample forecast evaluations as compared with the standard HAR models. Accurate volatility forecasts have direct influential to the application of risk management and investment portfolio analysis.

  6. Forecast calls for continued period of active hurricane seasons in the North Atlantic

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    “I have been designated as a representative of Chicken Little to tell you the sky is falling with regard to hurricanes.” So said William Gray professor of atmospheric science at Colorado State University at a July 26 briefing on Capitol Hill. The briefing, sponsored by the Congressional Natural Hazards Caucus, the (U.S.) University Corporation for Atmospheric Research, and the American Meteorological Society highlighted a new report about the current active hurricane period in the North Atlantic, as well as funding needs for hurricane research. “It is amazing the threat we appear to be in for in the next two to three decades, and how little realization of this [there] is with the government and with the general public,” said Gray a long-time forecaster of seasonal hurricane activity and co-author of a July 19 article in Science, “The Recent Increase in Atlantic Hurricane Activity: Causes and Implications.”

  7. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    PubMed Central

    Jensen, Tue V.; Pinson, Pierre

    2017-01-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation. PMID:29182600

  8. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.

    PubMed

    Jensen, Tue V; Pinson, Pierre

    2017-11-28

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  9. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    NASA Astrophysics Data System (ADS)

    Jensen, Tue V.; Pinson, Pierre

    2017-11-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  10. Model-Aided Altimeter-Based Water Level Forecasting System in Mekong River

    NASA Astrophysics Data System (ADS)

    Chang, C. H.; Lee, H.; Hossain, F.; Okeowo, M. A.; Basnayake, S. B.; Jayasinghe, S.; Saah, D. S.; Anderson, E.; Hwang, E.

    2017-12-01

    Mekong River, one of the massive river systems in the world, has drainage area of about 795,000 km2 covering six countries. People living in its drainage area highly rely on resources given by the river in terms of agriculture, fishery, and hydropower. Monitoring and forecasting the water level in a timely manner, is urgently needed over the Mekong River. Recently, using TOPEX/Poseidon (T/P) altimetry water level measurements in India, Biancamaria et al. [2011] has demonstrated the capability of an altimeter-based flood forecasting system in Bangladesh, with RMSE from 0.6 - 0.8 m for lead times up to 5 days on 10-day basis due to T/P's repeat period. Hossain et al. [2013] further established a daily water level forecasting system in Bangladesh using observations from Jason-2 in India and HEC-RAS hydraulic model, with RMSE from 0.5 - 1.5 m and an underestimating mean bias of 0.25 - 1.25 m. However, such daily forecasting system relies on a collection of Jason-2 virtual stations (VSs) to ensure frequent sampling and data availability. Since the Mekong River is a meridional river with few number of VSs, the direct application of this system to the Mekong River becomes challenging. To address this problem, we propose a model-aided altimeter-based forecasting system. The discharge output by Variable Infiltration Capacity hydrologic model is used to reconstruct a daily water level product at upstream Jason-2 VSs based on the discharge-to-level rating curve. The reconstructed daily water level is then used to perform regression analysis with downstream in-situ water level to build regression models, which are used to forecast a daily water level. In the middle reach of the Mekong River from Nakhon Phanom to Kratie, a 3-day lead time forecasting can reach RMSE about 0.7 - 1.3 m with correlation coefficient around 0.95. For the lower reach of the Mekong River, the water flow becomes more complicated due to the reversal flow between the Tonle Sap Lake and the Mekong River, while ocean tide can also propagate into this region. By considering the influence of Tonle Sap Lake and the Mekong River through multi-variable regression analysis, the forecasting results from Prek Kdam to Chau Doc/Tan Chau reach RMSE from about 0.3 - 0.65 m and correlation coefficient about 0.93- 0.97 with 5-day lead time.

  11. Multivariate Models of Adult Pacific Salmon Returns

    PubMed Central

    Burke, Brian J.; Peterson, William T.; Beckman, Brian R.; Morgan, Cheryl; Daly, Elizabeth A.; Litz, Marisa

    2013-01-01

    Most modeling and statistical approaches encourage simplicity, yet ecological processes are often complex, as they are influenced by numerous dynamic environmental and biological factors. Pacific salmon abundance has been highly variable over the last few decades and most forecasting models have proven inadequate, primarily because of a lack of understanding of the processes affecting variability in survival. Better methods and data for predicting the abundance of returning adults are therefore required to effectively manage the species. We combined 31 distinct indicators of the marine environment collected over an 11-year period into a multivariate analysis to summarize and predict adult spring Chinook salmon returns to the Columbia River in 2012. In addition to forecasts, this tool quantifies the strength of the relationship between various ecological indicators and salmon returns, allowing interpretation of ecosystem processes. The relative importance of indicators varied, but a few trends emerged. Adult returns of spring Chinook salmon were best described using indicators of bottom-up ecological processes such as composition and abundance of zooplankton and fish prey as well as measures of individual fish, such as growth and condition. Local indicators of temperature or coastal upwelling did not contribute as much as large-scale indicators of temperature variability, matching the spatial scale over which salmon spend the majority of their ocean residence. Results suggest that effective management of Pacific salmon requires multiple types of data and that no single indicator can represent the complex early-ocean ecology of salmon. PMID:23326586

  12. Forecasting daily source air quality using multivariate statistical analysis and radial basis function networks.

    PubMed

    Sun, Gang; Hoff, Steven J; Zelle, Brian C; Nelson, Minda A

    2008-12-01

    It is vital to forecast gas and particle matter concentrations and emission rates (GPCER) from livestock production facilities to assess the impact of airborne pollutants on human health, ecological environment, and global warming. Modeling source air quality is a complex process because of abundant nonlinear interactions between GPCER and other factors. The objective of this study was to introduce statistical methods and radial basis function (RBF) neural network to predict daily source air quality in Iowa swine deep-pit finishing buildings. The results show that four variables (outdoor and indoor temperature, animal units, and ventilation rates) were identified as relative important model inputs using statistical methods. It can be further demonstrated that only two factors, the environment factor and the animal factor, were capable of explaining more than 94% of the total variability after performing principal component analysis. The introduction of fewer uncorrelated variables to the neural network would result in the reduction of the model structure complexity, minimize computation cost, and eliminate model overfitting problems. The obtained results of RBF network prediction were in good agreement with the actual measurements, with values of the correlation coefficient between 0.741 and 0.995 and very low values of systemic performance indexes for all the models. The good results indicated the RBF network could be trained to model these highly nonlinear relationships. Thus, the RBF neural network technology combined with multivariate statistical methods is a promising tool for air pollutant emissions modeling.

  13. Sequential Linker Installation: Precise Placement of Functional Groups in Multivariate Metal-Organic Frameworks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan, S; Lu, WG; Chen, YP

    2015-03-11

    A unique strategy, sequential linker installation (SLI), has been developed to construct multivariate MOFs with functional groups precisely positioned. PCN-700, a Zr-MOF with eight-connected Zr6O4(OH)(8)(H2O)(4) clusters, has been judiciously designed; the Zr-6 clusters in this MOF are arranged in such a fashion that, by replacement of terminal OH-/H2O ligands, subsequent insertion of linear dicarboxylate linkers is achieved. We demonstrate that linkers with distinct lengths and functionalities can be sequentially installed into PCN-700. Single-crystal to single-crystal transformation is realized so that the positions of the subsequently installed linkers are pinpointed via single-crystal X-ray diffraction analyses. This methodology provides a powerful toolmore » to construct multivariate MOFs with precisely positioned functionalities in the desired proximity, which would otherwise be difficult to achieve.« less

  14. Forecasts of non-Gaussian parameter spaces using Box-Cox transformations

    NASA Astrophysics Data System (ADS)

    Joachimi, B.; Taylor, A. N.

    2011-09-01

    Forecasts of statistical constraints on model parameters using the Fisher matrix abound in many fields of astrophysics. The Fisher matrix formalism involves the assumption of Gaussianity in parameter space and hence fails to predict complex features of posterior probability distributions. Combining the standard Fisher matrix with Box-Cox transformations, we propose a novel method that accurately predicts arbitrary posterior shapes. The Box-Cox transformations are applied to parameter space to render it approximately multivariate Gaussian, performing the Fisher matrix calculation on the transformed parameters. We demonstrate that, after the Box-Cox parameters have been determined from an initial likelihood evaluation, the method correctly predicts changes in the posterior when varying various parameters of the experimental setup and the data analysis, with marginally higher computational cost than a standard Fisher matrix calculation. We apply the Box-Cox-Fisher formalism to forecast cosmological parameter constraints by future weak gravitational lensing surveys. The characteristic non-linear degeneracy between matter density parameter and normalization of matter density fluctuations is reproduced for several cases, and the capabilities of breaking this degeneracy by weak-lensing three-point statistics is investigated. Possible applications of Box-Cox transformations of posterior distributions are discussed, including the prospects for performing statistical data analysis steps in the transformed Gaussianized parameter space.

  15. Learning temporal rules to forecast instability in continuously monitored patients

    PubMed Central

    Dubrawski, Artur; Wang, Donghan; Hravnak, Marilyn; Clermont, Gilles; Pinsky, Michael R

    2017-01-01

    Inductive machine learning, and in particular extraction of association rules from data, has been successfully used in multiple application domains, such as market basket analysis, disease prognosis, fraud detection, and protein sequencing. The appeal of rule extraction techniques stems from their ability to handle intricate problems yet produce models based on rules that can be comprehended by humans, and are therefore more transparent. Human comprehension is a factor that may improve adoption and use of data-driven decision support systems clinically via face validity. In this work, we explore whether we can reliably and informatively forecast cardiorespiratory instability (CRI) in step-down unit (SDU) patients utilizing data from continuous monitoring of physiologic vital sign (VS) measurements. We use a temporal association rule extraction technique in conjunction with a rule fusion protocol to learn how to forecast CRI in continuously monitored patients. We detail our approach and present and discuss encouraging empirical results obtained using continuous multivariate VS data from the bedside monitors of 297 SDU patients spanning 29 346 hours (3.35 patient-years) of observation. We present example rules that have been learned from data to illustrate potential benefits of comprehensibility of the extracted models, and we analyze the empirical utility of each VS as a potential leading indicator of an impending CRI event. PMID:27274020

  16. Multivariate and Multiscale Data Assimilation in Terrestrial Systems: A Review

    PubMed Central

    Montzka, Carsten; Pauwels, Valentijn R. N.; Franssen, Harrie-Jan Hendricks; Han, Xujun; Vereecken, Harry

    2012-01-01

    More and more terrestrial observational networks are being established to monitor climatic, hydrological and land-use changes in different regions of the World. In these networks, time series of states and fluxes are recorded in an automated manner, often with a high temporal resolution. These data are important for the understanding of water, energy, and/or matter fluxes, as well as their biological and physical drivers and interactions with and within the terrestrial system. Similarly, the number and accuracy of variables, which can be observed by spaceborne sensors, are increasing. Data assimilation (DA) methods utilize these observations in terrestrial models in order to increase process knowledge as well as to improve forecasts for the system being studied. The widely implemented automation in observing environmental states and fluxes makes an operational computation more and more feasible, and it opens the perspective of short-time forecasts of the state of terrestrial systems. In this paper, we review the state of the art with respect to DA focusing on the joint assimilation of observational data precedents from different spatial scales and different data types. An introduction is given to different DA methods, such as the Ensemble Kalman Filter (EnKF), Particle Filter (PF) and variational methods (3/4D-VAR). In this review, we distinguish between four major DA approaches: (1) univariate single-scale DA (UVSS), which is the approach used in the majority of published DA applications, (2) univariate multiscale DA (UVMS) referring to a methodology which acknowledges that at least some of the assimilated data are measured at a different scale than the computational grid scale, (3) multivariate single-scale DA (MVSS) dealing with the assimilation of at least two different data types, and (4) combined multivariate multiscale DA (MVMS). Finally, we conclude with a discussion on the advantages and disadvantages of the assimilation of multiple data types in a simulation model. Existing approaches can be used to simultaneously update several model states and model parameters if applicable. In other words, the basic principles for multivariate data assimilation are already available. We argue that a better understanding of the measurement errors for different observation types, improved estimates of observation bias and improved multiscale assimilation methods for data which scale nonlinearly is important to properly weight them in multiscale multivariate data assimilation. In this context, improved cross-validation of different data types, and increased ground truth verification of remote sensing products are required. PMID:23443380

  17. Multivariate and multiscale data assimilation in terrestrial systems: a review.

    PubMed

    Montzka, Carsten; Pauwels, Valentijn R N; Franssen, Harrie-Jan Hendricks; Han, Xujun; Vereecken, Harry

    2012-11-26

    More and more terrestrial observational networks are being established to monitor climatic, hydrological and land-use changes in different regions of the World. In these networks, time series of states and fluxes are recorded in an automated manner, often with a high temporal resolution. These data are important for the understanding of water, energy, and/or matter fluxes, as well as their biological and physical drivers and interactions with and within the terrestrial system. Similarly, the number and accuracy of variables, which can be observed by spaceborne sensors, are increasing. Data assimilation (DA) methods utilize these observations in terrestrial models in order to increase process knowledge as well as to improve forecasts for the system being studied. The widely implemented automation in observing environmental states and fluxes makes an operational computation more and more feasible, and it opens the perspective of short-time forecasts of the state of terrestrial systems. In this paper, we review the state of the art with respect to DA focusing on the joint assimilation of observational data precedents from different spatial scales and different data types. An introduction is given to different DA methods, such as the Ensemble Kalman Filter (EnKF), Particle Filter (PF) and variational methods (3/4D-VAR). In this review, we distinguish between four major DA approaches: (1) univariate single-scale DA (UVSS), which is the approach used in the majority of published DA applications, (2) univariate multiscale DA (UVMS) referring to a methodology which acknowledges that at least some of the assimilated data are measured at a different scale than the computational grid scale, (3) multivariate single-scale DA (MVSS) dealing with the assimilation of at least two different data types, and (4) combined multivariate multiscale DA (MVMS). Finally, we conclude with a discussion on the advantages and disadvantages of the assimilation of multiple data types in a simulation model. Existing approaches can be used to simultaneously update several model states and model parameters if applicable. In other words, the basic principles for multivariate data assimilation are already available. We argue that a better understanding of the measurement errors for different observation types, improved estimates of observation bias and improved multiscale assimilation methods for data which scale nonlinearly is important to properly weight them in multiscale multivariate data assimilation. In this context, improved cross-validation of different data types, and increased ground truth verification of remote sensing products are required.

  18. Creating Weather System Ensembles Through Synergistic Process Modeling and Machine Learning

    NASA Astrophysics Data System (ADS)

    Chen, B.; Posselt, D. J.; Nguyen, H.; Wu, L.; Su, H.; Braverman, A. J.

    2017-12-01

    Earth's weather and climate are sensitive to a variety of control factors (e.g., initial state, forcing functions, etc). Characterizing the response of the atmosphere to a change in initial conditions or model forcing is critical for weather forecasting (ensemble prediction) and climate change assessment. Input - response relationships can be quantified by generating an ensemble of multiple (100s to 1000s) realistic realizations of weather and climate states. Atmospheric numerical models generate simulated data through discretized numerical approximation of the partial differential equations (PDEs) governing the underlying physics. However, the computational expense of running high resolution atmospheric state models makes generation of more than a few simulations infeasible. Here, we discuss an experiment wherein we approximate the numerical PDE solver within the Weather Research and Forecasting (WRF) Model using neural networks trained on a subset of model run outputs. Once trained, these neural nets can produce large number of realization of weather states from a small number of deterministic simulations with speeds that are orders of magnitude faster than the underlying PDE solver. Our neural network architecture is inspired by the governing partial differential equations. These equations are location-invariant, and consist of first and second derivations. As such, we use a 3x3 lon-lat grid of atmospheric profiles as the predictor in the neural net to provide the network the information necessary to compute the first and second moments. Results indicate that the neural network algorithm can approximate the PDE outputs with high degree of accuracy (less than 1% error), and that this error increases as a function of the prediction time lag.

  19. Advances in Landslide Hazard Forecasting: Evaluation of Global and Regional Modeling Approach

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia B.; Adler, Robert; Hone, Yang; Kumar, Sujay; Peters-Lidard, Christa; Lerner-Lam, Arthur

    2010-01-01

    A prototype global satellite-based landslide hazard algorithm has been developed to identify areas that exhibit a high potential for landslide activity by combining a calculation of landslide susceptibility with satellite-derived rainfall estimates. A recent evaluation of this algorithm framework found that while this tool represents an important first step in larger-scale landslide forecasting efforts, it requires several modifications before it can be fully realized as an operational tool. The evaluation finds that the landslide forecasting may be more feasible at a regional scale. This study draws upon a prior work's recommendations to develop a new approach for considering landslide susceptibility and forecasting at the regional scale. This case study uses a database of landslides triggered by Hurricane Mitch in 1998 over four countries in Central America: Guatemala, Honduras, EI Salvador and Nicaragua. A regional susceptibility map is calculated from satellite and surface datasets using a statistical methodology. The susceptibility map is tested with a regional rainfall intensity-duration triggering relationship and results are compared to global algorithm framework for the Hurricane Mitch event. The statistical results suggest that this regional investigation provides one plausible way to approach some of the data and resolution issues identified in the global assessment, providing more realistic landslide forecasts for this case study. Evaluation of landslide hazards for this extreme event helps to identify several potential improvements of the algorithm framework, but also highlights several remaining challenges for the algorithm assessment, transferability and performance accuracy. Evaluation challenges include representation errors from comparing susceptibility maps of different spatial resolutions, biases in event-based landslide inventory data, and limited nonlandslide event data for more comprehensive evaluation. Additional factors that may improve algorithm performance accuracy include incorporating additional triggering factors such as tectonic activity, anthropogenic impacts and soil moisture into the algorithm calculation. Despite these limitations, the methodology presented in this regional evaluation is both straightforward to calculate and easy to interpret, making results transferable between regions and allowing findings to be placed within an inter-comparison framework. The regional algorithm scenario represents an important step in advancing regional and global-scale landslide hazard assessment and forecasting.

  20. Test operation of a real-time tsunami inundation forecast system using actual data observed by S-net

    NASA Astrophysics Data System (ADS)

    Suzuki, W.; Yamamoto, N.; Miyoshi, T.; Aoi, S.

    2017-12-01

    If the tsunami inundation information can be rapidly and stably forecast before the large tsunami attacks, the information would have effectively people realize the impeding danger and necessity of evacuation. Toward that goal, we have developed a prototype system to perform the real-time tsunami inundation forecast for Chiba prefecture, eastern Japan, using off-shore ocean bottom pressure data observed by the seafloor observation network for earthquakes and tsunamis along the Japan Trench (S-net) (Aoi et al., 2015, AGU). Because tsunami inundation simulation requires a large computation cost, we employ a database approach searching the pre-calculated tsunami scenarios that reasonably explain the observed S-net pressure data based on the multi-index method (Yamamoto et al., 2016, EPS). The scenario search is regularly repeated, not triggered by the occurrence of the tsunami event, and the forecast information is generated from the selected scenarios that meet the criterion. Test operation of the prototype system using the actual observation data started in April, 2017 and the performance and behavior of the system during non-tsunami event periods have been examined. It is found that the treatment of the noises affecting the observed data is the main issue to be solved toward the improvement of the system. Even if the observed pressure data are filtered to extract the tsunami signals, the noises in ordinary times or unusually large noises like high ocean waves due to storm affect the comparison between the observed and scenario data. Due to the noises, the tsunami scenarios are selected and the tsunami is forecast although any tsunami event does not actually occur. In most cases, the selected scenarios due to the noises have the fault models in the region along the Kurile or Izu-Bonin Trenches, far from the S-net region, or the fault models below the land. Based on the parallel operation of the forecast system with a different scenario search condition and examination of the fault models, we improve the stability and performance of the forecast system.This work was supported by Council for Science, Technology and Innovation(CSTI), Cross-ministerial Strategic Innovation Promotion Program (SIP), "Enhancement of societal resiliency against natural disasters"(Funding agency: JST).

  1. Forecasting methodologies for Ganoderma spore concentration using combined statistical approaches and model evaluations

    NASA Astrophysics Data System (ADS)

    Sadyś, Magdalena; Skjøth, Carsten Ambelas; Kennedy, Roy

    2016-04-01

    High concentration levels of Ganoderma spp. spores were observed in Worcester, UK, during 2006-2010. These basidiospores are known to cause sensitization due to the allergen content and their small dimensions. This enables them to penetrate the lower part of the respiratory tract in humans. Establishment of a link between occurring symptoms of sensitization to Ganoderma spp. and other basidiospores is challenging due to lack of information regarding spore concentration in the air. Hence, aerobiological monitoring should be conducted, and if possible extended with the construction of forecast models. Daily mean concentration of allergenic Ganoderma spp. spores in the atmosphere of Worcester was measured using 7-day volumetric spore sampler through five consecutive years. The relationships between the presence of spores in the air and the weather parameters were examined. Forecast models were constructed for Ganoderma spp. spores using advanced statistical techniques, i.e. multivariate regression trees and artificial neural networks. Dew point temperature along with maximum temperature was the most important factor influencing the presence of spores in the air of Worcester. Based on these two major factors and several others of lesser importance, thresholds for certain levels of fungal spore concentration, i.e. low (0-49 s m-3), moderate (50-99 s m-3), high (100-149 s m-3) and very high (150 < n s m-3), could be designated. Despite some deviation in results obtained by artificial neural networks, authors have achieved a forecasting model, which was accurate (correlation between observed and predicted values varied from r s = 0.57 to r s = 0.68).

  2. Forecasting methodologies for Ganoderma spore concentration using combined statistical approaches and model evaluations.

    PubMed

    Sadyś, Magdalena; Skjøth, Carsten Ambelas; Kennedy, Roy

    2016-04-01

    High concentration levels of Ganoderma spp. spores were observed in Worcester, UK, during 2006-2010. These basidiospores are known to cause sensitization due to the allergen content and their small dimensions. This enables them to penetrate the lower part of the respiratory tract in humans. Establishment of a link between occurring symptoms of sensitization to Ganoderma spp. and other basidiospores is challenging due to lack of information regarding spore concentration in the air. Hence, aerobiological monitoring should be conducted, and if possible extended with the construction of forecast models. Daily mean concentration of allergenic Ganoderma spp. spores in the atmosphere of Worcester was measured using 7-day volumetric spore sampler through five consecutive years. The relationships between the presence of spores in the air and the weather parameters were examined. Forecast models were constructed for Ganoderma spp. spores using advanced statistical techniques, i.e. multivariate regression trees and artificial neural networks. Dew point temperature along with maximum temperature was the most important factor influencing the presence of spores in the air of Worcester. Based on these two major factors and several others of lesser importance, thresholds for certain levels of fungal spore concentration, i.e. low (0-49 s m(-3)), moderate (50-99 s m(-3)), high (100-149 s m(-3)) and very high (150 < n s m(-3)), could be designated. Despite some deviation in results obtained by artificial neural networks, authors have achieved a forecasting model, which was accurate (correlation between observed and predicted values varied from r s = 0.57 to r s = 0.68).

  3. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  4. Tropical Pacific moisture variability: Its detection, synoptic structure and consequences in the general circulation

    NASA Technical Reports Server (NTRS)

    Mcguirk, James P.

    1990-01-01

    Satellite data analysis tools are developed and implemented for the diagnosis of atmospheric circulation systems over the tropical Pacific Ocean. The tools include statistical multi-variate procedures, a multi-spectral radiative transfer model, and the global spectral forecast model at NMC. Data include in-situ observations; satellite observations from VAS (moisture, infrared and visible) NOAA polar orbiters (including Tiros Operational Satellite System (TOVS) multi-channel sounding data and OLR grids) and scanning multichannel microwave radiometer (SMMR); and European Centre for Medium Weather Forecasts (ECHMWF) analyses. A primary goal is a better understanding of the relation between synoptic structures of the area, particularly tropical plumes, and the general circulation, especially the Hadley circulation. A second goal is the definition of the quantitative structure and behavior of all Pacific tropical synoptic systems. Finally, strategies are examined for extracting new and additional information from existing satellite observations. Although moisture structure is emphasized, thermal patterns are also analyzed. Both horizontal and vertical structures are studied and objective quantitative results are emphasized.

  5. Preliminary identification of unicellular algal genus by using combined confocal resonance Raman spectroscopy with PCA and DPLS analysis

    NASA Astrophysics Data System (ADS)

    He, Shixuan; Xie, Wanyi; Zhang, Ping; Fang, Shaoxi; Li, Zhe; Tang, Peng; Gao, Xia; Guo, Jinsong; Tlili, Chaker; Wang, Deqiang

    2018-02-01

    The analysis of algae and dominant alga plays important roles in ecological and environmental fields since it can be used to forecast water bloom and control its potential deleterious effects. Herein, we combine in vivo confocal resonance Raman spectroscopy with multivariate analysis methods to preliminary identify the three algal genera in water blooms at unicellular scale. Statistical analysis of characteristic Raman peaks demonstrates that certain shifts and different normalized intensities, resulting from composition of different carotenoids, exist in Raman spectra of three algal cells. Principal component analysis (PCA) scores and corresponding loading weights show some differences from Raman spectral characteristics which are caused by vibrations of carotenoids in unicellular algae. Then, discriminant partial least squares (DPLS) classification method is used to verify the effectiveness of algal identification with confocal resonance Raman spectroscopy. Our results show that confocal resonance Raman spectroscopy combined with PCA and DPLS could handle the preliminary identification of dominant alga for forecasting and controlling of water blooms.

  6. A dynamic factor model of the evaluation of the financial crisis in Turkey.

    PubMed

    Sezgin, F; Kinay, B

    2010-01-01

    Factor analysis has been widely used in economics and finance in situations where a relatively large number of variables are believed to be driven by few common causes of variation. Dynamic factor analysis (DFA) which is a combination of factor and time series analysis, involves autocorrelation matrices calculated from multivariate time series. Dynamic factor models were traditionally used to construct economic indicators, macroeconomic analysis, business cycles and forecasting. In recent years, dynamic factor models have become more popular in empirical macroeconomics. They have more advantages than other methods in various respects. Factor models can for instance cope with many variables without running into scarce degrees of freedom problems often faced in regression-based analysis. In this study, a model which determines the effect of the global crisis on Turkey is proposed. The main aim of the paper is to analyze how several macroeconomic quantities show an alteration before the evolution of the crisis and to decide if a crisis can be forecasted or not.

  7. New insights on short-term solar irradiance forecast for space weather applications

    NASA Astrophysics Data System (ADS)

    Vieira, L. A.; Dudok de Wit, T.; Balmaceda, L. A.; Dal Lago, A.; Da Silva, L. A.; Gonzalez, W. D.

    2013-12-01

    The conditions of the thermosphere, the ionosphere, the neutral atmosphere, and the oceans on time scales from days to millennia are highly dependent on the solar electromagnetic output, the solar irradiance. The development of physics-based solar irradiance models during the last decade improved significantly our understanding of the solar forcing on Earth's climate. These models are based on the assumption that most of the solar irradiance variability is related to the magnetic field structure of the Sun. Recently, these models were extended to allow short-term forecast (1 to 15 days) of the total and spectral solar irradiance. The extension of the irradiance models is based on solar surface magnetic flux models and/or artificial neural network models. Here, we discuss in details the irradiance forecast models based on observations of the solar surface magnetic field realized by the HMI instrument on board of SDO spacecraft. We constrained and validated the models by comparing the output of the models and observations of the solar irradiance made by instruments onboard The SORCE spacecraft. This study received funding from the European Community's Seventh Framework Programme (FP7/2007-2013, FP7-SPACE-2010-1) under the grant agreement nrs. 218816 (SOTERIA project, www.soteria-space.eu) and 261948 (ATMOP,www.atmop.eu), and by the CNPq/Brazil under the grant number 312488/2012-2. We also gratefully thank the instrument teams for making their data available.

  8. Integration of Local Observations into the One Dimensional Fog Model PAFOG

    NASA Astrophysics Data System (ADS)

    Thoma, Christina; Schneider, Werner; Masbou, Matthieu; Bott, Andreas

    2012-05-01

    The numerical prediction of fog requires a very high vertical resolution of the atmosphere. Owing to a prohibitive computational effort of high resolution three dimensional models, operational fog forecast is usually done by means of one dimensional fog models. An important condition for a successful fog forecast with one dimensional models consists of the proper integration of observational data into the numerical simulations. The goal of the present study is to introduce new methods for the consideration of these data in the one dimensional radiation fog model PAFOG. First, it will be shown how PAFOG may be initialized with observed visibilities. Second, a nudging scheme will be presented for the inclusion of measured temperature and humidity profiles in the PAFOG simulations. The new features of PAFOG have been tested by comparing the model results with observations of the German Meteorological Service. A case study will be presented that reveals the importance of including local observations in the model calculations. Numerical results obtained with the modified PAFOG model show a distinct improvement of fog forecasts regarding the times of fog formation, dissipation as well as the vertical extent of the investigated fog events. However, model results also reveal that a further improvement of PAFOG might be possible if several empirical model parameters are optimized. This tuning can only be realized by comprehensive comparisons of model simulations with corresponding fog observations.

  9. Forgetting to remember our experiences: People overestimate how much they will retrospect about personal events.

    PubMed

    Tully, Stephanie; Meyvis, Tom

    2017-12-01

    People value experiences in part because of the memories they create. Yet, we find that people systematically overestimate how much they will retrospect about their experiences. This overestimation results from people focusing on their desire to retrospect about experiences, while failing to consider the experience's limited enduring accessibility in memory. Consistent with this view, we find that desirability is a stronger predictor of forecasted retrospection than it is of reported retrospection, resulting in greater overestimation when the desirability of retrospection is higher. Importantly, the desire to retrospect does not change over time. Instead, past experiences become less top-of-mind over time and, as a result, people simply forget to remember. In line with this account, our results show that obtaining physical reminders of an experience reduces the overestimation of retrospection by increasing how much people retrospect, bringing their realized retrospection more in line with their forecasts (and aspirations). We further observe that the extent to which reported retrospection falls short of forecasted retrospection reliably predicts declining satisfaction with an experience over time. Despite this potential negative consequence of retrospection falling short of expectations, we suggest that the initial overestimation itself may in fact be adaptive. This possibility and other potential implications of this work are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. a system approach to the long term forecasting of the climat data in baikal region

    NASA Astrophysics Data System (ADS)

    Abasov, N.; Berezhnykh, T.

    2003-04-01

    The Angara river running from Baikal with a cascade of hydropower plants built on it plays a peculiar role in economy of the region. With view of high variability of water inflow into the rivers and lakes (long-term low water periods and catastrophic floods) that is due to climatic peculiarities of the water resource formation, a long-term forecasting is developed and applied for risk decreasing at hydropower plants. Methodology and methods of long-term forecasting of natural-climatic processes employs some ideas of the research schools by Academician I.P.Druzhinin and Prof. A.P.Reznikhov and consists in detailed investigation of cause-effect relations, finding out physical analogs and their application to formalized methods of long-term forecasting. They are divided into qualitative (background method; method of analogs based on solar activity), probabilistic and approximative methods (analog-similarity relations; discrete-continuous model). These forecasting methods have been implemented in the form of analytical aids of the information-forecasting software "GIPSAR" that provides for some elements of artificial intelligence. Background forecasts of the runoff of the Ob, the Yenisei, the Angara Rivers in the south of Siberia are based on space-time regularities that were revealed on taking account of the phase shifts in occurrence of secular maxima and minima on integral-difference curves of many-year hydrological processes in objects compared. Solar activity plays an essential role in investigations of global variations of climatic processes. Its consideration in the method of superimposed epochs has allowed a conclusion to be made on the higher probability of the low-water period in the actual inflow to Lake Baikal that takes place on the increasing branch of solar activity of its 11-year cycle. The higher probability of a high-water period is observed on the decreasing branch of solar activity from the 2nd to the 5th year after its maximum. Probabilistic method of forecasting (with a year in advance) is based on the property of alternation of series of years with increase and decrease in the observed indicators (characteristic indices) of natural processes. Most of the series (98.4-99.6%) are represented by series of one to three years. The problem of forecasting is divided into two parts: 1) qualitative forecast of the probability that the started series will either continue or be replaced by a new series during the next year that is based on the frequency characteristics of series of years with increase or decrease of the forecasted sequence); 2) quantitative estimate of the forecasted value in the form of a curve of conditional frequencies is made on the base of intra-sequence interrelations among hydrometeorological elements by their differentiation with respect to series of years of increase or decrease, by construction of particular curves of conditional frequencies of the runoff for each expected variant of series development and by subsequent construction a generalized curve. Approximative learning methods form forecasted trajectories of the studied process indices for a long-term perspective. The method of analog-similarity relations is based on the fact that long periods of observations reveal some similarities in the character of variability of indices for some fragments of the sequence x (t) by definite criteria. The idea of the method is to estimate similarity of such fragments of the sequence that have been called the analogs. The method applies multistage optimization of both external parameters (e.g. the number of iterations of the sliding averaging needed to decompose the sequence into two components: the smoothed one with isolated periodic oscillations and the residual or random one). The method is applicable to current terms of forecasts and ending with the double solar cycle. Using a special procedure of integration, it separates terms with the best results for the given optimization subsample. Several optimal vectors of parameters obtained are tested on the examination (verifying) subsample. If the procedure is successful, the forecast is immediately made by integration of several best solutions. Peculiarities of forecasting extreme processes. Methods of long-term forecasting allow the sufficiently reliable forecasts to be made within the interval of xmin+Δ_1, xmax - Δ_2 (i.e. in the interval of medium values of indices). Meanwhile, in the intervals close to extreme ones, reliability of forecasts is substantially lower. While for medium values the statistics of the100-year sequence gives acceptable results owing to a sufficiently large number of revealed analogs that correspond to prognostic samples, for extreme values the situation is quite different, first of all by virtue of poverty of statistical data. Decreasing the values of Δ_1,Δ_2: Δ_1,Δ_2 rightarrow 0 (by including them into optimization parameters of the considered forecasting methods) could be one of the ways to improve reliability of forecasts. Partially, such an approach has been realized in the method of analog-similarity relations, giving the possibility to form a range of possible forecasted trajectories in two variants - from the minimum possible trajectory to the maximum possible one. Reliability of long-term forecasts. Both the methodology and the methods considered above have been realized as the information-forecasting system "GIPSAR". The system includes some tools implementing several methods of forecasting, analysis of initial and forecasted information, a developed database, a set of tools for verification of algorithms, additional information on the algorithms of statistical processing of sequences (sliding averaging, integral-difference curves, etc.), aids to organize input of initial information (in its various forms) as well as aids to draw up output prognostic documents. Risk management. The normal functioning of the Angara cascade is periodically interrupted by risks of two types that take place in the Baikal, the Bratsk and Ust-Ilimsk reservoirs: long low-water periods and sudden periods of extremely high water levels. For example, low-water periods, observed in the reservoirs of the Angara cascade can be classified under four risk categories : 1 - acceptable (negligible reduction of electric power generation by hydropower plants; certain difficulty in meeting environmental and navigation requirements); 2 - significant (substantial reduction of electric power generation by hydropower plants; certain restriction on water releases for navigation; violation of environmental requirements in some years); 3 - emergency (big losses in electric power generation; limited electricity supply to large consumers; significant restriction of water releases for navigation; threat of exposure of drinkable water intake works; violation of environmental requirements for a number of years); 4 - catastrophic (energy crisis; social crisis exposure of drinkable water intake works; termination of navigation; environmental catastrophe). Management of energy systems consists in operative, many-year regulation and perspective planning and has to take into account the analysis of operative data (water reserves in reservoirs), long-term statistics and relations among natural processes and also forecasts - short-term (for a day, week, decade), long-term and/or super-long-term (from a month to several decades). Such natural processes as water inflow to reservoirs, air temperatures during heating periods depend in turn on external factors: prevailing types of atmospheric circulation, intensity of the 11- and 22-year cycles of solar activity, volcanic activity, interaction between the ocean and atmosphere, etc. Until recently despite the formed scientific schools on long-term forecasting (I.P.Druzhinin, A.P.Reznikhov) the energy system management has been based on specially drawn dispatching schedules and long-term hydrometeorological forecasts only without attraction of perspective forecasted indices. Insertion of a parallel block of forecast (based on the analysis of data on natural processes and special methods of forecasting) into the scheme can largely smooth unfavorable consequences from the impact of natural processes on sustainable development of energy systems and especially on its safe operation. However, the requirements to reliability and accuracy of long-term forecasts significantly increase. The considered approach to long term forecasting can be used for prediction: mean winter and summer air temperatures, droughts and wood fires.

  11. Selection Indices and Multivariate Analysis Show Similar Results in the Evaluation of Growth and Carcass Traits in Beef Cattle

    PubMed Central

    Brito Lopes, Fernando; da Silva, Marcelo Corrêa; Magnabosco, Cláudio Ulhôa; Goncalves Narciso, Marcelo; Sainz, Roberto Daniel

    2016-01-01

    This research evaluated a multivariate approach as an alternative tool for the purpose of selection regarding expected progeny differences (EPDs). Data were fitted using a multi-trait model and consisted of growth traits (birth weight and weights at 120, 210, 365 and 450 days of age) and carcass traits (longissimus muscle area (LMA), back-fat thickness (BF), and rump fat thickness (RF)), registered over 21 years in extensive breeding systems of Polled Nellore cattle in Brazil. Multivariate analyses were performed using standardized (zero mean and unit variance) EPDs. The k mean method revealed that the best fit of data occurred using three clusters (k = 3) (P < 0.001). Estimates of genetic correlation among growth and carcass traits and the estimates of heritability were moderate to high, suggesting that a correlated response approach is suitable for practical decision making. Estimates of correlation between selection indices and the multivariate index (LD1) were moderate to high, ranging from 0.48 to 0.97. This reveals that both types of indices give similar results and that the multivariate approach is reliable for the purpose of selection. The alternative tool seems very handy when economic weights are not available or in cases where more rapid identification of the best animals is desired. Interestingly, multivariate analysis allowed forecasting information based on the relationships among breeding values (EPDs). Also, it enabled fine discrimination, rapid data summarization after genetic evaluation, and permitted accounting for maternal ability and the genetic direct potential of the animals. In addition, we recommend the use of longissimus muscle area and subcutaneous fat thickness as selection criteria, to allow estimation of breeding values before the first mating season in order to accelerate the response to individual selection. PMID:26789008

  12. Selection Indices and Multivariate Analysis Show Similar Results in the Evaluation of Growth and Carcass Traits in Beef Cattle.

    PubMed

    Brito Lopes, Fernando; da Silva, Marcelo Corrêa; Magnabosco, Cláudio Ulhôa; Goncalves Narciso, Marcelo; Sainz, Roberto Daniel

    2016-01-01

    This research evaluated a multivariate approach as an alternative tool for the purpose of selection regarding expected progeny differences (EPDs). Data were fitted using a multi-trait model and consisted of growth traits (birth weight and weights at 120, 210, 365 and 450 days of age) and carcass traits (longissimus muscle area (LMA), back-fat thickness (BF), and rump fat thickness (RF)), registered over 21 years in extensive breeding systems of Polled Nellore cattle in Brazil. Multivariate analyses were performed using standardized (zero mean and unit variance) EPDs. The k mean method revealed that the best fit of data occurred using three clusters (k = 3) (P < 0.001). Estimates of genetic correlation among growth and carcass traits and the estimates of heritability were moderate to high, suggesting that a correlated response approach is suitable for practical decision making. Estimates of correlation between selection indices and the multivariate index (LD1) were moderate to high, ranging from 0.48 to 0.97. This reveals that both types of indices give similar results and that the multivariate approach is reliable for the purpose of selection. The alternative tool seems very handy when economic weights are not available or in cases where more rapid identification of the best animals is desired. Interestingly, multivariate analysis allowed forecasting information based on the relationships among breeding values (EPDs). Also, it enabled fine discrimination, rapid data summarization after genetic evaluation, and permitted accounting for maternal ability and the genetic direct potential of the animals. In addition, we recommend the use of longissimus muscle area and subcutaneous fat thickness as selection criteria, to allow estimation of breeding values before the first mating season in order to accelerate the response to individual selection.

  13. The physical and empirical basis for a specific clear-air turbulence risk index

    NASA Technical Reports Server (NTRS)

    Keller, J. L.

    1985-01-01

    An improved operational CAT detection and forecasting technique is developed and detailed. This technique is the specific clear air turbulence risk (SCATR) index. This index shows some promising results. The improvements seen using hand analyzed data, as a result of the more realistic representation of the vertical shear of the horizontal wind, are also realized in the data analysis used in the PROFS/CWP application. The SCATR index should improve as database enhancements such as profiler and VAS satellite data, which increase the resolution in space and time, are brought into even more sophisticated objective analysis schemes.

  14. Operational tools to help stakeholders to protect and alert municipalities facing uncertainties and changes in karst flash floods

    NASA Astrophysics Data System (ADS)

    Borrell Estupina, V.; Raynaud, F.; Bourgeois, N.; Kong-A-Siou, L.; Collet, L.; Haziza, E.; Servat, E.

    2015-06-01

    Flash floods are often responsible for many deaths and involve many material damages. Regarding Mediterranean karst aquifers, the complexity of connections, between surface and groundwater, as well as weather non-stationarity patterns, increase difficulties in understanding the basins behaviour and thus warning and protecting people. Furthermore, given the recent changes in land use and extreme rainfall events, knowledge of the past floods is no longer sufficient to manage flood risks. Therefore the worst realistic flood that could occur should be considered. Physical and processes-based hydrological models are considered among the best ways to forecast floods under diverse conditions. However, they rarely match with the stakeholders' needs. In fact, the forecasting services, the municipalities, and the civil security have difficulties in running and interpreting data-consuming models in real-time, above all if data are uncertain or non-existent. To face these social and technical difficulties and help stakeholders, this study develops two operational tools derived from these models. These tools aim at planning real-time decisions given little, changing, and uncertain information available, which are: (i) a hydrological graphical tool (abacus) to estimate flood peak discharge from the karst past state and the forecasted but uncertain intense rainfall; (ii) a GIS-based method (MARE) to estimate the potential flooded pathways and areas, accounting for runoff and karst contributions and considering land use changes. Then, outputs of these tools are confronted to past and recent floods and municipalities observations, and the impacts of uncertainties and changes on planning decisions are discussed. The use of these tools on the recent 2014 events demonstrated their reliability and interest for stakeholders. This study was realized on French Mediterranean basins, in close collaboration with the Flood Forecasting Services (SPC Med-Ouest, SCHAPI, municipalities).

  15. Petascale Diagnostic Assessment of the Global Portfolio Rainfall Space Missions' Ability to Support Flood Forecasting

    NASA Astrophysics Data System (ADS)

    Reed, P. M.; Chaney, N.; Herman, J. D.; Wood, E. F.; Ferringer, M. P.

    2015-12-01

    This research represents a multi-institutional collaboration between Cornell University, The Aerospace Corporation, and Princeton University that has completed a Petascale diagnostic assessment of the current 10 satellite missions providing rainfall observations. Our diagnostic assessment has required four core tasks: (1) formally linking high-resolution astrodynamics design and coordination of space assets with their global hydrological impacts within a Petascale "many-objective" global optimization framework, (2) developing a baseline diagnostic evaluation of a 1-degree resolution global implementation of the Variable Infiltration Capacity (VIC) model to establish the required satellite observation frequencies and coverage to maintain acceptable global flood forecasts, (3) evaluating the limitations and vulnerabilities of the full suite of current satellite precipitation missions including the recently approved Global Precipitation Measurement (GPM) mission, and (4) conceptualizing the next generation spaced-based platforms for water cycle observation. Our team exploited over 100 Million hours of computing access on the 700,000+ core Blue Waters machine to radically advance our ability to discover and visualize key system tradeoffs and sensitivities. This project represents to our knowledge the first attempt to develop a 10,000 member Monte Carlo global hydrologic simulation at one degree resolution that characterizes the uncertain effects of changing the available frequencies of satellite precipitation on drought and flood forecasts. The simulation—optimization components of the work have set a theoretical baseline for the best possible frequencies and coverages for global precipitation given unlimited investment, broad international coordination in reconfiguring existing assets, and new satellite constellation design objectives informed directly by key global hydrologic forecasting requirements. Our research poses a step towards realizing the integrated global water cycle observatory long sought by the World Climate Research Programme, which has to date eluded the world's space agencies.

  16. Artificial intelligence approach with the use of artificial neural networks for the creation of a forecasting model of Plasmopara viticola infection.

    PubMed

    Bugliosi, R; Spera, G; La Torre, A; Campoli, L; Scaglione, M

    2006-01-01

    Most of the forecasting models of Plasmopara viticola infections are based upon empiric correlations between meteorological/environmental data and pathogen outbreak. These models generally overestimate the risk of infections and induce to treat the vineyard even if it should be not necessary. In rare cases they underrate the risk of infection leaving the pathogen to breakout. Starting from these considerations we have decided to approach the problem from another point of view utilizing Artificial Intelligence techniques for data elaboration and analysis. Meanwhile the same data have been studied with a more classic approach with statistical tools to verify the impact of a large data collection on the standard data analysis methods. A network of RTUs (Remote Terminal Units) distributed all over the Italian national territory transmits 12 environmental parameters every 15 minutes via radio or via GPRS to a centralized Data Base. Other pedologic data is collected directly from the field and sent via Internet to the centralized data base utilizing Personal Digital Assistants (PDAs) running a specific software. Data is stored after having been preprocessed, to guarantee the quality of the information. The subsequent analysis has been realized mostly with Artificial Neural Networks (ANNs). Collecting and analizing data in this way will probably bring us to the possibility of preventing Plasmospara viticola infection starting from the environmental conditions in this very complex context. The aim of this work is to forecast the infection avoiding the ineffective use of the plant protection products in agriculture. Applying different analysis models we will try to find the best ANN capable of forecasting with an high level of affordability.

  17. Decadal Prediction Skill in the GEOS-5 Forecast System

    NASA Technical Reports Server (NTRS)

    Ham, Yoo-Geun; Rienecker, Michele M.; Suarez, Max J.; Vikhliaev, Yury; Zhao, Bin; Marshak, Jelena; Vernieres, Guillaume; Schubert, Siegfried D.

    2013-01-01

    A suite of decadal predictions has been conducted with the NASA Global Modeling and Assimilation Office's (GMAO's) GEOS-5 Atmosphere-Ocean general circulation model. The hind casts are initialized every December 1st from 1959 to 2010, following the CMIP5 experimental protocol for decadal predictions. The initial conditions are from a multivariate ensemble optimal interpolation ocean and sea-ice reanalysis, and from GMAO's atmospheric reanalysis, the modern-era retrospective analysis for research and applications. The mean forecast skill of a three-member-ensemble is compared to that of an experiment without initialization but also forced with observed greenhouse gases. The results show that initialization increases the forecast skill of North Atlantic sea surface temperature compared to the uninitialized runs, with the increase in skill maintained for almost a decade over the subtropical and mid-latitude Atlantic. On the other hand, the initialization reduces the skill in predicting the warming trend over some regions outside the Atlantic. The annual-mean Atlantic meridional overturning circulation index, which is defined here as the maximum of the zonally-integrated overturning stream function at mid-latitude, is predictable up to a 4-year lead time, consistent with the predictable signal in upper ocean heat content over the North Atlantic. While the 6- to 9-year forecast skill measured by mean squared skill score shows 50 percent improvement in the upper ocean heat content over the subtropical and mid-latitude Atlantic, prediction skill is relatively low in the sub-polar gyre. This low skill is due in part to features in the spatial pattern of the dominant simulated decadal mode in upper ocean heat content over this region that differ from observations. An analysis of the large-scale temperature budget shows that this is the result of a model bias, implying that realistic simulation of the climatological fields is crucial for skillful decadal forecasts.

  18. Strategies to reduce the complexity of hydrologic data assimilation for high-dimensional models

    NASA Astrophysics Data System (ADS)

    Hernandez, F.; Liang, X.

    2017-12-01

    Probabilistic forecasts in the geosciences offer invaluable information by allowing to estimate the uncertainty of predicted conditions (including threats like floods and droughts). However, while forecast systems based on modern data assimilation algorithms are capable of producing multi-variate probability distributions of future conditions, the computational resources required to fully characterize the dependencies between the model's state variables render their applicability impractical for high-resolution cases. This occurs because of the quadratic space complexity of storing the covariance matrices that encode these dependencies and the cubic time complexity of performing inference operations with them. In this work we introduce two complementary strategies to reduce the size of the covariance matrices that are at the heart of Bayesian assimilation methods—like some variants of (ensemble) Kalman filters and of particle filters—and variational methods. The first strategy involves the optimized grouping of state variables by clustering individual cells of the model into "super-cells." A dynamic fuzzy clustering approach is used to take into account the states (e.g., soil moisture) and forcings (e.g., precipitation) of each cell at each time step. The second strategy consists in finding a compressed representation of the covariance matrix that still encodes the most relevant information but that can be more efficiently stored and processed. A learning and a belief-propagation inference algorithm are developed to take advantage of this modified low-rank representation. The two proposed strategies are incorporated into OPTIMISTS, a state-of-the-art hybrid Bayesian/variational data assimilation algorithm, and comparative streamflow forecasting tests are performed using two watersheds modeled with the Distributed Hydrology Soil Vegetation Model (DHSVM). Contrasts are made between the efficiency gains and forecast accuracy losses of each strategy used in isolation, and of those achieved through their coupling. We expect these developments to help catalyze improvements in the predictive accuracy of large-scale forecasting operations by lowering the costs of deploying advanced data assimilation techniques.

  19. Real-time realizations of the Bayesian Infrasonic Source Localization Method

    NASA Astrophysics Data System (ADS)

    Pinsky, V.; Arrowsmith, S.; Hofstetter, A.; Nippress, A.

    2015-12-01

    The Bayesian Infrasonic Source Localization method (BISL), introduced by Mordak et al. (2010) and upgraded by Marcillo et al. (2014) is destined for the accurate estimation of the atmospheric event origin at local, regional and global scales by the seismic and infrasonic networks and arrays. The BISL is based on probabilistic models of the source-station infrasonic signal propagation time, picking time and azimuth estimate merged with a prior knowledge about celerity distribution. It requires at each hypothetical source location, integration of the product of the corresponding source-station likelihood functions multiplied by a prior probability density function of celerity over the multivariate parameter space. The present BISL realization is generally time-consuming procedure based on numerical integration. The computational scheme proposed simplifies the target function so that integrals are taken exactly and are represented via standard functions. This makes the procedure much faster and realizable in real-time without practical loss of accuracy. The procedure executed as PYTHON-FORTRAN code demonstrates high performance on a set of the model and real data.

  20. How are you feeling?: A personalized methodology for predicting mental states from temporally observable physical and behavioral information.

    PubMed

    Tuarob, Suppawong; Tucker, Conrad S; Kumara, Soundar; Giles, C Lee; Pincus, Aaron L; Conroy, David E; Ram, Nilam

    2017-04-01

    It is believed that anomalous mental states such as stress and anxiety not only cause suffering for the individuals, but also lead to tragedies in some extreme cases. The ability to predict the mental state of an individual at both current and future time periods could prove critical to healthcare practitioners. Currently, the practical way to predict an individual's mental state is through mental examinations that involve psychological experts performing the evaluations. However, such methods can be time and resource consuming, mitigating their broad applicability to a wide population. Furthermore, some individuals may also be unaware of their mental states or may feel uncomfortable to express themselves during the evaluations. Hence, their anomalous mental states could remain undetected for a prolonged period of time. The objective of this work is to demonstrate the ability of using advanced machine learning based approaches to generate mathematical models that predict current and future mental states of an individual. The problem of mental state prediction is transformed into the time series forecasting problem, where an individual is represented as a multivariate time series stream of monitored physical and behavioral attributes. A personalized mathematical model is then automatically generated to capture the dependencies among these attributes, which is used for prediction of mental states for each individual. In particular, we first illustrate the drawbacks of traditional multivariate time series forecasting methodologies such as vector autoregression. Then, we show that such issues could be mitigated by using machine learning regression techniques which are modified for capturing temporal dependencies in time series data. A case study using the data from 150 human participants illustrates that the proposed machine learning based forecasting methods are more suitable for high-dimensional psychological data than the traditional vector autoregressive model in terms of both magnitude of error and directional accuracy. These results not only present a successful usage of machine learning techniques in psychological studies, but also serve as a building block for multiple medical applications that could rely on an automated system to gauge individuals' mental states. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Very short-term reactive forecasting of the solar ultraviolet index using an extreme learning machine integrated with the solar zenith angle.

    PubMed

    Deo, Ravinesh C; Downs, Nathan; Parisi, Alfio V; Adamowski, Jan F; Quilty, John M

    2017-05-01

    Exposure to erythemally-effective solar ultraviolet radiation (UVR) that contributes to malignant keratinocyte cancers and associated health-risk is best mitigated through innovative decision-support systems, with global solar UV index (UVI) forecast necessary to inform real-time sun-protection behaviour recommendations. It follows that the UVI forecasting models are useful tools for such decision-making. In this study, a model for computationally-efficient data-driven forecasting of diffuse and global very short-term reactive (VSTR) (10-min lead-time) UVI, enhanced by drawing on the solar zenith angle (θ s ) data, was developed using an extreme learning machine (ELM) algorithm. An ELM algorithm typically serves to address complex and ill-defined forecasting problems. UV spectroradiometer situated in Toowoomba, Australia measured daily cycles (0500-1700h) of UVI over the austral summer period. After trialling activations functions based on sine, hard limit, logarithmic and tangent sigmoid and triangular and radial basis networks for best results, an optimal ELM architecture utilising logarithmic sigmoid equation in hidden layer, with lagged combinations of θ s as the predictor data was developed. ELM's performance was evaluated using statistical metrics: correlation coefficient (r), Willmott's Index (WI), Nash-Sutcliffe efficiency coefficient (E NS ), root mean square error (RMSE), and mean absolute error (MAE) between observed and forecasted UVI. Using these metrics, the ELM model's performance was compared to that of existing methods: multivariate adaptive regression spline (MARS), M5 Model Tree, and a semi-empirical (Pro6UV) clear sky model. Based on RMSE and MAE values, the ELM model (0.255, 0.346, respectively) outperformed the MARS (0.310, 0.438) and M5 Model Tree (0.346, 0.466) models. Concurring with these metrics, the Willmott's Index for the ELM, MARS and M5 Model Tree models were 0.966, 0.942 and 0.934, respectively. About 57% of the ELM model's absolute errors were small in magnitude (±0.25), whereas the MARS and M5 Model Tree models generated 53% and 48% of such errors, respectively, indicating the latter models' errors to be distributed in larger magnitude error range. In terms of peak global UVI forecasting, with half the level of error, the ELM model outperformed MARS and M5 Model Tree. A comparison of the magnitude of hourly-cumulated errors of 10-min lead time forecasts for diffuse and global UVI highlighted ELM model's greater accuracy compared to MARS, M5 Model Tree or Pro6UV models. This confirmed the versatility of an ELM model drawing on θ s data for VSTR forecasting of UVI at near real-time horizon. When applied to the goal of enhancing expert systems, ELM-based accurate forecasts capable of reacting quickly to measured conditions can enhance real-time exposure advice for the public, mitigating the potential for solar UV-exposure-related disease. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  2. Forecasting East Asian Indices Futures via a Novel Hybrid of Wavelet-PCA Denoising and Artificial Neural Network Models

    PubMed Central

    2016-01-01

    The motivation behind this research is to innovatively combine new methods like wavelet, principal component analysis (PCA), and artificial neural network (ANN) approaches to analyze trade in today’s increasingly difficult and volatile financial futures markets. The main focus of this study is to facilitate forecasting by using an enhanced denoising process on market data, taken as a multivariate signal, in order to deduct the same noise from the open-high-low-close signal of a market. This research offers evidence on the predictive ability and the profitability of abnormal returns of a new hybrid forecasting model using Wavelet-PCA denoising and ANN (named WPCA-NN) on futures contracts of Hong Kong’s Hang Seng futures, Japan’s NIKKEI 225 futures, Singapore’s MSCI futures, South Korea’s KOSPI 200 futures, and Taiwan’s TAIEX futures from 2005 to 2014. Using a host of technical analysis indicators consisting of RSI, MACD, MACD Signal, Stochastic Fast %K, Stochastic Slow %K, Stochastic %D, and Ultimate Oscillator, empirical results show that the annual mean returns of WPCA-NN are more than the threshold buy-and-hold for the validation, test, and evaluation periods; this is inconsistent with the traditional random walk hypothesis, which insists that mechanical rules cannot outperform the threshold buy-and-hold. The findings, however, are consistent with literature that advocates technical analysis. PMID:27248692

  3. Forecasting East Asian Indices Futures via a Novel Hybrid of Wavelet-PCA Denoising and Artificial Neural Network Models.

    PubMed

    Chan Phooi M'ng, Jacinta; Mehralizadeh, Mohammadali

    2016-01-01

    The motivation behind this research is to innovatively combine new methods like wavelet, principal component analysis (PCA), and artificial neural network (ANN) approaches to analyze trade in today's increasingly difficult and volatile financial futures markets. The main focus of this study is to facilitate forecasting by using an enhanced denoising process on market data, taken as a multivariate signal, in order to deduct the same noise from the open-high-low-close signal of a market. This research offers evidence on the predictive ability and the profitability of abnormal returns of a new hybrid forecasting model using Wavelet-PCA denoising and ANN (named WPCA-NN) on futures contracts of Hong Kong's Hang Seng futures, Japan's NIKKEI 225 futures, Singapore's MSCI futures, South Korea's KOSPI 200 futures, and Taiwan's TAIEX futures from 2005 to 2014. Using a host of technical analysis indicators consisting of RSI, MACD, MACD Signal, Stochastic Fast %K, Stochastic Slow %K, Stochastic %D, and Ultimate Oscillator, empirical results show that the annual mean returns of WPCA-NN are more than the threshold buy-and-hold for the validation, test, and evaluation periods; this is inconsistent with the traditional random walk hypothesis, which insists that mechanical rules cannot outperform the threshold buy-and-hold. The findings, however, are consistent with literature that advocates technical analysis.

  4. Novel Methods in Disease Biogeography: A Case Study with Heterosporosis

    PubMed Central

    Escobar, Luis E.; Qiao, Huijie; Lee, Christine; Phelps, Nicholas B. D.

    2017-01-01

    Disease biogeography is currently a promising field to complement epidemiology, and ecological niche modeling theory and methods are a key component. Therefore, applying the concepts and tools from ecological niche modeling to disease biogeography and epidemiology will provide biologically sound and analytically robust descriptive and predictive analyses of disease distributions. As a case study, we explored the ecologically important fish disease Heterosporosis, a relatively poorly understood disease caused by the intracellular microsporidian parasite Heterosporis sutherlandae. We explored two novel ecological niche modeling methods, the minimum-volume ellipsoid (MVE) and the Marble algorithm, which were used to reconstruct the fundamental and the realized ecological niche of H. sutherlandae, respectively. Additionally, we assessed how the management of occurrence reports can impact the output of the models. Ecological niche models were able to reconstruct a proxy of the fundamental and realized niche for this aquatic parasite, identifying specific areas suitable for Heterosporosis. We found that the conceptual and methodological advances in ecological niche modeling provide accessible tools to update the current practices of spatial epidemiology. However, careful data curation and a detailed understanding of the algorithm employed are critical for a clear definition of the assumptions implicit in the modeling process and to ensure biologically sound forecasts. In this paper, we show how sensitive MVE is to the input data, while Marble algorithm may provide detailed forecasts with a minimum of parameters. We showed that exploring algorithms of different natures such as environmental clusters, climatic envelopes, and logistic regressions (e.g., Marble, MVE, and Maxent) provide different scenarios of potential distribution. Thus, no single algorithm should be used for disease mapping. Instead, different algorithms should be employed for a more informed and complete understanding of the pathogen or parasite in question. PMID:28770215

  5. Remote sensing validation through SOOP technology: implementation of Spectra system

    NASA Astrophysics Data System (ADS)

    Piermattei, Viviana; Madonia, Alice; Bonamano, Simone; Consalvi, Natalizia; Caligiore, Aurelio; Falcone, Daniela; Puri, Pio; Sarti, Fabio; Spaccavento, Giovanni; Lucarini, Diego; Pacci, Giacomo; Amitrano, Luigi; Iacullo, Salvatore; D'Andrea, Salvatore; Marcelli, Marco

    2017-04-01

    The development of low-cost instrumentation plays a key role in marine environmental studies and represents one of the most innovative aspects of marine research. The availability of low-cost technologies allows the realization of extended observatory networks for the study of marine phenomena through an integrated approach merging observations, remote sensing and operational oceanography. Marine services and practical applications critically depends on the availability of large amount of data collected with sufficiently dense spatial and temporal sampling. This issue directly influences the robustness both of ocean forecasting models and remote sensing observations through data assimilation and validation processes, particularly in the biological domain. For this reason it is necessary the development of cheap, small and integrated smart sensors, which could be functional both for satellite data validation and forecasting models data assimilation as well as to support early warning systems for environmental pollution control and prevention. This is particularly true in coastal areas, which are subjected to multiple anthropic pressures. Moreover, coastal waters can be classified like case 2 waters, where the optical properties of inorganic suspended matter and chromophoric dissolved organic matter must be considered and separated by the chlorophyll a contribution. Due to the high costs of mooring systems, research vessels, measure platforms and instrumentation a big effort was dedicated to the design, development and realization of a new low cost mini-FerryBox system: Spectra. Thanks to the modularity and user-friendly employment of the system, Spectra allows to acquire continuous in situ measures of temperature, conductivity, turbidity, chlorophyll a and chromophoric dissolved organic matter (CDOM) fluorescences from voluntary vessels, even by non specialized operators (Marcelli et al., 2014; 2016). This work shows the preliminary application of this technology to remote sensing data validation.

  6. Improving overly manufacturing metrics through application of feedforward mask-bias

    NASA Astrophysics Data System (ADS)

    Joubert, Etienne; Pellegrini, Joseph C.; Misra, Manish; Sturtevant, John L.; Bernhard, John M.; Ong, Phu; Crawshaw, Nathan K.; Puchalski, Vern

    2003-06-01

    Traditional run-to-run controllers that rely on highly correlated historical events to forecast process corrections have been shown to provide substantial benefit over manual control in the case of a fab that is primarily manufacturing high volume, frequent running parts (i.e., DRAM, MPU, and similar operations). However, a limitation of the traditional controller emerges when it is applied to a fab whose work in process (WIP) is composed of primarily short-running, high part count products (typical of foundries and ASIC fabs). This limitation exists because there is a strong likelihood that each reticle has a unique set of process corrections different from other reticles at the same process layer. Further limitations exist when it is realized that each reticle is loaded and aligned differently on multiple exposure tools.A structural change in how the run-to-run controller manages the frequent reticle changes associated with the high part count environment has allowed for breakthrough performance to be achieved. This breakthrough was mad possible by the realization that; 1. Reticle sourced errors were highly stable over long periods of time, thus allowing them to be deconvolved from the day to day tool and process drifts. 2. Reticle sourced errors can be modeled as a feedforward disturbance rather than as discriminates in defining and dividing process streams. In this paper, we show how to deconvolve the static (reticle) and dynamic (day to day tool and process) components from the overall error vector to better forecast feedback for existing products as well as how to compute or learn these values for new product introductions - or new tool startups. Manufacturing data will presented to support this discussion with some real world success stories.

  7. The research and application of the power big data

    NASA Astrophysics Data System (ADS)

    Zhang, Suxiang; Zhang, Dong; Zhang, Yaping; Cao, Jinping; Xu, Huiming

    2017-01-01

    Facing the increasing environment crisis, how to improve energy efficiency is the important problem. Power big data is main support tool to realize demand side management and response. With the promotion of smart power consumption, distributed clean energy and electric vehicles etc get wide application; meanwhile, the continuous development of the Internet of things technology, more applications access the endings in the grid power link, which leads to that a large number of electric terminal equipment, new energy access smart grid, and it will produce massive heterogeneous and multi-state electricity data. These data produce the power grid enterprise's precious wealth, as the power big data. How to transform it into valuable knowledge and effective operation becomes an important problem, it needs to interoperate in the smart grid. In this paper, we had researched the various applications of power big data and integrate the cloud computing and big data technology, which include electricity consumption online monitoring, the short-term power load forecasting and the analysis of the energy efficiency. Based on Hadoop, HBase and Hive etc., we realize the ETL and OLAP functions; and we also adopt the parallel computing framework to achieve the power load forecasting algorithms and propose a parallel locally weighted linear regression model; we study on energy efficiency rating model to comprehensive evaluate the level of energy consumption of electricity users, which allows users to understand their real-time energy consumption situation, adjust their electricity behavior to reduce energy consumption, it provides decision-making basis for the user. With an intelligent industrial park as example, this paper complete electricity management. Therefore, in the future, power big data will provide decision-making support tools for energy conservation and emissions reduction.

  8. Analysis of significant factors for dengue fever incidence prediction.

    PubMed

    Siriyasatien, Padet; Phumee, Atchara; Ongruk, Phatsavee; Jampachaisri, Katechan; Kesorn, Kraisak

    2016-04-16

    Many popular dengue forecasting techniques have been used by several researchers to extrapolate dengue incidence rates, including the K-H model, support vector machines (SVM), and artificial neural networks (ANN). The time series analysis methodology, particularly ARIMA and SARIMA, has been increasingly applied to the field of epidemiological research for dengue fever, dengue hemorrhagic fever, and other infectious diseases. The main drawback of these methods is that they do not consider other variables that are associated with the dependent variable. Additionally, new factors correlated to the disease are needed to enhance the prediction accuracy of the model when it is applied to areas of similar climates, where weather factors such as temperature, total rainfall, and humidity are not substantially different. Such drawbacks may consequently lower the predictive power for the outbreak. The predictive power of the forecasting model-assessed by Akaike's information criterion (AIC), Bayesian information criterion (BIC), and the mean absolute percentage error (MAPE)-is improved by including the new parameters for dengue outbreak prediction. This study's selected model outperforms all three other competing models with the lowest AIC, the lowest BIC, and a small MAPE value. The exclusive use of climate factors from similar locations decreases a model's prediction power. The multivariate Poisson regression, however, effectively forecasts even when climate variables are slightly different. Female mosquitoes and seasons were strongly correlated with dengue cases. Therefore, the dengue incidence trends provided by this model will assist the optimization of dengue prevention. The present work demonstrates the important roles of female mosquito infection rates from the previous season and climate factors (represented as seasons) in dengue outbreaks. Incorporating these two factors in the model significantly improves the predictive power of dengue hemorrhagic fever forecasting models, as confirmed by AIC, BIC, and MAPE.

  9. Empowering Geoscience with Improved Data Assimilation Using the Data Assimilation Research Testbed "Manhattan" Release.

    NASA Astrophysics Data System (ADS)

    Raeder, K.; Hoar, T. J.; Anderson, J. L.; Collins, N.; Hendricks, J.; Kershaw, H.; Ha, S.; Snyder, C.; Skamarock, W. C.; Mizzi, A. P.; Liu, H.; Liu, J.; Pedatella, N. M.; Karspeck, A. R.; Karol, S. I.; Bitz, C. M.; Zhang, Y.

    2017-12-01

    The capabilities of the Data Assimilation Research Testbed (DART) at NCAR have been significantly expanded with the recent "Manhattan" release. DART is an ensemble Kalman filter based suite of tools, which enables researchers to use data assimilation (DA) without first becoming DA experts. Highlights: significant improvement in efficient ensemble DA for very large models on thousands of processors, direct read and write of model state files in parallel, more control of the DA output for finer-grained analysis, new model interfaces which are useful to a variety of geophysical researchers, new observation forward operators and the ability to use precomputed forward operators from the forecast model. The new model interfaces and example applications include the following: MPAS-A; Model for Prediction Across Scales - Atmosphere is a global, nonhydrostatic, variable-resolution mesh atmospheric model, which facilitates multi-scale analysis and forecasting. The absence of distinct subdomains eliminates problems associated with subdomain boundaries. It demonstrates the ability to consistently produce higher-quality analyses than coarse, uniform meshes do. WRF-Chem; Weather Research and Forecasting + (MOZART) Chemistry model assimilates observations from FRAPPÉ (Front Range Air Pollution and Photochemistry Experiment). WACCM-X; Whole Atmosphere Community Climate Model with thermosphere and ionosphere eXtension assimilates observations of electron density to investigate sudden stratospheric warming. CESM (weakly) coupled assimilation; NCAR's Community Earth System Model is used for assimilation of atmospheric and oceanic observations into their respective components using coupled atmosphere+land+ocean+sea+ice forecasts. CESM2.0; Assimilation in the atmospheric component (CAM, WACCM) of the newly released version is supported. This version contains new and extensively updated components and software environment. CICE; Los Alamos sea ice model (in CESM) is used to assimilate multivariate sea ice concentration observations to constrain the model's ice thickness, concentration, and parameters.

  10. Multi-centennial upper-ocean heat content reconstruction using online data assimilation

    NASA Astrophysics Data System (ADS)

    Perkins, W. A.; Hakim, G. J.

    2017-12-01

    The Last Millennium Reanalysis (LMR) provides an advanced paleoclimate ensemble data assimilation framework for multi-variate climate field reconstructions over the Common Era. Although reconstructions in this framework with full Earth system models remain prohibitively expensive, recent work has shown improved ensemble reconstruction validation using computationally inexpensive linear inverse models (LIMs). Here we leverage these techniques in pursuit of a new multi-centennial field reconstruction of upper-ocean heat content (OHC), synthesizing model dynamics with observational constraints from proxy records. OHC is an important indicator of internal climate variability and responds to planetary energy imbalances. Therefore, a consistent extension of the OHC record in time will help inform aspects of low-frequency climate variability. We use the Community Climate System Model version 4 (CCSM4) and Max Planck Institute (MPI) last millennium simulations to derive the LIMs, and the PAGES2K v.2.0 proxy database to perform annually resolved reconstructions of upper-OHC, surface air temperature, and wind stress over the last 500 years. Annual OHC reconstructions and uncertainties for both the global mean and regional basins are compared against observational and reanalysis data. We then investigate differences in dynamical behavior at decadal and longer time scales between the reconstruction and simulations in the last-millennium Coupled Model Intercomparison Project version 5 (CMIP5). Preliminary investigation of 1-year forecast skill for an OHC-only LIM shows largely positive spatial grid point local anomaly correlations (LAC) with a global average LAC of 0.37. Compared to 1-year OHC persistence forecast LAC (global average LAC of 0.30), the LIM outperforms the persistence forecasts in the tropical Indo-Pacific region, the equatorial Atlantic, and in certain regions near the Antarctic Circumpolar Current. In other regions, the forecast correlations are less than the persistence case but still positive overall.

  11. Harbin 2020 R&D Personnel Demand Forecast Based on Manufacturing Green Innovation System

    NASA Astrophysics Data System (ADS)

    Jiang, Xin; Duan, Yu Ting; Shen, Jun Yi; Zhang, Dong Ying

    2018-06-01

    Because of the constraints of energy conservation and the impact on the environment, the manufacturing industry has adopted sustainable development as the goal, and a green manufacturing innovation system based on environmental protection has emerged. In order to provide R&D personnel support to manufacturing enterprises in Harbin, and in order to promote the construction of a green innovation system for manufacturing and the realization of the 13th Five-Year Plan, this article used the grey forecasting model and the univariate linear regression prediction to predict the number of R&D personnel in Harbin in 2020 based on the number of R&D personnel in 2010-2016, and the predicted values were 24,952 and 31,172 respectively. The results show that if Harbin continues to use its original development model, it will not be able to achieve the established development goals by 2020 because of the shortage of R&D personnel. Therefore, it is necessary to increase investment in R&D personnel so as to achieve the 13th Five-Year Plan of Harbin City and protect the ecological green development goals.

  12. Ensemble machine learning and forecasting can achieve 99% uptime for rural handpumps

    PubMed Central

    Thomas, Evan A.

    2017-01-01

    Broken water pumps continue to impede efforts to deliver clean and economically-viable water to the global poor. The literature has demonstrated that customers’ health benefits and willingness to pay for clean water are best realized when clean water infrastructure performs extremely well (>99% uptime). In this paper, we used sensor data from 42 Afridev-brand handpumps observed for 14 months in western Kenya to demonstrate how sensors and supervised ensemble machine learning could be used to increase total fleet uptime from a best-practices baseline of about 70% to >99%. We accomplish this increase in uptime by forecasting pump failures and identifying existing failures very quickly. Comparing the costs of operating the pump per functional year over a lifetime of 10 years, we estimate that implementing this algorithm would save 7% on the levelized cost of water relative to a sensor-less scheduled maintenance program. Combined with a rigorous system for dispatching maintenance personnel, implementing this algorithm in a real-world program could significantly improve health outcomes and customers’ willingness to pay for water services. PMID:29182673

  13. On the log-normality of historical magnetic-storm intensity statistics: implications for extreme-event probabilities

    USGS Publications Warehouse

    Love, Jeffrey J.; Rigler, E. Joshua; Pulkkinen, Antti; Riley, Pete

    2015-01-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to −Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, −Dst≥850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42,2.41] times per century; a 100-yr magnetic storm is identified as having a −Dst≥880 nT (greater than Carrington) but a wide 95% confidence interval of [490,1187] nT.

  14. Aggregation of Environmental Model Data for Decision Support

    NASA Astrophysics Data System (ADS)

    Alpert, J. C.

    2013-12-01

    Weather forecasts and warnings must be prepared and then delivered so as to reach their intended audience in good time to enable effective decision-making. An effort to mitigate these difficulties was studied at a Workshop, 'Sustaining National Meteorological Services - Strengthening WMO Regional and Global Centers' convened, June , 2013, by the World Bank, WMO and the US National Weather Service (NWS). The skill and accuracy of atmospheric forecasts from deterministic models have increased and there are now ensembles of such models that improve decisions to protect life, property and commerce. The NWS production of numerical weather prediction products result in model output from global and high resolution regional ensemble forecasts. Ensembles are constructed by changing the initial conditions to make a 'cloud' of forecasts that attempt to span the space of possible atmospheric realizations which can quantify not only the most likely forecast, but also the uncertainty. This has led to an unprecedented increase in data production and information content from higher resolution, multi-model output and secondary calculations. One difficulty is to obtain the needed subset of data required to estimate the probability of events, and report the information. The calibration required to reliably estimate the probability of events, and honing of threshold adjustments to reduce false alarms for decision makers is also needed. To meet the future needs of the ever-broadening user community and address these issues on a national and international basis, the weather service implemented the NOAA Operational Model Archive and Distribution System (NOMADS). NOMADS provides real-time and retrospective format independent access to climate, ocean and weather model data and delivers high availability content services as part of NOAA's official real time data dissemination at its new NCWCP web operations center. An important aspect of the server's abilities is to aggregate the matrix of model output offering access to probability and calibrating information for real time decision making. The aggregation content server reports over ensemble component and forecast time in addition to the other data dimensions of vertical layer and position for each variable. The unpacking, organization and reading of many binary packed files is accomplished most efficiently on the server while weather element event probability calculations, the thresholds for more accurate decision support, or display remain for the client. Our goal is to reduce uncertainty for variables of interest, e.g, agricultural importance. The weather service operational GFS model ensemble and short range ensemble forecasts can make skillful probability forecasts to alert users if and when their selected weather events will occur. A description of how this framework operates and how it can be implemented using existing NOMADS content services and applications is described.

  15. Applying remote sensing to invasive species science—A tamarisk example

    USGS Publications Warehouse

    Morisette, Jeffrey T.

    2011-01-01

    The Invasive Species Science Branch of the Fort Collins Science Center provides research and technical assistance relating to management concerns for invasive species, including understanding how these species are introduced, identifying areas vulnerable to invasion, forecasting invasions, and developing control methods. This fact sheet considers the invasive plant species tamarisk (Tamarix spp), addressing three fundamental questions: *Where is it now? *What are the potential or realized ecological impacts of invasion? *Where can it survive and thrive if introduced? It provides peer-review examples of how the U.S. Geological Survey, working with other federal agencies and university partners, are applying remote-sensing technologies to address these key questions.

  16. Internet pharmaceutical sales: attributes, concerns, and future forecast.

    PubMed

    Bruckel, Katy; Capozzoli, Ernest A

    2003-01-01

    Internet pharmaceutical sales continue to skyrocket as healthcare providers and consumers are increasingly relying on the efficiencies and convenience that is available via such transactions. Managed care companies, increasing demands to reduce healthcare inefficiencies while maximizing the quality of patient care is a significant contributing factor to the expanding utilization and success of online pharmaceutical sales. However, with the expansion of Internet pharmaceutical sales, healthcare providers, pharmacy benefit management and insurance companies, and consumers realize new opportunities and risks. This paper will review the attributes and concerns associated with online pharmaceutical sales, discussing current and pending legislation intended to more effectively manage these parameters.

  17. Quantitative forecasting of PTSD from early trauma responses: a Machine Learning application.

    PubMed

    Galatzer-Levy, Isaac R; Karstoft, Karen-Inge; Statnikov, Alexander; Shalev, Arieh Y

    2014-12-01

    There is broad interest in predicting the clinical course of mental disorders from early, multimodal clinical and biological information. Current computational models, however, constitute a significant barrier to realizing this goal. The early identification of trauma survivors at risk of post-traumatic stress disorder (PTSD) is plausible given the disorder's salient onset and the abundance of putative biological and clinical risk indicators. This work evaluates the ability of Machine Learning (ML) forecasting approaches to identify and integrate a panel of unique predictive characteristics and determine their accuracy in forecasting non-remitting PTSD from information collected within 10 days of a traumatic event. Data on event characteristics, emergency department observations, and early symptoms were collected in 957 trauma survivors, followed for fifteen months. An ML feature selection algorithm identified a set of predictors that rendered all others redundant. Support Vector Machines (SVMs) as well as other ML classification algorithms were used to evaluate the forecasting accuracy of i) ML selected features, ii) all available features without selection, and iii) Acute Stress Disorder (ASD) symptoms alone. SVM also compared the prediction of a) PTSD diagnostic status at 15 months to b) posterior probability of membership in an empirically derived non-remitting PTSD symptom trajectory. Results are expressed as mean Area Under Receiver Operating Characteristics Curve (AUC). The feature selection algorithm identified 16 predictors, present in ≥ 95% cross-validation trials. The accuracy of predicting non-remitting PTSD from that set (AUC = .77) did not differ from predicting from all available information (AUC = .78). Predicting from ASD symptoms was not better then chance (AUC = .60). The prediction of PTSD status was less accurate than that of membership in a non-remitting trajectory (AUC = .71). ML methods may fill a critical gap in forecasting PTSD. The ability to identify and integrate unique risk indicators makes this a promising approach for developing algorithms that infer probabilistic risk of chronic posttraumatic stress psychopathology based on complex sources of biological, psychological, and social information. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Sampling strategies based on singular vectors for assimilated models in ocean forecasting systems

    NASA Astrophysics Data System (ADS)

    Fattorini, Maria; Brandini, Carlo; Ortolani, Alberto

    2016-04-01

    Meteorological and oceanographic models do need observations, not only as a ground truth element to verify the quality of the models, but also to keep model forecast error acceptable: through data assimilation techniques which merge measured and modelled data, natural divergence of numerical solutions from reality can be reduced / controlled and a more reliable solution - called analysis - is computed. Although this concept is valid in general, its application, especially in oceanography, raises many problems due to three main reasons: the difficulties that have ocean models in reaching an acceptable state of equilibrium, the high measurements cost and the difficulties in realizing them. The performances of the data assimilation procedures depend on the particular observation networks in use, well beyond the background quality and the used assimilation method. In this study we will present some results concerning the great impact of the dataset configuration, in particular measurements position, on the evaluation of the overall forecasting reliability of an ocean model. The aim consists in identifying operational criteria to support the design of marine observation networks at regional scale. In order to identify the observation network able to minimize the forecast error, a methodology based on Singular Vectors Decomposition of the tangent linear model is proposed. Such a method can give strong indications on the local error dynamics. In addition, for the purpose of avoiding redundancy of information contained in the data, a minimal distance among data positions has been chosen on the base of a spatial correlation analysis of the hydrodynamic fields under investigation. This methodology has been applied for the choice of data positions starting from simplified models, like an ideal double-gyre model and a quasi-geostrophic one. Model configurations and data assimilation are based on available ROMS routines, where a variational assimilation algorithm (4D-var) is included as part of the code These first applications have provided encouraging results in terms of increased predictability time and reduced forecast error, also improving the quality of the analysis used to recover the real circulation patterns from a first guess quite far from the real state.

  19. Liver cancer mortality rate model in Thailand

    NASA Astrophysics Data System (ADS)

    Sriwattanapongse, Wattanavadee; Prasitwattanaseree, Sukon

    2013-09-01

    Liver Cancer has been a leading cause of death in Thailand. The purpose of this study was to model and forecast liver cancer mortality rate in Thailand using death certificate reports. A retrospective analysis of the liver cancer mortality rate was conducted. Numbering of 123,280 liver cancer causes of death cases were obtained from the national vital registration database for the 10-year period from 2000 to 2009, provided by the Ministry of Interior and coded as cause-of-death using ICD-10 by the Ministry of Public Health. Multivariate regression model was used for modeling and forecasting age-specific liver cancer mortality rates in Thailand. Liver cancer mortality increased with increasing age for each sex and was also higher in the North East provinces. The trends of liver cancer mortality remained stable in most age groups with increases during ten-year period (2000 to 2009) in the Northern and Southern. Liver cancer mortality was higher in males and increase with increasing age. There is need of liver cancer control measures to remain on a sustained and long-term basis for the high liver cancer burden rate of Thailand.

  20. Vector autoregressive model approach for forecasting outflow cash in Central Java

    NASA Astrophysics Data System (ADS)

    hoyyi, Abdul; Tarno; Maruddani, Di Asih I.; Rahmawati, Rita

    2018-05-01

    Multivariate time series model is more applied in economic and business problems as well as in other fields. Applications in economic problems one of them is the forecasting of outflow cash. This problem can be viewed globally in the sense that there is no spatial effect between regions, so the model used is the Vector Autoregressive (VAR) model. The data used in this research is data on the money supply in Bank Indonesia Semarang, Solo, Purwokerto and Tegal. The model used in this research is VAR (1), VAR (2) and VAR (3) models. Ordinary Least Square (OLS) is used to estimate parameters. The best model selection criteria use the smallest Akaike Information Criterion (AIC). The result of data analysis shows that the AIC value of VAR (1) model is equal to 42.72292, VAR (2) equals 42.69119 and VAR (3) equals 42.87662. The difference in AIC values is not significant. Based on the smallest AIC value criteria, the best model is the VAR (2) model. This model has satisfied the white noise assumption.

  1. Multivariate Time Series Forecasting of Crude Palm Oil Price Using Machine Learning Techniques

    NASA Astrophysics Data System (ADS)

    Kanchymalay, Kasturi; Salim, N.; Sukprasert, Anupong; Krishnan, Ramesh; Raba'ah Hashim, Ummi

    2017-08-01

    The aim of this paper was to study the correlation between crude palm oil (CPO) price, selected vegetable oil prices (such as soybean oil, coconut oil, and olive oil, rapeseed oil and sunflower oil), crude oil and the monthly exchange rate. Comparative analysis was then performed on CPO price forecasting results using the machine learning techniques. Monthly CPO prices, selected vegetable oil prices, crude oil prices and monthly exchange rate data from January 1987 to February 2017 were utilized. Preliminary analysis showed a positive and high correlation between the CPO price and soy bean oil price and also between CPO price and crude oil price. Experiments were conducted using multi-layer perception, support vector regression and Holt Winter exponential smoothing techniques. The results were assessed by using criteria of root mean square error (RMSE), means absolute error (MAE), means absolute percentage error (MAPE) and Direction of accuracy (DA). Among these three techniques, support vector regression(SVR) with Sequential minimal optimization (SMO) algorithm showed relatively better results compared to multi-layer perceptron and Holt Winters exponential smoothing method.

  2. The joint return period analysis of natural disasters based on monitoring and statistical modeling of multidimensional hazard factors.

    PubMed

    Liu, Xueqin; Li, Ning; Yuan, Shuai; Xu, Ning; Shi, Wenqin; Chen, Weibin

    2015-12-15

    As a random event, a natural disaster has the complex occurrence mechanism. The comprehensive analysis of multiple hazard factors is important in disaster risk assessment. In order to improve the accuracy of risk analysis and forecasting, the formation mechanism of a disaster should be considered in the analysis and calculation of multi-factors. Based on the consideration of the importance and deficiencies of multivariate analysis of dust storm disasters, 91 severe dust storm disasters in Inner Mongolia from 1990 to 2013 were selected as study cases in the paper. Main hazard factors from 500-hPa atmospheric circulation system, near-surface meteorological system, and underlying surface conditions were selected to simulate and calculate the multidimensional joint return periods. After comparing the simulation results with actual dust storm events in 54years, we found that the two-dimensional Frank Copula function showed the better fitting results at the lower tail of hazard factors and that three-dimensional Frank Copula function displayed the better fitting results at the middle and upper tails of hazard factors. However, for dust storm disasters with the short return period, three-dimensional joint return period simulation shows no obvious advantage. If the return period is longer than 10years, it shows significant advantages in extreme value fitting. Therefore, we suggest the multivariate analysis method may be adopted in forecasting and risk analysis of serious disasters with the longer return period, such as earthquake and tsunami. Furthermore, the exploration of this method laid the foundation for the prediction and warning of other nature disasters. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Drought: A comprehensive R package for drought monitoring, prediction and analysis

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.; Cheng, Hongguang

    2015-04-01

    Drought may impose serious challenges to human societies and ecosystems. Due to complicated causing effects and wide impacts, a universally accepted definition of drought does not exist. The drought indicator is commonly used to characterize drought properties such as duration or severity. Various drought indicators have been developed in the past few decades for the monitoring of a certain aspect of drought condition along with the development of multivariate drought indices for drought characterizations from multiple sources or hydro-climatic variables. Reliable drought prediction with suitable drought indicators is critical to the drought preparedness plan to reduce potential drought impacts. In addition, drought analysis to quantify the risk of drought properties would provide useful information for operation drought managements. The drought monitoring, prediction and risk analysis are important components in drought modeling and assessments. In this study, a comprehensive R package "drought" is developed to aid the drought monitoring, prediction and risk analysis (available from R-Forge and CRAN soon). The computation of a suite of univariate and multivariate drought indices that integrate drought information from various sources such as precipitation, temperature, soil moisture, and runoff is available in the drought monitoring component in the package. The drought prediction/forecasting component consists of statistical drought predictions to enhance the drought early warning for decision makings. Analysis of drought properties such as duration and severity is also provided in this package for drought risk assessments. Based on this package, a drought monitoring and prediction/forecasting system is under development as a decision supporting tool. The package will be provided freely to the public to aid the drought modeling and assessment for researchers and practitioners.

  4. Stochastic univariate and multivariate time series analysis of PM2.5 and PM10 air pollution: A comparative case study for Plovdiv and Asenovgrad, Bulgaria

    NASA Astrophysics Data System (ADS)

    Gocheva-Ilieva, S.; Stoimenova, M.; Ivanov, A.; Voynikova, D.; Iliev, I.

    2016-10-01

    Fine particulate matter PM2.5 and PM10 air pollutants are a serious problem in many urban areas affecting both the health of the population and the environment as a whole. The availability of large data arrays for the levels of these pollutants makes it possible to perform statistical analysis, to obtain relevant information, and to find patterns within the data. Research in this field is particularly topical for a number of Bulgarian cities, European country, where in recent years regulatory air pollution health limits are constantly being exceeded. This paper examines average daily data for air pollution with PM2.5 and PM10, collected by 3 monitoring stations in the cities of Plovdiv and Asenovgrad between 2011 and 2016. The goal is to find and analyze actual relationships in data time series, to build adequate mathematical models, and to develop short-term forecasts. Modeling is carried out by stochastic univariate and multivariate time series analysis, based on Box-Jenkins methodology. The best models are selected following initial transformation of the data and using a set of standard and robust statistical criteria. The Mathematica and SPSS software were used to perform calculations. This examination showed measured concentrations of PM2.5 and PM10 in the region of Plovdiv and Asenovgrad regularly exceed permissible European and national health and safety thresholds. We obtained adequate stochastic models with high statistical fit with the data and good quality forecasting when compared against actual measurements. The mathematical approach applied provides an independent alternative to standard official monitoring and control means for air pollution in urban areas.

  5. A vision for an ultra-high resolution integrated water cycle observation and prediction system

    NASA Astrophysics Data System (ADS)

    Houser, P. R.

    2013-05-01

    Society's welfare, progress, and sustainable economic growth—and life itself—depend on the abundance and vigorous cycling and replenishing of water throughout the global environment. The water cycle operates on a continuum of time and space scales and exchanges large amounts of energy as water undergoes phase changes and is moved from one part of the Earth system to another. We must move toward an integrated observation and prediction paradigm that addresses broad local-to-global science and application issues by realizing synergies associated with multiple, coordinated observations and prediction systems. A central challenge of a future water and energy cycle observation strategy is to progress from single variable water-cycle instruments to multivariable integrated instruments in electromagnetic-band families. The microwave range in the electromagnetic spectrum is ideally suited for sensing the state and abundance of water because of water's dielectric properties. Eventually, a dedicated high-resolution water-cycle microwave-based satellite mission may be possible based on large-aperture antenna technology that can harvest the synergy that would be afforded by simultaneous multichannel active and passive microwave measurements. A partial demonstration of these ideas can even be realized with existing microwave satellite observations to support advanced multivariate retrieval methods that can exploit the totality of the microwave spectral information. The simultaneous multichannel active and passive microwave retrieval would allow improved-accuracy retrievals that are not possible with isolated measurements. Furthermore, the simultaneous monitoring of several of the land, atmospheric, oceanic, and cryospheric states brings synergies that will substantially enhance understanding of the global water and energy cycle as a system. The multichannel approach also affords advantages to some constituent retrievals—for instance, simultaneous retrieval of vegetation biomass would improve soil-moisture retrieval by avoiding the need for auxiliary vegetation information. This multivariable water-cycle observation system must be integrated with high-resolution, application relevant prediction systems to optimize their information content and utility is addressing critical water cycle issues. One such vision is a real-time ultra-high resolution locally-moasiced global land modeling and assimilation system, that overlays regional high-fidelity information over a baseline global land prediction system. Such a system would provide the best possible local information for use in applications, while integrating and sharing information globally for diagnosing larger water cycle variability. In a sense, this would constitute a hydrologic telecommunication system, where the best local in-situ gage, Doppler radar, and weather station can be shared internationally, and integrated in a consistent manner with global observation platforms like the multivariable water cycle mission. To realize such a vision, large issues must be addressed, such as international data sharing policy, model-observation integration approaches that maintain local extremes while achieving global consistency, and methods for establishing error estimates and uncertainty.

  6. Developing a dengue early warning system using time series model: Case study in Tainan, Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, Xiao-Wei; Jan, Chyan-Deng; Wang, Ji-Shang

    2017-04-01

    Dengue fever (DF) is a climate-sensitive disease that has been emerging in southern regions of Taiwan over the past few decades, causing a significant health burden to affected areas. This study aims to propose a predictive model to implement an early warning system so as to enhance dengue surveillance and control in Tainan, Taiwan. The Seasonal Autoregressive Integrated Moving Average (SARIMA) model was used herein to forecast dengue cases. Temporal correlation between dengue incidences and climate variables were examined by Pearson correlation analysis and Cross-correlation tests in order to identify key determinants to be included as predictors. The dengue surveillance data between 2000 and 2009, as well as their respective climate variables were then used as inputs for the model. We validated the model by forecasting the number of dengue cases expected to occur each week between January 1, 2010 and December 31, 2015. In addition, we analyzed historical dengue trends and found that 25 cases occurring in one week was a trigger point that often led to a dengue outbreak. This threshold point was combined with the season-based framework put forth by the World Health Organization to create a more accurate epidemic threshold for a Tainan-specific warning system. A Seasonal ARIMA model with the general form: (1,0,5)(1,1,1)52 is identified as the most appropriate model based on lowest AIC, and was proven significant in the prediction of observed dengue cases. Based on the correlation coefficient, Lag-11 maximum 1-hr rainfall (r=0.319, P<0.05) and Lag-11 minimum temperature (r=0.416, P<0.05) are found to be the most positively correlated climate variables. Comparing the four multivariate models(i.e.1, 4, 9 and 13 weeks ahead), we found that including the climate variables improves the prediction RMSE as high as 3.24%, 10.39%, 17.96%, 21.81% respectively, in contrast to univariate models. Furthermore, the ability of the four multivariate models to determine whether the epidemic threshold would be exceeded in any given week during the forecasting period of 2010-2015 was analyzed using a contingency table. The 4 weeks-ahead approach was the most appropriate for an operational public health response with a 78.7% hit rate and 0.7% false alarm rate. Our findings indicate that SARIMA model is an ideal model for detecting outbreaks as it has high sensitivity and low risk of false alarms. Accurately forecasting future trends will provide valuable time to activate dengue surveillance and control in Tainan, Taiwan. We conclude that this timely dengue early warning system will enable public health services to allocate limited resources more effectively, and public health officials to adjust dengue emergency response plans to their maximum capabilities.

  7. An iterative ensemble quasi-linear data assimilation approach for integrated reservoir monitoring

    NASA Astrophysics Data System (ADS)

    Li, J. Y.; Kitanidis, P. K.

    2013-12-01

    Reservoir forecasting and management are increasingly relying on an integrated reservoir monitoring approach, which involves data assimilation to calibrate the complex process of multi-phase flow and transport in the porous medium. The numbers of unknowns and measurements arising in such joint inversion problems are usually very large. The ensemble Kalman filter and other ensemble-based techniques are popular because they circumvent the computational barriers of computing Jacobian matrices and covariance matrices explicitly and allow nonlinear error propagation. These algorithms are very useful but their performance is not well understood and it is not clear how many realizations are needed for satisfactory results. In this presentation we introduce an iterative ensemble quasi-linear data assimilation approach for integrated reservoir monitoring. It is intended for problems for which the posterior or conditional probability density function is not too different from a Gaussian, despite nonlinearity in the state transition and observation equations. The algorithm generates realizations that have the potential to adequately represent the conditional probability density function (pdf). Theoretical analysis sheds light on the conditions under which this algorithm should work well and explains why some applications require very few realizations while others require many. This algorithm is compared with the classical ensemble Kalman filter (Evensen, 2003) and with Gu and Oliver's (2007) iterative ensemble Kalman filter on a synthetic problem of monitoring a reservoir using wellbore pressure and flux data.

  8. Learning temporal rules to forecast instability in continuously monitored patients.

    PubMed

    Guillame-Bert, Mathieu; Dubrawski, Artur; Wang, Donghan; Hravnak, Marilyn; Clermont, Gilles; Pinsky, Michael R

    2017-01-01

    Inductive machine learning, and in particular extraction of association rules from data, has been successfully used in multiple application domains, such as market basket analysis, disease prognosis, fraud detection, and protein sequencing. The appeal of rule extraction techniques stems from their ability to handle intricate problems yet produce models based on rules that can be comprehended by humans, and are therefore more transparent. Human comprehension is a factor that may improve adoption and use of data-driven decision support systems clinically via face validity. In this work, we explore whether we can reliably and informatively forecast cardiorespiratory instability (CRI) in step-down unit (SDU) patients utilizing data from continuous monitoring of physiologic vital sign (VS) measurements. We use a temporal association rule extraction technique in conjunction with a rule fusion protocol to learn how to forecast CRI in continuously monitored patients. We detail our approach and present and discuss encouraging empirical results obtained using continuous multivariate VS data from the bedside monitors of 297 SDU patients spanning 29 346 hours (3.35 patient-years) of observation. We present example rules that have been learned from data to illustrate potential benefits of comprehensibility of the extracted models, and we analyze the empirical utility of each VS as a potential leading indicator of an impending CRI event. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Forecasting high-priority infectious disease surveillance regions: a socioeconomic model.

    PubMed

    Chan, Emily H; Scales, David A; Brewer, Timothy F; Madoff, Lawrence C; Pollack, Marjorie P; Hoen, Anne G; Choden, Tenzin; Brownstein, John S

    2013-02-01

    Few researchers have assessed the relationships between socioeconomic inequality and infectious disease outbreaks at the population level globally. We use a socioeconomic model to forecast national annual rates of infectious disease outbreaks. We constructed a multivariate mixed-effects Poisson model of the number of times a given country was the origin of an outbreak in a given year. The dataset included 389 outbreaks of international concern reported in the World Health Organization's Disease Outbreak News from 1996 to 2008. The initial full model included 9 socioeconomic variables related to education, poverty, population health, urbanization, health infrastructure, gender equality, communication, transportation, and democracy, and 1 composite index. Population, latitude, and elevation were included as potential confounders. The initial model was pared down to a final model by a backwards elimination procedure. The dependent and independent variables were lagged by 2 years to allow for forecasting future rates. Among the socioeconomic variables tested, the final model included child measles immunization rate and telephone line density. The Democratic Republic of Congo, China, and Brazil were predicted to be at the highest risk for outbreaks in 2010, and Colombia and Indonesia were predicted to have the highest percentage of increase in their risk compared to their average over 1996-2008. Understanding socioeconomic factors could help improve the understanding of outbreak risk. The inclusion of the measles immunization variable suggests that there is a fundamental basis in ensuring adequate public health capacity. Increased vigilance and expanding public health capacity should be prioritized in the projected high-risk regions.

  10. Forecasting paediatric malaria admissions on the Kenya Coast using rainfall.

    PubMed

    Karuri, Stella Wanjugu; Snow, Robert W

    2016-01-01

    Malaria is a vector-borne disease which, despite recent scaled-up efforts to achieve control in Africa, continues to pose a major threat to child survival. The disease is caused by the protozoan parasite Plasmodium and requires mosquitoes and humans for transmission. Rainfall is a major factor in seasonal and secular patterns of malaria transmission along the East African coast. The goal of the study was to develop a model to reliably forecast incidences of paediatric malaria admissions to Kilifi District Hospital (KDH). In this article, we apply several statistical models to look at the temporal association between monthly paediatric malaria hospital admissions, rainfall, and Indian Ocean sea surface temperatures. Trend and seasonally adjusted, marginal and multivariate, time-series models for hospital admissions were applied to a unique data set to examine the role of climate, seasonality, and long-term anomalies in predicting malaria hospital admission rates and whether these might become more or less predictable with increasing vector control. The proportion of paediatric admissions to KDH that have malaria as a cause of admission can be forecast by a model which depends on the proportion of malaria admissions in the previous 2 months. This model is improved by incorporating either the previous month's Indian Ocean Dipole information or the previous 2 months' rainfall. Surveillance data can help build time-series prediction models which can be used to anticipate seasonal variations in clinical burdens of malaria in stable transmission areas and aid the timing of malaria vector control.

  11. Climate Cycles and Forecasts of Cutaneous Leishmaniasis, a Nonstationary Vector-Borne Disease

    PubMed Central

    Chaves, Luis Fernando; Pascual, Mercedes

    2006-01-01

    Background Cutaneous leishmaniasis (CL) is one of the main emergent diseases in the Americas. As in other vector-transmitted diseases, its transmission is sensitive to the physical environment, but no study has addressed the nonstationary nature of such relationships or the interannual patterns of cycling of the disease. Methods and Findings We studied monthly data, spanning from 1991 to 2001, of CL incidence in Costa Rica using several approaches for nonstationary time series analysis in order to ensure robustness in the description of CL's cycles. Interannual cycles of the disease and the association of these cycles to climate variables were described using frequency and time-frequency techniques for time series analysis. We fitted linear models to the data using climatic predictors, and tested forecasting accuracy for several intervals of time. Forecasts were evaluated using “out of fit” data (i.e., data not used to fit the models). We showed that CL has cycles of approximately 3 y that are coherent with those of temperature and El Niño Southern Oscillation indices (Sea Surface Temperature 4 and Multivariate ENSO Index). Conclusions Linear models using temperature and MEI can predict satisfactorily CL incidence dynamics up to 12 mo ahead, with an accuracy that varies from 72% to 77% depending on prediction time. They clearly outperform simpler models with no climate predictors, a finding that further supports a dynamical link between the disease and climate. PMID:16903778

  12. Objective Use of Climate Indices to Inform Ensemble Streamflow Forecasts in the Columbia River Basin - An Initial Review

    NASA Astrophysics Data System (ADS)

    Pytlak, E.; McManamon, A.; Hughes, S. P.; Van Der Zweep, R. A.; Butcher, P.; Karafotias, C.; Beckers, J.; Welles, E.

    2016-12-01

    Numerous studies have documented the impacts that large scale weather patterns and climate phenomenon like the El Niño Southern Oscillation (ENSO), Pacific-North American (PNA) Pattern, and others can have on seasonal temperature and precipitation in the Columbia River Basin (CRB). While far from perfect in terms of seasonal predictability in specific locations, these intra-annual weather and climate signal do tilt the odds toward different temperature and precipitation outcomes, which in turn can have impacts on seasonal snowpacks, streamflows and water supply in large river basins like the CRB. We hypothesize that intraseasonal climate signals and long wave jet stream patterns can be objectively incorporated into what it is otherwise a climatology-based set of Ensemble Streamflow Forecasts, and can increase the predictive skill and utility of these forecasts used for mid-range hydropower planning. The Bonneville Power Administration (BPA) and Deltares have developed a subsampling-resampling method to incorporate climate mode information into the Ensemble Streamflow Prediction (ESP) forecasts (Beckers, et al., 2016). Since 2015, BPA and Deltares USA have experimented with this method in pre-operational use, using five objective multivariate climate indices that appear to have the greatest predictive value for seasonal temperature and precipitation in the CRB. The indices are used to objectively select historical weather from about twenty analog years in the 66-year (1949-2015) historical ESP set. These twenty scenarios then serve as the starting point to generate monthly synthetic weather and streamflow time series to return to a set of 66 streamflow traces. Our poster will share initial results from the 2015 and 2016 water years, which included large swings in the Quasi-Biennial Oscillation, persistent blocking jet stream patterns, and the development of a strong El Niño event. While the results are very preliminary and for only two seasons, there may be some value in incorporating objectively-identified climate signals into ESP-based streamflow forecasts.Beckers, J. V. L., Weerts, A. H., Tijdeman, E., and Welles, E.: ENSO-Conditioned Weather Resampling Method for Seasonal Ensemble Streamflow Prediction, Hydrol. Earth Syst. Sci. Discuss., doi:10.5194/hess-2016-72, in review, 2016.

  13. State updating of a distributed hydrological model with Ensemble Kalman Filtering: Effects of updating frequency and observation network density on forecast accuracy

    NASA Astrophysics Data System (ADS)

    Rakovec, O.; Weerts, A.; Hazenberg, P.; Torfs, P.; Uijlenhoet, R.

    2012-12-01

    This paper presents a study on the optimal setup for discharge assimilation within a spatially distributed hydrological model (Rakovec et al., 2012a). The Ensemble Kalman filter (EnKF) is employed to update the grid-based distributed states of such an hourly spatially distributed version of the HBV-96 model. By using a physically based model for the routing, the time delay and attenuation are modelled more realistically. The discharge and states at a given time step are assumed to be dependent on the previous time step only (Markov property). Synthetic and real world experiments are carried out for the Upper Ourthe (1600 km2), a relatively quickly responding catchment in the Belgian Ardennes. The uncertain precipitation model forcings were obtained using a time-dependent multivariate spatial conditional simulation method (Rakovec et al., 2012b), which is further made conditional on preceding simulations. We assess the impact on the forecasted discharge of (1) various sets of the spatially distributed discharge gauges and (2) the filtering frequency. The results show that the hydrological forecast at the catchment outlet is improved by assimilating interior gauges. This augmentation of the observation vector improves the forecast more than increasing the updating frequency. In terms of the model states, the EnKF procedure is found to mainly change the pdfs of the two routing model storages, even when the uncertainty in the discharge simulations is smaller than the defined observation uncertainty. Rakovec, O., Weerts, A. H., Hazenberg, P., Torfs, P. J. J. F., and Uijlenhoet, R.: State updating of a distributed hydrological model with Ensemble Kalman Filtering: effects of updating frequency and observation network density on forecast accuracy, Hydrol. Earth Syst. Sci. Discuss., 9, 3961-3999, doi:10.5194/hessd-9-3961-2012, 2012a. Rakovec, O., Hazenberg, P., Torfs, P. J. J. F., Weerts, A. H., and Uijlenhoet, R.: Generating spatial precipitation ensembles: impact of temporal correlation structure, Hydrol. Earth Syst. Sci. Discuss., 9, 3087-3127, doi:10.5194/hessd-9-3087-2012, 2012b.

  14. Travel Demand Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Southworth, Frank; Garrow, Dr. Laurie

    This chapter describes the principal types of both passenger and freight demand models in use today, providing a brief history of model development supported by references to a number of popular texts on the subject, and directing the reader to papers covering some of the more recent technical developments in the area. Over the past half century a variety of methods have been used to estimate and forecast travel demands, drawing concepts from economic/utility maximization theory, transportation system optimization and spatial interaction theory, using and often combining solution techniques as varied as Box-Jenkins methods, non-linear multivariate regression, non-linear mathematical programming,more » and agent-based microsimulation.« less

  15. Forecast for solar cycle 23 activity: a progress report

    NASA Astrophysics Data System (ADS)

    Ahluwalia, H. S.

    2001-08-01

    At the 25th International Cosmic Ray Conference (ICRC) at Durban, South Africa, I announced the discovery of a three cycle quasi-periodicity in the ion chamber data string assembled by me, for the 1937 to 1994 period (Conf. Pap., v. 2, p. 109, 1997). It corresponded in time with a similar quasi-periodicity observed in the dataset for the planetary index Ap. At the 26th ICRC at Salt Lake City, UT, I reported on our analysis of the Ap data to forecast the amplitude of solar cycle 23 activity (Conf. Pap., v. 2, pl. 260, 1999). I predicted that cycle 23 will be moderate (a la cycle 17), notwithstanding the early exuberant forecasts of some solar astronomers that cycle 23, "may be one of the greatest cycles in recent times, if not the greatest." Sunspot number data up to April 2001 indicate that our forecast appears to be right on the mark. We review the solar, interplanetary and geophysical data and describe the important lessons learned from this experience. 1. Introduction Ohl (1971) was the first to realize that Sun may be sending us a subliminal message as to its intent for its activity (Sunspot Numbers, SSN) in the next cycle. He posited that the message was embedded in the geomagnetic activity (given by sum Kp). Schatten at al (1978) suggested that Ohl hypothesis could be understood on the basis of the model proposed by Babcock (1961) who suggested that the high latitude solar poloidal fields, near a minimum, emerge as the toroidal fields on opposite sides of the solar equator. This is known as the Solar Dynamo Model. One can speculate that the precursor poloidal solar field is entrained in the high speed solar wind streams (HSSWS) from the coronal holes which are observed at Earth's orbit during the descending phase of the previous cycle. The interaction

  16. Non-uniform multivariate embedding to assess the information transfer in cardiovascular and cardiorespiratory variability series.

    PubMed

    Faes, Luca; Nollo, Giandomenico; Porta, Alberto

    2012-03-01

    The complexity of the short-term cardiovascular control prompts for the introduction of multivariate (MV) nonlinear time series analysis methods to assess directional interactions reflecting the underlying regulatory mechanisms. This study introduces a new approach for the detection of nonlinear Granger causality in MV time series, based on embedding the series by a sequential, non-uniform procedure, and on estimating the information flow from one series to another by means of the corrected conditional entropy. The approach is validated on short realizations of linear stochastic and nonlinear deterministic processes, and then evaluated on heart period, systolic arterial pressure and respiration variability series measured from healthy humans in the resting supine position and in the upright position after head-up tilt. Copyright © 2011 Elsevier Ltd. All rights reserved.

  17. Improving Air Quality Forecasts with AURA Observations

    NASA Technical Reports Server (NTRS)

    Newchurch, M. J.; Biazer, A.; Khan, M.; Koshak, W. J.; Nair, U.; Fuller, K.; Wang, L.; Parker, Y.; Williams, R.; Liu, X.

    2008-01-01

    Past studies have identified model initial and boundary conditions as sources of reducible errors in air-quality simulations. In particular, improving the initial condition improves the accuracy of short-term forecasts as it allows for the impact of local emissions to be realized by the model and improving boundary conditions improves long range transport through the model domain, especially in recirculating anticyclones. During the August 2006 period, we use AURA/OMI ozone measurements along with MODIS and CALIPSO aerosol observations to improve the initial and boundary conditions of ozone and Particulate Matter. Assessment of the model by comparison of the control run and satellite assimilation run to the IONS06 network of ozonesonde observations, which comprise the densest ozone sounding campaign ever conducted in North America, to AURA/TES ozone profile measurements, and to the EPA ground network of ozone and PM measurements will show significant improvement in the CMAQ calculations that use AURA initial and boundary conditions. Further analyses of lightning occurrences from ground and satellite observations and AURA/OMI NO2 column abundances will identify the lightning NOx signal evident in OMI measurements and suggest pathways for incorporating the lightning and NO2 data into the CMAQ simulations.

  18. Evaluation of Improved Pushback Forecasts Derived from Airline Ground Operations Data

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Theis, Georg; Feron, Eric; Clarke, John-Paul

    2003-01-01

    Accurate and timely predictions of airline pushbacks can potentially lead to improved performance of automated decision-support tools for airport surface traffic, thus reducing the variability and average duration of costly airline delays. One factor which affects the realization of these benefits is the level of uncertainty inherent in the turn processes. To characterize this inherent uncertainty, three techniques are developed for predicting time-to-go until pushback as a function of available ground-time; elapsed ground-time; and the status (not-started/in-progress/completed) of individual turn processes (cleaning, fueling, etc.). These techniques are tested against a large and detailed dataset covering approximately l0(exp 4) real-world turn operations obtained through collaboration with Deutsche Lufthansa AG. Even after the dataset is filtered to obtain a sample of turn operations with minimal uncertainty, the standard deviation of forecast error for all three techniques is lower-bounded away from zero, indicating that turn operations have a significant stochastic component. This lower-bound result shows that decision-support tools must be designed to incorporate robust mechanisms for coping with pushback demand stochasticity, rather than treating the pushback demand process as a known deterministic input.

  19. Model-independent constraints on modified gravity from current data and from the Euclid and SKA future surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taddei, Laura; Martinelli, Matteo; Amendola, Luca, E-mail: taddei@thphys.uni-heidelberg.de, E-mail: martinelli@lorentz.leidenuniv.nl, E-mail: amendola@thphys.uni-heidelberg.de

    2016-12-01

    The aim of this paper is to constrain modified gravity with redshift space distortion observations and supernovae measurements. Compared with a standard ΛCDM analysis, we include three additional free parameters, namely the initial conditions of the matter perturbations, the overall perturbation normalization, and a scale-dependent modified gravity parameter modifying the Poisson equation, in an attempt to perform a more model-independent analysis. First, we constrain the Poisson parameter Y (also called G {sub eff}) by using currently available f σ{sub 8} data and the recent SN catalog JLA. We find that the inclusion of the additional free parameters makes the constraintsmore » significantly weaker than when fixing them to the standard cosmological value. Second, we forecast future constraints on Y by using the predicted growth-rate data for Euclid and SKA missions. Here again we point out the weakening of the constraints when the additional parameters are included. Finally, we adopt as modified gravity Poisson parameter the specific Horndeski form, and use scale-dependent forecasts to build an exclusion plot for the Yukawa potential akin to the ones realized in laboratory experiments, both for the Euclid and the SKA surveys.« less

  20. Extinction debt from climate change for frogs in the wet tropics

    PubMed Central

    Brook, Barry W.; Hoskin, Conrad J.; Pressey, Robert L.; VanDerWal, Jeremy; Williams, Stephen E.

    2016-01-01

    The effect of twenty-first-century climate change on biodiversity is commonly forecast based on modelled shifts in species ranges, linked to habitat suitability. These projections have been coupled with species–area relationships (SAR) to infer extinction rates indirectly as a result of the loss of climatically suitable areas and associated habitat. This approach does not model population dynamics explicitly, and so accepts that extinctions might occur after substantial (but unknown) delays—an extinction debt. Here we explicitly couple bioclimatic envelope models of climate and habitat suitability with generic life-history models for 24 species of frogs found in the Australian Wet Tropics (AWT). We show that (i) as many as four species of frogs face imminent extinction by 2080, due primarily to climate change; (ii) three frogs face delayed extinctions; and (iii) this extinction debt will take at least a century to be realized in full. Furthermore, we find congruence between forecast rates of extinction using SARs, and demographic models with an extinction lag of 120 years. We conclude that SAR approaches can provide useful advice to conservation on climate change impacts, provided there is a good understanding of the time lags over which delayed extinctions are likely to occur. PMID:27729484

  1. Researches on High Accuracy Prediction Methods of Earth Orientation Parameters

    NASA Astrophysics Data System (ADS)

    Xu, X. Q.

    2015-09-01

    The Earth rotation reflects the coupling process among the solid Earth, atmosphere, oceans, mantle, and core of the Earth on multiple spatial and temporal scales. The Earth rotation can be described by the Earth's orientation parameters, which are abbreviated as EOP (mainly including two polar motion components PM_X and PM_Y, and variation in the length of day ΔLOD). The EOP is crucial in the transformation between the terrestrial and celestial reference systems, and has important applications in many areas such as the deep space exploration, satellite precise orbit determination, and astrogeodynamics. However, the EOP products obtained by the space geodetic technologies generally delay by several days to two weeks. The growing demands for modern space navigation make high-accuracy EOP prediction be a worthy topic. This thesis is composed of the following three aspects, for the purpose of improving the EOP forecast accuracy. (1) We analyze the relation between the length of the basic data series and the EOP forecast accuracy, and compare the EOP prediction accuracy for the linear autoregressive (AR) model and the nonlinear artificial neural network (ANN) method by performing the least squares (LS) extrapolations. The results show that the high precision forecast of EOP can be realized by appropriate selection of the basic data series length according to the required time span of EOP prediction: for short-term prediction, the basic data series should be shorter, while for the long-term prediction, the series should be longer. The analysis also showed that the LS+AR model is more suitable for the short-term forecasts, while the LS+ANN model shows the advantages in the medium- and long-term forecasts. (2) We develop for the first time a new method which combines the autoregressive model and Kalman filter (AR+Kalman) in short-term EOP prediction. The equations of observation and state are established using the EOP series and the autoregressive coefficients respectively, which are used to improve/re-evaluate the AR model. Comparing to the single AR model, the AR+Kalman method performs better in the prediction of UT1-UTC and ΔLOD, and the improvement in the prediction of the polar motion is significant. (3) Following the successful Earth Orientation Parameter Prediction Comparison Campaign (EOP PCC), the Earth Orientation Parameter Combination of Prediction Pilot Project (EOPC PPP) was sponsored in 2010. As one of the participants from China, we update and submit the short- and medium-term (1 to 90 days) EOP predictions every day. From the current comparative statistics, our prediction accuracy is on the medium international level. We will carry out more innovative researches to improve the EOP forecast accuracy and enhance our level in EOP forecast.

  2. Environmental forecasting and turbulence modeling

    NASA Astrophysics Data System (ADS)

    Hunt, J. C. R.

    This review describes the fundamental assumptions and current methodologies of the two main kinds of environmental forecast; the first is valid for a limited period of time into the future and over a limited space-time ‘target’, and is largely determined by the initial and preceding state of the environment, such as the weather or pollution levels, up to the time when the forecast is issued and by its state at the edges of the region being considered; the second kind provides statistical information over long periods of time and/or over large space-time targets, so that they only depend on the statistical averages of the initial and ‘edge’ conditions. Environmental forecasts depend on the various ways that models are constructed. These range from those based on the ‘reductionist’ methodology (i.e., the combination of separate, scientifically based, models for the relevant processes) to those based on statistical methodologies, using a mixture of data and scientifically based empirical modeling. These are, as a rule, focused on specific quantities required for the forecast. The persistence and predictability of events associated with environmental and turbulent flows and the reasons for variation in the accuracy of their forecasts (of the first and second kinds) are now better understood and better modeled. This has partly resulted from using analogous results of disordered chaotic systems, and using the techniques of calculating ensembles of realizations, ideally involving several different models, so as to incorporate in the probabilistic forecasts a wider range of possible events. The rationale for such an approach needs to be developed. However, other insights have resulted from the recognition of the ordered, though randomly occurring, nature of the persistent motions in these flows, whose scales range from those of synoptic weather patterns (whether storms or ‘blocked’ anticyclones) to small scale vortices. These eigen states can be predicted from the reductionist models or may be modeled specifically, for example, in terms of ‘self-organized’ critical phenomena. It is noted how in certain applications of turbulent modeling its methods are beginning to resemble those of environmental simulations, because of the trend to introduce ‘on-line’ controls of the turbulent flows in advanced flows in advanced engineering fluid systems. In real time simulations, for both local environmental processes and these engineering systems, maximum information is needed about the likely flow patterns in order to optimize both the assimilation of limited real-time data and the use of limited real-time computing capacity. It is concluded that philosophical studies of how scientific models develop and of the concept of determinism in science are helpful in considering these complex issues.

  3. Subsurface characterization with localized ensemble Kalman filter employing adaptive thresholding

    NASA Astrophysics Data System (ADS)

    Delijani, Ebrahim Biniaz; Pishvaie, Mahmoud Reza; Boozarjomehry, Ramin Bozorgmehry

    2014-07-01

    Ensemble Kalman filter, EnKF, as a Monte Carlo sequential data assimilation method has emerged promisingly for subsurface media characterization during past decade. Due to high computational cost of large ensemble size, EnKF is limited to small ensemble set in practice. This results in appearance of spurious correlation in covariance structure leading to incorrect or probable divergence of updated realizations. In this paper, a universal/adaptive thresholding method is presented to remove and/or mitigate spurious correlation problem in the forecast covariance matrix. This method is, then, extended to regularize Kalman gain directly. Four different thresholding functions have been considered to threshold forecast covariance and gain matrices. These include hard, soft, lasso and Smoothly Clipped Absolute Deviation (SCAD) functions. Three benchmarks are used to evaluate the performances of these methods. These benchmarks include a small 1D linear model and two 2D water flooding (in petroleum reservoirs) cases whose levels of heterogeneity/nonlinearity are different. It should be noted that beside the adaptive thresholding, the standard distance dependant localization and bootstrap Kalman gain are also implemented for comparison purposes. We assessed each setup with different ensemble sets to investigate the sensitivity of each method on ensemble size. The results indicate that thresholding of forecast covariance yields more reliable performance than Kalman gain. Among thresholding function, SCAD is more robust for both covariance and gain estimation. Our analyses emphasize that not all assimilation cycles do require thresholding and it should be performed wisely during the early assimilation cycles. The proposed scheme of adaptive thresholding outperforms other methods for subsurface characterization of underlying benchmarks.

  4. The use of spatio-temporal correlation to forecast critical transitions

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Bierkens, Marc F. P.

    2010-05-01

    Complex dynamical systems may have critical thresholds at which the system shifts abruptly from one state to another. Such critical transitions have been observed in systems ranging from the human body system to financial markets and the Earth system. Forecasting the timing of critical transitions before they are reached is of paramount importance because critical transitions are associated with a large shift in dynamical regime of the system under consideration. However, it is hard to forecast critical transitions, because the state of the system shows relatively little change before the threshold is reached. Recently, it was shown that increased spatio-temporal autocorrelation and variance can serve as alternative early warning signal for critical transitions. However, thus far these second order statistics have not been used for forecasting in a data assimilation framework. Here we show that the use of spatio-temporal autocorrelation and variance in the state of the system reduces the uncertainty in the predicted timing of critical transitions compared to classical approaches that use the value of the system state only. This is shown by assimilating observed spatio-temporal autocorrelation and variance into a dynamical system model using a Particle Filter. We adapt a well-studied distributed model of a logistically growing resource with a fixed grazing rate. The model describes the transition from an underexploited system with high resource biomass to overexploitation as grazing pressure crosses the critical threshold, which is a fold bifurcation. To represent limited prior information, we use a large variance in the prior probability distributions of model parameters and the system driver (grazing rate). First, we show that the rate of increase in spatio-temporal autocorrelation and variance prior to reaching the critical threshold is relatively consistent across the uncertainty range of the driver and parameter values used. This indicates that an increase in spatio-temporal autocorrelation and variance are consistent predictors of a critical transition, even under the condition of a poorly defined system. Second, we perform data assimilation experiments using an artificial exhaustive data set generated by one realization of the model. To mimic real-world sampling, an observational data set is created from this exhaustive data set. This is done by sampling on a regular spatio-temporal grid, supplemented by sampling locations at a short distance. Spatial and temporal autocorrelation in this observational data set is calculated for different spatial and temporal separation (lag) distances. To assign appropriate weights to observations (here, autocorrelation values and variance) in the Particle Filter, the covariance matrix of the error in these observations is required. This covariance matrix is estimated using Monte Carlo sampling, selecting a different random position of the sampling network relative to the exhaustive data set for each realization. At each update moment in the Particle Filter, observed autocorrelation values are assimilated into the model and the state of the model is updated. Using this approach, it is shown that the use of autocorrelation reduces the uncertainty in the forecasted timing of a critical transition compared to runs without data assimilation. The performance of the use of spatial autocorrelation versus temporal autocorrelation depends on the timing and number of observational data. This study is restricted to a single model only. However, it is becoming increasingly clear that spatio-temporal autocorrelation and variance can be used as early warning signals for a large number of systems. Thus, it is expected that spatio-temporal autocorrelation and variance are valuable in data assimilation frameworks in a large number of dynamical systems.

  5. Enviro-HIRLAM/ HARMONIE Studies in ECMWF HPC EnviroAerosols Project

    NASA Astrophysics Data System (ADS)

    Hansen Sass, Bent; Mahura, Alexander; Nuterman, Roman; Baklanov, Alexander; Palamarchuk, Julia; Ivanov, Serguei; Pagh Nielsen, Kristian; Penenko, Alexey; Edvardsson, Nellie; Stysiak, Aleksander Andrzej; Bostanbekov, Kairat; Amstrup, Bjarne; Yang, Xiaohua; Ruban, Igor; Bergen Jensen, Marina; Penenko, Vladimir; Nurseitov, Daniyar; Zakarin, Edige

    2017-04-01

    The EnviroAerosols on ECMWF HPC project (2015-2017) "Enviro-HIRLAM/ HARMONIE model research and development for online integrated meteorology-chemistry-aerosols feedbacks and interactions in weather and atmospheric composition forecasting" is aimed at analysis of importance of the meteorology-chemistry/aerosols interactions and to provide a way for development of efficient techniques for on-line coupling of numerical weather prediction and atmospheric chemical transport via process-oriented parameterizations and feedback algorithms, which will improve both the numerical weather prediction and atmospheric composition forecasts. Two main application areas of the on-line integrated modelling are considered: (i) improved numerical weather prediction with short-term feedbacks of aerosols and chemistry on formation and development of meteorological variables, and (ii) improved atmospheric composition forecasting with on-line integrated meteorological forecast and two-way feedbacks between aerosols/chemistry and meteorology. During 2015-2016 several research projects were realized. At first, the study on "On-line Meteorology-Chemistry/Aerosols Modelling and Integration for Risk Assessment: Case Studies" focused on assessment of scenarios with accidental and continuous emissions of sulphur dioxide for case studies for Atyrau (Kazakhstan) near the northern part of the Caspian Sea and metallurgical enterprises on the Kola Peninsula (Russia), with GIS integration of modelling results into the RANDOM (Risk Assessment of Nature Detriment due to Oil spill Migration) system. At second, the studies on "The sensitivity of precipitation simulations to the soot aerosol presence" & "The precipitation forecast sensitivity to data assimilation on a very high resolution domain" focused on sensitivity and changes in precipitation life-cycle under black carbon polluted conditions over Scandinavia. At third, studies on "Aerosol effects over China investigated with a high resolution convection permitting weather model" & "Meteorological and chemical urban scale modelling for Shanghai metropolitan area" with focus on aerosol effects and influence of urban areas in China at regional-subregional-urban scales. At fourth, study on "Direct variational data assimilation algorithm for atmospheric chemistry data with transport and transformation model" with focus on testing chemical data assimilation algorithm of in situ concentration measurements on real data scenario. At firth, study on "Aerosol influence on High Resolution NWP HARMONIE Operational Forecasts" with focus on impact of sea salt aerosols on numerical weather prediction during low precipitation events. And finally, study on "Impact of regional afforestation on climatic conditions in metropolitan areas: case study of Copenhagen" with focus on impact of forest and land-cover change on formation and development of temperature regimes in the Copenhagen metropolitan area of Denmark. Selected results and findings will be presented and discussed.

  6. The clustering-based case-based reasoning for imbalanced business failure prediction: a hybrid approach through integrating unsupervised process with supervised process

    NASA Astrophysics Data System (ADS)

    Li, Hui; Yu, Jun-Ling; Yu, Le-An; Sun, Jie

    2014-05-01

    Case-based reasoning (CBR) is one of the main forecasting methods in business forecasting, which performs well in prediction and holds the ability of giving explanations for the results. In business failure prediction (BFP), the number of failed enterprises is relatively small, compared with the number of non-failed ones. However, the loss is huge when an enterprise fails. Therefore, it is necessary to develop methods (trained on imbalanced samples) which forecast well for this small proportion of failed enterprises and performs accurately on total accuracy meanwhile. Commonly used methods constructed on the assumption of balanced samples do not perform well in predicting minority samples on imbalanced samples consisting of the minority/failed enterprises and the majority/non-failed ones. This article develops a new method called clustering-based CBR (CBCBR), which integrates clustering analysis, an unsupervised process, with CBR, a supervised process, to enhance the efficiency of retrieving information from both minority and majority in CBR. In CBCBR, various case classes are firstly generated through hierarchical clustering inside stored experienced cases, and class centres are calculated out by integrating cases information in the same clustered class. When predicting the label of a target case, its nearest clustered case class is firstly retrieved by ranking similarities between the target case and each clustered case class centre. Then, nearest neighbours of the target case in the determined clustered case class are retrieved. Finally, labels of the nearest experienced cases are used in prediction. In the empirical experiment with two imbalanced samples from China, the performance of CBCBR was compared with the classical CBR, a support vector machine, a logistic regression and a multi-variant discriminate analysis. The results show that compared with the other four methods, CBCBR performed significantly better in terms of sensitivity for identifying the minority samples and generated high total accuracy meanwhile. The proposed approach makes CBR useful in imbalanced forecasting.

  7. Development of online NIR urine analyzing system based on AOTF

    NASA Astrophysics Data System (ADS)

    Wan, Feng; Sun, Zhendong; Li, Xiaoxia

    2006-09-01

    In this paper, some key techniques on development of on-line MR urine analyzing system based on AOTF (Acousto - Optics Tunable Filter) are introduced. Problems about designing the optical system including collimation of incident light and working distance (the shortest distance for separating incident light and diffracted light) are analyzed and researched. DDS (Direct Digital Synthesizer) controlled by microprocessor is used to realize the wavelength scan. The experiment results show that this MR urine analyzing system based on. AOTF has 10000 - 4000cm -1 wavelength range and O.3ms wavelength transfer rate. Compare with the conventional Fourier Transform NIP. spectrophotometer for analyzing multi-components in urine, this system features low cost, small volume and on-line measurement function. Unscrambler software (multivariate statistical software by CAMO Inc. Norway) is selected as the software for processing the data. This system can realize on line quantitative analysis of protein, urea and creatinine in urine.

  8. A New Predictive Model of Centerline Segregation in Continuous Cast Steel Slabs by Using Multivariate Adaptive Regression Splines Approach

    PubMed Central

    García Nieto, Paulino José; González Suárez, Victor Manuel; Álvarez Antón, Juan Carlos; Mayo Bayón, Ricardo; Sirgo Blanco, José Ángel; Díaz Fernández, Ana María

    2015-01-01

    The aim of this study was to obtain a predictive model able to perform an early detection of central segregation severity in continuous cast steel slabs. Segregation in steel cast products is an internal defect that can be very harmful when slabs are rolled in heavy plate mills. In this research work, the central segregation was studied with success using the data mining methodology based on multivariate adaptive regression splines (MARS) technique. For this purpose, the most important physical-chemical parameters are considered. The results of the present study are two-fold. In the first place, the significance of each physical-chemical variable on the segregation is presented through the model. Second, a model for forecasting segregation is obtained. Regression with optimal hyperparameters was performed and coefficients of determination equal to 0.93 for continuity factor estimation and 0.95 for average width were obtained when the MARS technique was applied to the experimental dataset, respectively. The agreement between experimental data and the model confirmed the good performance of the latter.

  9. Occipital MEG Activity in the Early Time Range (<300 ms) Predicts Graded Changes in Perceptual Consciousness.

    PubMed

    Andersen, Lau M; Pedersen, Michael N; Sandberg, Kristian; Overgaard, Morten

    2016-06-01

    Two electrophysiological components have been extensively investigated as candidate neural correlates of perceptual consciousness: An early, occipitally realized component occurring 130-320 ms after stimulus onset and a late, frontally realized component occurring 320-510 ms after stimulus onset. Recent studies have suggested that the late component may not be uniquely related to perceptual consciousness, but also to sensory expectations, task associations, and selective attention. We conducted a magnetoencephalographic study; using multivariate analysis, we compared classification accuracies when decoding perceptual consciousness from the 2 components using sources from occipital and frontal lobes. We found that occipital sources during the early time range were significantly more accurate in decoding perceptual consciousness than frontal sources during both the early and late time ranges. These results are the first of its kind where the predictive values of the 2 components are quantitatively compared, and they provide further evidence for the primary importance of occipital sources in realizing perceptual consciousness. The results have important consequences for current theories of perceptual consciousness, especially theories emphasizing the role of frontal sources. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Dew point temperature affects ascospore release of allergenic genus Leptosphaeria

    NASA Astrophysics Data System (ADS)

    Sadyś, Magdalena; Kaczmarek, Joanna; Grinn-Gofron, Agnieszka; Rodinkova, Victoria; Prikhodko, Alex; Bilous, Elena; Strzelczak, Agnieszka; Herbert, Robert J.; Jedryczka, Malgorzata

    2018-06-01

    The genus Leptosphaeria contains numerous fungi that cause the symptoms of asthma and also parasitize wild and crop plants. In search of a robust and universal forecast model, the ascospore concentration in air was measured and weather data recorded from 1 March to 31 October between 2006 and 2012. The experiment was conducted in three European countries of the temperate climate, i.e., Ukraine, Poland, and the UK. Out of over 150 forecast models produced using artificial neural networks (ANNs) and multivariate regression trees (MRTs), we selected the best model for each site, as well as for joint two-site combinations. The performance of all computed models was tested against records from 1 year which had not been used for model construction. The statistical analysis of the fungal spore data was supported by a comprehensive study of both climate and land cover within a 30-km radius from the air sampler location. High-performance forecasting models were obtained for individual sites, showing that the local micro-climate plays a decisive role in biology of the fungi. Based on the previous epidemiological studies, we hypothesized that dew point temperature (DPT) would be a critical factor in the models. The impact of DPT was confirmed only by one of the final best neural models, but the MRT analyses, similarly to the Spearman's rank test, indicated the importance of DPT in all but one of the studied cases and in half of them ranked it as a fundamental factor. This work applies artificial neural modeling to predict the Leptosphaeria airborne spore concentration in urban areas for the first time.

  11. Dew point temperature affects ascospore release of allergenic genus Leptosphaeria

    NASA Astrophysics Data System (ADS)

    Sadyś, Magdalena; Kaczmarek, Joanna; Grinn-Gofron, Agnieszka; Rodinkova, Victoria; Prikhodko, Alex; Bilous, Elena; Strzelczak, Agnieszka; Herbert, Robert J.; Jedryczka, Malgorzata

    2018-01-01

    The genus Leptosphaeria contains numerous fungi that cause the symptoms of asthma and also parasitize wild and crop plants. In search of a robust and universal forecast model, the ascospore concentration in air was measured and weather data recorded from 1 March to 31 October between 2006 and 2012. The experiment was conducted in three European countries of the temperate climate, i.e., Ukraine, Poland, and the UK. Out of over 150 forecast models produced using artificial neural networks (ANNs) and multivariate regression trees (MRTs), we selected the best model for each site, as well as for joint two-site combinations. The performance of all computed models was tested against records from 1 year which had not been used for model construction. The statistical analysis of the fungal spore data was supported by a comprehensive study of both climate and land cover within a 30-km radius from the air sampler location. High-performance forecasting models were obtained for individual sites, showing that the local micro-climate plays a decisive role in biology of the fungi. Based on the previous epidemiological studies, we hypothesized that dew point temperature (DPT) would be a critical factor in the models. The impact of DPT was confirmed only by one of the final best neural models, but the MRT analyses, similarly to the Spearman's rank test, indicated the importance of DPT in all but one of the studied cases and in half of them ranked it as a fundamental factor. This work applies artificial neural modeling to predict the Leptosphaeria airborne spore concentration in urban areas for the first time.

  12. Forecasting High-Priority Infectious Disease Surveillance Regions: A Socioeconomic Model

    PubMed Central

    Chan, Emily H.; Scales, David A.; Brewer, Timothy F.; Madoff, Lawrence C.; Pollack, Marjorie P.; Hoen, Anne G.; Choden, Tenzin; Brownstein, John S.

    2013-01-01

    Background. Few researchers have assessed the relationships between socioeconomic inequality and infectious disease outbreaks at the population level globally. We use a socioeconomic model to forecast national annual rates of infectious disease outbreaks. Methods. We constructed a multivariate mixed-effects Poisson model of the number of times a given country was the origin of an outbreak in a given year. The dataset included 389 outbreaks of international concern reported in the World Health Organization's Disease Outbreak News from 1996 to 2008. The initial full model included 9 socioeconomic variables related to education, poverty, population health, urbanization, health infrastructure, gender equality, communication, transportation, and democracy, and 1 composite index. Population, latitude, and elevation were included as potential confounders. The initial model was pared down to a final model by a backwards elimination procedure. The dependent and independent variables were lagged by 2 years to allow for forecasting future rates. Results. Among the socioeconomic variables tested, the final model included child measles immunization rate and telephone line density. The Democratic Republic of Congo, China, and Brazil were predicted to be at the highest risk for outbreaks in 2010, and Colombia and Indonesia were predicted to have the highest percentage of increase in their risk compared to their average over 1996–2008. Conclusions. Understanding socioeconomic factors could help improve the understanding of outbreak risk. The inclusion of the measles immunization variable suggests that there is a fundamental basis in ensuring adequate public health capacity. Increased vigilance and expanding public health capacity should be prioritized in the projected high-risk regions. PMID:23118271

  13. Dew point temperature affects ascospore release of allergenic genus Leptosphaeria.

    PubMed

    Sadyś, Magdalena; Kaczmarek, Joanna; Grinn-Gofron, Agnieszka; Rodinkova, Victoria; Prikhodko, Alex; Bilous, Elena; Strzelczak, Agnieszka; Herbert, Robert J; Jedryczka, Malgorzata

    2018-06-01

    The genus Leptosphaeria contains numerous fungi that cause the symptoms of asthma and also parasitize wild and crop plants. In search of a robust and universal forecast model, the ascospore concentration in air was measured and weather data recorded from 1 March to 31 October between 2006 and 2012. The experiment was conducted in three European countries of the temperate climate, i.e., Ukraine, Poland, and the UK. Out of over 150 forecast models produced using artificial neural networks (ANNs) and multivariate regression trees (MRTs), we selected the best model for each site, as well as for joint two-site combinations. The performance of all computed models was tested against records from 1 year which had not been used for model construction. The statistical analysis of the fungal spore data was supported by a comprehensive study of both climate and land cover within a 30-km radius from the air sampler location. High-performance forecasting models were obtained for individual sites, showing that the local micro-climate plays a decisive role in biology of the fungi. Based on the previous epidemiological studies, we hypothesized that dew point temperature (DPT) would be a critical factor in the models. The impact of DPT was confirmed only by one of the final best neural models, but the MRT analyses, similarly to the Spearman's rank test, indicated the importance of DPT in all but one of the studied cases and in half of them ranked it as a fundamental factor. This work applies artificial neural modeling to predict the Leptosphaeria airborne spore concentration in urban areas for the first time.

  14. Objective calibration of numerical weather prediction models

    NASA Astrophysics Data System (ADS)

    Voudouri, A.; Khain, P.; Carmona, I.; Bellprat, O.; Grazzini, F.; Avgoustoglou, E.; Bettems, J. M.; Kaufmann, P.

    2017-07-01

    Numerical weather prediction (NWP) and climate models use parameterization schemes for physical processes, which often include free or poorly confined parameters. Model developers normally calibrate the values of these parameters subjectively to improve the agreement of forecasts with available observations, a procedure referred as expert tuning. A practicable objective multi-variate calibration method build on a quadratic meta-model (MM), that has been applied for a regional climate model (RCM) has shown to be at least as good as expert tuning. Based on these results, an approach to implement the methodology to an NWP model is presented in this study. Challenges in transferring the methodology from RCM to NWP are not only restricted to the use of higher resolution and different time scales. The sensitivity of the NWP model quality with respect to the model parameter space has to be clarified, as well as optimize the overall procedure, in terms of required amount of computing resources for the calibration of an NWP model. Three free model parameters affecting mainly turbulence parameterization schemes were originally selected with respect to their influence on the variables associated to daily forecasts such as daily minimum and maximum 2 m temperature as well as 24 h accumulated precipitation. Preliminary results indicate that it is both affordable in terms of computer resources and meaningful in terms of improved forecast quality. In addition, the proposed methodology has the advantage of being a replicable procedure that can be applied when an updated model version is launched and/or customize the same model implementation over different climatological areas.

  15. Performance and quality assessment of the recent updated CMEMS global ocean monitoring and forecasting real-time system

    NASA Astrophysics Data System (ADS)

    Le Galloudec, Olivier; Lellouche, Jean-Michel; Greiner, Eric; Garric, Gilles; Régnier, Charly; Drévillon, Marie; Drillet, Yann

    2017-04-01

    Since May 2015, Mercator Ocean opened the Copernicus Marine Environment and Monitoring Service (CMEMS) and is in charge of the global eddy resolving ocean analyses and forecast. In this context, Mercator Ocean currently delivers in real-time daily services (weekly analyses and daily forecast) with a global 1/12° high resolution system. The model component is the NEMO platform driven at the surface by the IFS ECMWF atmospheric analyses and forecasts. Observations are assimilated by means of a reduced-order Kalman filter with a 3D multivariate modal decomposition of the forecast error. It includes an adaptive-error estimate and a localization algorithm. Along track altimeter data, satellite Sea Surface Temperature and in situ temperature and salinity vertical profiles are jointly assimilated to estimate the initial conditions for numerical ocean forecasting. A 3D-Var scheme provides a correction for the slowly-evolving large-scale biases in temperature and salinity. R&D activities have been conducted at Mercator Ocean these last years to improve the real-time 1/12° global system for recent updated CMEMS version in 2016. The ocean/sea-ice model and the assimilation scheme benefited of the following improvements: large-scale and objective correction of atmospheric quantities with satellite data, new Mean Dynamic Topography taking into account the last version of GOCE geoid, new adaptive tuning of some observational errors, new Quality Control on the assimilated temperature and salinity vertical profiles based on dynamic height criteria, assimilation of satellite sea-ice concentration, new freshwater runoff from ice sheets melting, … This presentation will show the impact of some updates separately, with a particular focus on adaptive tuning experiments of satellite Sea Level Anomaly (SLA) and Sea Surface Temperature (SST) observations errors. For the SLA, the a priori prescribed observation error is globally greatly reduced. The median value of the error changed from 5cm to 2.5cm in a few assimilation cycles. For the SST, we chose to maintain the median value of the error to 0.4°C. The spatial distribution of the SST error follows the model physics and atmospheric variability. Either for SLA or SST, we improve the performances of the system using this adaptive tuning. The overall behavior of the system integrating all updates reporting on the products quality improvements will be also discussed, highlighting the level of performance and the reliability of the new system.

  16. Forecasting seasonal hydrologic response in major river basins

    NASA Astrophysics Data System (ADS)

    Bhuiyan, A. M.

    2014-05-01

    Seasonal precipitation variation due to natural climate variation influences stream flow and the apparent frequency and severity of extreme hydrological conditions such as flood and drought. To study hydrologic response and understand the occurrence of extreme hydrological events, the relevant forcing variables must be identified. This study attempts to assess and quantify the historical occurrence and context of extreme hydrologic flow events and quantify the relation between relevant climate variables. Once identified, the flow data and climate variables are evaluated to identify the primary relationship indicators of hydrologic extreme event occurrence. Existing studies focus on developing basin-scale forecasting techniques based on climate anomalies in El Nino/La Nina episodes linked to global climate. Building on earlier work, the goal of this research is to quantify variations in historical river flows at seasonal temporal-scale, and regional to continental spatial-scale. The work identifies and quantifies runoff variability of major river basins and correlates flow with environmental forcing variables such as El Nino, La Nina, sunspot cycle. These variables are expected to be the primary external natural indicators of inter-annual and inter-seasonal patterns of regional precipitation and river flow. Relations between continental-scale hydrologic flows and external climate variables are evaluated through direct correlations in a seasonal context with environmental phenomenon such as sun spot numbers (SSN), Southern Oscillation Index (SOI), and Pacific Decadal Oscillation (PDO). Methods including stochastic time series analysis and artificial neural networks are developed to represent the seasonal variability evident in the historical records of river flows. River flows are categorized into low, average and high flow levels to evaluate and simulate flow variations under associated climate variable variations. Results demonstrated not any particular method is suited to represent scenarios leading to extreme flow conditions. For selected flow scenarios, the persistence model performance may be comparable to more complex multivariate approaches, and complex methods did not always improve flow estimation. Overall model performance indicates inclusion of river flows and forcing variables on average improve model extreme event forecasting skills. As a means to further refine the flow estimation, an ensemble forecast method is implemented to provide a likelihood-based indication of expected river flow magnitude and variability. Results indicate seasonal flow variations are well-captured in the ensemble range, therefore the ensemble approach can often prove efficient in estimating extreme river flow conditions. The discriminant prediction approach, a probabilistic measure to forecast streamflow, is also adopted to derive model performance. Results show the efficiency of the method in terms of representing uncertainties in the forecasts.

  17. Bayesian Local Contamination Models for Multivariate Outliers

    PubMed Central

    Page, Garritt L.; Dunson, David B.

    2013-01-01

    In studies where data are generated from multiple locations or sources it is common for there to exist observations that are quite unlike the majority. Motivated by the application of establishing a reference value in an inter-laboratory setting when outlying labs are present, we propose a local contamination model that is able to accommodate unusual multivariate realizations in a flexible way. The proposed method models the process level of a hierarchical model using a mixture with a parametric component and a possibly nonparametric contamination. Much of the flexibility in the methodology is achieved by allowing varying random subsets of the elements in the lab-specific mean vectors to be allocated to the contamination component. Computational methods are developed and the methodology is compared to three other possible approaches using a simulation study. We apply the proposed method to a NIST/NOAA sponsored inter-laboratory study which motivated the methodological development. PMID:24363465

  18. Long-term weather predictability: Ural case study

    NASA Astrophysics Data System (ADS)

    Kubyshen, Alexander; Shopin, Sergey

    2016-04-01

    The accuracy of the state-of-the-art long-term meteorological forecast (at the seasonal level) is still low. Here it is presented approach (RAMES method) realizing different forecasting methodology. It provides prediction horizon of up to 19-22 years under equal probabilities of determination of parameters in every analyzed period [1]. Basic statements of the method are the following. 1. Long-term forecast on the basis of numerical modeling of the global meteorological process is principally impossible. Extension of long-term prediction horizon could be obtained only by the revealing and using a periodicity of meteorological situations at one point of observation. 2. Conventional calendar is unsuitable for generalization of meteorological data and revealing of cyclicity of meteorological processes. RAMES method uses natural time intervals: one day, synodic month and one year. It was developed a set of special calendars using these natural periods and the Metonic cycle. 3. Long-term time series of meteorological data is not a uniform universal set, it is a sequence of 28 universal sets appropriately superseding each other in time. The specifics of the method are: 1. Usage of the original research toolkit consisting of - a set of calendars based on the Metonic cycle; - a set of charts (coordinate systems) for the construction of sequence diagrams (of daily variability of a meteorological parameter during the analyzed year; of daily variability of a meteorological parameter using long-term dynamical time series of periods-analogues; of monthly and yearly variability of accumulated value of meteorological parameter). 2. Identification and usage of new virtual meteorological objects having several degrees of generalization appropriately located in the used coordinate systems. 3. All calculations are integrated into the single technological scheme providing comparison and mutual verification of calculation results. During the prolonged testing in the Ural region, it was proved the efficiency of the method for forecasting the following meteorological parameters: ­- air temperature (minimum, maximum, daily mean, diurnal variation, last spring and first autumn freeze); - periods of winds with speeds of >5m/s and the maximal expected wind speed; - precipitation periods and amount of precipitations; -­ relative humidity; - atmospheric pressure. Atmospheric events (thunderstorms, fog) and hydrometeors also occupy the appropriate positions at the sequence diagrams that provides a possibility of long-term forecasting also for these events. Accuracy of forecasts was tested in 2006-2009 years. The difference between the forecasted monthly mean temperature and actual values was <0.5°C in 40.9% of cases, between 0.5°C and 1°C in 18.2% of cases, between 1°C and 1.5°C in 18.2% of cases, <2°C in 86% of cases. The RAMES method provides the toolkit to successfully forecast the weather conditions in advance of several years. 1. A.F. Kubyshen, "RAMES method: revealing the periodicity of meteorological processes and it usage for long-term forecast [Metodika «RAMES»: vyjavlenie periodichnosti meteorologicheskih processov i ee ispol'zovanie dlja dolgosrochnogo prognozirovanija]", in A.E. Fedorov (ed.), Sistema «Planeta Zemlja»: 200 let so dnja rozhdenija Izmaila Ivanovicha Sreznevskogo. 100 let so dnja izdanija ego slovarja drevnerusskogo jazyka. LENAND. Moscow. pp. 305-311. (In Russian)

  19. Implementation of Multivariable Logic Functions in Parallel by Electrically Addressing a Molecule of Three Dopants in Silicon.

    PubMed

    Fresch, Barbara; Bocquel, Juanita; Hiluf, Dawit; Rogge, Sven; Levine, Raphael D; Remacle, Françoise

    2017-07-05

    To realize low-power, compact logic circuits, one can explore parallel operation on single nanoscale devices. An added incentive is to use multivalued (as distinct from Boolean) logic. Here, we theoretically demonstrate that the computation of all the possible outputs of a multivariate, multivalued logic function can be implemented in parallel by electrical addressing of a molecule made up of three interacting dopant atoms embedded in Si. The electronic states of the dopant molecule are addressed by pulsing a gate voltage. By simulating the time evolution of the non stationary electronic density built by the gate voltage, we show that one can implement a molecular decision tree that provides in parallel all the outputs for all the inputs of the multivariate, multivalued logic function. The outputs are encoded in the populations and in the bond orders of the dopant molecule, which can be measured using an STM tip. We show that the implementation of the molecular logic tree is equivalent to a spectral function decomposition. The function that is evaluated can be field-programmed by changing the time profile of the pulsed gate voltage. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Visualization of spatial-temporal data based on 3D virtual scene

    NASA Astrophysics Data System (ADS)

    Wang, Xianghong; Liu, Jiping; Wang, Yong; Bi, Junfang

    2009-10-01

    The main purpose of this paper is to realize the expression of the three-dimensional dynamic visualization of spatialtemporal data based on three-dimensional virtual scene, using three-dimensional visualization technology, and combining with GIS so that the people's abilities of cognizing time and space are enhanced and improved by designing dynamic symbol and interactive expression. Using particle systems, three-dimensional simulation, virtual reality and other visual means, we can simulate the situations produced by changing the spatial location and property information of geographical entities over time, then explore and analyze its movement and transformation rules by changing the interactive manner, and also replay history and forecast of future. In this paper, the main research object is the vehicle track and the typhoon path and spatial-temporal data, through three-dimensional dynamic simulation of its track, and realize its timely monitoring its trends and historical track replaying; according to visualization techniques of spatialtemporal data in Three-dimensional virtual scene, providing us with excellent spatial-temporal information cognitive instrument not only can add clarity to show spatial-temporal information of the changes and developments in the situation, but also be used for future development and changes in the prediction and deduction.

  1. The seasonal-cycle climate model

    NASA Technical Reports Server (NTRS)

    Marx, L.; Randall, D. A.

    1981-01-01

    The seasonal cycle run which will become the control run for the comparison with runs utilizing codes and parameterizations developed by outside investigators is discussed. The climate model currently exists in two parallel versions: one running on the Amdahl and the other running on the CYBER 203. These two versions are as nearly identical as machine capability and the requirement for high speed performance will allow. Developmental changes are made on the Amdahl/CMS version for ease of testing and rapidity of turnaround. The changes are subsequently incorporated into the CYBER 203 version using vectorization techniques where speed improvement can be realized. The 400 day seasonal cycle run serves as a control run for both medium and long range climate forecasts alsensitivity studies.

  2. Carbon source in the future chemical industries

    NASA Astrophysics Data System (ADS)

    Hofmann, Peter; Heinrich Krauch, Carl

    1982-11-01

    Rising crude oil prices favour the exploitation of hitherto unutilised energy carriers and the realisation of new technologies in all sectors where carbon is used. These changed economic constraints necessitate both savings in conventional petrochemistry and a change to oil-independent carbon sources in the chemical industry. While, in coal chemistry, the synthesis and process principles of petrochemistry — fragmentation of the raw material and subsequent buildup of molecular structures — can be maintained, the raw material structure largely remains unchanged in the chemistry of renewable raw materials. This lecture is to demonstrate the structural as well as the technological and energy criteria of the chemistry of alternative carbon sources, to forecast the chances of commercial realization and to discuss some promising fields of research and development.

  3. Environmental prediction, risk assessment and extreme events: adaptation strategies for the developing world

    PubMed Central

    Webster, Peter J.; Jian, Jun

    2011-01-01

    The uncertainty associated with predicting extreme weather events has serious implications for the developing world, owing to the greater societal vulnerability to such events. Continual exposure to unanticipated extreme events is a contributing factor for the descent into perpetual and structural rural poverty. We provide two examples of how probabilistic environmental prediction of extreme weather events can support dynamic adaptation. In the current climate era, we describe how short-term flood forecasts have been developed and implemented in Bangladesh. Forecasts of impending floods with horizons of 10 days are used to change agricultural practices and planning, store food and household items and evacuate those in peril. For the first time in Bangladesh, floods were anticipated in 2007 and 2008, with broad actions taking place in advance of the floods, grossing agricultural and household savings measured in units of annual income. We argue that probabilistic environmental forecasts disseminated to an informed user community can reduce poverty caused by exposure to unanticipated extreme events. Second, it is also realized that not all decisions in the future can be made at the village level and that grand plans for water resource management require extensive planning and funding. Based on imperfect models and scenarios of economic and population growth, we further suggest that flood frequency and intensity will increase in the Ganges, Brahmaputra and Yangtze catchments as greenhouse-gas concentrations increase. However, irrespective of the climate-change scenario chosen, the availability of fresh water in the latter half of the twenty-first century seems to be dominated by population increases that far outweigh climate-change effects. Paradoxically, fresh water availability may become more critical if there is no climate change. PMID:22042897

  4. Forecasting the Value of Podiatric Medical Care in Newly Insured Diabetic Patients During Implementation of the Affordable Care Act in California.

    PubMed

    Labovitz, Jonathan M; Kominski, Gerald F

    2016-05-01

    Because value-based care is critical to the Affordable Care Act success, we forecasted inpatient costs and the potential impact of podiatric medical care on savings in the diabetic population through improved care quality and decreased resource use during implementation of the health reform initiatives in California. We forecasted enrollment of diabetic adults into Medicaid and subsidized health benefit exchange programs using the California Simulation of Insurance Markets (CalSIM) base model. Amputations and admissions per 1,000 diabetic patients and inpatient costs were based on the California Office of Statewide Health Planning and Development 2009-2011 inpatient discharge files. We evaluated cost in three categories: uncomplicated admissions, amputations during admissions, and discharges to a skilled nursing facility. Total costs and projected savings were calculated by applying the metrics and cost to the projected enrollment. Diabetic patients accounted for 6.6% of those newly eligible for Medicaid or health benefit exchange subsidies, with a 60.8% take-up rate. We project costs to be $24.2 million in the diabetic take-up population from 2014 to 2019. Inpatient costs were 94.3% higher when amputations occurred during the admission and 46.7% higher when discharged to a skilled nursing facility. Meanwhile, 61.0% of costs were attributed to uncomplicated admissions. Podiatric medical services saved 4.1% with a 10% reduction in admissions and amputations and an additional 1% for every 10% improvement in access to podiatric medical care. When implementing the Affordable Care Act, inclusion of podiatric medical services on multidisciplinary teams and in chronic-care models featuring prevention helps shift care to ambulatory settings to realize the greatest cost savings.

  5. Impact of Three-Phase Relative Permeability and Hysteresis Models on Forecasts of Storage Associated With CO2-EOR

    NASA Astrophysics Data System (ADS)

    Jia, Wei; McPherson, Brian; Pan, Feng; Dai, Zhenxue; Moodie, Nathan; Xiao, Ting

    2018-02-01

    Geological CO2 sequestration in conjunction with enhanced oil recovery (CO2-EOR) includes complex multiphase flow processes compared to CO2 storage in deep saline aquifers. Two of the most important factors affecting multiphase flow in CO2-EOR are three-phase relative permeability and associated hysteresis, both of which are difficult to measure and are usually represented by numerical interpolation models. The purpose of this study is to improve understanding of (1) the relative impacts of different three-phase relative permeability models and hysteresis models on CO2 trapping mechanisms, and (2) uncertainty associated with these two factors. Four different three-phase relative permeability models and three hysteresis models were applied to simulations of an active CO2-EOR site, the SACROC unit located in western Texas. To eliminate possible bias of deterministic parameters, we utilized a sequential Gaussian simulation technique to generate 50 realizations to describe heterogeneity of porosity and permeability, based on data obtained from well logs and seismic survey. Simulation results of forecasted CO2 storage suggested that (1) the choice of three-phase relative permeability model and hysteresis model led to noticeable impacts on forecasted CO2 sequestration capacity; (2) impacts of three-phase relative permeability models and hysteresis models on CO2 trapping are small during the CO2-EOR injection period, and increase during the post-EOR CO2 injection period; (3) the specific choice of hysteresis model is more important relative to the choice of three-phase relative permeability model; and (4) using the recommended three-phase WAG (Water-Alternating-Gas) hysteresis model may increase the impact of three-phase relative permeability models and uncertainty due to heterogeneity.

  6. Who needs budgets?

    PubMed

    Hope, Jeremy; Fraser, Robin

    2003-02-01

    Budgeting, as most corporations practice it, should be abolished. That may sound radical, but doing so would further companies' long-running efforts to transform themselves into developed networks that can nimbly adjust to market conditions. Most other building blocks are in place, but companies continue to restrict themselves by relying on inflexible budget processes and the command-and-control culture that budgeting entails. A number of companies have rejected the foregone conclusions embedded in budgets, and they've given up the self-interested wrangling over what the data indicate. In the absence of budgets, alternative goals and measures--some financial, such as cost-to-income ratios, and some nonfinancial, such as time to market-move to the foreground. Companies that have rejected budgets require employees to measure themselves against the performance of competitors and against internal peer groups. Because employees don't know whether they've succeeded until they can look back on the results of a given period, they must use every ounce of energy to ensure that they beat the competition. A key feature of many companies that have rejected budgets is the use of rolling forecasts, which are created every few months and typically cover five to eight quarters. Because the forecasts are regularly revised, they allow companies to continuously adapt to market conditions. The forecasting practices of two such companies, both based in Sweden, are examined in detail: the bank Svenska Handelsbanken and the wholesaler Ahlsell. Though the first companies to reject budgets were located in Northern Europe, organizations that have gone beyond budgeting can be found in a range of countries and industries. Their practices allow them to unleash the power of today's management tools and realize the potential of a fully decentralized organization.

  7. Multivariate Drought Characterization in India for Monitoring and Prediction

    NASA Astrophysics Data System (ADS)

    Sreekumaran Unnithan, P.; Mondal, A.

    2016-12-01

    Droughts are one of the most important natural hazards that affect the society significantly in terms of mortality and productivity. The metric that is most widely used by the India Meteorological Department (IMD) to monitor and predict the occurrence, spread, intensification and termination of drought is based on the univariate Standardized Precipitation Index (SPI). However, droughts may be caused by the influence and interaction of many variables (such as precipitation, soil moisture, runoff, etc.), emphasizing the need for a multivariate approach for drought characterization. This study advocates and illustrates use of the recently proposed multivariate standardized drought index (MSDI) in monitoring and prediction of drought and assessing its concerned risk in the Indian region. MSDI combines information from multiple sources: precipitation and soil moisture, and has been deemed to be a more reliable drought index. All-India monthly rainfall and soil moisture data sets are analysed for the period 1980 to 2014 to characterize historical droughts using both the univariate indices, the precipitation-based SPI and the standardized soil moisture index (SSI), as well as the multivariate MSDI using parametric and non-parametric approaches. We confirm that MSDI can capture droughts of 1986 and 1990 that aren't detected by using SPI alone. Moreover, in 1987, MSDI indicated a higher severity of drought when a deficiency in both soil moisture and precipitation was encountered. Further, this study also explores the use of MSDI for drought forecasts and assesses its performance vis-à-vis existing predictions from the IMD. Future research efforts will be directed towards formulating a more robust standardized drought indicator that can take into account socio-economic aspects that also play a key role for water-stressed regions such as India.

  8. On the Log-Normality of Historical Magnetic-Storm Intensity Statistics: Implications for Extreme-Event Probabilities

    NASA Astrophysics Data System (ADS)

    Love, J. J.; Rigler, E. J.; Pulkkinen, A. A.; Riley, P.

    2015-12-01

    An examination is made of the hypothesis that the statistics of magnetic-storm-maximum intensities are the realization of a log-normal stochastic process. Weighted least-squares and maximum-likelihood methods are used to fit log-normal functions to -Dst storm-time maxima for years 1957-2012; bootstrap analysis is used to established confidence limits on forecasts. Both methods provide fits that are reasonably consistent with the data; both methods also provide fits that are superior to those that can be made with a power-law function. In general, the maximum-likelihood method provides forecasts having tighter confidence intervals than those provided by weighted least-squares. From extrapolation of maximum-likelihood fits: a magnetic storm with intensity exceeding that of the 1859 Carrington event, -Dst > 850 nT, occurs about 1.13 times per century and a wide 95% confidence interval of [0.42, 2.41] times per century; a 100-yr magnetic storm is identified as having a -Dst > 880 nT (greater than Carrington) but a wide 95% confidence interval of [490, 1187] nT. This work is partially motivated by United States National Science and Technology Council and Committee on Space Research and International Living with a Star priorities and strategic plans for the assessment and mitigation of space-weather hazards.

  9. Extinction debt from climate change for frogs in the wet tropics.

    PubMed

    Fordham, Damien A; Brook, Barry W; Hoskin, Conrad J; Pressey, Robert L; VanDerWal, Jeremy; Williams, Stephen E

    2016-10-01

    The effect of twenty-first-century climate change on biodiversity is commonly forecast based on modelled shifts in species ranges, linked to habitat suitability. These projections have been coupled with species-area relationships (SAR) to infer extinction rates indirectly as a result of the loss of climatically suitable areas and associated habitat. This approach does not model population dynamics explicitly, and so accepts that extinctions might occur after substantial (but unknown) delays-an extinction debt. Here we explicitly couple bioclimatic envelope models of climate and habitat suitability with generic life-history models for 24 species of frogs found in the Australian Wet Tropics (AWT). We show that (i) as many as four species of frogs face imminent extinction by 2080, due primarily to climate change; (ii) three frogs face delayed extinctions; and (iii) this extinction debt will take at least a century to be realized in full. Furthermore, we find congruence between forecast rates of extinction using SARs, and demographic models with an extinction lag of 120 years. We conclude that SAR approaches can provide useful advice to conservation on climate change impacts, provided there is a good understanding of the time lags over which delayed extinctions are likely to occur. © 2016 The Author(s).

  10. On the use of multi-agent systems for the monitoring of industrial systems

    NASA Astrophysics Data System (ADS)

    Rezki, Nafissa; Kazar, Okba; Mouss, Leila Hayet; Kahloul, Laid; Rezki, Djamil

    2016-03-01

    The objective of the current paper is to present an intelligent system for complex process monitoring, based on artificial intelligence technologies. This system aims to realize with success all the complex process monitoring tasks that are: detection, diagnosis, identification and reconfiguration. For this purpose, the development of a multi-agent system that combines multiple intelligences such as: multivariate control charts, neural networks, Bayesian networks and expert systems has became a necessity. The proposed system is evaluated in the monitoring of the complex process Tennessee Eastman process.

  11. Methodological challenges to multivariate syndromic surveillance: a case study using Swiss animal health data.

    PubMed

    Vial, Flavie; Wei, Wei; Held, Leonhard

    2016-12-20

    In an era of ubiquitous electronic collection of animal health data, multivariate surveillance systems (which concurrently monitor several data streams) should have a greater probability of detecting disease events than univariate systems. However, despite their limitations, univariate aberration detection algorithms are used in most active syndromic surveillance (SyS) systems because of their ease of application and interpretation. On the other hand, a stochastic modelling-based approach to multivariate surveillance offers more flexibility, allowing for the retention of historical outbreaks, for overdispersion and for non-stationarity. While such methods are not new, they are yet to be applied to animal health surveillance data. We applied an example of such stochastic model, Held and colleagues' two-component model, to two multivariate animal health datasets from Switzerland. In our first application, multivariate time series of the number of laboratories test requests were derived from Swiss animal diagnostic laboratories. We compare the performance of the two-component model to parallel monitoring using an improved Farrington algorithm and found both methods yield a satisfactorily low false alarm rate. However, the calibration test of the two-component model on the one-step ahead predictions proved satisfactory, making such an approach suitable for outbreak prediction. In our second application, the two-component model was applied to the multivariate time series of the number of cattle abortions and the number of test requests for bovine viral diarrhea (a disease that often results in abortions). We found that there is a two days lagged effect from the number of abortions to the number of test requests. We further compared the joint modelling and univariate modelling of the number of laboratory test requests time series. The joint modelling approach showed evidence of superiority in terms of forecasting abilities. Stochastic modelling approaches offer the potential to address more realistic surveillance scenarios through, for example, the inclusion of times series specific parameters, or of covariates known to have an impact on syndrome counts. Nevertheless, many methodological challenges to multivariate surveillance of animal SyS data still remain. Deciding on the amount of corroboration among data streams that is required to escalate into an alert is not a trivial task given the sparse data on the events under consideration (e.g. disease outbreaks).

  12. Utility and Value of Satellite-Based Frost Forecasting for Kenya's Tea Farming Sector

    NASA Astrophysics Data System (ADS)

    Morrison, I.

    2016-12-01

    Frost damage regularly inflicts millions of dollars of crop losses in the tea-growing highlands of western Kenya, a problem that the USAID/NASA Regional Visualization and Monitoring System (SERVIR) program is working to mitigate through a frost monitoring and forecasting product that uses satellite-based temperature and soil moisture data to generate up to three days of advanced warning before frost events. This paper presents the findings of a value of information (VOI) study assessing the value of this product based on Kenyan tea farmers' experiences with frost and frost-damage mitigation. Value was calculated based on historic trends of frost frequency, severity, and extent; likelihood of warning receipt and response; and subsequent frost-related crop-loss aversion. Quantification of these factors was derived through inferential analysis of survey data from 400 tea-farming households across the tea-growing regions of Kericho and Nandi, supplemented with key informant interviews with decision-makers at large estate tea plantations, historical frost incident and crop-loss data from estate tea plantations and agricultural insurance companies, and publicly available demographic and economic data. At this time, the product provides a forecasting window of up to three days, and no other frost-prediction methods are used by the large or small-scale farmers of Kenya's tea sector. This represents a significant opportunity for preemptive loss-reduction via Earth observation data. However, the tea-growing community has only two realistic options for frost-damage mitigation: preemptive harvest of available tea leaves to minimize losses, or skiving (light pruning) to facilitate fast recovery from frost damage. Both options are labor-intensive and require a minimum of three days of warning to be viable. As a result, the frost forecasting system has a very narrow margin of usefulness, making its value highly dependent on rapid access to the warning messages and flexible access to harvesting labor for mitigation activities. These findings show that the Frost monitoring product has the potential for real monetary benefit to members of the frost-vulnerable tea growing community but realization of that value needs direct collaboration with the tea-farming community to ensure effective product utilization.

  13. The impact of covariance localization on the performance of an ocean EnKF system assimilating glider data in the Ligurian Sea

    NASA Astrophysics Data System (ADS)

    Falchetti, Silvia; Alvarez, Alberto

    2018-04-01

    Data assimilation through an ensemble Kalman filter (EnKF) is not exempt from deficiencies, including the generation of long-range unphysical correlations that degrade its performance. The covariance localization technique has been proposed and used in previous research to mitigate this effect. However, an evaluation of its performance is usually hindered by the sparseness and unsustained collection of independent observations. This article assesses the performance of an ocean prediction system composed of a multivariate EnKF coupled with a regional configuration of the Regional Ocean Model System (ROMS) with a covariance localization solution and data assimilation from an ocean glider that operated over a limited region of the Ligurian Sea. Simultaneous with the operation of the forecast system, a high-quality data set was repeatedly collected with a CTD sensor, i.e., every day during the period from 5 to 20 August 2013 (approximately 4 to 5 times the synoptic time scale of the area), located on board the NR/V Alliance for model validation. Comparisons between the validation data set and the forecasts provide evidence that the performance of the prediction system with covariance localization is superior to that observed using only EnKF assimilation without localization or using a free run ensemble. Furthermore, it is shown that covariance localization also increases the robustness of the model to the location of the assimilated data. Our analysis reveals that improvements are detected with regard to not only preventing the occurrence of spurious correlations but also preserving the spatial coherence in the updated covariance matrix. Covariance localization has been shown to be relevant in operational frameworks where short-term forecasts (on the order of days) are required.

  14. The CONCEPTS Global Ice-Ocean Prediction System: Establishing an Environmental Prediction Capability in Canada

    NASA Astrophysics Data System (ADS)

    Pellerin, Pierre; Smith, Gregory; Testut, Charles-Emmanuel; Surcel Colan, Dorina; Roy, Francois; Reszka, Mateusz; Dupont, Frederic; Lemieux, Jean-Francois; Beaudoin, Christiane; He, Zhongjie; Belanger, Jean-Marc; Deacu, Daniel; Lu, Yimin; Buehner, Mark; Davidson, Fraser; Ritchie, Harold; Lu, Youyu; Drevillon, Marie; Tranchant, Benoit; Garric, Gilles

    2015-04-01

    Here we describe a new system implemented recently at the Canadian Meteorological Centre (CMC) entitled the Global Ice Ocean Prediction System (GIOPS). GIOPS provides ice and ocean analyses and 10 day forecasts daily at 00GMT on a global 1/4° resolution grid. GIOPS includes a full multivariate ocean data assimilation system that combines satellite observations of sea level anomaly and sea surface temperature (SST) together with in situ observations of temperature and salinity. In situ observations are obtained from a variety of sources including: the Argo network of autonomous profiling floats, moorings, ships of opportunity, marine mammals and research cruises. Ocean analyses are blended with sea ice analyses produced by the Global Ice Analysis System.. GIOPS has been developed as part of the Canadian Operational Network of Coupled Environmental PredicTion Systems (CONCEPTS) tri-departmental initiative between Environment Canada, Fisheries and Oceans Canada and National Defense. The development of GIOPS was made through a partnership with Mercator-Océan, a French operational oceanography group. Mercator-Océan provided the ocean data assimilation code and assistance with the system implementation. GIOPS has undergone a rigorous evaluation of the analysis, trial and forecast fields demonstrating its capacity to provide high-quality products in a robust and reliable framework. In particular, SST and ice concentration forecasts demonstrate a clear benefit with respect to persistence. These results support the use of GIOPS products within other CMC operational systems, and more generally, as part of a Government of Canada marine core service. Impact of a two-way coupling between the GEM atmospheric model and NEMO-CICE ocean-ice model will also be presented.

  15. 3D Exploration of Meteorological Data: Facing the challenges of operational forecasters

    NASA Astrophysics Data System (ADS)

    Koutek, Michal; Debie, Frans; van der Neut, Ian

    2016-04-01

    In the past years the Royal Netherlands Meteorological Institute (KNMI) has been working on innovation in the field of meteorological data visualization. We are dealing with Numerical Weather Prediction (NWP) model data and observational data, i.e. satellite images, precipitation radar, ground and air-borne measurements. These multidimensional multivariate data are geo-referenced and can be combined in 3D space to provide more intuitive views on the atmospheric phenomena. We developed the Weather3DeXplorer (W3DX), a visualization framework for processing and interactive exploration and visualization using Virtual Reality (VR) technology. We managed to have great successes with research studies on extreme weather situations. In this paper we will elaborate what we have learned from application of interactive 3D visualization in the operational weather room. We will explain how important it is to control the degrees-of-freedom during interaction that are given to the users: forecasters/scientists; (3D camera and 3D slicing-plane navigation appear to be rather difficult for the users, when not implemented properly). We will present a novel approach of operational 3D visualization user interfaces (UI) that for a great deal eliminates the obstacle and the time it usually takes to set up the visualization parameters and an appropriate camera view on a certain atmospheric phenomenon. We have found our inspiration in the way our operational forecasters work in the weather room. We decided to form a bridge between 2D visualization images and interactive 3D exploration. Our method combines WEB-based 2D UI's, pre-rendered 3D visualization catalog for the latest NWP model runs, with immediate entry into interactive 3D session for selected visualization setting. Finally, we would like to present the first user experiences with this approach.

  16. Comparative evaluation of spectroscopic models using different multivariate statistical tools in a multicancer scenario

    NASA Astrophysics Data System (ADS)

    Ghanate, A. D.; Kothiwale, S.; Singh, S. P.; Bertrand, Dominique; Krishna, C. Murali

    2011-02-01

    Cancer is now recognized as one of the major causes of morbidity and mortality. Histopathological diagnosis, the gold standard, is shown to be subjective, time consuming, prone to interobserver disagreement, and often fails to predict prognosis. Optical spectroscopic methods are being contemplated as adjuncts or alternatives to conventional cancer diagnostics. The most important aspect of these approaches is their objectivity, and multivariate statistical tools play a major role in realizing it. However, rigorous evaluation of the robustness of spectral models is a prerequisite. The utility of Raman spectroscopy in the diagnosis of cancers has been well established. Until now, the specificity and applicability of spectral models have been evaluated for specific cancer types. In this study, we have evaluated the utility of spectroscopic models representing normal and malignant tissues of the breast, cervix, colon, larynx, and oral cavity in a broader perspective, using different multivariate tests. The limit test, which was used in our earlier study, gave high sensitivity but suffered from poor specificity. The performance of other methods such as factorial discriminant analysis and partial least square discriminant analysis are at par with more complex nonlinear methods such as decision trees, but they provide very little information about the classification model. This comparative study thus demonstrates not just the efficacy of Raman spectroscopic models but also the applicability and limitations of different multivariate tools for discrimination under complex conditions such as the multicancer scenario.

  17. Prediction of the Main Engine Power of a New Container Ship at the Preliminary Design Stage

    NASA Astrophysics Data System (ADS)

    Cepowski, Tomasz

    2017-06-01

    The paper presents mathematical relationships that allow us to forecast the estimated main engine power of new container ships, based on data concerning vessels built in 2005-2015. The presented approximations allow us to estimate the engine power based on the length between perpendiculars and the number of containers the ship will carry. The approximations were developed using simple linear regression and multivariate linear regression analysis. The presented relations have practical application for estimation of container ship engine power needed in preliminary parametric design of the ship. It follows from the above that the use of multiple linear regression to predict the main engine power of a container ship brings more accurate solutions than simple linear regression.

  18. Penalized spline estimation for functional coefficient regression models.

    PubMed

    Cao, Yanrong; Lin, Haiqun; Wu, Tracy Z; Yu, Yan

    2010-04-01

    The functional coefficient regression models assume that the regression coefficients vary with some "threshold" variable, providing appreciable flexibility in capturing the underlying dynamics in data and avoiding the so-called "curse of dimensionality" in multivariate nonparametric estimation. We first investigate the estimation, inference, and forecasting for the functional coefficient regression models with dependent observations via penalized splines. The P-spline approach, as a direct ridge regression shrinkage type global smoothing method, is computationally efficient and stable. With established fixed-knot asymptotics, inference is readily available. Exact inference can be obtained for fixed smoothing parameter λ, which is most appealing for finite samples. Our penalized spline approach gives an explicit model expression, which also enables multi-step-ahead forecasting via simulations. Furthermore, we examine different methods of choosing the important smoothing parameter λ: modified multi-fold cross-validation (MCV), generalized cross-validation (GCV), and an extension of empirical bias bandwidth selection (EBBS) to P-splines. In addition, we implement smoothing parameter selection using mixed model framework through restricted maximum likelihood (REML) for P-spline functional coefficient regression models with independent observations. The P-spline approach also easily allows different smoothness for different functional coefficients, which is enabled by assigning different penalty λ accordingly. We demonstrate the proposed approach by both simulation examples and a real data application.

  19. Mid-Term Probabilistic Forecast of Oil Spill Trajectories

    NASA Astrophysics Data System (ADS)

    Castanedo, S.; Abascal, A. J.; Cardenas, M.; Medina, R.; Guanche, Y.; Mendez, F. J.; Camus, P.

    2012-12-01

    There is increasing concern about the threat posed by oil spills to the coastal environment. This is reflected in the promulgation of various national and international standards among which are those that require companies whose activities involves oil spill risk, to have oil pollution emergency plans or similar arrangements for responding promptly and effectively to oil pollution incidents. Operational oceanography systems (OOS) that provide decision makers with oil spill trajectory forecasting, have demonstrated their usefulness in recent accidents (Castanedo et al., 2006). In recent years, many national and regional OOS have been setup focusing on short-term oil spill forecast (up to 5 days). However, recent accidental marine oil spills (Prestige in Spain, Deep Horizon in Gulf of Mexico) have revealed the importance of having larger prediction horizons (up to 15 days) in regional-scale areas. In this work, we have developed a methodology to provide probabilistic oil spill forecast based on numerical modelling and statistical methods. The main components of this approach are: (1) Use of high resolution long-term (1948-2009) historical hourly data bases of wind, wind-induced currents and astronomical tide currents obtained using state-of-the-art numerical models; (2) classification of representative wind field patterns (n=100) using clustering techniques based on PCA and K-means algorithms (Camus et al., 2011); (3) determination of the cluster occurrence probability and the stochastic matrix (matrix of transition of probability or Markov matrix), p_ij, (probability of moving from a cluster "i" to a cluster "j" in one time step); (4) Initial state for mid-term simulations is obtained from available wind forecast using nearest-neighbors analog method; (5) 15-days Stochastic Markov Chain simulations (m=1000) are launched; (6) Corresponding oil spill trajectories are carried out by TESEO Lagrangian transport model (Abascal et al., 2009); (7) probability maps are delivered using an user friendly Web App. The application of the method to the Gulf of Biscay (North Spain) will show the ability of this approach. References Abascal, A.J., Castanedo, S., Mendez, F.J., Medina, R., Losada, I.J., 2009. Calibration of a Lagrangian transport model using drifting buoys deployed during the Prestige oil spill. J. Coast. Res. 25 (1), 80-90.. Camus, P., Méndez, F.J., Medina, R., 2011. Analysis of clustering and selection algorithms for the study of multivariate wave climate. Coastal Engineering, doi:10.1016/j.coastaleng.2011.02.003. Castanedo, S., Medina, R., Losada, I.J., Vidal, C., Méndez, F.J., Osorio, A., Juanes, J.A., Puente, A., 2006. The Prestige oil spill in Cantabria (Bay of Biscay). Part I: operational forecasting system for quick response, risk assessment and protection of natural resources. J. Coast. Res. 22 (6), 1474-1489.

  20. A constrained multinomial Probit route choice model in the metro network: Formulation, estimation and application

    PubMed Central

    Zhang, Yongsheng; Wei, Heng; Zheng, Kangning

    2017-01-01

    Considering that metro network expansion brings us with more alternative routes, it is attractive to integrate the impacts of routes set and the interdependency among alternative routes on route choice probability into route choice modeling. Therefore, the formulation, estimation and application of a constrained multinomial probit (CMNP) route choice model in the metro network are carried out in this paper. The utility function is formulated as three components: the compensatory component is a function of influencing factors; the non-compensatory component measures the impacts of routes set on utility; following a multivariate normal distribution, the covariance of error component is structured into three parts, representing the correlation among routes, the transfer variance of route, and the unobserved variance respectively. Considering multidimensional integrals of the multivariate normal probability density function, the CMNP model is rewritten as Hierarchical Bayes formula and M-H sampling algorithm based Monte Carlo Markov Chain approach is constructed to estimate all parameters. Based on Guangzhou Metro data, reliable estimation results are gained. Furthermore, the proposed CMNP model also shows a good forecasting performance for the route choice probabilities calculation and a good application performance for transfer flow volume prediction. PMID:28591188

  1. Risk assessment and stock market volatility in the Eurozone: 1986-2014

    NASA Astrophysics Data System (ADS)

    Menezes, Rui; Oliveira, Álvaro

    2015-04-01

    This paper studies the stock market return's volatility in the Eurozone as an input for evaluating the market risk. Stock market returns are endogenously determined by long-term interest rate changes and so is the return's conditional variance. The conditional variance is the time-dependent variance of the underlying variable. In other words, it is the variance of the returns measured at each moment t, so it changes through time depending on the specific market structure at each time observation. Thus, a multivariate EGARCH model is proposed to capture the complex nature of this network. By network, in this context, we mean the chain of stock exchanges that co-move and interact in such a way that a shock in one of them propagates up to the other ones (contagion). Previous studies provide evidence that the Eurozone stock exchanges are deeply integrated. The results indicate that asymmetry and leverage effects exist along with fat tails and endogeneity. In-sample and out-of-sample forecasting tests provide clear evidence that the multivariate EGARCH model performs better than the univariate counterpart to predict the behavior of returns both before and after the 2008 crisis.

  2. Study of cyanotoxins presence from experimental cyanobacteria concentrations using a new data mining methodology based on multivariate adaptive regression splines in Trasona reservoir (Northern Spain).

    PubMed

    Garcia Nieto, P J; Sánchez Lasheras, F; de Cos Juez, F J; Alonso Fernández, J R

    2011-11-15

    There is an increasing need to describe cyanobacteria blooms since some cyanobacteria produce toxins, termed cyanotoxins. These latter can be toxic and dangerous to humans as well as other animals and life in general. It must be remarked that the cyanobacteria are reproduced explosively under certain conditions. This results in algae blooms, which can become harmful to other species if the cyanobacteria involved produce cyanotoxins. In this research work, the evolution of cyanotoxins in Trasona reservoir (Principality of Asturias, Northern Spain) was studied with success using the data mining methodology based on multivariate adaptive regression splines (MARS) technique. The results of the present study are two-fold. On one hand, the importance of the different kind of cyanobacteria over the presence of cyanotoxins in the reservoir is presented through the MARS model and on the other hand a predictive model able to forecast the possible presence of cyanotoxins in a short term was obtained. The agreement of the MARS model with experimental data confirmed the good performance of the same one. Finally, conclusions of this innovative research are exposed. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Integrated GIS and multivariate statistical analysis for regional scale assessment of heavy metal soil contamination: A critical review.

    PubMed

    Hou, Deyi; O'Connor, David; Nathanail, Paul; Tian, Li; Ma, Yan

    2017-12-01

    Heavy metal soil contamination is associated with potential toxicity to humans or ecotoxicity. Scholars have increasingly used a combination of geographical information science (GIS) with geostatistical and multivariate statistical analysis techniques to examine the spatial distribution of heavy metals in soils at a regional scale. A review of such studies showed that most soil sampling programs were based on grid patterns and composite sampling methodologies. Many programs intended to characterize various soil types and land use types. The most often used sampling depth intervals were 0-0.10 m, or 0-0.20 m, below surface; and the sampling densities used ranged from 0.0004 to 6.1 samples per km 2 , with a median of 0.4 samples per km 2 . The most widely used spatial interpolators were inverse distance weighted interpolation and ordinary kriging; and the most often used multivariate statistical analysis techniques were principal component analysis and cluster analysis. The review also identified several determining and correlating factors in heavy metal distribution in soils, including soil type, soil pH, soil organic matter, land use type, Fe, Al, and heavy metal concentrations. The major natural and anthropogenic sources of heavy metals were found to derive from lithogenic origin, roadway and transportation, atmospheric deposition, wastewater and runoff from industrial and mining facilities, fertilizer application, livestock manure, and sewage sludge. This review argues that the full potential of integrated GIS and multivariate statistical analysis for assessing heavy metal distribution in soils on a regional scale has not yet been fully realized. It is proposed that future research be conducted to map multivariate results in GIS to pinpoint specific anthropogenic sources, to analyze temporal trends in addition to spatial patterns, to optimize modeling parameters, and to expand the use of different multivariate analysis tools beyond principal component analysis (PCA) and cluster analysis (CA). Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in challenging Raman endoscopic applications.

  5. Operational flood forecasting: further lessons learned form a recent inundation in Tuscany, Italy

    NASA Astrophysics Data System (ADS)

    Caparrini, F.; Castelli, F.; di Carlo, E.

    2010-09-01

    After a few years of experimental setup, model refinement and parameters calibration, a distributed flood forecasting system for the Tuscany region was promoted to operational use in early 2008. The hydrologic core of the system, MOBIDIC, is a fully distributed soil moisture accounting model, with sequential assimilation of hydrometric data. The model is forced by the real-time dense hydrometeorological network of the Regional Hydrologic Service as well from the QPF products of a number of different limited area meteorological models (LAMI, WRF+ECMWF, WRF+GFS). Given the relatively short response time of the Tuscany basins, the river flow forecasts based on ground measured precipitation are operationally used mainly as a monitoring tool, while the true usable predictions are necessarily based on the QPF input. The first severe flooding event the system had to face occurred in late December 2009, when a failure of the right levee of the Serchio river caused an extensive inundation (on December 25th). In the days following the levee breaking, intensive monitoring and forecast was needed (another flood peak occurred on the night between December 29th and January 1st 2010) as a support for decisions regarding the management of the increased vulnerability of the area and the planning of emergency reparation works at the river banks. The operational use of the system during such a complex event, when both the meteorological and the hydrological components may be said to have performed well form a strict modeling point of view, brought to attention a number of additional issues about the system as a whole. The main of these issues may be phrased in terms of additional system requirements, namely: the ranking of different QPF products in terms of some likelihood measure; the rapid redefinition of alarm thresholds due to sudden changes in the river flow capacity; the supervised prediction for evaluating the consequences of different management scenarios for reservoirs, regulated floodplains, levees, etc. In order to quantitatively address these issues, a multivariate sensitivity hindcast of the above event is presented here, where variation of model predictions and subsequent likely decision making are measured against QPF accuracy, other possible levees failures, different reservoir releases.

  6. Remote sensing techniques for conservation and management of natural vegetation ecosystems

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Verdesio, J. J.; Dossantos, J. R.

    1981-01-01

    The importance of using remote sensing techniques, in the visible and near-infrared ranges, for mapping, inventory, conservation and management of natural ecosystems is discussed. Some examples realized in Brazil or other countries are given to evaluate the products from orbital platform (MSS and RBV imagery of LANDSAT) and aerial level (photography) for ecosystems study. The maximum quantitative and qualitative information which can be obtained from each sensor, at different level, are discussed. Based on the developed experiments it is concluded that the remote sensing technique is a useful tool in mapping vegetation units, estimating biomass, forecasting and evaluation of fire damage, disease detection, deforestation mapping and change detection in land-use. In addition, remote sensing techniques can be used in controling implantation and planning natural/artificial regeneration.

  7. A long-term forecast analysis on worldwide land uses.

    PubMed

    Zhang, Wenjun; Qi, Yanhong; Zhang, Zhiguo

    2006-08-01

    More and more lands worldwide are being cultivated for food production while forests are disappearing at an unprecedented rate. This paper aims to make a long-term forecast on land uses worldwide and provide the public, researchers, and government officials with a clear profile for land uses in the future. Data of land uses since 1961 were used to fit historical trajectories and make the forecast. The results show that trajectories of land areas can be well fitted with univariate linear regressions. The forecasts of land uses during the coming 25 years were given in detail. Areas of agricultural land, arable land, and permanent pasture land worldwide would increase by 6.6%, 7.2%, and 6.3% respectively in the year 2030 as compared to the current areas. Permanent crops land area all over the world is forecasted to increase 0.64% by 2030. By the year 2030 the areas of forests and woodland, nonarable and nonpermanent land worldwide would decrease by 2.4% and 0.9% against the current areas. All other land area in the world would dramatically decline by 6.4% by the year 2030. Overall the land area related to agriculture would tend to decrease in developed countries, industrialized countries, Europe, and North and Central America. The agriculture related land area would considerably increase in developing countries, least developed countries, low-income countries, Asia, Africa, South America, etc. Developing countries hold larger total land area than developed countries. Dramatic and continuous growth in agricultural land area of developing countries would largely contribute to the expected growth of world agricultural land area in the coming years. Population explosion, food shortage and poverty in the world, especially in developing countries, together caused the excessive cultivation of land for agricultural uses in the past years. Increasing agricultural land area exacerbates the climate changes and degradation of environment. How to limit the growth of human population is a key problem for reducing agricultural land expansion. Development and use of high-yielding and high-quality crop and animal varieties, diversification of human food sources, and technical and financial assistance to developing countries from developed countries, should also be implemented and strengthened in the future in order to slow down or even reverse the increase trend of agricultural land area. Sustainable agriculture is the effective way to stabilize the agricultural land area without food shortage. Through various techniques and measures, sustainable agriculture may meet the food production goals with minimum environmental risk. Public awareness and interest in sustainable agriculture will help realize and ease the increasing stress from agricultural land expansion.

  8. Equicontrollability and the model following problem

    NASA Technical Reports Server (NTRS)

    Curran, R. T.

    1971-01-01

    Equicontrollability and its application to the linear time-invariant model-following problem are discussed. The problem is presented in the form of two systems, the plant and the model. The requirement is to find a controller to apply to the plant so that the resultant compensated plant behaves, in an input-output sense, the same as the model. All systems are assumed to be linear and time-invariant. The basic approach is to find suitable equicontrollable realizations of the plant and model and to utilize feedback so as to produce a controller of minimal state dimension. The concept of equicontrollability is a generalization of control canonical (phase variable) form applied to multivariable systems. It allows one to visualize clearly the effects of feedback and to pinpoint the parameters of a multivariable system which are invariant under feedback. The basic contributions are the development of equicontrollable form; solution of the model-following problem in an entirely algorithmic way, suitable for computer programming; and resolution of questions on system decoupling.

  9. A bispectral q-hypergeometric basis for a class of quantum integrable models

    NASA Astrophysics Data System (ADS)

    Baseilhac, Pascal; Martin, Xavier

    2018-01-01

    For the class of quantum integrable models generated from the q-Onsager algebra, a basis of bispectral multivariable q-orthogonal polynomials is exhibited. In the first part, it is shown that the multivariable Askey-Wilson polynomials with N variables and N + 3 parameters introduced by Gasper and Rahman [Dev. Math. 13, 209 (2005)] generate a family of infinite dimensional modules for the q-Onsager algebra, whose fundamental generators are realized in terms of the multivariable q-difference and difference operators proposed by Iliev [Trans. Am. Math. Soc. 363, 1577 (2011)]. Raising and lowering operators extending those of Sahi [SIGMA 3, 002 (2007)] are also constructed. In the second part, finite dimensional modules are constructed and studied for a certain class of parameters and if the N variables belong to a discrete support. In this case, the bispectral property finds a natural interpretation within the framework of tridiagonal pairs. In the third part, eigenfunctions of the q-Dolan-Grady hierarchy are considered in the polynomial basis. In particular, invariant subspaces are identified for certain conditions generalizing Nepomechie's relations. In the fourth part, the analysis is extended to the special case q = 1. This framework provides a q-hypergeometric formulation of quantum integrable models such as the open XXZ spin chain with generic integrable boundary conditions (q ≠ 1).

  10. Nonlinear Decoupling Control With ANFIS-Based Unmodeled Dynamics Compensation for a Class of Complex Industrial Processes.

    PubMed

    Zhang, Yajun; Chai, Tianyou; Wang, Hong; Wang, Dianhui; Chen, Xinkai

    2018-06-01

    Complex industrial processes are multivariable and generally exhibit strong coupling among their control loops with heavy nonlinear nature. These make it very difficult to obtain an accurate model. As a result, the conventional and data-driven control methods are difficult to apply. Using a twin-tank level control system as an example, a novel multivariable decoupling control algorithm with adaptive neural-fuzzy inference system (ANFIS)-based unmodeled dynamics (UD) compensation is proposed in this paper for a class of complex industrial processes. At first, a nonlinear multivariable decoupling controller with UD compensation is introduced. Different from the existing methods, the decomposition estimation algorithm using ANFIS is employed to estimate the UD, and the desired estimating and decoupling control effects are achieved. Second, the proposed method does not require the complicated switching mechanism which has been commonly used in the literature. This significantly simplifies the obtained decoupling algorithm and its realization. Third, based on some new lemmas and theorems, the conditions on the stability and convergence of the closed-loop system are analyzed to show the uniform boundedness of all the variables. This is then followed by the summary on experimental tests on a heavily coupled nonlinear twin-tank system that demonstrates the effectiveness and the practicability of the proposed method.

  11. A simplified dynamic model of the T700 turboshaft engine

    NASA Technical Reports Server (NTRS)

    Duyar, Ahmet; Gu, Zhen; Litt, Jonathan S.

    1992-01-01

    A simplified open-loop dynamic model of the T700 turboshaft engine, valid within the normal operating range of the engine, is developed. This model is obtained by linking linear state space models obtained at different engine operating points. Each linear model is developed from a detailed nonlinear engine simulation using a multivariable system identification and realization method. The simplified model may be used with a model-based real time diagnostic scheme for fault detection and diagnostics, as well as for open loop engine dynamics studies and closed loop control analysis utilizing a user generated control law.

  12. Basic principles of Hasse diagram technique in chemistry.

    PubMed

    Brüggemann, Rainer; Voigt, Kristina

    2008-11-01

    Principles of partial order applied to ranking are explained. The Hasse diagram technique (HDT) is the application of partial order theory based on a data matrix. In this paper, HDT is introduced in a stepwise procedure, and some elementary theorems are exemplified. The focus is to show how the multivariate character of a data matrix is realized by HDT and in which cases one should apply other mathematical or statistical methods. Many simple examples illustrate the basic theoretical ideas. Finally, it is shown that HDT is a useful alternative for the evaluation of antifouling agents, which was originally performed by amoeba diagrams.

  13. Debris flow early warning systems in Norway: organization and tools

    NASA Astrophysics Data System (ADS)

    Kleivane, I.; Colleuille, H.; Haugen, L. E.; Alve Glad, P.; Devoli, G.

    2012-04-01

    In Norway, shallow slides and debris flows occur as a combination of high-intensity precipitation, snowmelt, high groundwater level and saturated soil. Many events have occurred in the last decades and are often associated with (or related to) floods events, especially in the Southern of Norway, causing significant damages to roads, railway lines, buildings, and other infrastructures (i.e November 2000; August 2003; September 2005; November 2005; Mai 2008; June and Desember 2011). Since 1989 the Norwegian Water Resources and Energy Directorate (NVE) has had an operational 24 hour flood forecasting system for the entire country. From 2009 NVE is also responsible to assist regions and municipalities in the prevention of disasters posed by landslides and snow avalanches. Besides assisting the municipalities through implementation of digital landslides inventories, susceptibility and hazard mapping, areal planning, preparation of guidelines, realization of mitigation measures and helping during emergencies, NVE is developing a regional scale debris flow warning system that use hydrological models that are already available in the flood warning systems. It is well known that the application of rainfall thresholds is not sufficient to evaluate the hazard for debris flows and shallow slides, and soil moisture conditions play a crucial role in the triggering conditions. The information on simulated soil and groundwater conditions and water supply (rain and snowmelt) based on weather forecast, have proved to be useful variables that indicate the potential occurrence of debris flows and shallow slides. Forecasts of runoff and freezing-thawing are also valuable information. The early warning system is using real-time measurements (Discharge; Groundwater level; Soil water content and soil temperature; Snow water equivalent; Meteorological data) and model simulations (a spatially distributed version of the HBV-model and an adapted version of 1-D soil water and energy balance model COUP). The data are presented in a web- and GIS-based system with daily nationwide maps showing the meteorological and hydrological conditions for the present and the near future from quantitative weather prognosis. In addition a division of the country in homogenous debris flow-prone regions is also under progress based on geomorfological, topographic parameters and loose quaternary deposits distribution. Threshold-levels are being investigated by using statistical analyses of historical debris flows events and measured hydro-meteorological parameters. The debris flow early warning system is currently being tested and is expected to be operational in 2013. Final products will be warning messages and a map showing the different hazard levels, from low to high, indicating the landslide probability and the type of expected damages in a certain area. Many activities are realized in strong collaboration with the road and railway authorities, the geological survey and private consultant companies.

  14. Realized niche shift associated with the Eurasian charophyte Nitellopsis obtusa becoming invasive in North America

    PubMed Central

    Escobar, Luis E.; Qiao, Huijie; Phelps, Nicholas B. D.; Wagner, Carli K.; Larkin, Daniel J.

    2016-01-01

    Nitellopsis obtusa (starry stonewort) is a dioecious green alga native to Europe and Asia that has emerged as an aquatic invasive species in North America. Nitellopsis obtusa is rare across large portions of its native range, but has spread rapidly in northern-tier lakes in the United States, where it can interfere with recreation and may displace native species. Little is known about the invasion ecology of N. obtusa, making it difficult to forecast future expansion. Using ecological niche modeling we investigated environmental variables associated with invasion risk. We used species records, climate data, and remotely sensed environmental variables to characterize the species’ multidimensional distribution. We found that N. obtusa is exploiting novel ecological niche space in its introduced range, which may help explain its invasiveness. While the fundamental niche of N. obtusa may be stable, there appears to have been a shift in its realized niche associated with invasion in North America. Large portions of the United States are predicted to constitute highly suitable habitat for N. obtusa. Our results can inform early detection and rapid response efforts targeting N. obtusa and provide testable estimates of the physiological tolerances of this species as a baseline for future empirical research. PMID:27363541

  15. Phase I of a National Phenological Assessment

    NASA Astrophysics Data System (ADS)

    Betancourt, J. L.; Henebry, G. M.

    2009-12-01

    Phenology is the gateway to climatic effects on both managed and unmanaged ecosystems. Adaptation to climatic variability and change will require integration of phenological data and models with climatic forecasts at seasonal to decadal timescales. We propose a scoping study to identify, formulate, and refine approaches to the first National Phenological Assessment (NPA) for the U.S. The NPA should be viewed as a data product of the USA-National Phenology Network that will help guide future phenological monitoring and research at the national level. We envision three main objectives for the first NPA: 1) Establish a suite of indicators of phenological change (IPCs) at regional to continental scales, following the Heinz Center model for such national assessments; 2) Using sufficiently long and broad-scale time series of IPCs and legacy phenological data, assess phenological responses to what many scientists are calling the early stages of anthropogenic climate change, specifically the abrupt advance in spring onset in the late 1970’s/early 1980’s 3) Project large-scale phenological changes into 21st Century using GCM and RCM model realizations. Toward this end we see the following tasks as critical preliminary work to plan the first NPA: a) Identify, evaluate, and refine IPCs based on indices developed from standard weather observations, streamflow and other hydrological observations (e.g., center of mass, lake freeze/thaw, etc.), plant and animal phenology observations from legacy datasets, remote sensing datastreams, flux tower observations, and GCM and RCM model realizations; b) Evaluate covariability between IPCs, legacy phenological data, and large-scale modes of climate variability to help detection and attribution of supposed secular trends and development of short and long-lead forecasts for phenological variations; c) identify, evaluate, and refine optimal methods for quantifying what constitutes significant statistical and ecological change in phenological indicators, given uncertainties in both data and methods and defined range of natural variability; d) identify, evaluate, and refine key questions of natural resource managers regarding phenological indicators for monitoring and adaptive management of habitats and wildlife, given the spectrum of management objectives on federal, state, and private lands.

  16. Soil-vegetation-atmosphere energy fluxes: Land Surface Temperature evaluation by Terra/MODIS satellite images

    NASA Astrophysics Data System (ADS)

    Telesca, V.; Copertino, V. A.; Scavone, G.; Pastore, V.; Dal Sasso, S.

    2009-04-01

    Most of the hydrological models are by now founded on field and satellite data integration. In fact, the use of remote sensing techniques supplies the frequent lack of field-measured variables and parameters required to apply evaluation models of the hydrological cycle components at a regional scale. These components are very sensitive to the climatic and surface features and conditions. Remote sensing represent a complementary contribution to in situ investigation methodologies, furnishing repeated and real time observations. Naturally, the interest of these techniques is tied up to the existence of a solid correlation among the greatness to evaluate and the remote sensing information obtainable from the images. In this context, satellite remote sensing has become a basic tool since it allows the regular monitoring of extensive areas. Different surface variables and parameters can be extracted from the combination of the multi-spectral information contained in a satellite image. Land Surface Temperature (LST) is a fundamental parameter to estimate most of the components of the hydrological cycle and the soil-atmosphere energy balance, such as the net radiation, the sensible heat flux and the actual evapotranspiration. Besides, LST maps can be used in models for the fire monitoring and prevention. The aim of this work is to realize, exploiting the contribution of the remote sensing, some Land Surface Temperature maps, applying different "Split Windows" algorithms and to compare them with the "Day/Night" LST/MODIS, to select the best algorithm to apply in a Two-Source Energy Balance model (STSEB). Integrated into a rainfall/runoff model, it can contribute to cope with problems of land management for the protection from natural hazards. In particular, the energy balance procedure will be included into a model for the ‘in continuous' simulation and the forecast of floods. Another important application of our model is tied up to the forecast of scenarios connected to drought problems. In this context, they can contribute to the planning and the realization of mitigation interventions for the desertification risk.

  17. Accurate Influenza Monitoring and Forecasting Using Novel Internet Data Streams: A Case Study in the Boston Metropolis

    PubMed Central

    Lu, Fred Sun; Hou, Suqin; Baltrusaitis, Kristin; Shah, Manan; Leskovec, Jure; Sosic, Rok; Hawkins, Jared; Brownstein, John; Conidi, Giuseppe; Gunn, Julia; Gray, Josh; Zink, Anna

    2018-01-01

    Background Influenza outbreaks pose major challenges to public health around the world, leading to thousands of deaths a year in the United States alone. Accurate systems that track influenza activity at the city level are necessary to provide actionable information that can be used for clinical, hospital, and community outbreak preparation. Objective Although Internet-based real-time data sources such as Google searches and tweets have been successfully used to produce influenza activity estimates ahead of traditional health care–based systems at national and state levels, influenza tracking and forecasting at finer spatial resolutions, such as the city level, remain an open question. Our study aimed to present a precise, near real-time methodology capable of producing influenza estimates ahead of those collected and published by the Boston Public Health Commission (BPHC) for the Boston metropolitan area. This approach has great potential to be extended to other cities with access to similar data sources. Methods We first tested the ability of Google searches, Twitter posts, electronic health records, and a crowd-sourced influenza reporting system to detect influenza activity in the Boston metropolis separately. We then adapted a multivariate dynamic regression method named ARGO (autoregression with general online information), designed for tracking influenza at the national level, and showed that it effectively uses the above data sources to monitor and forecast influenza at the city level 1 week ahead of the current date. Finally, we presented an ensemble-based approach capable of combining information from models based on multiple data sources to more robustly nowcast as well as forecast influenza activity in the Boston metropolitan area. The performances of our models were evaluated in an out-of-sample fashion over 4 influenza seasons within 2012-2016, as well as a holdout validation period from 2016 to 2017. Results Our ensemble-based methods incorporating information from diverse models based on multiple data sources, including ARGO, produced the most robust and accurate results. The observed Pearson correlations between our out-of-sample flu activity estimates and those historically reported by the BPHC were 0.98 in nowcasting influenza and 0.94 in forecasting influenza 1 week ahead of the current date. Conclusions We show that information from Internet-based data sources, when combined using an informed, robust methodology, can be effectively used as early indicators of influenza activity at fine geographic resolutions. PMID:29317382

  18. Constructing probabilistic scenarios for wide-area solar power generation

    DOE PAGES

    Woodruff, David L.; Deride, Julio; Staid, Andrea; ...

    2017-12-22

    Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less

  19. Constructing probabilistic scenarios for wide-area solar power generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woodruff, David L.; Deride, Julio; Staid, Andrea

    Optimizing thermal generation commitments and dispatch in the presence of high penetrations of renewable resources such as solar energy requires a characterization of their stochastic properties. In this study, we describe novel methods designed to create day-ahead, wide-area probabilistic solar power scenarios based only on historical forecasts and associated observations of solar power production. Each scenario represents a possible trajectory for solar power in next-day operations with an associated probability computed by algorithms that use historical forecast errors. Scenarios are created by segmentation of historic data, fitting non-parametric error distributions using epi-splines, and then computing specific quantiles from these distributions.more » Additionally, we address the challenge of establishing an upper bound on solar power output. Our specific application driver is for use in stochastic variants of core power systems operations optimization problems, e.g., unit commitment and economic dispatch. These problems require as input a range of possible future realizations of renewables production. However, the utility of such probabilistic scenarios extends to other contexts, e.g., operator and trader situational awareness. Finally, we compare the performance of our approach to a recently proposed method based on quantile regression, and demonstrate that our method performs comparably to this approach in terms of two widely used methods for assessing the quality of probabilistic scenarios: the Energy score and the Variogram score.« less

  20. Imaging of near-Earth space plasma.

    PubMed

    Mitchell, Cathryn N

    2002-12-15

    This paper describes the technique of imaging the ionosphere using tomographic principles. It reports on current developments and speculates on the future of this research area. Recent developments in computing and ionospheric measurement, together with the sharing of data via the internet, now allow us to envisage a time when high-resolution, real-time images and 'movies' of the ionosphere will be possible for radio communications planning. There is great potential to use such images for improving our understanding of the physical processes controlling the behaviour of the ionosphere. While real-time images and movies of the electron concentration are now almost possible, forecasting of ionospheric morphology is still in its early stages. It has become clear that the ionosphere cannot be considered as a system in isolation, and consequently new research projects to link together models of the solar-terrestrial system, including the Sun, solar wind, magnetosphere, ionosphere and thermosphere, are now being proposed. The prospect is now on the horizon of assimilating data from the entire solar-terrestrial system to produce a real-time computer model and 'space weather' forecast. The role of tomography in imaging beyond the ionosphere to include the whole near-Earth space-plasma realm is yet to be realized, and provides a challenging prospect for the future. Finally, exciting possibilities exist in applying such methods to image the atmospheres and ionospheres of other planets.

  1. Ice flood velocity calculating approach based on single view metrology

    NASA Astrophysics Data System (ADS)

    Wu, X.; Xu, L.

    2017-02-01

    Yellow River is the river in which the ice flood occurs most frequently in China, hence, the Ice flood forecasting has great significance for the river flood prevention work. In various ice flood forecast models, the flow velocity is one of the most important parameters. In spite of the great significance of the flow velocity, its acquisition heavily relies on manual observation or deriving from empirical formula. In recent years, with the high development of video surveillance technology and wireless transmission network, the Yellow River Conservancy Commission set up the ice situation monitoring system, in which live videos can be transmitted to the monitoring center through 3G mobile networks. In this paper, an approach to get the ice velocity based on single view metrology and motion tracking technique using monitoring videos as input data is proposed. First of all, River way can be approximated as a plane. On this condition, we analyze the geometry relevance between the object side and the image side. Besides, we present the principle to measure length in object side from image. Secondly, we use LK optical flow which support pyramid data to track the ice in motion. Combining the result of camera calibration and single view metrology, we propose a flow to calculate the real velocity of ice flood. At last we realize a prototype system by programming and use it to test the reliability and rationality of the whole solution.

  2. A Study on Mutil-Scale Background Error Covariances in 3D-Var Data Assimilation

    NASA Astrophysics Data System (ADS)

    Zhang, Xubin; Tan, Zhe-Min

    2017-04-01

    The construction of background error covariances is a key component of three-dimensional variational data assimilation. There are different scale background errors and interactions among them in the numerical weather Prediction. However, the influence of these errors and their interactions cannot be represented in the background error covariances statistics when estimated by the leading methods. So, it is necessary to construct background error covariances influenced by multi-scale interactions among errors. With the NMC method, this article firstly estimates the background error covariances at given model-resolution scales. And then the information of errors whose scales are larger and smaller than the given ones is introduced respectively, using different nesting techniques, to estimate the corresponding covariances. The comparisons of three background error covariances statistics influenced by information of errors at different scales reveal that, the background error variances enhance particularly at large scales and higher levels when introducing the information of larger-scale errors by the lateral boundary condition provided by a lower-resolution model. On the other hand, the variances reduce at medium scales at the higher levels, while those show slight improvement at lower levels in the nested domain, especially at medium and small scales, when introducing the information of smaller-scale errors by nesting a higher-resolution model. In addition, the introduction of information of larger- (smaller-) scale errors leads to larger (smaller) horizontal and vertical correlation scales of background errors. Considering the multivariate correlations, the Ekman coupling increases (decreases) with the information of larger- (smaller-) scale errors included, whereas the geostrophic coupling in free atmosphere weakens in both situations. The three covariances obtained in above work are used in a data assimilation and model forecast system respectively, and then the analysis-forecast cycles for a period of 1 month are conducted. Through the comparison of both analyses and forecasts from this system, it is found that the trends for variation in analysis increments with information of different scale errors introduced are consistent with those for variation in variances and correlations of background errors. In particular, introduction of smaller-scale errors leads to larger amplitude of analysis increments for winds at medium scales at the height of both high- and low- level jet. And analysis increments for both temperature and humidity are greater at the corresponding scales at middle and upper levels under this circumstance. These analysis increments improve the intensity of jet-convection system which includes jets at different levels and coupling between them associated with latent heat release, and these changes in analyses contribute to the better forecasts for winds and temperature in the corresponding areas. When smaller-scale errors are included, analysis increments for humidity enhance significantly at large scales at lower levels to moisten southern analyses. This humidification devotes to correcting dry bias there and eventually improves forecast skill of humidity. Moreover, inclusion of larger- (smaller-) scale errors is beneficial for forecast quality of heavy (light) precipitation at large (small) scales due to the amplification (diminution) of intensity and area in precipitation forecasts but tends to overestimate (underestimate) light (heavy) precipitation .

  3. Quantifying the Value of Satellite Imagery in Agriculture and other Sectors

    NASA Astrophysics Data System (ADS)

    Brown, M. E.; Abbott, P. C.; Escobar, V. M.

    2013-12-01

    This study focused on quantifying the commercial value of satellite remote sensing for agriculture. Commercial value from satellite imagery arises when improved information leads to better economic decisions. We identified five areas of application of remote sensing to agriculture where there is this potential: crop management (precision agriculture), insurance, real estate assessment, crop forecasting, and environmental monitoring. These applications can be divided between public information (crop forecasting) and those that may generate private commercial value (crop management), with both public and private information dimensions in some categories. Public information applications of remote sensing have been more successful in the past, and are likely to generate more economic value in the future. It was found that several issues have limited realization of the potential to generate private value from remote sensing in agriculture. The scale of use is small to the high cost of acquiring and interpreting large images has limited the cost effectiveness to individual farmers. Insurance, environmental monitoring, and crop management services by cooperatives or consultants may be cases overcoming this limitation. The greatest opportunities for potential commercial value from agriculture are probably in the crop forecasting area, especially where agricultural statistics services are not as well developed, since public market information benefits a broad range of economic actors, not limited to countries where forecasts are made. We estimate here the value from components of USDA's World Agricultural Supply and Demand Estimates (WASDE) forecasts for corn, indicating potential value increasing in the range of 60 to 240 million if improved satellite based information enhances those forecasts. The research was conducted by agricultural economists at Purdue University, and will be the basis for further evaluation of the use of satellite data within the NASA Carbon Monitoring System (CMS). A general evaluation framework to determine the usefulness of the CMS products to various users and to the broader community interested in managing carbon is shown in Figure 2. The first step in conducting such an analysis is to develop an understanding of the history, institutions, behaviors and other factors setting the context of an application which CMS data products inform. Decision makers are identified (who may become early adopters), and the alternative decisions they might take are elaborated. Economic models informed by biophysical models would then predict the outcome of the engagement. The new information must then be linked to a revised decision, and that decision in turn must lead to better economic or social outcomes on average. The value of the information is estimated as the predicted increase in economic surplus (profit, cost, consumer welfare) or social outcome that is a direct result of that revised decision. Alternative Monte Carlo simulations would estimate averages of key outcomes under alternative circumstances, such as differing regulations or better data, hence capturing consequences of the changes induced. These approaches will be described in the context of NASA and satellite data.

  4. Incomplete Data in Smart Grid: Treatment of Values in Electric Vehicle Charging Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Majipour, Mostafa; Chu, Peter; Gadh, Rajit

    2014-11-03

    In this paper, five imputation methods namely Constant (zero), Mean, Median, Maximum Likelihood, and Multiple Imputation methods have been applied to compensate for missing values in Electric Vehicle (EV) charging data. The outcome of each of these methods have been used as the input to a prediction algorithm to forecast the EV load in the next 24 hours at each individual outlet. The data is real world data at the outlet level from the UCLA campus parking lots. Given the sparsity of the data, both Median and Constant (=zero) imputations improved the prediction results. Since in most missing value casesmore » in our database, all values of that instance are missing, the multivariate imputation methods did not improve the results significantly compared to univariate approaches.« less

  5. Modelization of the Current and Future Habitat Suitability of Rhododendron ferrugineum Using Potential Snow Accumulation

    PubMed Central

    Komac, Benjamin; Esteban, Pere; Trapero, Laura; Caritg, Roger

    2016-01-01

    Mountain areas are particularly sensitive to climate change. Species distribution models predict important extinctions in these areas whose magnitude will depend on a number of different factors. Here we examine the possible impact of climate change on the Rhododendron ferrugineum (alpenrose) niche in Andorra (Pyrenees). This species currently occupies 14.6 km2 of this country and relies on the protection afforded by snow cover in winter. We used high-resolution climatic data, potential snow accumulation and a combined forecasting method to obtain the realized niche model of this species. Subsequently, we used data from the high-resolution Scampei project climate change projection for the A2, A1B and B1 scenarios to model its future realized niche model. The modelization performed well when predicting the species’s distribution, which improved when we considered the potential snow accumulation, the most important variable influencing its distribution. We thus obtained a potential extent of about 70.7 km2 or 15.1% of the country. We observed an elevation lag distribution between the current and potential distribution of the species, probably due to its slow colonization rate and the small-scale survey of seedlings. Under the three climatic scenarios, the realized niche model of the species will be reduced by 37.9–70.1 km2 by the end of the century and it will become confined to what are today screes and rocky hillside habitats. The particular effects of climate change on seedling establishment, as well as on the species’ plasticity and sensitivity in the event of a reduction of the snow cover, could worsen these predictions. PMID:26824847

  6. Application of WRF - SWAT OpenMI 2.0 based models integration for real time hydrological modelling and forecasting

    NASA Astrophysics Data System (ADS)

    Bugaets, Andrey; Gonchukov, Leonid

    2014-05-01

    Intake of deterministic distributed hydrological models into operational water management requires intensive collection and inputting of spatial distributed climatic information in a timely manner that is both time consuming and laborious. The lead time of the data pre-processing stage could be essentially reduced by coupling of hydrological and numerical weather prediction models. This is especially important for the regions such as the South of the Russian Far East where its geographical position combined with a monsoon climate affected by typhoons and extreme heavy rains caused rapid rising of the mountain rivers water level and led to the flash flooding and enormous damage. The objective of this study is development of end-to-end workflow that executes, in a loosely coupled mode, an integrated modeling system comprised of Weather Research and Forecast (WRF) atmospheric model and Soil and Water Assessment Tool (SWAT 2012) hydrological model using OpenMI 2.0 and web-service technologies. Migration SWAT into OpenMI compliant involves reorganization of the model into a separate initialization, performing timestep and finalization functions that can be accessed from outside. To save SWAT normal behavior, the source code was separated from OpenMI-specific implementation into the static library. Modified code was assembled into dynamic library and wrapped into C# class implemented the OpenMI ILinkableComponent interface. Development of WRF OpenMI-compliant component based on the idea of the wrapping web-service clients into a linkable component and seamlessly access to output netCDF files without actual models connection. The weather state variables (precipitation, wind, solar radiation, air temperature and relative humidity) are processed by automatic input selection algorithm to single out the most relevant values used by SWAT model to yield climatic data at the subbasin scale. Spatial interpolation between the WRF regular grid and SWAT subbasins centroid (which are coinciding as virtual weather stations) realized as OpenMI AdaptedOutput. In order to make sure that SWAT-WRF integration technically sounds and preevaluate the impact of the climatic data resolution on the model parameters a number of test calculations were performed with different time-spatial aggregation of WRF output. Numerical experiments were carried out for the period of 2012-2013 on the Komarovka river watershed (former Primorskaya water-balance station) located in the small mountains landscapes in the western part of the Khankaiskaya plain. The watershed outlet is equipped with the automatic water level and rain gauging stations of Primorie Hydrometeorological Agency (Prigidromet http://primgidromet.ru) observation network. Spatial structure of SWAT simulation realized by ArcSWAT 2012 with 10m DEM resolution and 1:50000 soils and landuse cover. Sensitivity analysis and calibration are performed with SWAT CUP. WRF-SWAT composition is assembled in the GUI OpenMI. For the test basin in most cases the simulation results show that the predicted and measured water levels demonstrate acceptable agreement. Enforcing SWAT with WRF output avoids some semi-empirical model approximation, replaces a native weather generator for WRF forecast interval and improved upon the operational streamflow forecast. It is anticipated that leveraging direct use of the WRF variables (not only substituted standard SWAT input) will have good potential to make SWAT more physically sound.

  7. Have We Entered a 21st Century Prolonged Minimum of Solar Activity? Updated Implications of a 1987 Prediction

    NASA Astrophysics Data System (ADS)

    Shirley, James H.

    2009-05-01

    Fairbridge and Shirley (1987) predicted that a new prolonged minimum of solar activity would be underway by the year 2013 (Solar Physics 110, 191). While it is much too early to tell if this prediction will be fully realized, recent observations document a striking reduction in the Sun's general level of activity. While other forecasts of reduced future activity levels on decadal time scales have appeared, the Fairbridge-Shirley (FS) prediction is unique in pinpointing the current epoch. We are unaware of any forecast method that shows a better correspondence with the actual behavior of the Sun to this point. The FS prediction was based on the present-day recurrence of two physical indicators that were correlated in time with the occurrence of the Wolf, Sporer, and Maunder Minima. The amplitude of the inertial revolution of the axis of symmetry of the Sun's orbital motion about the solar system barycenter, and the direction in space of that axis, each bear a relationship to the occurrence of the prolonged minima of the historic record. The FS prediction appeared before the importance of solar meridional flows was generally appreciated, and before the existence and role of the tachocline was suspected. We will update and restate some of the physical implications of the FS results, along with those of some more recent investigations, particularly with reference to orbit-spin coupling hypotheses (Shirley, 2006: M.N.R.A.S. 368, 280). New investigations combining and integrating modern dynamo models with physical solutions describing key aspects of the variability of the solar motion may lead to significant advances in our ability to forecast future changes in the Sun. Acknowledgement: This work was supported by the resources of the author. No part of this work was performed at the Jet Propulsion Laboratory under a contract from NASA.

  8. Contribution of piezometric measurement on knowledge and management of low water levels

    NASA Astrophysics Data System (ADS)

    Bessiere, Hélène; Stollsteiner, Philippe; Allier, Delphine; Nicolas, Jérôme; Gourcy, Laurence

    2014-05-01

    This article is based on a BRGM study on piezometric indicators, threshold values of discharges and groundwater levels for the assessment of potentially pumpable volumes of chalky watersheds. A method for estimating low water levels from groundwater levels is presented from three examples of chalk aquifer; the first one is located in Picardy and the two other in the Champagne Ardennes region. Piezometers with "annual" cycles, used in these examples, are supposed to be representative of the aquifer hydrodynamics. The analysis leads to relatively precise and satisfactory relationships between groundwater levels and observed discharges for this chalky context. These relationships may be useful for monitoring, validation, extension or reconstruction of the low water flow. On the one hand, they allow defining the piezometric levels corresponding to the different alert thresholds of river discharges. On the other hand, they clarify the distribution of low water flow from runoff or the draining of the aquifer. Finally, these correlations give an assessment of the minimum flow for the coming weeks using of the rate of draining of the aquifer. Nevertheless the use of these correlations does not allow to optimize the value of pumpable volumes because it seems to be difficult to integrate the amount of the effective rainfall that may occur during the draining period. In addition, these relationships cannot be exploited for multi-annual cycle systems. In these cases, the solution seems to lie on the realization of a rainfall-runoff-piezometric level model. Therefore, two possibilities are possible. The first one is to achieve each year, on a given date, a forecast for the days or months to come with various frequential distributions rainfalls. However, the forecast must be reiterated each year depending on climatic conditions. The principle of the second method is to simulate forecasts for different rainfall intensities and following different initial conditions. The results are presented in chart form. In addition, this last method is currently tested for the problem of floods by groundwater level rise.

  9. A space weather information service based upon remote and in-situ measurements of coronal mass ejections heading for Earth

    NASA Astrophysics Data System (ADS)

    Hartkorn, O. A.; Ritter, B.; Meskers, A. J. H.; Miles, O.; Russwurm, M.; Scully, S.; Roldan, A.; Juestel, P.; Reville, V.; Lupu, S.; Ruffenach, A.

    2014-12-01

    The Earth's magnetosphere is formed as a consequence of the interaction between the planet's magnetic field and the solar wind, a continuous plasma stream from the Sun. A number of different solar wind phenomena have been studied over the past forty years with the intention of understandingand forcasting solar behavior and space weather. In particular, Earth-bound interplanetary coronal mass ejections (CMEs) can significantly disturb the Earth's magnetosphere for a short time and cause geomagnetic storms. We present a mission concept consisting of six spacecraft that are equally spaced in a heliocentric orbit at 0.72 AU. These spacecraft will monitor the plasma properties, the magnetic field's orientation and magnitude, and the 3D-propagation trajectory of CMEs heading for Earth. The primary objective of this mission is to increase space weather forecasting time by means of a near real-time information service, that is based upon in-situ and remote measurements of the CME properties. The mission secondary objective is the improvement of scientific space weather models. In-situ measurements are performed using a Solar Wind Analyzer instrumentation package and flux gate magnetometers. For remote measurements, coronagraphs are employed. The proposed instruments originate from other space missions with the intention to reduce mission costs and to streamline the mission design process. Communication with the six identical spacecraft is realized via a deep space network consisting of six ground stations. This network provides an information service that is in uninterrupted contact with the spacecraft, allowing for continuos space weather monitoring. A dedicated data processing center will handle all the data, and forward the processed data to the SSA Space Weather Coordination Center. This organization will inform the general public through a space weather forecast. The data processing center will additionally archive the data for the scientific community. This concept mission allows for major advances in space weather forecasting and the scientific modeling of space weather.

  10. Space Weather Forecasting: An Enigma

    NASA Astrophysics Data System (ADS)

    Sojka, J. J.

    2012-12-01

    The space age began in earnest on October 4, 1957 with the launch of Sputnik 1 and was fuelled for over a decade by very strong national societal concerns. Prior to this single event the adverse effects of space weather had been registered on telegraph lines as well as interference on early WWII radar systems, while for countless eons the beauty of space weather as mid-latitude auroral displays were much appreciated. These prior space weather impacts were in themselves only a low-level science puzzle pursued by a few dedicated researchers. The technology boost and innovation that the post Sputnik era generated has almost single handedly defined our present day societal technology infrastructure. During the decade following Neil's walk on the moon on July 21, 1969 an international thrust to understand the science of space, and its weather, was in progress. However, the search for scientific understand was parsed into independent "stove pipe" categories: The ionosphere-aeronomy, the magnetosphere, the heliosphere-sun. The present day scientific infrastructure of funding agencies, learned societies, and international organizations are still hampered by these 1960's logical divisions which today are outdated in the pursuit of understanding space weather. As this era of intensive and well funded scientific research progressed so did societies innovative uses for space technologies and space "spin-offs". Well over a decade ago leaders in technology, science, and the military realized that there was indeed an adverse side to space weather that with each passing year became more severe. In 1994 several U.S. agencies established the National Space Weather Program (NSWP) to focus scientific attention on the system wide issue of the adverse effects of space weather on society and its technologies. Indeed for the past two decades a significant fraction of the scientific community has actively engaged in understanding space weather and hence crossing the "stove-pipe" disciplines. The perceived progress in space weather understanding differs significantly depending upon which community (scientific, technology, forecaster, society) is addressing the question. Even more divergent are these thoughts when the question is how valuable is the scientific capability of forecasting space weather. This talk will discuss present day as well as future potential for forecasting space weather for a few selected examples. The author will attempt to straddle the divergent community opinions.

  11. Geostatistical Sampling Methods for Efficient Uncertainty Analysis in Flow and Transport Problems

    NASA Astrophysics Data System (ADS)

    Liodakis, Stylianos; Kyriakidis, Phaedon; Gaganis, Petros

    2015-04-01

    In hydrogeological applications involving flow and transport of in heterogeneous porous media the spatial distribution of hydraulic conductivity is often parameterized in terms of a lognormal random field based on a histogram and variogram model inferred from data and/or synthesized from relevant knowledge. Realizations of simulated conductivity fields are then generated using geostatistical simulation involving simple random (SR) sampling and are subsequently used as inputs to physically-based simulators of flow and transport in a Monte Carlo framework for evaluating the uncertainty in the spatial distribution of solute concentration due to the uncertainty in the spatial distribution of hydraulic con- ductivity [1]. Realistic uncertainty analysis, however, calls for a large number of simulated concentration fields; hence, can become expensive in terms of both time and computer re- sources. A more efficient alternative to SR sampling is Latin hypercube (LH) sampling, a special case of stratified random sampling, which yields a more representative distribution of simulated attribute values with fewer realizations [2]. Here, term representative implies realizations spanning efficiently the range of possible conductivity values corresponding to the lognormal random field. In this work we investigate the efficiency of alternative methods to classical LH sampling within the context of simulation of flow and transport in a heterogeneous porous medium. More precisely, we consider the stratified likelihood (SL) sampling method of [3], in which attribute realizations are generated using the polar simulation method by exploring the geometrical properties of the multivariate Gaussian distribution function. In addition, we propose a more efficient version of the above method, here termed minimum energy (ME) sampling, whereby a set of N representative conductivity realizations at M locations is constructed by: (i) generating a representative set of N points distributed on the surface of a M-dimensional, unit radius hyper-sphere, (ii) relocating the N points on a representative set of N hyper-spheres of different radii, and (iii) transforming the coordinates of those points to lie on N different hyper-ellipsoids spanning the multivariate Gaussian distribution. The above method is applied in a dimensionality reduction context by defining flow-controlling points over which representative sampling of hydraulic conductivity is performed, thus also accounting for the sensitivity of the flow and transport model to the input hydraulic conductivity field. The performance of the various stratified sampling methods, LH, SL, and ME, is compared to that of SR sampling in terms of reproduction of ensemble statistics of hydraulic conductivity and solute concentration for different sample sizes N (numbers of realizations). The results indicate that ME sampling constitutes an equally if not more efficient simulation method than LH and SL sampling, as it can reproduce to a similar extent statistics of the conductivity and concentration fields, yet with smaller sampling variability than SR sampling. References [1] Gutjahr A.L. and Bras R.L. Spatial variability in subsurface flow and transport: A review. Reliability Engineering & System Safety, 42, 293-316, (1993). [2] Helton J.C. and Davis F.J. Latin hypercube sampling and the propagation of uncertainty in analyses of complex systems. Reliability Engineering & System Safety, 81, 23-69, (2003). [3] Switzer P. Multiple simulation of spatial fields. In: Heuvelink G, Lemmens M (eds) Proceedings of the 4th International Symposium on Spatial Accuracy Assessment in Natural Resources and Environmental Sciences, Coronet Books Inc., pp 629?635 (2000).

  12. The predictive value of 2-year posttreatment biopsy after prostate cancer radiotherapy for eventual biochemical outcome

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vance, Waseet; Tucker, Susan L.; Crevoisier, Renaud de

    2007-03-01

    Purpose: To determine the value of a 2-year post-radiotherapy (RT) prostate biopsy for predicting eventual biochemical failure in patients who were treated for localized prostate cancer. Methods and Materials: This study comprised 164 patients who underwent a planned 2-year post-RT prostate biopsy. The independent prognostic value of the biopsy results for forecasting eventual biochemical outcome and overall survival was tested with other factors (the Gleason score, 1992 American Joint Committee on Cancer tumor stage, pretreatment prostate-specific antigen level, risk group, and RT dose) in a multivariate analysis. The current nadir + 2 (CN + 2) definition of biochemical failure wasmore » used. Patients with rising prostate-specific antigen (PSA) or suspicious digital rectal examination before the biopsy were excluded. Results: The biopsy results were normal in 78 patients, scant atypical and malignant cells in 30, carcinoma with treatment effect in 43, and carcinoma without treatment effect in 13. Using the CN + 2 definition, we found a significant association between biopsy results and eventual biochemical failure. We also found that the biopsy status provides predictive information independent of the PSA status at the time of biopsy. Conclusion: A 2-year post-RT prostate biopsy may be useful for forecasting CN + 2 biochemical failure. Posttreatment prostate biopsy may be useful for identifying patients for aggressive salvage therapy.« less

  13. Forecasting peak asthma admissions in London: an application of quantile regression models.

    PubMed

    Soyiri, Ireneous N; Reidpath, Daniel D; Sarran, Christophe

    2013-07-01

    Asthma is a chronic condition of great public health concern globally. The associated morbidity, mortality and healthcare utilisation place an enormous burden on healthcare infrastructure and services. This study demonstrates a multistage quantile regression approach to predicting excess demand for health care services in the form of asthma daily admissions in London, using retrospective data from the Hospital Episode Statistics, weather and air quality. Trivariate quantile regression models (QRM) of asthma daily admissions were fitted to a 14-day range of lags of environmental factors, accounting for seasonality in a hold-in sample of the data. Representative lags were pooled to form multivariate predictive models, selected through a systematic backward stepwise reduction approach. Models were cross-validated using a hold-out sample of the data, and their respective root mean square error measures, sensitivity, specificity and predictive values compared. Two of the predictive models were able to detect extreme number of daily asthma admissions at sensitivity levels of 76 % and 62 %, as well as specificities of 66 % and 76 %. Their positive predictive values were slightly higher for the hold-out sample (29 % and 28 %) than for the hold-in model development sample (16 % and 18 %). QRMs can be used in multistage to select suitable variables to forecast extreme asthma events. The associations between asthma and environmental factors, including temperature, ozone and carbon monoxide can be exploited in predicting future events using QRMs.

  14. Forecasting peak asthma admissions in London: an application of quantile regression models

    NASA Astrophysics Data System (ADS)

    Soyiri, Ireneous N.; Reidpath, Daniel D.; Sarran, Christophe

    2013-07-01

    Asthma is a chronic condition of great public health concern globally. The associated morbidity, mortality and healthcare utilisation place an enormous burden on healthcare infrastructure and services. This study demonstrates a multistage quantile regression approach to predicting excess demand for health care services in the form of asthma daily admissions in London, using retrospective data from the Hospital Episode Statistics, weather and air quality. Trivariate quantile regression models (QRM) of asthma daily admissions were fitted to a 14-day range of lags of environmental factors, accounting for seasonality in a hold-in sample of the data. Representative lags were pooled to form multivariate predictive models, selected through a systematic backward stepwise reduction approach. Models were cross-validated using a hold-out sample of the data, and their respective root mean square error measures, sensitivity, specificity and predictive values compared. Two of the predictive models were able to detect extreme number of daily asthma admissions at sensitivity levels of 76 % and 62 %, as well as specificities of 66 % and 76 %. Their positive predictive values were slightly higher for the hold-out sample (29 % and 28 %) than for the hold-in model development sample (16 % and 18 %). QRMs can be used in multistage to select suitable variables to forecast extreme asthma events. The associations between asthma and environmental factors, including temperature, ozone and carbon monoxide can be exploited in predicting future events using QRMs.

  15. The past, present, and future of the U.S. electric power sector: Examining regulatory changes using multivariate time series approaches

    NASA Astrophysics Data System (ADS)

    Binder, Kyle Edwin

    The U.S. energy sector has undergone continuous change in the regulatory, technological, and market environments. These developments show no signs of slowing. Accordingly, it is imperative that energy market regulators and participants develop a strong comprehension of market dynamics and the potential implications of their actions. This dissertation contributes to a better understanding of the past, present, and future of U.S. energy market dynamics and interactions with policy. Advancements in multivariate time series analysis are employed in three related studies of the electric power sector. Overall, results suggest that regulatory changes have had and will continue to have important implications for the electric power sector. The sector, however, has exhibited adaptability to past regulatory changes and is projected to remain resilient in the future. Tests for constancy of the long run parameters in a vector error correction model are applied to determine whether relationships among coal inventories in the electric power sector, input prices, output prices, and opportunity costs have remained constant over the past 38 years. Two periods of instability are found, the first following railroad deregulation in the U.S. and the second corresponding to a number of major regulatory changes in the electric power and natural gas sectors. Relationships among Renewable Energy Credit prices, electricity prices, and natural gas prices are estimated using a vector error correction model. Results suggest that Renewable Energy Credit prices do not completely behave as previously theorized in the literature. Potential reasons for the divergence between theory and empirical evidence are the relative immaturity of current markets and continuous institutional intervention. Potential impacts of future CO2 emissions reductions under the Clean Power Plan on economic and energy sector activity are estimated. Conditional forecasts based on an outlined path for CO2 emissions are developed from a factor-augmented vector autoregressive model for a large dataset. Unconditional and conditional forecasts are compared for U.S. industrial production, real personal income, and estimated factors. Results suggest that economic growth will be slower under the Clean Power Plan than it would otherwise; however, CO2 emissions reductions and economic growth can be achieved simultaneously.

  16. Spatial-temporal analysis of the of the risk of Rift Valley Fever in Kenya

    NASA Astrophysics Data System (ADS)

    Bett, B.; Omolo, A.; Hansen, F.; Notenbaert, A.; Kemp, S.

    2012-04-01

    Historical data on Rift Valley Fever (RVF) outbreaks in Kenya covering the period 1951 - 2010 were analyzed using a logistic regression model to identify factors associated with RVF occurrence. The analysis used a division, an administrative unit below a district, as the unit of analysis. The infection status of each division was defined on a monthly time scale and used as a dependent variable. Predictors investigated include: monthly precipitation (minimum, maximum and total), normalized difference vegetation index, altitude, agro-ecological zone, presence of game, livestock and human population densities, the number of times a division has had an outbreak before and time interval in months between successive outbreaks (used as a proxy for immunity). Both univariable and multivariable analyses were conducted. The models used incorporated an auto-regressive correlation matrix to account for clustering of observations in time, while dummy variables were fitted in the multivariable model to account for spatial relatedness/topology between divisions. This last procedure was followed because it is expected that the risk of RVF occurring in a given division increases when its immediate neighbor gets infected. Functional relationships between the continuous and the outcome variables were assessed to ensure that the linearity assumption was met. Deviance and leverage residuals were also generated from the final model and used for evaluating the goodness of fit of the model. Descriptive analyzes indicate that a total of 91 divisions in 42 districts (of the original 69 districts in place by 1999) reported RVF outbreaks at least once over the period. The mean interval between outbreaks was determined to be about 43 months. Factors that were positively associated with RVF occurrence include increased precipitation, high outbreak interval and the number of times a division has been infected or reported an outbreak. The model will be validated and used for developing an RVF forecasting system. This forecasting system can then be used with the existing regional RVF prediction tools such as EMPRES-i to downscale RVF risk predictions to country-specific scales and subsequently link them with decision support systems. The ultimate aim is to increase the capacity of the national institutions to formulate appropriate RVF mitigation measures.

  17. Approach to determine measurement uncertainty in complex nanosystems with multiparametric dependencies and multivariate output quantities

    NASA Astrophysics Data System (ADS)

    Hampel, B.; Liu, B.; Nording, F.; Ostermann, J.; Struszewski, P.; Langfahl-Klabes, J.; Bieler, M.; Bosse, H.; Güttler, B.; Lemmens, P.; Schilling, M.; Tutsch, R.

    2018-03-01

    In many cases, the determination of the measurement uncertainty of complex nanosystems provides unexpected challenges. This is in particular true for complex systems with many degrees of freedom, i.e. nanosystems with multiparametric dependencies and multivariate output quantities. The aim of this paper is to address specific questions arising during the uncertainty calculation of such systems. This includes the division of the measurement system into subsystems and the distinction between systematic and statistical influences. We demonstrate that, even if the physical systems under investigation are very different, the corresponding uncertainty calculation can always be realized in a similar manner. This is exemplarily shown in detail for two experiments, namely magnetic nanosensors and ultrafast electro-optical sampling of complex time-domain signals. For these examples the approach for uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) is explained, in which correlations between multivariate output quantities are captured. To illustate the versatility of the proposed approach, its application to other experiments, namely nanometrological instruments for terahertz microscopy, dimensional scanning probe microscopy, and measurement of concentration of molecules using surface enhanced Raman scattering, is shortly discussed in the appendix. We believe that the proposed approach provides a simple but comprehensive orientation for uncertainty calculation in the discussed measurement scenarios and can also be applied to similar or related situations.

  18. Chest wall recurrence after mastectomy does not always portend a dismal outcome.

    PubMed

    Chagpar, Anees; Meric-Bernstam, Funda; Hunt, Kelly K; Ross, Merrick I; Cristofanilli, Massimo; Singletary, S Eva; Buchholz, Thomas A; Ames, Frederick C; Marcy, Sylvie; Babiera, Gildy V; Feig, Barry W; Hortobagyi, Gabriel N; Kuerer, Henry M

    2003-07-01

    Chest wall recurrence (CWR) after mastectomy often forecasts a grim prognosis. Predictors of outcome after CWR, however, are not clear. From 1988 to 1998, 130 patients with isolated CWRs were seen at our center. Clinicopathologic factors were studied by univariate and multivariate analyses for distant metastasis-free survival after CWR. The median post-CWR follow-up was 37 months. Initial nodal status was the strongest predictor of outcome by univariate analysis. Other significant factors included initial T4 disease, primary lymphovascular invasion, treatment of the primary tumor with neoadjuvant therapy or radiation, time to CWR >24 months, and treatment for CWR (surgery, radiation, or multimodality therapy). Multivariate analysis also found initial nodal status to have the greatest effect; time to CWR and use of radiation for CWR were also independent predictors. Three groups of patients were identified. Low risk was defined by initial node-negative disease, time to CWR >24 months, and radiation for CWR; intermediate risk had one or two favorable features; and high risk had none. The median distant metastasis-free survival after CWR was significantly different among these groups (P <.0001). Patients with CWR are a heterogeneous population. Patients with initial node-negative disease who develop CWR after 24 months have an optimistic prognosis, especially if they are treated with radiation.

  19. DPYD*2A and MTHFR C677T predict toxicity and efficacy, respectively, in patients on chemotherapy with 5-fluorouracil for colorectal cancer.

    PubMed

    Nahid, Noor Ahmed; Apu, Mohd Nazmul Hasan; Islam, Md Reazul; Shabnaz, Samia; Chowdhury, Surid Mohammad; Ahmed, Maizbha Uddin; Nahar, Zabun; Islam, Md Siddiqul; Islam, Mohammad Safiqul; Hasnat, Abul

    2018-01-01

    Significant inter-individual variation in the sensitivity to 5-fluorouracil (5-FU) represents a major therapeutic hindrance either by impairing drug response or inducing adverse drug reactions (ADRs). This study aimed at exploring the cause behind this inter-individual alterations in consequences of 5-fluorouracil-based chemotherapy by investigating the effects of DPYD*2A and MTHFR C677T polymorphisms on toxicity and response of 5-FU in Bangladeshi colorectal cancer patients. Colorectal cancer patients (n = 161) receiving 5-FU-based chemotherapy were prospectively enrolled. DPYD and MTHFR polymorphisms were assessed in peripheral leukocytes. Multivariate analyses were applied to evaluate which variables could predict chemotherapy-induced toxicity and efficacy. Multivariate analyses showed that DPYD*2A polymorphism was a predictive factor (P = 0.023) for grade 3 and grade 4 5-fluorouracil-related toxicities. Although MTHFR C677T polymorphism might act as forecasters for grade 3 or grade 4 neutropenia, diarrhea, and mucositis, this polymorphism was found to increase significantly (P = 0.006) the response of 5-FU. DPYD*2A and MTHFR C677T polymorphisms could explain 5-FU toxicity or clinical outcome in Bangladeshi colorectal patients.

  20. A Personalized Predictive Framework for Multivariate Clinical Time Series via Adaptive Model Selection.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2017-11-01

    Building of an accurate predictive model of clinical time series for a patient is critical for understanding of the patient condition, its dynamics, and optimal patient management. Unfortunately, this process is not straightforward. First, patient-specific variations are typically large and population-based models derived or learned from many different patients are often unable to support accurate predictions for each individual patient. Moreover, time series observed for one patient at any point in time may be too short and insufficient to learn a high-quality patient-specific model just from the patient's own data. To address these problems we propose, develop and experiment with a new adaptive forecasting framework for building multivariate clinical time series models for a patient and for supporting patient-specific predictions. The framework relies on the adaptive model switching approach that at any point in time selects the most promising time series model out of the pool of many possible models, and consequently, combines advantages of the population, patient-specific and short-term individualized predictive models. We demonstrate that the adaptive model switching framework is very promising approach to support personalized time series prediction, and that it is able to outperform predictions based on pure population and patient-specific models, as well as, other patient-specific model adaptation strategies.

  1. Forecasting of Average Monthly River Flows in Colombia

    NASA Astrophysics Data System (ADS)

    Mesa, O. J.; Poveda, G.

    2006-05-01

    The last two decades have witnessed a marked increase in our knowledge of the causes of interannual hydroclimatic variability and our ability to make predictions. Colombia, located near the seat of the ENSO phenomenon, has been shown to experience negative (positive) anomalies in precipitation in concert with El Niño (La Niña). In general besides the Pacific Ocean, Colombia has climatic influences from the Atlantic Ocean and the Caribbean Sea through the tropical forest of the Amazon basin and the savannas of the Orinoco River, in top of the orographic and hydro-climatic effects introduced by the Andes. As in various other countries of the region, hydro-electric power contributes a large proportion (75 %) of the total electricity generation in Colombia. Also, most agriculture is rain-fed dependant, and domestic water supply relies mainly on surface waters from creeks and rivers. Besides, various vector borne tropical diseases intensify in response to rain and temperature changes. Therefore, there is a direct connection between climatic fluctuations and national and regional economies. This talk specifically presents different forecasts of average monthly stream flows for the inflow into the largest reservoir used for hydropower generation in Colombia, and illustrates the potential economic savings of such forecasts. Because of planning of the reservoir operation, the most appropriated time scale for this application is the annual to interannual. Fortunately, this corresponds to the scale at which hydroclimate variability understanding has improved significantly. Among the different possibilities we have explored: traditional statistical ARIMA models, multiple linear regression, natural and constructed analogue models, the linear inverse model, neural network models, the non-parametric regression splines (MARS) model, regime dependant Markovian models and one we termed PREBEO, which is based on spectral bands decomposition using wavelets. Most of the methods make use of the climatic observations and the general prediction models of ENSO which are routinely reported in various sources (http://www.cpc.ncep.noaa.gov/). We will compare the forecasting skills of the models, depending on lead time and initial month of forecasting. Besides ENSO indices, tropical Atlantic sea surface temperatures and the North Atlantic Oscillation index are relevant for these predictions in Colombia. Clear-cut benefits of these predictions are evident for the operation of the system. Ever since the 1991-1992 ENSO event the government, power companies and big consumers realized on its importance and routinely incorporated it into their operational planning. On the contrary, this new knowledge has not been useful for the expansion of the system to accommodate the increasing demand. Some kind of resonance between the scale of fluctuation of climate and the memory of decision makers produces a hydro-illogical cycle of urgency during El Niño dry times and of unawareness during La Niña abundance.

  2. Short-term forecasting of aftershock sequences, microseismicity and swarms inside the Corinth Gulf continental rift

    NASA Astrophysics Data System (ADS)

    Segou, Margarita

    2014-05-01

    Corinth Gulf (Central Greece) is the fastest continental rift in the world with extension rates 11-15 mm/yr with diverse seismic deformation including earthquakes with M greater than 6.0, several periods of increased microseismic activity, usually lasting few months and possibly related with fluid diffusion, and swarm episodes lasting few days. In this study I perform a retrospective forecast experiment between 1995-2012, focusing on the comparison between physics-based and statistical models for short term time classes. Even though Corinth gulf has been studied extensively in the past there is still today a debate whether earthquake activity is related with the existence of either a shallow dipping structure or steeply dipping normal faults. In the light of the above statement, two CRS realization are based on resolving Coulomb stress changes on specified receiver faults, expressing the aforementioned structural models, whereas the third CRS model uses optimally-oriented for failure planes. The CRS implementation accounts for stress changes following all major ruptures with M greater than 4.5 within the testing phase. I also estimate fault constitutive parameters from modeling the response to major earthquakes at the vicinity of the gulf (Aσ=0.2, stressing rate app. 0.02 bar/yr). The generic ETAS parameters are taken as the maximum likelihood estimates derived from the stochastic declustering of the modern seismicity catalog (1995-2012) with minimum triggering magnitude M2.5. I test whether the generic ETAS can efficiently describe the aftershock spatio-temporal clustering but also the evolution of swarm episodes and microseismicity. For the reason above, I implement likelihood tests to evaluate the forecasts for their spatial consistency and for the total amount of predicted versus observed events with M greater than 3.0 in 10-day time windows during three distinct evaluation phases; the first evaluation phase focuses on the Aigio 1995 aftershock sequence (15/06/1995, M6.4), the second covers the period between September 2006-May 2007, characterized for its intense microseismicity, and the third is related with the May 2013 swarm. The conclusions support that (1) geology based CRS models are preferred over optimally oriented planes (2) CRS models are consistent forecasters (60-70%) of transient seismicity, having in most cases comparable performance with ETAS models (3) microseismicity and swarms are not triggered by static stress changes of preceding local events with magnitude M greater than 4.5 and (4) the generic ETAS model can efficiently describe the recent swarm episode. The findings of this study have a number of important implications for future short-term forecasting and time-dependent hazard within Corinth Gulf.

  3. Physics-Based and Statistical Forecasting in Slowly Stressed Environments

    NASA Astrophysics Data System (ADS)

    Segou, M.; Deschamps, A.

    2013-12-01

    We perform a retrospective forecasting experiment between 1995-2012, comparing the predictive power of physics-based and statistical models in Corinth Gulf (Central Greece), which is the fastest continental rift in the world with extension rates 11-15 mm/yr, but also at least three times lower than the motion accommodated by the San Andreas Fault System (~40 mm/yr). The seismicity of the western Corinth gulf has been characterized by significant historical events (1817 M6.6, 1861 M6.7, 1889 M7.0) whereas the modern instrumental catalog (post-1964) reveals one major event, the 1995 Aigio M6.4 (15/06/1995) together with several periods of increased microseismic activity, usually lasting few months and possibly related with fluid diffusion. We examine six predictive models, three based on the combination of Coulomb stress changes and rate-and-state theory (CRS), two epidemic type aftershock sequence (ETAS) models and one hybrid CRS-ETAS (h-ETAS) model. We investigate whether the above forecast models can adequately describe the episodic swarm activity within the gulf. Even though Corinth gulf has been studied extensively in the past there is still today a debate whether earthquake activity is related with the existence of either a shallow dipping structure or steeply dipping normal faults. In the light of the above statement, two CRS realization are based on resolving Coulomb stress changes on specified receiver faults, expressing the aforementioned structural models, whereas the third CRS model uses optimally-oriented for failure planes. In our CRS implementation we account for stress changes following all major ruptures within our testing phase with M greater than 4.5. We also estimate fault constitutive parameters from modeling the response to major earthquakes at the vicinity of the gulf (Ασ=0.2, stressing rate 0.02 bar/yr). The ETAS parameters are taken as the maximum likelihood estimates derived from stochastic declustering of the modern seismicity catalog with minimum triggering magnitude M2.5. We implement likelihood tests to evaluate our forecasts for their spatial consistency and for the total amount of predicted versus observed events with M greater than 3.0 in 10-day time intervals in two distinct evaluation phases. The first evaluation phase focuses on the Aigio 1995 aftershock sequence (15/06/1995, M6.4) whereas the second covers the period between September 2006-May 2007, characterized for the intense swarm activity.We find that (1) geology based CRS models are preferred over optimally oriented planes (2) CRS models are consistent forecasters (60-70%) of transient seismicity, having in most cases comparable performance with ETAS models (3) swarms are not triggered by static stress changes of preceding local events.

  4. The meta-Gaussian Bayesian Processor of forecasts and associated preliminary experiments

    NASA Astrophysics Data System (ADS)

    Chen, Fajing; Jiao, Meiyan; Chen, Jing

    2013-04-01

    Public weather services are trending toward providing users with probabilistic weather forecasts, in place of traditional deterministic forecasts. Probabilistic forecasting techniques are continually being improved to optimize available forecasting information. The Bayesian Processor of Forecast (BPF), a new statistical method for probabilistic forecast, can transform a deterministic forecast into a probabilistic forecast according to the historical statistical relationship between observations and forecasts generated by that forecasting system. This technique accounts for the typical forecasting performance of a deterministic forecasting system in quantifying the forecast uncertainty. The meta-Gaussian likelihood model is suitable for a variety of stochastic dependence structures with monotone likelihood ratios. The meta-Gaussian BPF adopting this kind of likelihood model can therefore be applied across many fields, including meteorology and hydrology. The Bayes theorem with two continuous random variables and the normal-linear BPF are briefly introduced. The meta-Gaussian BPF for a continuous predictand using a single predictor is then presented and discussed. The performance of the meta-Gaussian BPF is tested in a preliminary experiment. Control forecasts of daily surface temperature at 0000 UTC at Changsha and Wuhan stations are used as the deterministic forecast data. These control forecasts are taken from ensemble predictions with a 96-h lead time generated by the National Meteorological Center of the China Meteorological Administration, the European Centre for Medium-Range Weather Forecasts, and the US National Centers for Environmental Prediction during January 2008. The results of the experiment show that the meta-Gaussian BPF can transform a deterministic control forecast of surface temperature from any one of the three ensemble predictions into a useful probabilistic forecast of surface temperature. These probabilistic forecasts quantify the uncertainty of the control forecast; accordingly, the performance of the probabilistic forecasts differs based on the source of the underlying deterministic control forecasts.

  5. Multivariate constrained shape optimization: Application to extrusion bell shape for pasta production

    NASA Astrophysics Data System (ADS)

    Sarghini, Fabrizio; De Vivo, Angela; Marra, Francesco

    2017-10-01

    Computational science and engineering methods have allowed a major change in the way products and processes are designed, as validated virtual models - capable to simulate physical, chemical and bio changes occurring during production processes - can be realized and used in place of real prototypes and performing experiments, often time and money consuming. Among such techniques, Optimal Shape Design (OSD) (Mohammadi & Pironneau, 2004) represents an interesting approach. While most classical numerical simulations consider fixed geometrical configurations, in OSD a certain number of geometrical degrees of freedom is considered as a part of the unknowns: this implies that the geometry is not completely defined, but part of it is allowed to move dynamically in order to minimize or maximize the objective function. The applications of optimal shape design (OSD) are uncountable. For systems governed by partial differential equations, they range from structure mechanics to electromagnetism and fluid mechanics or to a combination of the three. This paper presents one of possible applications of OSD, particularly how extrusion bell shape, for past production, can be designed by applying a multivariate constrained shape optimization.

  6. Multivariate analysis of variance of designed chromatographic data. A case study involving fermentation of rooibos tea.

    PubMed

    Marini, Federico; de Beer, Dalene; Walters, Nico A; de Villiers, André; Joubert, Elizabeth; Walczak, Beata

    2017-03-17

    An ultimate goal of investigations of rooibos plant material subjected to different stages of fermentation is to identify the chemical changes taking place in the phenolic composition, using an untargeted approach and chromatographic fingerprints. Realization of this goal requires, among others, identification of the main components of the plant material involved in chemical reactions during the fermentation process. Quantitative chromatographic data for the compounds for extracts of green, semi-fermented and fermented rooibos form the basis of preliminary study following a targeted approach. The aim is to estimate whether treatment has a significant effect based on all quantified compounds and to identify the compounds, which contribute significantly to it. Analysis of variance is performed using modern multivariate methods such as ANOVA-Simultaneous Component Analysis, ANOVA - Target Projection and regularized MANOVA. This study is the first one in which all three approaches are compared and evaluated. For the data studied, all tree methods reveal the same significance of the fermentation effect on the extract compositions, but they lead to its different interpretation. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Effects of Forecasted Climate Change on Stream Temperatures in the Nooksack River Basin

    NASA Astrophysics Data System (ADS)

    Truitt, S. E.; Mitchell, R. J.; Yearsley, J. R.; Grah, O. J.

    2017-12-01

    The Nooksack River in northwest Washington State provides valuable habitat for endangered salmon species, as such it is critical to understand how stream temperatures will be affected by forecasted climate change. The Middle and North Forks basins of the Nooksack are high-relief and glaciated, whereas the South Fork is a lower relief rain and snow dominated basin. Due to a moderate Pacific maritime climate, snowpack in the basins is sensitive to temperature increases. Previous modeling studies in the upper Nooksack basins indicate a reduction in snowpack and spring runoff, and a recession of glaciers into the 21st century. How stream temperatures will respond to these changes is unknown. We use the Distributed Hydrology Soil Vegetation Model (DHSVM) coupled with a glacier dynamics model and the River Basin Model (RBM) to simulate hydrology and stream temperature from present to the year 2100. We calibrate the DHSVM and RBM to the three forks in the upper 1550 km2 of the Nooksack basin, which contain an estimated 3400 hectares of glacial ice. We employ observed stream-temperature data collected over the past decade and hydrologic data from the four USGS streamflow monitoring sites within the basin and observed gridded climate data developed by Linveh et al. (2013). Field work was conducted in the summer of 2016 to determine stream morphology, discharge, and stream temperatures at a number of stream segments for the RBM calibration. We simulate forecast climate change impacts, using gridded daily downscaled data from global climate models of the CMIP5 with RCP4.5 and RCP8.5 forcing scenarios developed using the multivariate adaptive constructed analogs method (MACA; Abatzoglou and Brown, 2011). Simulation results project a trending increase in stream temperature as a result of lower snowmelt and higher air temperatures into the 21st century, especially in the lower relief, unglaciated South Fork basin.

  8. A two-model hydrologic ensemble prediction of hydrograph: case study from the upper Nysa Klodzka river basin (SW Poland)

    NASA Astrophysics Data System (ADS)

    Niedzielski, Tomasz; Mizinski, Bartlomiej

    2016-04-01

    The HydroProg system has been elaborated in frame of the research project no. 2011/01/D/ST10/04171 of the National Science Centre of Poland and is steadily producing multimodel ensemble predictions of hydrograph in real time. Although there are six ensemble members available at present, the longest record of predictions and their statistics is available for two data-based models (uni- and multivariate autoregressive models). Thus, we consider 3-hour predictions of water levels, with lead times ranging from 15 to 180 minutes, computed every 15 minutes since August 2013 for the Nysa Klodzka basin (SW Poland) using the two approaches and their two-model ensemble. Since the launch of the HydroProg system there have been 12 high flow episodes, and the objective of this work is to present the performance of the two-model ensemble in the process of forecasting these events. For a sake of brevity, we limit our investigation to a single gauge located at the Nysa Klodzka river in the town of Klodzko, which is centrally located in the studied basin. We identified certain regular scenarios of how the models perform in predicting the high flows in Klodzko. At the initial phase of the high flow, well before the rising limb of hydrograph, the two-model ensemble is found to provide the most skilful prognoses of water levels. However, while forecasting the rising limb of hydrograph, either the two-model solution or the vector autoregressive model offers the best predictive performance. In addition, it is hypothesized that along with the development of the rising limb phase, the vector autoregression becomes the most skilful approach amongst the scrutinized ones. Our simple two-model exercise confirms that multimodel hydrologic ensemble predictions cannot be treated as universal solutions suitable for forecasting the entire high flow event, but their superior performance may hold only for certain phases of a high flow.

  9. A temporal and spatial analysis of ground-water levels for effective monitoring in Huron County, Michigan

    USGS Publications Warehouse

    Holtschlag, David J.; Sweat, M.J.

    1999-01-01

    Quarterly water-level measurements were analyzed to assess the effectiveness of a monitoring network of 26 wells in Huron County, Michigan. Trends were identified as constant levels and autoregressive components were computed at all wells on the basis of data collected from 1993 to 1997, using structural time series analysis. Fixed seasonal components were identified at 22 wells and outliers were identified at 23 wells. The 95- percent confidence intervals were forecast for water-levels during the first and second quarters of 1998. Intervals in the first quarter were consistent with 92.3 percent of the measured values. In the second quarter, measured values were within the forecast intervals only 65.4 percent of the time. Unusually low precipitation during the second quarter is thought to have contributed to the reduced reliability of the second-quarter forecasts. Spatial interrelations among wells were investigated on the basis of the autoregressive components, which were filtered to create a set of innovation sequences that were temporally uncorrelated. The empirical covariance among the innovation sequences indicated both positive and negative spatial interrelations. The negative covariance components are considered to be physically implausible and to have resulted from random sampling error. Graphical modeling, a form of multivariate analysis, was used to model the covariance structure. Results indicate that only 29 of the 325 possible partial correlations among the water-level innovations were statistically significant. The model covariance matrix, corresponding to the model partial correlation structure, contained only positive elements. This model covariance was sequentially partitioned to compute a set of partial covariance matrices that were used to rank the effectiveness of the 26 monitoring wells from greatest to least. Results, for example, indicate that about 50 percent of the uncertainty of the water-level innovations currently monitored by the 26- well network could be described by the 6 most effective wells.

  10. Toward a multivariate reanalysis of the North Atlantic ocean biogeochemistry during 1998-2006 based on the assimilation of SeaWiFS chlorophyll data

    NASA Astrophysics Data System (ADS)

    Fontana, C.; Brasseur, P.; Brankart, J.-M.

    2012-04-01

    Today, the routine assimilation of satellite data into operational models of the ocean circulation is mature enough to enable the production of global reanalyses describing the ocean circulation variability during the past decades. The expansion of the "reanalysis" concept from ocean physics to biogeochemistry is a timely challenge that motivates the present study. The objective of this paper is to investigate the potential benefits of assimilating satellite-estimated chlorophyll data into a basin-scale three-dimensional coupled physical-biogeochemical model of the North-Atlantic. The aim is on one hand to improve forecasts of ocean biogeochemical properties and on the other hand to define a methodology for producing data-driven climatologies based on coupled physical-biogeochemical modelling. A simplified variant of the Kalman filter is used to assimilate ocean color data during a 9 year-long period. In this frame, two experiences are carried out, with and without anamorphic transformations of the state vector variables. Data assimilation efficiency is assessed with respect to the assimilated data set, the nitrate World Ocean Atlas database and a derived climatology. Along the simulation period, the non-linear assimilation scheme clearly improves the surface chlorophyll concentrations analysis and forecast, especially in the North Atlantic bloom region. Nitrate concentration forecasts are also improved thanks to the assimilation of ocean color data while this improvement is limited to the upper layer of the water column, in agreement with recent related litterature. This feature is explained by the weak correlation taken into account by the assimilation between surface phytoplankton and nitrate concentration deeper than 50 m. The assessement of the non-linear assimilation experiments indicates that the proposed methodology provides the skeleton of an assimilative system suitable for reanalysing the ocean biogeochemistry based on ocean color data.

  11. Toward a multivariate reanalysis of the North Atlantic Ocean biogeochemistry during 1998-2006 based on the assimilation of SeaWiFS chlorophyll data

    NASA Astrophysics Data System (ADS)

    Fontana, C.; Brasseur, P.; Brankart, J.-M.

    2013-01-01

    Today, the routine assimilation of satellite data into operational models of ocean circulation is mature enough to enable the production of global reanalyses describing the ocean circulation variability during the past decades. The expansion of the "reanalysis" concept from ocean physics to biogeochemistry is a timely challenge that motivates the present study. The objective of this paper is to investigate the potential benefits of assimilating satellite-estimated chlorophyll data into a basin-scale three-dimensional coupled physical-biogeochemical model of the North Atlantic. The aim is on the one hand to improve forecasts of ocean biogeochemical properties and on the other hand to define a methodology for producing data-driven climatologies based on coupled physical-biogeochemical modeling. A simplified variant of the Kalman filter is used to assimilate ocean color data during a 9-year period. In this frame, two experiments are carried out, with and without anamorphic transformations of the state vector variables. Data assimilation efficiency is assessed with respect to the assimilated data set, nitrate of the World Ocean Atlas database and a derived climatology. Along the simulation period, the non-linear assimilation scheme clearly improves the surface analysis and forecast chlorophyll concentrations, especially in the North Atlantic bloom region. Nitrate concentration forecasts are also improved thanks to the assimilation of ocean color data while this improvement is limited to the upper layer of the water column, in agreement with recent related literature. This feature is explained by the weak correlation taken into account by the assimilation between surface phytoplankton and nitrate concentrations deeper than 50 meters. The assessment of the non-linear assimilation experiments indicates that the proposed methodology provides the skeleton of an assimilative system suitable for reanalyzing the ocean biogeochemistry based on ocean color data.

  12. Forecasting daily attendances at an emergency department to aid resource planning

    PubMed Central

    Sun, Yan; Heng, Bee Hoon; Seow, Yian Tay; Seow, Eillyne

    2009-01-01

    Background Accurate forecasting of emergency department (ED) attendances can be a valuable tool for micro and macro level planning. Methods Data for analysis was the counts of daily patient attendances at the ED of an acute care regional general hospital from July 2005 to Mar 2008. Patients were stratified into three acuity categories; i.e. P1, P2 and P3, with P1 being the most acute and P3 being the least acute. The autoregressive integrated moving average (ARIMA) method was separately applied to each of the three acuity categories and total patient attendances. Independent variables included in the model were public holiday (yes or no), ambient air quality measured by pollution standard index (PSI), daily ambient average temperature and daily relative humidity. The seasonal components of weekly and yearly periodicities in the time series of daily attendances were also studied. Univariate analysis by t-tests and multivariate time series analysis were carried out in SPSS version 15. Results By time series analyses, P1 attendances did not show any weekly or yearly periodicity and was only predicted by ambient air quality of PSI > 50. P2 and total attendances showed weekly periodicities, and were also significantly predicted by public holiday. P3 attendances were significantly correlated with day of the week, month of the year, public holiday, and ambient air quality of PSI > 50. After applying the developed models to validate the forecast, the MAPE of prediction by the models were 16.8%, 6.7%, 8.6% and 4.8% for P1, P2, P3 and total attendances, respectively. The models were able to account for most of the significant autocorrelations present in the data. Conclusion Time series analysis has been shown to provide a useful, readily available tool for predicting emergency department workload that can be used to plan staff roster and resource planning. PMID:19178716

  13. Toward the assimilation of biogeochemical data in the CMEMS BIOMER coupled physical-biogeochemical operational system

    NASA Astrophysics Data System (ADS)

    Lamouroux, Julien; Testut, Charles-Emmanuel; Lellouche, Jean-Michel; Perruche, Coralie; Paul, Julien

    2017-04-01

    The operational production of data-assimilated biogeochemical state of the ocean is one of the challenging core projects of the Copernicus Marine Environment Monitoring Service. In that framework - and with the April 2018 CMEMS V4 release as a target - Mercator Ocean is in charge of improving the realism of its global ¼° BIOMER coupled physical-biogeochemical (NEMO/PISCES) simulations, analyses and re-analyses, and to develop an effective capacity to routinely estimate the biogeochemical state of the ocean, through the implementation of biogeochemical data assimilation. Primary objectives are to enhance the time representation of the seasonal cycle in the real time and reanalysis systems, and to provide a better control of the production in the equatorial regions. The assimilation of BGC data will rely on a simplified version of the SEEK filter, where the error statistics do not evolve with the model dynamics. The associated forecast error covariances are based on the statistics of a collection of 3D ocean state anomalies. The anomalies are computed from a multi-year numerical experiment (free run without assimilation) with respect to a running mean in order to estimate the 7-day scale error on the ocean state at a given period of the year. These forecast error covariances rely thus on a fixed-basis seasonally variable ensemble of anomalies. This methodology, which is currently implemented in the "blue" component of the CMEMS operational forecast system, is now under adaptation to be applied to the biogeochemical part of the operational system. Regarding observations - and as a first step - the system shall rely on the CMEMS GlobColour Global Ocean surface chlorophyll concentration products, delivered in NRT. The objective of this poster is to provide a detailed overview of the implementation of the aforementioned data assimilation methodology in the CMEMS BIOMER forecasting system. Focus shall be put on (1) the assessment of the capabilities of this data assimilation methodology to provide satisfying statistics of the model variability errors (through space-time analysis of dedicated representers of satellite surface Chla observations), (2) the dedicated features of the data assimilation configuration that have been implemented so far (e.g. log-transformation of the analysis state, multivariate Chlorophyll-Nutrient control vector, etc.) and (3) the assessment of the performances of this future operational data assimilation configuration.

  14. Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Cai, X.; Yang, D.

    2010-12-01

    Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast

  15. REGIONAL-SCALE WIND FIELD CLASSIFICATION EMPLOYING CLUSTER ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Glascoe, L G; Glaser, R E; Chin, H S

    2004-06-17

    The classification of time-varying multivariate regional-scale wind fields at a specific location can assist event planning as well as consequence and risk analysis. Further, wind field classification involves data transformation and inference techniques that effectively characterize stochastic wind field variation. Such a classification scheme is potentially useful for addressing overall atmospheric transport uncertainty and meteorological parameter sensitivity issues. Different methods to classify wind fields over a location include the principal component analysis of wind data (e.g., Hardy and Walton, 1978) and the use of cluster analysis for wind data (e.g., Green et al., 1992; Kaufmann and Weber, 1996). The goalmore » of this study is to use a clustering method to classify the winds of a gridded data set, i.e, from meteorological simulations generated by a forecast model.« less

  16. Weather forecasting expert system study

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Weather forecasting is critical to both the Space Transportation System (STS) ground operations and the launch/landing activities at NASA Kennedy Space Center (KSC). The current launch frequency places significant demands on the USAF weather forecasters at the Cape Canaveral Forecasting Facility (CCFF), who currently provide the weather forecasting for all STS operations. As launch frequency increases, KSC's weather forecasting problems will be great magnified. The single most important problem is the shortage of highly skilled forecasting personnel. The development of forecasting expertise is difficult and requires several years of experience. Frequent personnel changes within the forecasting staff jeopardize the accumulation and retention of experience-based weather forecasting expertise. The primary purpose of this project was to assess the feasibility of using Artificial Intelligence (AI) techniques to ameliorate this shortage of experts by capturing aria incorporating the forecasting knowledge of current expert forecasters into a Weather Forecasting Expert System (WFES) which would then be made available to less experienced duty forecasters.

  17. Forecasting fish biomasses, densities, productions, and bioaccumulation potentials of mid-atlantic wadeable streams.

    PubMed

    Barber, M Craig; Rashleigh, Brenda; Cyterski, Michael J

    2016-01-01

    Regional fishery conditions of Mid-Atlantic wadeable streams in the eastern United States are estimated using the Bioaccumulation and Aquatic System Simulator (BASS) bioaccumulation and fish community model and data collected by the US Environmental Protection Agency's Environmental Monitoring and Assessment Program (EMAP). Average annual biomasses and population densities and annual productions are estimated for 352 randomly selected streams. Realized bioaccumulation factors (BAF) and biomagnification factors (BMF), which are dependent on these forecasted biomasses, population densities, and productions, are also estimated by assuming constant water exposures to methylmercury and tetra-, penta-, hexa-, and hepta-chlorinated biphenyls. Using observed biomasses, observed densities, and estimated annual productions of total fish from 3 regions assumed to support healthy fisheries as benchmarks (eastern Tennessee and Catskill Mountain trout streams and Ozark Mountains smallmouth bass streams), 58% of the region's wadeable streams are estimated to be in marginal or poor condition (i.e., not healthy). Using simulated BAFs and EMAP Hg fish concentrations, we also estimate that approximately 24% of the game fish and subsistence fishing species that are found in streams having detectable Hg concentrations would exceed an acceptable human consumption criterion of 0.185 μg/g wet wt. Importantly, such streams have been estimated to represent 78.2% to 84.4% of the Mid-Atlantic's wadeable stream lengths. Our results demonstrate how a dynamic simulation model can support regional assessment and trends analysis for fisheries. © 2015 SETAC.

  18. Nature Run for the North Atlantic Ocean Hurricane Region: System Evaluation and Regional Applications

    NASA Astrophysics Data System (ADS)

    Kourafalou, V.; Androulidakis, I.; Halliwell, G. R., Jr.; Kang, H.; Mehari, M. F.; Atlas, R. M.

    2016-02-01

    A prototype ocean Observing System Simulation Experiments (OSSE) system, first developed and data validated in the Gulf of Mexico, has been applied on the extended North Atlantic Ocean hurricane region. The main objectives of this study are: a) to contribute toward a fully relocatable ocean OSSE system by expanding the Gulf of Mexico OSSE to the North Atlantic Ocean; b) demonstrate and quantify improvements in hurricane forecasting when the ocean component of coupled hurricane models is advanced through targeted observations and assimilation. The system is based on the Hybrid Coordinate Ocean Model (HYCOM) and has been applied on a 1/250 Mercator mesh for the free-running Nature Run (NR) and on a 1/120 Mercator mesh for the data assimilative forecast model (FM). A "fraternal twin" system is employed, using two different realizations for NR and FM, each configured to produce substantially different physics and truncation errors. The NR has been evaluated using a variety of available observations, such as from AVISO, GDEM climatology and GHRSST observations, plus specific regional products (upper ocean profiles from air-borne instruments, surface velocity maps derived from the historical drifter data set and tropical cyclone heat potential maps derived from altimetry observations). The utility of the OSSE system to advance the knowledge of regional air-sea interaction processes related to hurricane activity is demonstrated in the Amazon region (salinity induced surface barrier layer) and the Gulf Stream region (hurricane impact on the Gulf Stream extension).

  19. Impact of VLSI/VHSIC on satellite on-board signal processing

    NASA Astrophysics Data System (ADS)

    Aanstoos, J. V.; Ruedger, W. H.; Snyder, W. E.; Kelly, W. L.

    Forecasted improvements in IC fabrication techniques, such as the use of X-ray lithography, are expected to yield submicron circuit feature sizes within the decade of the 1980s. As dimensions decrease, reliability, cost, speed, power consumption and density improvements will be realized which have a significant impact on the capabilities of onboard spacecraft signal processing functions. This will in turn result in increases of the intelligence that may be deployed on spaceborne remote sensing platforms. Among programs oriented toward such goals are the silicon-based Very High Speed Integrated Circuit (VHSIC) researches sponsored by the U.S. Department of Defense, and efforts toward the development of GaAs devices which will compete with silicon VLSI technology for future applications. GaAs has an electron mobility which is five to six times that of silicon, and promises commensurate computation speed increases under low field conditions.

  20. A stochastic-dynamic model for global atmospheric mass field statistics

    NASA Technical Reports Server (NTRS)

    Ghil, M.; Balgovind, R.; Kalnay-Rivas, E.

    1981-01-01

    A model that yields the spatial correlation structure of atmospheric mass field forecast errors was developed. The model is governed by the potential vorticity equation forced by random noise. Expansion in spherical harmonics and correlation function was computed analytically using the expansion coefficients. The finite difference equivalent was solved using a fast Poisson solver and the correlation function was computed using stratified sampling of the individual realization of F(omega) and hence of phi(omega). A higher order equation for gamma was derived and solved directly in finite differences by two successive applications of the fast Poisson solver. The methods were compared for accuracy and efficiency and the third method was chosen as clearly superior. The results agree well with the latitude dependence of observed atmospheric correlation data. The value of the parameter c sub o which gives the best fit to the data is close to the value expected from dynamical considerations.

  1. Global Precipitation Measurement Program and the Development of Dual-Frequency Precipitation Radar

    NASA Technical Reports Server (NTRS)

    Iguchi, Toshio; Oki, Riko; Smith, Eric A.; Furuhama, Yoji

    2002-01-01

    The Global Precipitation Measurement (GPM) program is a mission to measure precipitation from space, and is a similar but much expanded mission of the Tropical Rainfall Measuring Mission. Its scope is not limited to scientific research, but includes practical and operational applications such as weather forecasting and water resource management. To meet the requirements of operational use, the GPM uses multiple low-orbiting satellites to increase the sampling frequency and to create three-hourly global rain maps that will be delivered to the world in quasi-real time. A dual-frequency radar (DPR) will be installed on the primary satellite that plays an important role in the whole mission. The DPR will realize measurement of precipitation with high sensitivity, high precision and high resolutions. This paper describes an outline of the GPM program, its issues and the roles and development of the DPR.

  2. Forecasting the Economic Impact of Future Space Station Operations

    NASA Technical Reports Server (NTRS)

    Summer, R. A.; Smolensky, S. M.; Muir, A. H.

    1967-01-01

    Recent manned and unmanned Earth-orbital operations have suggested great promise of improved knowledge and of substantial economic and associated benefits to be derived from services offered by a space station. Proposed application areas include agriculture, forestry, hydrology, public health, oceanography, natural disaster warning, and search/rescue operations. The need for reliable estimates of economic and related Earth-oriented benefits to be realized from Earth-orbital operations is discussed and recent work in this area is reviewed. Emphasis is given to those services based on remote sensing. Requirements for a uniform, comprehensive and flexible methodology are discussed. A brief review of the suggested methodology is presented. This methodology will be exercised through five case studies which were chosen from a gross inventory of almost 400 user candidates. The relationship of case study results to benefits in broader application areas is discussed, Some management implications of possible future program implementation are included.

  3. Integration of Behind-the-Meter PV Fleet Forecasts into Utility Grid System Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoff, Thomas Hoff; Kankiewicz, Adam

    Four major research objectives were completed over the course of this study. Three of the objectives were to evaluate three, new, state-of-the-art solar irradiance forecasting models. The fourth objective was to improve the California Independent System Operator’s (ISO) load forecasts by integrating behind-the-meter (BTM) PV forecasts. The three, new, state-of-the-art solar irradiance forecasting models included: the infrared (IR) satellite-based cloud motion vector (CMV) model; the WRF-SolarCA model and variants; and the Optimized Deep Machine Learning (ODML)-training model. The first two forecasting models targeted known weaknesses in current operational solar forecasts. They were benchmarked against existing operational numerical weather prediction (NWP)more » forecasts, visible satellite CMV forecasts, and measured PV plant power production. IR CMV, WRF-SolarCA, and ODML-training forecasting models all improved the forecast to a significant degree. Improvements varied depending on time of day, cloudiness index, and geographic location. The fourth objective was to demonstrate that the California ISO’s load forecasts could be improved by integrating BTM PV forecasts. This objective represented the project’s most exciting and applicable gains. Operational BTM forecasts consisting of 200,000+ individual rooftop PV forecasts were delivered into the California ISO’s real-time automated load forecasting (ALFS) environment. They were then evaluated side-by-side with operational load forecasts with no BTM-treatment. Overall, ALFS-BTM day-ahead (DA) forecasts performed better than baseline ALFS forecasts when compared to actual load data. Specifically, ALFS-BTM DA forecasts were observed to have the largest reduction of error during the afternoon on cloudy days. Shorter term 30 minute-ahead ALFS-BTM forecasts were shown to have less error under all sky conditions, especially during the morning time periods when traditional load forecasts often experience their largest uncertainties. This work culminated in a GO decision being made by the California ISO to include zonal BTM forecasts into its operational load forecasting system. The California ISO’s Manager of Short Term Forecasting, Jim Blatchford, summarized the research performed in this project with the following quote: “The behind-the-meter (BTM) California ISO region forecasting research performed by Clean Power Research and sponsored by the Department of Energy’s SUNRISE program was an opportunity to verify value and demonstrate improved load forecast capability. In 2016, the California ISO will be incorporating the BTM forecast into the Hour Ahead and Day Ahead load models to look for improvements in the overall load forecast accuracy as BTM PV capacity continues to grow.”« less

  4. Optimising seasonal streamflow forecast lead time for operational decision making in Australia

    NASA Astrophysics Data System (ADS)

    Schepen, Andrew; Zhao, Tongtiegang; Wang, Q. J.; Zhou, Senlin; Feikema, Paul

    2016-10-01

    Statistical seasonal forecasts of 3-month streamflow totals are released in Australia by the Bureau of Meteorology and updated on a monthly basis. The forecasts are often released in the second week of the forecast period, due to the onerous forecast production process. The current service relies on models built using data for complete calendar months, meaning the forecast production process cannot begin until the first day of the forecast period. Somehow, the bureau needs to transition to a service that provides forecasts before the beginning of the forecast period; timelier forecast release will become critical as sub-seasonal (monthly) forecasts are developed. Increasing the forecast lead time to one month ahead is not considered a viable option for Australian catchments that typically lack any predictability associated with snowmelt. The bureau's forecasts are built around Bayesian joint probability models that have antecedent streamflow, rainfall and climate indices as predictors. In this study, we adapt the modelling approach so that forecasts have any number of days of lead time. Daily streamflow and sea surface temperatures are used to develop predictors based on 28-day sliding windows. Forecasts are produced for 23 forecast locations with 0-14- and 21-day lead time. The forecasts are assessed in terms of continuous ranked probability score (CRPS) skill score and reliability metrics. CRPS skill scores, on average, reduce monotonically with increase in days of lead time, although both positive and negative differences are observed. Considering only skilful forecast locations, CRPS skill scores at 7-day lead time are reduced on average by 4 percentage points, with differences largely contained within +5 to -15 percentage points. A flexible forecasting system that allows for any number of days of lead time could benefit Australian seasonal streamflow forecast users by allowing more time for forecasts to be disseminated, comprehended and made use of prior to the commencement of a forecast season. The system would allow for forecasts to be updated if necessary.

  5. Evaluation of ensemble forecast uncertainty using a new proper score: application to medium-range and seasonal forecasts

    NASA Astrophysics Data System (ADS)

    Christensen, Hannah; Moroz, Irene; Palmer, Tim

    2015-04-01

    Forecast verification is important across scientific disciplines as it provides a framework for evaluating the performance of a forecasting system. In the atmospheric sciences, probabilistic skill scores are often used for verification as they provide a way of unambiguously ranking the performance of different probabilistic forecasts. In order to be useful, a skill score must be proper -- it must encourage honesty in the forecaster, and reward forecasts which are reliable and which have good resolution. A new score, the Error-spread Score (ES), is proposed which is particularly suitable for evaluation of ensemble forecasts. It is formulated with respect to the moments of the forecast. The ES is confirmed to be a proper score, and is therefore sensitive to both resolution and reliability. The ES is tested on forecasts made using the Lorenz '96 system, and found to be useful for summarising the skill of the forecasts. The European Centre for Medium-Range Weather Forecasts (ECMWF) ensemble prediction system (EPS) is evaluated using the ES. Its performance is compared to a perfect statistical probabilistic forecast -- the ECMWF high resolution deterministic forecast dressed with the observed error distribution. This generates a forecast that is perfectly reliable if considered over all time, but which does not vary from day to day with the predictability of the atmospheric flow. The ES distinguishes between the dynamically reliable EPS forecasts and the statically reliable dressed deterministic forecasts. Other skill scores are tested and found to be comparatively insensitive to this desirable forecast quality. The ES is used to evaluate seasonal range ensemble forecasts made with the ECMWF System 4. The ensemble forecasts are found to be skilful when compared with climatological or persistence forecasts, though this skill is dependent on region and time of year.

  6. Value of long-term streamflow forecast to reservoir operations for water supply in snow-dominated catchments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anghileri, Daniela; Voisin, Nathalie; Castelletti, Andrea F.

    In this study, we develop a forecast-based adaptive control framework for Oroville reservoir, California, to assess the value of seasonal and inter-annual forecasts for reservoir operation.We use an Ensemble Streamflow Prediction (ESP) approach to generate retrospective, one-year-long streamflow forecasts based on the Variable Infiltration Capacity hydrology model. The optimal sequence of daily release decisions from the reservoir is then determined by Model Predictive Control, a flexible and adaptive optimization scheme.We assess the forecast value by comparing system performance based on the ESP forecasts with that based on climatology and a perfect forecast. In addition, we evaluate system performance based onmore » a synthetic forecast, which is designed to isolate the contribution of seasonal and inter-annual forecast skill to the overall value of the ESP forecasts.Using the same ESP forecasts, we generalize our results by evaluating forecast value as a function of forecast skill, reservoir features, and demand. Our results show that perfect forecasts are valuable when the water demand is high and the reservoir is sufficiently large to allow for annual carry-over. Conversely, ESP forecast value is highest when the reservoir can shift water on a seasonal basis.On average, for the system evaluated here, the overall ESP value is 35% less than the perfect forecast value. The inter-annual component of the ESP forecast contributes 20-60% of the total forecast value. Improvements in the seasonal component of the ESP forecast would increase the overall ESP forecast value between 15 and 20%.« less

  7. Improved Anvil Forecasting

    NASA Technical Reports Server (NTRS)

    Lambert, Winifred C.

    2000-01-01

    This report describes the outcome of Phase 1 of the AMU's Improved Anvil Forecasting task. Forecasters in the 45th Weather Squadron and the Spaceflight Meteorology Group have found that anvil forecasting is a difficult task when predicting LCC and FR violations. The purpose of this task is to determine the technical feasibility of creating an anvil-forecasting tool. Work on this study was separated into three steps: literature search, forecaster discussions, and determination of technical feasibility. The literature search revealed no existing anvil-forecasting techniques. However, there appears to be growing interest in anvils in recent years. If this interest continues to grow, more information will be available to aid in developing a reliable anvil-forecasting tool. The forecaster discussion step revealed an array of methods on how better forecasting techniques could be developed. The forecasters have ideas based on sound meteorological principles and personal experience in forecasting and analyzing anvils. Based on the information gathered in the discussions with the forecasters, the conclusion of this report is that it is technically feasible at this time to develop an anvil forecasting technique that will significantly contribute to the confidence in anvil forecasts.

  8. Flare forecasting at the Met Office Space Weather Operations Centre

    NASA Astrophysics Data System (ADS)

    Murray, S. A.; Bingham, S.; Sharpe, M.; Jackson, D. R.

    2017-04-01

    The Met Office Space Weather Operations Centre produces 24/7/365 space weather guidance, alerts, and forecasts to a wide range of government and commercial end-users across the United Kingdom. Solar flare forecasts are one of its products, which are issued multiple times a day in two forms: forecasts for each active region on the solar disk over the next 24 h and full-disk forecasts for the next 4 days. Here the forecasting process is described in detail, as well as first verification of archived forecasts using methods commonly used in operational weather prediction. Real-time verification available for operational flare forecasting use is also described. The influence of human forecasters is highlighted, with human-edited forecasts outperforming original model results and forecasting skill decreasing over longer forecast lead times.

  9. Operational hydrological forecasting in Bavaria. Part II: Ensemble forecasting

    NASA Astrophysics Data System (ADS)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In part I of this study, the operational flood forecasting system in Bavaria and an approach to identify and quantify forecast uncertainty was introduced. The approach is split into the calculation of an empirical 'overall error' from archived forecasts and the calculation of an empirical 'model error' based on hydrometeorological forecast tests, where rainfall observations were used instead of forecasts. The 'model error' can especially in upstream catchments where forecast uncertainty is strongly dependent on the current predictability of the atrmosphere be superimposed on the spread of a hydrometeorological ensemble forecast. In Bavaria, two meteorological ensemble prediction systems are currently tested for operational use: the 16-member COSMO-LEPS forecast and a poor man's ensemble composed of DWD GME, DWD Cosmo-EU, NCEP GFS, Aladin-Austria, MeteoSwiss Cosmo-7. The determination of the overall forecast uncertainty is dependent on the catchment characteristics: 1. Upstream catchment with high influence of weather forecast a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. b) Corresponding to the characteristics of the meteorological ensemble forecast, each resulting forecast hydrograph can be regarded as equally likely. c) The 'model error' distribution, with parameters dependent on hydrological case and lead time, is added to each forecast timestep of each ensemble member d) For each forecast timestep, the overall (i.e. over all 'model error' distribution of each ensemble member) error distribution is calculated e) From this distribution, the uncertainty range on a desired level (here: the 10% and 90% percentile) is extracted and drawn as forecast envelope. f) As the mean or median of an ensemble forecast does not necessarily exhibit meteorologically sound temporal evolution, a single hydrological forecast termed 'lead forecast' is chosen and shown in addition to the uncertainty bounds. This can be either an intermediate forecast between the extremes of the ensemble spread or a manually selected forecast based on a meteorologists advice. 2. Downstream catchments with low influence of weather forecast In downstream catchments with strong human impact on discharge (e.g. by reservoir operation) and large influence of upstream gauge observation quality on forecast quality, the 'overall error' may in most cases be larger than the combination of the 'model error' and an ensemble spread. Therefore, the overall forecast uncertainty bounds are calculated differently: a) A hydrological ensemble forecast is calculated using each of the meteorological forecast members as forcing. Here, additionally the corresponding inflow hydrograph from all upstream catchments must be used. b) As for an upstream catchment, the uncertainty range is determined by combination of 'model error' and the ensemble member forecasts c) In addition, the 'overall error' is superimposed on the 'lead forecast'. For reasons of consistency, the lead forecast must be based on the same meteorological forecast in the downstream and all upstream catchments. d) From the resulting two uncertainty ranges (one from the ensemble forecast and 'model error', one from the 'lead forecast' and 'overall error'), the envelope is taken as the most prudent uncertainty range. In sum, the uncertainty associated with each forecast run is calculated and communicated to the public in the form of 10% and 90% percentiles. As in part I of this study, the methodology as well as the useful- or uselessness of the resulting uncertainty ranges will be presented and discussed by typical examples.

  10. The Wind Forecast Improvement Project (WFIP). A Public/Private Partnership for Improving Short Term Wind Energy Forecasts and Quantifying the Benefits of Utility Operations -- the Northern Study Area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Finley, Cathy

    2014-04-30

    This report contains the results from research aimed at improving short-range (0-6 hour) hub-height wind forecasts in the NOAA weather forecast models through additional data assimilation and model physics improvements for use in wind energy forecasting. Additional meteorological observing platforms including wind profilers, sodars, and surface stations were deployed for this study by NOAA and DOE, and additional meteorological data at or near wind turbine hub height were provided by South Dakota State University and WindLogics/NextEra Energy Resources over a large geographical area in the U.S. Northern Plains for assimilation into NOAA research weather forecast models. The resulting improvements inmore » wind energy forecasts based on the research weather forecast models (with the additional data assimilation and model physics improvements) were examined in many different ways and compared with wind energy forecasts based on the current operational weather forecast models to quantify the forecast improvements important to power grid system operators and wind plant owners/operators participating in energy markets. Two operational weather forecast models (OP_RUC, OP_RAP) and two research weather forecast models (ESRL_RAP, HRRR) were used as the base wind forecasts for generating several different wind power forecasts for the NextEra Energy wind plants in the study area. Power forecasts were generated from the wind forecasts in a variety of ways, from very simple to quite sophisticated, as they might be used by a wide range of both general users and commercial wind energy forecast vendors. The error characteristics of each of these types of forecasts were examined and quantified using bulk error statistics for both the local wind plant and the system aggregate forecasts. The wind power forecast accuracy was also evaluated separately for high-impact wind energy ramp events. The overall bulk error statistics calculated over the first six hours of the forecasts at both the individual wind plant and at the system-wide aggregate level over the one year study period showed that the research weather model-based power forecasts (all types) had lower overall error rates than the current operational weather model-based power forecasts, both at the individual wind plant level and at the system aggregate level. The bulk error statistics of the various model-based power forecasts were also calculated by season and model runtime/forecast hour as power system operations are more sensitive to wind energy forecast errors during certain times of year and certain times of day. The results showed that there were significant differences in seasonal forecast errors between the various model-based power forecasts. The results from the analysis of the various wind power forecast errors by model runtime and forecast hour showed that the forecast errors were largest during the times of day that have increased significance to power system operators (the overnight hours and the morning/evening boundary layer transition periods), but the research weather model-based power forecasts showed improvement over the operational weather model-based power forecasts at these times.« less

  11. Medium-range reference evapotranspiration forecasts for the contiguous United States based on multi-model numerical weather predictions

    NASA Astrophysics Data System (ADS)

    Medina, Hanoi; Tian, Di; Srivastava, Puneet; Pelosi, Anna; Chirico, Giovanni B.

    2018-07-01

    Reference evapotranspiration (ET0) plays a fundamental role in agronomic, forestry, and water resources management. Estimating and forecasting ET0 have long been recognized as a major challenge for researchers and practitioners in these communities. This work explored the potential of multiple leading numerical weather predictions (NWPs) for estimating and forecasting summer ET0 at 101 U.S. Regional Climate Reference Network stations over nine climate regions across the contiguous United States (CONUS). Three leading global NWP model forecasts from THORPEX Interactive Grand Global Ensemble (TIGGE) dataset were used in this study, including the single model ensemble forecasts from the European Centre for Medium-Range Weather Forecasts (EC), the National Centers for Environmental Prediction Global Forecast System (NCEP), and the United Kingdom Meteorological Office forecasts (MO), as well as multi-model ensemble forecasts from the combinations of these NWP models. A regression calibration was employed to bias correct the ET0 forecasts. Impact of individual forecast variables on ET0 forecasts were also evaluated. The results showed that the EC forecasts provided the least error and highest skill and reliability, followed by the MO and NCEP forecasts. The multi-model ensembles constructed from the combination of EC and MO forecasts provided slightly better performance than the single model EC forecasts. The regression process greatly improved ET0 forecast performances, particularly for the regions involving stations near the coast, or with a complex orography. The performance of EC forecasts was only slightly influenced by the size of the ensemble members, particularly at short lead times. Even with less ensemble members, EC still performed better than the other two NWPs. Errors in the radiation forecasts, followed by those in the wind, had the most detrimental effects on the ET0 forecast performances.

  12. Uses and Applications of Climate Forecasts for Power Utilities.

    NASA Astrophysics Data System (ADS)

    Changnon, Stanley A.; Changnon, Joyce M.; Changnon, David

    1995-05-01

    The uses and potential applications of climate forecasts for electric and gas utilities were assessed 1) to discern needs for improving climate forecasts and guiding future research, and 2) to assist utilities in making wise use of forecasts. In-depth structured interviews were conducted with 56 decision makers in six utilities to assess existing and potential uses of climate forecasts. Only 3 of the 56 use forecasts. Eighty percent of those sampled envisioned applications of climate forecasts, given certain changes and additional information. Primary applications exist in power trading, load forecasting, fuel acquisition, and systems planning, with slight differences in interests between utilities. Utility staff understand probability-based forecasts but desire climatological information related to forecasted outcomes, including analogs similar to the forecasts, and explanations of the forecasts. Desired lead times vary from a week to three months, along with forecasts of up to four seasons ahead. The new NOAA forecasts initiated in 1995 provide the lead times and longer-term forecasts desired. Major hindrances to use of forecasts are hard-to-understand formats, lack of corporate acceptance, and lack of access to expertise. Recent changes in government regulations altered the utility industry, leading to a more competitive world wherein information about future weather conditions assumes much more value. Outreach efforts by government forecast agencies appear valuable to help achieve the appropriate and enhanced use of climate forecasts by the utility industry. An opportunity for service exists also for the private weather sector.

  13. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2017-08-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  14. An overview of health forecasting.

    PubMed

    Soyiri, Ireneous N; Reidpath, Daniel D

    2013-01-01

    Health forecasting is a novel area of forecasting, and a valuable tool for predicting future health events or situations such as demands for health services and healthcare needs. It facilitates preventive medicine and health care intervention strategies, by pre-informing health service providers to take appropriate mitigating actions to minimize risks and manage demand. Health forecasting requires reliable data, information and appropriate analytical tools for the prediction of specific health conditions or situations. There is no single approach to health forecasting, and so various methods have often been adopted to forecast aggregate or specific health conditions. Meanwhile, there are no defined health forecasting horizons (time frames) to match the choices of health forecasting methods/approaches that are often applied. The key principles of health forecasting have not also been adequately described to guide the process. This paper provides a brief introduction and theoretical analysis of health forecasting. It describes the key issues that are important for health forecasting, including: definitions, principles of health forecasting, and the properties of health data, which influence the choices of health forecasting methods. Other matters related to the value of health forecasting, and the general challenges associated with developing and using health forecasting services are discussed. This overview is a stimulus for further discussions on standardizing health forecasting approaches and methods that will facilitate health care and health services delivery.

  15. Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States.

    PubMed

    Yamana, Teresa K; Kandula, Sasikiran; Shaman, Jeffrey

    2017-11-01

    Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time.

  16. Individual versus superensemble forecasts of seasonal influenza outbreaks in the United States

    PubMed Central

    Kandula, Sasikiran; Shaman, Jeffrey

    2017-01-01

    Recent research has produced a number of methods for forecasting seasonal influenza outbreaks. However, differences among the predicted outcomes of competing forecast methods can limit their use in decision-making. Here, we present a method for reconciling these differences using Bayesian model averaging. We generated retrospective forecasts of peak timing, peak incidence, and total incidence for seasonal influenza outbreaks in 48 states and 95 cities using 21 distinct forecast methods, and combined these individual forecasts to create weighted-average superensemble forecasts. We compared the relative performance of these individual and superensemble forecast methods by geographic location, timing of forecast, and influenza season. We find that, overall, the superensemble forecasts are more accurate than any individual forecast method and less prone to producing a poor forecast. Furthermore, we find that these advantages increase when the superensemble weights are stratified according to the characteristics of the forecast or geographic location. These findings indicate that different competing influenza prediction systems can be combined into a single more accurate forecast product for operational delivery in real time. PMID:29107987

  17. Multivariate Error Covariance Estimates by Monte-Carlo Simulation for Assimilation Studies in the Pacific Ocean

    NASA Technical Reports Server (NTRS)

    Borovikov, Anna; Rienecker, Michele M.; Keppenne, Christian; Johnson, Gregory C.

    2004-01-01

    One of the most difficult aspects of ocean state estimation is the prescription of the model forecast error covariances. The paucity of ocean observations limits our ability to estimate the covariance structures from model-observation differences. In most practical applications, simple covariances are usually prescribed. Rarely are cross-covariances between different model variables used. Here a comparison is made between a univariate Optimal Interpolation (UOI) scheme and a multivariate OI algorithm (MvOI) in the assimilation of ocean temperature. In the UOI case only temperature is updated using a Gaussian covariance function and in the MvOI salinity, zonal and meridional velocities as well as temperature, are updated using an empirically estimated multivariate covariance matrix. Earlier studies have shown that a univariate OI has a detrimental effect on the salinity and velocity fields of the model. Apparently, in a sequential framework it is important to analyze temperature and salinity together. For the MvOI an estimation of the model error statistics is made by Monte-Carlo techniques from an ensemble of model integrations. An important advantage of using an ensemble of ocean states is that it provides a natural way to estimate cross-covariances between the fields of different physical variables constituting the model state vector, at the same time incorporating the model's dynamical and thermodynamical constraints as well as the effects of physical boundaries. Only temperature observations from the Tropical Atmosphere-Ocean array have been assimilated in this study. In order to investigate the efficacy of the multivariate scheme two data assimilation experiments are validated with a large independent set of recently published subsurface observations of salinity, zonal velocity and temperature. For reference, a third control run with no data assimilation is used to check how the data assimilation affects systematic model errors. While the performance of the UOI and MvOI is similar with respect to the temperature field, the salinity and velocity fields are greatly improved when multivariate correction is used, as evident from the analyses of the rms differences of these fields and independent observations. The MvOI assimilation is found to improve upon the control run in generating the water masses with properties close to the observed, while the UOI failed to maintain the temperature and salinity structure.

  18. U.S. High Seas Marine Text Forecasts by Area

    Science.gov Websites

    Flooding Tsunamis 406 EPIRB's U.S. High Seas Marine Text Forecasts by Area OPC N.Atlantic High Seas Forecast NHC N.Atlantic High Seas Forecast OPC N.Pacific High Seas Forecast HFO N.Pacific High Seas Forecast NHC N.Pacific High Seas Forecast HFO S.Pacific High Seas Forecast U.S. High Seas Marine Text

  19. A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions

    NASA Astrophysics Data System (ADS)

    Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.

    2017-12-01

    The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.

  20. Intermediate-term forecasting of aftershocks from an early aftershock sequence: Bayesian and ensemble forecasting approaches

    NASA Astrophysics Data System (ADS)

    Omi, Takahiro; Ogata, Yosihiko; Hirata, Yoshito; Aihara, Kazuyuki

    2015-04-01

    Because aftershock occurrences can cause significant seismic risks for a considerable time after the main shock, prospective forecasting of the intermediate-term aftershock activity as soon as possible is important. The epidemic-type aftershock sequence (ETAS) model with the maximum likelihood estimate effectively reproduces general aftershock activity including secondary or higher-order aftershocks and can be employed for the forecasting. However, because we cannot always expect the accurate parameter estimation from incomplete early aftershock data where many events are missing, such forecasting using only a single estimated parameter set (plug-in forecasting) can frequently perform poorly. Therefore, we here propose Bayesian forecasting that combines the forecasts by the ETAS model with various probable parameter sets given the data. By conducting forecasting tests of 1 month period aftershocks based on the first 1 day data after the main shock as an example of the early intermediate-term forecasting, we show that the Bayesian forecasting performs better than the plug-in forecasting on average in terms of the log-likelihood score. Furthermore, to improve forecasting of large aftershocks, we apply a nonparametric (NP) model using magnitude data during the learning period and compare its forecasting performance with that of the Gutenberg-Richter (G-R) formula. We show that the NP forecast performs better than the G-R formula in some cases but worse in other cases. Therefore, robust forecasting can be obtained by employing an ensemble forecast that combines the two complementary forecasts. Our proposed method is useful for a stable unbiased intermediate-term assessment of aftershock probabilities.

  1. Simulation Exploration through Immersive Parallel Planes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas J; Bush, Brian W; Gruchalla, Kenny M

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  2. Simulation Exploration through Immersive Parallel Planes: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunhart-Lupo, Nicholas; Bush, Brian W.; Gruchalla, Kenny

    We present a visualization-driven simulation system that tightly couples systems dynamics simulations with an immersive virtual environment to allow analysts to rapidly develop and test hypotheses in a high-dimensional parameter space. To accomplish this, we generalize the two-dimensional parallel-coordinates statistical graphic as an immersive 'parallel-planes' visualization for multivariate time series emitted by simulations running in parallel with the visualization. In contrast to traditional parallel coordinate's mapping the multivariate dimensions onto coordinate axes represented by a series of parallel lines, we map pairs of the multivariate dimensions onto a series of parallel rectangles. As in the case of parallel coordinates, eachmore » individual observation in the dataset is mapped to a polyline whose vertices coincide with its coordinate values. Regions of the rectangles can be 'brushed' to highlight and select observations of interest: a 'slider' control allows the user to filter the observations by their time coordinate. In an immersive virtual environment, users interact with the parallel planes using a joystick that can select regions on the planes, manipulate selection, and filter time. The brushing and selection actions are used to both explore existing data as well as to launch additional simulations corresponding to the visually selected portions of the input parameter space. As soon as the new simulations complete, their resulting observations are displayed in the virtual environment. This tight feedback loop between simulation and immersive analytics accelerates users' realization of insights about the simulation and its output.« less

  3. Quality-by-design case study: investigation of the role of poloxamer in immediate-release tablets by experimental design and multivariate data analysis.

    PubMed

    Kaul, Goldi; Huang, Jun; Chatlapalli, Ramarao; Ghosh, Krishnendu; Nagi, Arwinder

    2011-12-01

    The role of poloxamer 188, water and binder addition rate, on retarding dissolution in immediate-release tablets of a model drug from BCS class II was investigated by means of multivariate data analysis (MVDA) combined with design of experiments (DOE). While the DOE analysis yielded important clues into the cause-and-effect relationship between the responses and design factors, multivariate data analysis of the 40+ variables provided additional information on slowdown in tablet dissolution. A steep dependence of both tablet dissolution and disintegration on the poloxamer and less so on other design variables was observed. Poloxamer was found to increase dissolution rates in granules as expected of surfactants in general but retard dissolution in tablets. The unexpected effect of poloxamer in tablets was accompanied by an increase in tablet-disintegration-time-mediated slowdown of tablet dissolution and by a surrogate binding effect of poloxamer at higher concentrations. It was additionally realized through MVDA that poloxamer in tablets either acts as a binder by itself or promotes binder action of the binder povidone resulting in increased intragranular cohesion. Additionally, poloxamer was found to mediate tablet dissolution on stability as well. In contrast to tablet dissolution at release (time zero), poloxamer appeared to increase tablet dissolution in a concentration-dependent manner on accelerated open-dish stability. Substituting polysorbate 80 as an alternate surfactant in place of poloxamer in the formulation was found to stabilize tablet dissolution.

  4. Study on initiative vibration absorbing technology of optics in strong disturbed environment

    NASA Astrophysics Data System (ADS)

    Jia, Si-nan; Xiong, Mu-di; Zou, Xiao-jie

    2007-12-01

    Strong disturbed environment is apt to cause irregular vibration, which seriously affects optical collimation. To improve the performance of laser beam, three-point dynamic vibration absorbing method is proposed, and laser beam initiative vibration absorbing system is designed. The maladjustment signal is detected by position sensitive device (PSD), three groups of PZT are driven to adjust optical element in real-time, so the performance of output-beam is improved. The coupling model of the system is presented. Multivariable adaptive closed-loop decoupling arithmetic is used to design three-input-three-output decoupling controller, so that high precision dynamic adjusting is realized. Experiments indicate that the system has good shock absorbing efficiency.

  5. Simulation analysis of adaptive cruise prediction control

    NASA Astrophysics Data System (ADS)

    Zhang, Li; Cui, Sheng Min

    2017-09-01

    Predictive control is suitable for multi-variable and multi-constraint system control.In order to discuss the effect of predictive control on the vehicle longitudinal motion, this paper establishes the expected spacing model by combining variable pitch spacing and the of safety distance strategy. The model predictive control theory and the optimization method based on secondary planning are designed to obtain and track the best expected acceleration trajectory quickly. Simulation models are established including predictive and adaptive fuzzy control. Simulation results show that predictive control can realize the basic function of the system while ensuring the safety. The application of predictive and fuzzy adaptive algorithm in cruise condition indicates that the predictive control effect is better.

  6. Quantifying model uncertainty in seasonal Arctic sea-ice forecasts

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin

    2017-04-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  7. Model-free aftershock forecasts constructed from similar sequences in the past

    NASA Astrophysics Data System (ADS)

    van der Elst, N.; Page, M. T.

    2017-12-01

    The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity forecast may be useful to emergency managers and non-specialists when confidence or expertise in parametric forecasting may be lacking. The method makes over-tuning impossible, and minimizes the rate of surprises. At the least, this forecast constitutes a useful benchmark for more precisely tuned parametric forecasts.

  8. Novel forecasting approaches using combination of machine learning and statistical models for flood susceptibility mapping.

    PubMed

    Shafizadeh-Moghadam, Hossein; Valavi, Roozbeh; Shahabi, Himan; Chapi, Kamran; Shirzadi, Ataollah

    2018-07-01

    In this research, eight individual machine learning and statistical models are implemented and compared, and based on their results, seven ensemble models for flood susceptibility assessment are introduced. The individual models included artificial neural networks, classification and regression trees, flexible discriminant analysis, generalized linear model, generalized additive model, boosted regression trees, multivariate adaptive regression splines, and maximum entropy, and the ensemble models were Ensemble Model committee averaging (EMca), Ensemble Model confidence interval Inferior (EMciInf), Ensemble Model confidence interval Superior (EMciSup), Ensemble Model to estimate the coefficient of variation (EMcv), Ensemble Model to estimate the mean (EMmean), Ensemble Model to estimate the median (EMmedian), and Ensemble Model based on weighted mean (EMwmean). The data set covered 201 flood events in the Haraz watershed (Mazandaran province in Iran) and 10,000 randomly selected non-occurrence points. Among the individual models, the Area Under the Receiver Operating Characteristic (AUROC), which showed the highest value, belonged to boosted regression trees (0.975) and the lowest value was recorded for generalized linear model (0.642). On the other hand, the proposed EMmedian resulted in the highest accuracy (0.976) among all models. In spite of the outstanding performance of some models, nevertheless, variability among the prediction of individual models was considerable. Therefore, to reduce uncertainty, creating more generalizable, more stable, and less sensitive models, ensemble forecasting approaches and in particular the EMmedian is recommended for flood susceptibility assessment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. A study for systematic errors of the GLA forecast model in tropical regions

    NASA Technical Reports Server (NTRS)

    Chen, Tsing-Chang; Baker, Wayman E.; Pfaendtner, James; Corrigan, Martin

    1988-01-01

    From the sensitivity studies performed with the Goddard Laboratory for Atmospheres (GLA) analysis/forecast system, it was revealed that the forecast errors in the tropics affect the ability to forecast midlatitude weather in some cases. Apparently, the forecast errors occurring in the tropics can propagate to midlatitudes. Therefore, the systematic error analysis of the GLA forecast system becomes a necessary step in improving the model's forecast performance. The major effort of this study is to examine the possible impact of the hydrological-cycle forecast error on dynamical fields in the GLA forecast system.

  10. Multi-Year Revenue and Expenditure Forecasting for Small Municipal Governments.

    DTIC Science & Technology

    1981-03-01

    Management Audit Econometric Revenue Forecast Gap and Impact Analysis Deterministic Expenditure Forecast Municipal Forecasting Municipal Budget Formlto...together with a multi-year revenue and expenditure forecasting model for the City of Monterey, California. The Monterey model includes an econometric ...65 5 D. FORECAST BASED ON THE ECONOMETRIC MODEL ------- 67 E. FORECAST BASED ON EXPERT JUDGMENT AND TREND ANALYSIS

  11. Convective Weather Forecast Accuracy Analysis at Center and Sector Levels

    NASA Technical Reports Server (NTRS)

    Wang, Yao; Sridhar, Banavar

    2010-01-01

    This paper presents a detailed convective forecast accuracy analysis at center and sector levels. The study is aimed to provide more meaningful forecast verification measures to aviation community, as well as to obtain useful information leading to the improvements in the weather translation capacity models. In general, the vast majority of forecast verification efforts over past decades have been on the calculation of traditional standard verification measure scores over forecast and observation data analyses onto grids. These verification measures based on the binary classification have been applied in quality assurance of weather forecast products at the national level for many years. Our research focuses on the forecast at the center and sector levels. We calculate the standard forecast verification measure scores for en-route air traffic centers and sectors first, followed by conducting the forecast validation analysis and related verification measures for weather intensities and locations at centers and sectors levels. An approach to improve the prediction of sector weather coverage by multiple sector forecasts is then developed. The weather severe intensity assessment was carried out by using the correlations between forecast and actual weather observation airspace coverage. The weather forecast accuracy on horizontal location was assessed by examining the forecast errors. The improvement in prediction of weather coverage was determined by the correlation between actual sector weather coverage and prediction. observed and forecasted Convective Weather Avoidance Model (CWAM) data collected from June to September in 2007. CWAM zero-minute forecast data with aircraft avoidance probability of 60% and 80% are used as the actual weather observation. All forecast measurements are based on 30-minute, 60- minute, 90-minute, and 120-minute forecasts with the same avoidance probabilities. The forecast accuracy analysis for times under one-hour showed that the errors in intensity and location for center forecast are relatively low. For example, 1-hour forecast intensity and horizontal location errors for ZDC center were about 0.12 and 0.13. However, the correlation between sector 1-hour forecast and actual weather coverage was weak, for sector ZDC32, about 32% of the total variation of observation weather intensity was unexplained by forecast; the sector horizontal location error was about 0.10. The paper also introduces an approach to estimate the sector three-dimensional actual weather coverage by using multiple sector forecasts, which turned out to produce better predictions. Using Multiple Linear Regression (MLR) model for this approach, the correlations between actual observation and the multiple sector forecast model prediction improved by several percents at 95% confidence level in comparison with single sector forecast.

  12. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    NASA Astrophysics Data System (ADS)

    Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.

    2018-03-01

    Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  13. Bayesian analyses of seasonal runoff forecasts

    NASA Astrophysics Data System (ADS)

    Krzysztofowicz, R.; Reese, S.

    1991-12-01

    Forecasts of seasonal snowmelt runoff volume provide indispensable information for rational decision making by water project operators, irrigation district managers, and farmers in the western United States. Bayesian statistical models and communication frames have been researched in order to enhance the forecast information disseminated to the users, and to characterize forecast skill from the decision maker's point of view. Four products are presented: (i) a Bayesian Processor of Forecasts, which provides a statistical filter for calibrating the forecasts, and a procedure for estimating the posterior probability distribution of the seasonal runoff; (ii) the Bayesian Correlation Score, a new measure of forecast skill, which is related monotonically to the ex ante economic value of forecasts for decision making; (iii) a statistical predictor of monthly cumulative runoffs within the snowmelt season, conditional on the total seasonal runoff forecast; and (iv) a framing of the forecast message that conveys the uncertainty associated with the forecast estimates to the users. All analyses are illustrated with numerical examples of forecasts for six gauging stations from the period 1971 1988.

  14. Maximum covariance analysis to identify intraseasonal oscillations over tropical Brazil

    NASA Astrophysics Data System (ADS)

    Barreto, Naurinete J. C.; Mesquita, Michel d. S.; Mendes, David; Spyrides, Maria H. C.; Pedra, George U.; Lucio, Paulo S.

    2017-09-01

    A reliable prognosis of extreme precipitation events in the tropics is arguably challenging to obtain due to the interaction of meteorological systems at various time scales. A pivotal component of the global climate variability is the so-called intraseasonal oscillations, phenomena that occur between 20 and 100 days. The Madden-Julian Oscillation (MJO), which is directly related to the modulation of convective precipitation in the equatorial belt, is considered the primary oscillation in the tropical region. The aim of this study is to diagnose the connection between the MJO signal and the regional intraseasonal rainfall variability over tropical Brazil. This is achieved through the development of an index called Multivariate Intraseasonal Index for Tropical Brazil (MITB). This index is based on Maximum Covariance Analysis (MCA) applied to the filtered daily anomalies of rainfall data over tropical Brazil against a group of covariates consisting of: outgoing longwave radiation and the zonal component u of the wind at 850 and 200 hPa. The first two MCA modes, which were used to create the { MITB}_1 and { MITB}_2 indices, represent 65 and 16 % of the explained variance, respectively. The combined multivariate index was able to satisfactorily represent the pattern of intraseasonal variability over tropical Brazil, showing that there are periods of activation and inhibition of precipitation connected with the pattern of MJO propagation. The MITB index could potentially be used as a diagnostic tool for intraseasonal forecasting.

  15. Climate variability, weather and enteric disease incidence in New Zealand: time series analysis.

    PubMed

    Lal, Aparna; Ikeda, Takayoshi; French, Nigel; Baker, Michael G; Hales, Simon

    2013-01-01

    Evaluating the influence of climate variability on enteric disease incidence may improve our ability to predict how climate change may affect these diseases. To examine the associations between regional climate variability and enteric disease incidence in New Zealand. Associations between monthly climate and enteric diseases (campylobacteriosis, salmonellosis, cryptosporidiosis, giardiasis) were investigated using Seasonal Auto Regressive Integrated Moving Average (SARIMA) models. No climatic factors were significantly associated with campylobacteriosis and giardiasis, with similar predictive power for univariate and multivariate models. Cryptosporidiosis was positively associated with average temperature of the previous month (β =  0.130, SE =  0.060, p <0.01) and inversely related to the Southern Oscillation Index (SOI) two months previously (β =  -0.008, SE =  0.004, p <0.05). By contrast, salmonellosis was positively associated with temperature (β  = 0.110, SE = 0.020, p<0.001) of the current month and SOI of the current (β  = 0.005, SE = 0.002, p<0.050) and previous month (β  = 0.005, SE = 0.002, p<0.05). Forecasting accuracy of the multivariate models for cryptosporidiosis and salmonellosis were significantly higher. Although spatial heterogeneity in the observed patterns could not be assessed, these results suggest that temporally lagged relationships between climate variables and national communicable disease incidence data can contribute to disease prediction models and early warning systems.

  16. A Regularized Linear Dynamical System Framework for Multivariate Time Series Analysis.

    PubMed

    Liu, Zitao; Hauskrecht, Milos

    2015-01-01

    Linear Dynamical System (LDS) is an elegant mathematical framework for modeling and learning Multivariate Time Series (MTS). However, in general, it is difficult to set the dimension of an LDS's hidden state space. A small number of hidden states may not be able to model the complexities of a MTS, while a large number of hidden states can lead to overfitting. In this paper, we study learning methods that impose various regularization penalties on the transition matrix of the LDS model and propose a regularized LDS learning framework (rLDS) which aims to (1) automatically shut down LDSs' spurious and unnecessary dimensions, and consequently, address the problem of choosing the optimal number of hidden states; (2) prevent the overfitting problem given a small amount of MTS data; and (3) support accurate MTS forecasting. To learn the regularized LDS from data we incorporate a second order cone program and a generalized gradient descent method into the Maximum a Posteriori framework and use Expectation Maximization to obtain a low-rank transition matrix of the LDS model. We propose two priors for modeling the matrix which lead to two instances of our rLDS. We show that our rLDS is able to recover well the intrinsic dimensionality of the time series dynamics and it improves the predictive performance when compared to baselines on both synthetic and real-world MTS datasets.

  17. A short-term ensemble wind speed forecasting system for wind power applications

    NASA Astrophysics Data System (ADS)

    Baidya Roy, S.; Traiteur, J. J.; Callicutt, D.; Smith, M.

    2011-12-01

    This study develops an adaptive, blended forecasting system to provide accurate wind speed forecasts 1 hour ahead of time for wind power applications. The system consists of an ensemble of 21 forecasts with different configurations of the Weather Research and Forecasting Single Column Model (WRFSCM) and a persistence model. The ensemble is calibrated against observations for a 2 month period (June-July, 2008) at a potential wind farm site in Illinois using the Bayesian Model Averaging (BMA) technique. The forecasting system is evaluated against observations for August 2008 at the same site. The calibrated ensemble forecasts significantly outperform the forecasts from the uncalibrated ensemble while significantly reducing forecast uncertainty under all environmental stability conditions. The system also generates significantly better forecasts than persistence, autoregressive (AR) and autoregressive moving average (ARMA) models during the morning transition and the diurnal convective regimes. This forecasting system is computationally more efficient than traditional numerical weather prediction models and can generate a calibrated forecast, including model runs and calibration, in approximately 1 minute. Currently, hour-ahead wind speed forecasts are almost exclusively produced using statistical models. However, numerical models have several distinct advantages over statistical models including the potential to provide turbulence forecasts. Hence, there is an urgent need to explore the role of numerical models in short-term wind speed forecasting. This work is a step in that direction and is likely to trigger a debate within the wind speed forecasting community.

  18. Morbidity Forecast in Cities: A Study of Urban Air Pollution and Respiratory Diseases in the Metropolitan Region of Curitiba, Brazil.

    PubMed

    de Souza, Fabio Teodoro

    2018-05-29

    In the last two decades, urbanization has intensified, and in Brazil, about 90% of the population now lives in urban centers. Atmospheric patterns have changed owing to the high growth rate of cities, with negative consequences for public health. This research aims to elucidate the spatial patterns of air pollution and respiratory diseases. A data-based model to aid local urban management to improve public health policies concerning air pollution is described. An example of data preparation and multivariate analysis with inventories from different cities in the Metropolitan Region of Curitiba was studied. A predictive model with outstanding accuracy in prediction of outbreaks was developed. Preliminary results describe relevant relations among morbidity scales, air pollution levels, and atmospheric seasonal patterns. The knowledge gathered here contributes to the debate on social issues and public policies. Moreover, the results of this smaller scale study can be extended to megacities.

  19. Data-driven Analysis and Prediction of Arctic Sea Ice

    NASA Astrophysics Data System (ADS)

    Kondrashov, D. A.; Chekroun, M.; Ghil, M.; Yuan, X.; Ting, M.

    2015-12-01

    We present results of data-driven predictive analyses of sea ice over the main Arctic regions. Our approach relies on the Multilayer Stochastic Modeling (MSM) framework of Kondrashov, Chekroun and Ghil [Physica D, 2015] and it leads to prognostic models of sea ice concentration (SIC) anomalies on seasonal time scales.This approach is applied to monthly time series of leading principal components from the multivariate Empirical Orthogonal Function decomposition of SIC and selected climate variables over the Arctic. We evaluate the predictive skill of MSM models by performing retrospective forecasts with "no-look ahead" forup to 6-months ahead. It will be shown in particular that the memory effects included in our non-Markovian linear MSM models improve predictions of large-amplitude SIC anomalies in certain Arctic regions. Furtherimprovements allowed by the MSM framework will adopt a nonlinear formulation, as well as alternative data-adaptive decompositions.

  20. Communicating uncertainty in hydrological forecasts: mission impossible?

    NASA Astrophysics Data System (ADS)

    Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian

    2010-05-01

    Cascading uncertainty in meteo-hydrological modelling chains for forecasting and integrated flood risk assessment is an essential step to improve the quality of hydrological forecasts. Although the best methodology to quantify the total predictive uncertainty in hydrology is still debated, there is a common agreement that one must avoid uncertainty misrepresentation and miscommunication, as well as misinterpretation of information by users. Several recent studies point out that uncertainty, when properly explained and defined, is no longer unwelcome among emergence response organizations, users of flood risk information and the general public. However, efficient communication of uncertain hydro-meteorological forecasts is far from being a resolved issue. This study focuses on the interpretation and communication of uncertain hydrological forecasts based on (uncertain) meteorological forecasts and (uncertain) rainfall-runoff modelling approaches to decision-makers such as operational hydrologists and water managers in charge of flood warning and scenario-based reservoir operation. An overview of the typical flow of uncertainties and risk-based decisions in hydrological forecasting systems is presented. The challenges related to the extraction of meaningful information from probabilistic forecasts and the test of its usefulness in assisting operational flood forecasting are illustrated with the help of two case-studies: 1) a study on the use and communication of probabilistic flood forecasting within the European Flood Alert System; 2) a case-study on the use of probabilistic forecasts by operational forecasters from the hydroelectricity company EDF in France. These examples show that attention must be paid to initiatives that promote or reinforce the active participation of expert forecasters in the forecasting chain. The practice of face-to-face forecast briefings, focusing on sharing how forecasters interpret, describe and perceive the model output forecasted scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.

  1. Bayesian flood forecasting methods: A review

    NASA Astrophysics Data System (ADS)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been developed and widely applied, but there is still room for improvements. Future research in the context of Bayesian flood forecasting should be on assimilation of various sources of newly available information and improvement of predictive performance assessment methods.

  2. Verification of Space Weather Forecasts using Terrestrial Weather Approaches

    NASA Astrophysics Data System (ADS)

    Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.

    2015-12-01

    The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.

  3. A framework for improving a seasonal hydrological forecasting system using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Pappenberger, Florian; Smith, Paul; Cloke, Hannah

    2017-04-01

    Seasonal streamflow forecasts are of great value for the socio-economic sector, for applications such as navigation, flood and drought mitigation and reservoir management for hydropower generation and water allocation to agriculture and drinking water. However, as we speak, the performance of dynamical seasonal hydrological forecasting systems (systems based on running seasonal meteorological forecasts through a hydrological model to produce seasonal hydrological forecasts) is still limited in space and time. In this context, the ESP (Ensemble Streamflow Prediction) remains an attractive forecasting method for seasonal streamflow forecasting as it relies on forcing a hydrological model (starting from the latest observed or simulated initial hydrological conditions) with historical meteorological observations. This makes it cheaper to run than a standard dynamical seasonal hydrological forecasting system, for which the seasonal meteorological forecasts will first have to be produced, while still producing skilful forecasts. There is thus the need to focus resources and time towards improvements in dynamical seasonal hydrological forecasting systems which will eventually lead to significant improvements in the skill of the streamflow forecasts generated. Sensitivity analyses are a powerful tool that can be used to disentangle the relative contributions of the two main sources of errors in seasonal streamflow forecasts, namely the initial hydrological conditions (IHC; e.g., soil moisture, snow cover, initial streamflow, among others) and the meteorological forcing (MF; i.e., seasonal meteorological forecasts of precipitation and temperature, input to the hydrological model). Sensitivity analyses are however most useful if they inform and change current operational practices. To this end, we propose a method to improve the design of a seasonal hydrological forecasting system. This method is based on sensitivity analyses, informing the forecasters as to which element of the forecasting chain (i.e., IHC or MF) could potentially lead to the highest increase in seasonal hydrological forecasting performance, after each forecast update.

  4. Forecasting Consumer Adoption of Information Technology and Services--Lessons from Home Video Forecasting.

    ERIC Educational Resources Information Center

    Klopfenstein, Bruce C.

    1989-01-01

    Describes research that examined the strengths and weaknesses of technological forecasting methods by analyzing forecasting studies made for home video players. The discussion covers assessments and explications of correct and incorrect forecasting assumptions, and their implications for forecasting the adoption of home information technologies…

  5. Uncertainties in Forecasting Streamflow using Entropy Theory

    NASA Astrophysics Data System (ADS)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  6. A national-scale seasonal hydrological forecast system: development and evaluation over Britain

    NASA Astrophysics Data System (ADS)

    Bell, Victoria A.; Davies, Helen N.; Kay, Alison L.; Brookshaw, Anca; Scaife, Adam A.

    2017-09-01

    Skilful winter seasonal predictions for the North Atlantic circulation and northern Europe have now been demonstrated and the potential for seasonal hydrological forecasting in the UK is now being explored. One of the techniques being used combines seasonal rainfall forecasts provided by operational weather forecast systems with hydrological modelling tools to provide estimates of seasonal mean river flows up to a few months ahead. The work presented here shows how spatial information contained in a distributed hydrological model typically requiring high-resolution (daily or better) rainfall data can be used to provide an initial condition for a much simpler forecast model tailored to use low-resolution monthly rainfall forecasts. Rainfall forecasts (hindcasts) from the GloSea5 model (1996 to 2009) are used to provide the first assessment of skill in these national-scale flow forecasts. The skill in the combined modelling system is assessed for different seasons and regions of Britain, and compared to what might be achieved using other approaches such as use of an ensemble of historical rainfall in a hydrological model, or a simple flow persistence forecast. The analysis indicates that only limited forecast skill is achievable for Spring and Summer seasonal hydrological forecasts; however, Autumn and Winter flows can be reasonably well forecast using (ensemble mean) rainfall forecasts based on either GloSea5 forecasts or historical rainfall (the preferred type of forecast depends on the region). Flow forecasts using ensemble mean GloSea5 rainfall perform most consistently well across Britain, and provide the most skilful forecasts overall at the 3-month lead time. Much of the skill (64 %) in the 1-month ahead seasonal flow forecasts can be attributed to the hydrological initial condition (particularly in regions with a significant groundwater contribution to flows), whereas for the 3-month ahead lead time, GloSea5 forecasts account for ˜ 70 % of the forecast skill (mostly in areas of high rainfall to the north and west) and only 30 % of the skill arises from hydrological memory (typically groundwater-dominated areas). Given the high spatial heterogeneity in typical patterns of UK rainfall and evaporation, future development of skilful spatially distributed seasonal forecasts could lead to substantial improvements in seasonal flow forecast capability, potentially benefitting practitioners interested in predicting hydrological extremes, not only in the UK but also across Europe.

  7. Improving medium-range and seasonal hydroclimate forecasts in the southeast USA

    NASA Astrophysics Data System (ADS)

    Tian, Di

    Accurate hydro-climate forecasts are important for decision making by water managers, agricultural producers, and other stake holders. Numerical weather prediction models and general circulation models may have potential for improving hydro-climate forecasts at different scales. In this study, forecast analogs of the Global Forecast System (GFS) and Global Ensemble Forecast System (GEFS) based on different approaches were evaluated for medium-range reference evapotranspiration (ETo), irrigation scheduling, and urban water demand forecasts in the southeast United States; the Climate Forecast System version 2 (CFSv2) and the North American national multi-model ensemble (NMME) were statistically downscaled for seasonal forecasts of ETo, precipitation (P) and 2-m temperature (T2M) at the regional level. The GFS mean temperature (Tmean), relative humidity, and wind speed (Wind) reforecasts combined with the climatology of Reanalysis 2 solar radiation (Rs) produced higher skill than using the direct GFS output only. Constructed analogs showed slightly higher skill than natural analogs for deterministic forecasts. Both irrigation scheduling driven by the GEFS-based ETo forecasts and GEFS-based ETo forecast skill were generally positive up to one week throughout the year. The GEFS improved ETo forecast skill compared to the GFS. The GEFS-based analog forecasts for the input variables of an operational urban water demand model were skillful when applied in the Tampa Bay area. The modified operational models driven by GEFS analog forecasts showed higher forecast skill than the operational model based on persistence. The results for CFSv2 seasonal forecasts showed maximum temperature (Tmax) and Rs had the greatest influence on ETo. The downscaled Tmax showed the highest predictability, followed by Tmean, Tmin, Rs, and Wind. The CFSv2 model could better predict ETo in cold seasons during El Nino Southern Oscillation (ENSO) events only when the forecast initial condition was in ENSO. Downscaled P and T2M forecasts were produced by directly downscaling the NMME P and T2M output or indirectly using the NMME forecasts of Nino3.4 sea surface temperatures to predict local-scale P and T2M. The indirect method generally showed the highest forecast skill which occurs in cold seasons. The bias-corrected NMME ensemble forecast skill did not outperform the best single model.

  8. Modeled Forecasts of Dengue Fever in San Juan, Puerto Rico Using NASA Satellite Enhanced Weather Forecasts

    NASA Astrophysics Data System (ADS)

    Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.

    2015-12-01

    Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.

  9. The potential of radar-based ensemble forecasts for flash-flood early warning in the southern Swiss Alps

    NASA Astrophysics Data System (ADS)

    Liechti, K.; Panziera, L.; Germann, U.; Zappa, M.

    2013-10-01

    This study explores the limits of radar-based forecasting for hydrological runoff prediction. Two novel radar-based ensemble forecasting chains for flash-flood early warning are investigated in three catchments in the southern Swiss Alps and set in relation to deterministic discharge forecasts for the same catchments. The first radar-based ensemble forecasting chain is driven by NORA (Nowcasting of Orographic Rainfall by means of Analogues), an analogue-based heuristic nowcasting system to predict orographic rainfall for the following eight hours. The second ensemble forecasting system evaluated is REAL-C2, where the numerical weather prediction COSMO-2 is initialised with 25 different initial conditions derived from a four-day nowcast with the radar ensemble REAL. Additionally, three deterministic forecasting chains were analysed. The performance of these five flash-flood forecasting systems was analysed for 1389 h between June 2007 and December 2010 for which NORA forecasts were issued, due to the presence of orographic forcing. A clear preference was found for the ensemble approach. Discharge forecasts perform better when forced by NORA and REAL-C2 rather then by deterministic weather radar data. Moreover, it was observed that using an ensemble of initial conditions at the forecast initialisation, as in REAL-C2, significantly improved the forecast skill. These forecasts also perform better then forecasts forced by ensemble rainfall forecasts (NORA) initialised form a single initial condition of the hydrological model. Thus the best results were obtained with the REAL-C2 forecasting chain. However, for regions where REAL cannot be produced, NORA might be an option for forecasting events triggered by orographic precipitation.

  10. Physics-based and statistical earthquake forecasting in a continental rift zone: the case study of Corinth Gulf (Greece)

    NASA Astrophysics Data System (ADS)

    Segou, Margarita

    2016-01-01

    I perform a retrospective forecast experiment in the most rapid extensive continental rift worldwide, the western Corinth Gulf (wCG, Greece), aiming to predict shallow seismicity (depth <15 km) with magnitude M ≥ 3.0 for the time period between 1995 and 2013. I compare two short-term earthquake clustering models, based on epidemic-type aftershock sequence (ETAS) statistics, four physics-based (CRS) models, combining static stress change estimations and the rate-and-state laboratory law and one hybrid model. For the latter models, I incorporate the stress changes imparted from 31 earthquakes with magnitude M ≥ 4.5 at the extended area of wCG. Special attention is given on the 3-D representation of active faults, acting as potential receiver planes for the estimation of static stress changes. I use reference seismicity between 1990 and 1995, corresponding to the learning phase of physics-based models, and I evaluate the forecasts for six months following the 1995 M = 6.4 Aigio earthquake using log-likelihood performance metrics. For the ETAS realizations, I use seismic events with magnitude M ≥ 2.5 within daily update intervals to enhance their predictive power. For assessing the role of background seismicity, I implement a stochastic reconstruction (aka declustering) aiming to answer whether M > 4.5 earthquakes correspond to spontaneous events and identify, if possible, different triggering characteristics between aftershock sequences and swarm-type seismicity periods. I find that: (1) ETAS models outperform CRS models in most time intervals achieving very low rejection ratio RN = 6 per cent, when I test their efficiency to forecast the total number of events inside the study area, (2) the best rejection ratio for CRS models reaches RN = 17 per cent, when I use varying target depths and receiver plane geometry, (3) 75 per cent of the 1995 Aigio aftershocks that occurred within the first month can be explained by static stress changes, (4) highly variable performance on behalf of both statistical and physical models is suggested by large confidence intervals of information gain per earthquake and (5) generic ETAS models can adequately predict the temporal evolution of seismicity during swarms. Furthermore, stochastic reconstruction of seismicity makes possible the identification of different triggering processes between specific seismic crises (2001, 2003-04, 2006-07) and the 1995 aftershock sequence. I find that: (1) seismic events with M ≥ 5.0 are not a part of a preceding earthquake cascade, since they are characterized by high probability being a background event (average Pback > 0.8) and (2) triggered seismicity within swarms is characterized by lower event productivity when compared with the corresponding value during aftershock sequences. I conclude that physics-based models contribute on the determination of the `new-normal' seismicity rate at longer time intervals and that their joint implementation with statistical models is beneficial for future operational forecast systems.

  11. Evaluation of Flood Forecast and Warning in Elbe river basin - Impact of Forecaster's Strategy

    NASA Astrophysics Data System (ADS)

    Danhelka, Jan; Vlasak, Tomas

    2010-05-01

    Czech Hydrometeorological Institute (CHMI) is responsible for flood forecasting and warning in the Czech Republic. To meet that issue CHMI operates hydrological forecasting systems and publish flow forecast in selected profiles. Flood forecast and warning is an output of system that links observation (flow and atmosphere), data processing, weather forecast (especially NWP's QPF), hydrological modeling and modeled outputs evaluation and interpretation by forecaster. Forecast users are interested in final output without separating uncertainties of separate steps of described process. Therefore an evaluation of final operational forecasts was done for profiles within Elbe river basin produced by AquaLog forecasting system during period 2002 to 2008. Effects of uncertainties of observation, data processing and especially meteorological forecasts were not accounted separately. Forecast of flood levels exceedance (peak over the threshold) during forecasting period was the main criterion as flow increase forecast is of the highest importance. Other evaluation criteria included peak flow and volume difference. In addition Nash-Sutcliffe was computed separately for each time step (1 to 48 h) of forecasting period to identify its change with the lead time. Textual flood warnings are issued for administrative regions to initiate flood protection actions in danger of flood. Flood warning hit rate was evaluated at regions level and national level. Evaluation found significant differences of model forecast skill between forecasting profiles, particularly less skill was evaluated at small headwater basins due to domination of QPF uncertainty in these basins. The average hit rate was 0.34 (miss rate = 0.33, false alarm rate = 0.32). However its explored spatial difference is likely to be influenced also by different fit of parameters sets (due to different basin characteristics) and importantly by different impact of human factor. Results suggest that the practice of interactive model operation, experience and forecasting strategy differs between responsible forecasting offices. Warning is based on model outputs interpretation by hydrologists-forecaster. Warning hit rate reached 0.60 for threshold set to lowest flood stage of which 0.11 was underestimation of flood degree (miss 0.22, false alarm 0.28). Critical success index of model forecast was 0.34, while the same criteria for warning reached 0.55. We assume that the increase accounts not only to change of scale from single forecasting point to region for warning, but partly also to forecaster's added value. There is no official warning strategy preferred in the Czech Republic (f.e. tolerance towards higher false alarm rate). Therefore forecaster decision and personal strategy is of great importance. Results show quite successful warning for 1st flood level exceedance, over-warning for 2nd flood level, but under-warning for 3rd (highest) flood level. That suggests general forecaster's preference of medium level warning (2nd flood level is legally determined to be the start of the flood and flood protection activities). In conclusion human forecaster's experience and analysis skill increases flood warning performance notably. However society preference should be specifically addressed in the warning strategy definition to support forecaster's decision making.

  12. An Optimization of Inventory Demand Forecasting in University Healthcare Centre

    NASA Astrophysics Data System (ADS)

    Bon, A. T.; Ng, T. K.

    2017-01-01

    Healthcare industry becomes an important field for human beings nowadays as it concerns about one’s health. With that, forecasting demand for health services is an important step in managerial decision making for all healthcare organizations. Hence, a case study was conducted in University Health Centre to collect historical demand data of Panadol 650mg for 68 months from January 2009 until August 2014. The aim of the research is to optimize the overall inventory demand through forecasting techniques. Quantitative forecasting or time series forecasting model was used in the case study to forecast future data as a function of past data. Furthermore, the data pattern needs to be identified first before applying the forecasting techniques. Trend is the data pattern and then ten forecasting techniques are applied using Risk Simulator Software. Lastly, the best forecasting techniques will be find out with the least forecasting error. Among the ten forecasting techniques include single moving average, single exponential smoothing, double moving average, double exponential smoothing, regression, Holt-Winter’s additive, Seasonal additive, Holt-Winter’s multiplicative, seasonal multiplicative and Autoregressive Integrated Moving Average (ARIMA). According to the forecasting accuracy measurement, the best forecasting technique is regression analysis.

  13. Air Pollution Forecasts: An Overview

    PubMed Central

    Bai, Lu; Wang, Jianzhou; Lu, Haiyan

    2018-01-01

    Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies. PMID:29673227

  14. Air Pollution Forecasts: An Overview.

    PubMed

    Bai, Lu; Wang, Jianzhou; Ma, Xuejiao; Lu, Haiyan

    2018-04-17

    Air pollution is defined as a phenomenon harmful to the ecological system and the normal conditions of human existence and development when some substances in the atmosphere exceed a certain concentration. In the face of increasingly serious environmental pollution problems, scholars have conducted a significant quantity of related research, and in those studies, the forecasting of air pollution has been of paramount importance. As a precaution, the air pollution forecast is the basis for taking effective pollution control measures, and accurate forecasting of air pollution has become an important task. Extensive research indicates that the methods of air pollution forecasting can be broadly divided into three classical categories: statistical forecasting methods, artificial intelligence methods, and numerical forecasting methods. More recently, some hybrid models have been proposed, which can improve the forecast accuracy. To provide a clear perspective on air pollution forecasting, this study reviews the theory and application of those forecasting models. In addition, based on a comparison of different forecasting methods, the advantages and disadvantages of some methods of forecasting are also provided. This study aims to provide an overview of air pollution forecasting methods for easy access and reference by researchers, which will be helpful in further studies.

  15. Analog-Based Postprocessing of Navigation-Related Hydrological Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Hemri, S.; Klein, B.

    2017-11-01

    Inland waterway transport benefits from probabilistic forecasts of water levels as they allow to optimize the ship load and, hence, to minimize the transport costs. Probabilistic state-of-the-art hydrologic ensemble forecasts inherit biases and dispersion errors from the atmospheric ensemble forecasts they are driven with. The use of statistical postprocessing techniques like ensemble model output statistics (EMOS) allows for a reduction of these systematic errors by fitting a statistical model based on training data. In this study, training periods for EMOS are selected based on forecast analogs, i.e., historical forecasts that are similar to the forecast to be verified. Due to the strong autocorrelation of water levels, forecast analogs have to be selected based on entire forecast hydrographs in order to guarantee similar hydrograph shapes. Custom-tailored measures of similarity for forecast hydrographs comprise hydrological series distance (SD), the hydrological matching algorithm (HMA), and dynamic time warping (DTW). Verification against observations reveals that EMOS forecasts for water level at three gauges along the river Rhine with training periods selected based on SD, HMA, and DTW compare favorably with reference EMOS forecasts, which are based on either seasonal training periods or on training periods obtained by dividing the hydrological forecast trajectories into runoff regimes.

  16. Forecasting, Forecasting

    Treesearch

    Michael A. Fosberg

    1987-01-01

    Future improvements in the meteorological forecasts used in fire management will come from improvements in three areas: observational systems, forecast techniques, and postprocessing of forecasts and better integration of this information into the fire management process.

  17. Accuracy of short‐term sea ice drift forecasts using a coupled ice‐ocean model

    PubMed Central

    Zhang, Jinlun

    2015-01-01

    Abstract Arctic sea ice drift forecasts of 6 h–9 days for the summer of 2014 are generated using the Marginal Ice Zone Modeling and Assimilation System (MIZMAS); the model is driven by 6 h atmospheric forecasts from the Climate Forecast System (CFSv2). Forecast ice drift speed is compared to drifting buoys and other observational platforms. Forecast positions are compared with actual positions 24 h–8 days since forecast. Forecast results are further compared to those from the forecasts generated using an ice velocity climatology driven by multiyear integrations of the same model. The results are presented in the context of scheduling the acquisition of high‐resolution images that need to follow buoys or scientific research platforms. RMS errors for ice speed are on the order of 5 km/d for 24–48 h since forecast using the sea ice model compared with 9 km/d using climatology. Predicted buoy position RMS errors are 6.3 km for 24 h and 14 km for 72 h since forecast. Model biases in ice speed and direction can be reduced by adjusting the air drag coefficient and water turning angle, but the adjustments do not affect verification statistics. This suggests that improved atmospheric forecast forcing may further reduce the forecast errors. The model remains skillful for 8 days. Using the forecast model increases the probability of tracking a target drifting in sea ice with a 10 km × 10 km image from 60 to 95% for a 24 h forecast and from 27 to 73% for a 48 h forecast. PMID:27818852

  18. Two-step forecast of geomagnetic storm using coronal mass ejection and solar wind condition

    PubMed Central

    Kim, R-S; Moon, Y-J; Gopalswamy, N; Park, Y-D; Kim, Y-H

    2014-01-01

    To forecast geomagnetic storms, we had examined initially observed parameters of coronal mass ejections (CMEs) and introduced an empirical storm forecast model in a previous study. Now we suggest a two-step forecast considering not only CME parameters observed in the solar vicinity but also solar wind conditions near Earth to improve the forecast capability. We consider the empirical solar wind criteria derived in this study (Bz ≤ −5 nT or Ey ≥ 3 mV/m for t≥ 2 h for moderate storms with minimum Dst less than −50 nT) and a Dst model developed by Temerin and Li (2002, 2006) (TL model). Using 55 CME-Dst pairs during 1997 to 2003, our solar wind criteria produce slightly better forecasts for 31 storm events (90%) than the forecasts based on the TL model (87%). However, the latter produces better forecasts for 24 nonstorm events (88%), while the former correctly forecasts only 71% of them. We then performed the two-step forecast. The results are as follows: (i) for 15 events that are incorrectly forecasted using CME parameters, 12 cases (80%) can be properly predicted based on solar wind conditions; (ii) if we forecast a storm when both CME and solar wind conditions are satisfied (∩), the critical success index becomes higher than that from the forecast using CME parameters alone, however, only 25 storm events (81%) are correctly forecasted; and (iii) if we forecast a storm when either set of these conditions is satisfied (∪), all geomagnetic storms are correctly forecasted. PMID:26213515

  19. Two-step forecast of geomagnetic storm using coronal mass ejection and solar wind condition.

    PubMed

    Kim, R-S; Moon, Y-J; Gopalswamy, N; Park, Y-D; Kim, Y-H

    2014-04-01

    To forecast geomagnetic storms, we had examined initially observed parameters of coronal mass ejections (CMEs) and introduced an empirical storm forecast model in a previous study. Now we suggest a two-step forecast considering not only CME parameters observed in the solar vicinity but also solar wind conditions near Earth to improve the forecast capability. We consider the empirical solar wind criteria derived in this study ( B z  ≤ -5 nT or E y  ≥ 3 mV/m for t ≥ 2 h for moderate storms with minimum Dst less than -50 nT) and a Dst model developed by Temerin and Li (2002, 2006) (TL model). Using 55 CME- Dst pairs during 1997 to 2003, our solar wind criteria produce slightly better forecasts for 31 storm events (90%) than the forecasts based on the TL model (87%). However, the latter produces better forecasts for 24 nonstorm events (88%), while the former correctly forecasts only 71% of them. We then performed the two-step forecast. The results are as follows: (i) for 15 events that are incorrectly forecasted using CME parameters, 12 cases (80%) can be properly predicted based on solar wind conditions; (ii) if we forecast a storm when both CME and solar wind conditions are satisfied (∩), the critical success index becomes higher than that from the forecast using CME parameters alone, however, only 25 storm events (81%) are correctly forecasted; and (iii) if we forecast a storm when either set of these conditions is satisfied (∪), all geomagnetic storms are correctly forecasted.

  20. Medium-range fire weather forecasts

    Treesearch

    J.O. Roads; K. Ueyoshi; S.C. Chen; J. Alpert; F. Fujioka

    1991-01-01

    The forecast skill of theNational Meteorological Center's medium range forecast (MRF) numerical forecasts of fire weather variables is assessed for the period June 1,1988 to May 31,1990. Near-surface virtual temperature, relative humidity, wind speed and a derived fire weather index (FWI) are forecast well by the MRF model. However, forecast relative humidity has...

  1. Probability fire weather forecasts .. show promise in 3-year trial

    Treesearch

    Paul G. Scowcroft

    1970-01-01

    Probability fire weather forecasts were compared with categorical and climatological forecasts in a trial in southern California during the 1965-1967 fire seasons. Equations were developed to express the reliability of forecasts and degree of skill shown by the forecaster. Evaluation of 336 daily reports suggests that probability forecasts were more reliable. For...

  2. NASA Products to Enhance Energy Utility Load Forecasting

    NASA Technical Reports Server (NTRS)

    Lough, G.; Zell, E.; Engel-Cox, J.; Fungard, Y.; Jedlovec, G.; Stackhouse, P.; Homer, R.; Biley, S.

    2012-01-01

    Existing energy load forecasting tools rely upon historical load and forecasted weather to predict load within energy company service areas. The shortcomings of load forecasts are often the result of weather forecasts that are not at a fine enough spatial or temporal resolution to capture local-scale weather events. This project aims to improve the performance of load forecasting tools through the integration of high-resolution, weather-related NASA Earth Science Data, such as temperature, relative humidity, and wind speed. Three companies are participating in operational testing one natural gas company, and two electric providers. Operational results comparing load forecasts with and without NASA weather forecasts have been generated since March 2010. We have worked with end users at the three companies to refine selection of weather forecast information and optimize load forecast model performance. The project will conclude in 2012 with transitioning documented improvements from the inclusion of NASA forecasts for sustained use by energy utilities nationwide in a variety of load forecasting tools. In addition, Battelle has consulted with energy companies nationwide to document their information needs for long-term planning, in light of climate change and regulatory impacts.

  3. Optimizing Tsunami Forecast Model Accuracy

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Nyland, D. L.; Huang, P. Y.

    2015-12-01

    Recent tsunamis provide a means to determine the accuracy that can be expected of real-time tsunami forecast models. Forecast accuracy using two different tsunami forecast models are compared for seven events since 2006 based on both real-time application and optimized, after-the-fact "forecasts". Lessons learned by comparing the forecast accuracy determined during an event to modified applications of the models after-the-fact provide improved methods for real-time forecasting for future events. Variables such as source definition, data assimilation, and model scaling factors are examined to optimize forecast accuracy. Forecast accuracy is also compared for direct forward modeling based on earthquake source parameters versus accuracy obtained by assimilating sea level data into the forecast model. Results show that including assimilated sea level data into the models increases accuracy by approximately 15% for the events examined.

  4. Seasonal Water Balance Forecasts for Drought Early Warning in Ethiopia

    NASA Astrophysics Data System (ADS)

    Spirig, Christoph; Bhend, Jonas; Liniger, Mark

    2016-04-01

    Droughts severely impact Ethiopian agricultural production. Successful early warning for drought conditions in the upcoming harvest season therefore contributes to better managing food shortages arising from adverse climatic conditions. So far, however, meteorological seasonal forecasts have not been used in Ethiopia's national food security early warning system (i.e. the LEAP platform). Here we analyse the forecast quality of seasonal forecasts of total rainfall and of the meteorological water balance as a proxy for plant available water. We analyse forecast skill of June to September rainfall and water balance from dynamical seasonal forecast systems, the ECMWF System4 and EC-EARTH global forecasting systems. Rainfall forecasts outperform forecasts assuming a stationary climate mainly in north-eastern Ethiopia - an area that is particularly vulnerable to droughts. Forecasts of the water balance index seem to be even more skilful and thus more useful than pure rainfall forecasts. The results vary though for different lead times and skill measures employed. We further explore the potential added value of dynamically downscaling the forecasts through several dynamical regional climate models made available through the EU FP7 project EUPORIAS. Preliminary results suggest that dynamically downscaled seasonal forecasts are not significantly better compared with seasonal forecasts from the global models. We conclude that seasonal forecasts of a simple climate index such as the water balance have the potential to benefit drought early warning in Ethiopia, both due to its positive predictive skill and higher usefulness than seasonal mean quantities.

  5. A Decision Support System for effective use of probability forecasts

    NASA Astrophysics Data System (ADS)

    De Kleermaeker, Simone; Verkade, Jan

    2013-04-01

    Often, water management decisions are based on hydrological forecasts. These forecasts, however, are affected by inherent uncertainties. It is increasingly common for forecasting agencies to make explicit estimates of these uncertainties and thus produce probabilistic forecasts. Associated benefits include the decision makers' increased awareness of forecasting uncertainties and the potential for risk-based decision-making. Also, a stricter separation of responsibilities between forecasters and decision maker can be made. However, simply having probabilistic forecasts available is not sufficient to realise the associated benefits. Additional effort is required in areas such as forecast visualisation and communication, decision making in uncertainty and forecast verification. Also, revised separation of responsibilities requires a shift in institutional arrangements and responsibilities. A recent study identified a number of additional issues related to the effective use of probability forecasts. When moving from deterministic to probability forecasting, a dimension is added to an already multi-dimensional problem; this makes it increasingly difficult for forecast users to extract relevant information from a forecast. A second issue is that while probability forecasts provide a necessary ingredient for risk-based decision making, other ingredients may not be present. For example, in many cases no estimates of flood damage, of costs of management measures and of damage reduction are available. This paper presents the results of the study, including some suggestions for resolving these issues and the integration of those solutions in a prototype decision support system (DSS). A pathway for further development of the DSS is outlined.

  6. Evaluation of statistical models for forecast errors from the HBV model

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjørn; Renard, Benjamin; Steinsland, Ingelin; Kolberg, Sjur

    2010-04-01

    SummaryThree statistical models for the forecast errors for inflow into the Langvatn reservoir in Northern Norway have been constructed and tested according to the agreement between (i) the forecast distribution and the observations and (ii) median values of the forecast distribution and the observations. For the first model observed and forecasted inflows were transformed by the Box-Cox transformation before a first order auto-regressive model was constructed for the forecast errors. The parameters were conditioned on weather classes. In the second model the Normal Quantile Transformation (NQT) was applied on observed and forecasted inflows before a similar first order auto-regressive model was constructed for the forecast errors. For the third model positive and negative errors were modeled separately. The errors were first NQT-transformed before conditioning the mean error values on climate, forecasted inflow and yesterday's error. To test the three models we applied three criterions: we wanted (a) the forecast distribution to be reliable; (b) the forecast intervals to be narrow; (c) the median values of the forecast distribution to be close to the observed values. Models 1 and 2 gave almost identical results. The median values improved the forecast with Nash-Sutcliffe R eff increasing from 0.77 for the original forecast to 0.87 for the corrected forecasts. Models 1 and 2 over-estimated the forecast intervals but gave the narrowest intervals. Their main drawback was that the distributions are less reliable than Model 3. For Model 3 the median values did not fit well since the auto-correlation was not accounted for. Since Model 3 did not benefit from the potential variance reduction that lies in bias estimation and removal it gave on average wider forecasts intervals than the two other models. At the same time Model 3 on average slightly under-estimated the forecast intervals, probably explained by the use of average measures to evaluate the fit.

  7. Optical properties of β-BBO and potential for THz applications

    NASA Astrophysics Data System (ADS)

    Nikolaev, N. A.; Andreev, Yu. M.; Antsygin, V. D.; Bekker, T. B.; Ezhov, D. M.; Kokh, A. E.; Kokh, K. A.; Lanskii, G. V.; Mamrashev, A. A.; Svetlichnyi, V. A.

    2018-01-01

    The anisotropy of optical properties of high quality beta barium borate crystal (β-BaB2O4, β-BBO) was studied in the main transparency window by using classic spectroscopic methods and in the range of 0.2 - 2 THz by using THz time-domain spectroscopy. β-BBO crystals were grown by the top-seeded solution technique in a highly resistive furnace with a heat field of 3-fold axis symmetry. At room temperature (RT), absorption coefficient in the maximal transparency window in grown crystals did not exceed 0.05 cm-1. Strong absorption anisotropy was observed in 3 - 5 μm and the THz range. At 1 THz absorption coefficients for e and o wave were, respectively, 7 cm-1 and 21 cm-1 at RT; 2 cm-1 and 10 cm-1 at 81 K. At the most attractive for out-of-door applications range < 0.4 THz the absorption coefficient is found to be very low: below 0.2 cm-1 at RT and 1 cm-1 at 81 K. Refractive indices dispersions measured by THz-TDS were approximated in the form of Sellmeier equations. Birefringence is found quite large for phase matched difference frequency generation (DFG) or down-conversion into the THz range (THz-DFG) under near IR pump at RT and 81 K. Type II (oe-o and eo-o), and type I (ee-e) three wave interactions can be realized at RT. THz-DFG of Nd:YAG laser and KTP OPO can be realized by type II (oe-o) three-wave interaction. For selected spectral ranges of femtosecond Ti:Sapphire laser efficient phase matched and group velocity matched optical rectification can be realized by another two types of three wave interactions. Accounting other well-known attractive physical properties of β-BBO crystal, wide application in THz technique can be forecasted.

  8. Moving beyond the cost-loss ratio: economic assessment of streamflow forecasts for a risk-averse decision maker

    NASA Astrophysics Data System (ADS)

    Matte, Simon; Boucher, Marie-Amélie; Boucher, Vincent; Fortier Filion, Thomas-Charles

    2017-06-01

    A large effort has been made over the past 10 years to promote the operational use of probabilistic or ensemble streamflow forecasts. Numerous studies have shown that ensemble forecasts are of higher quality than deterministic ones. Many studies also conclude that decisions based on ensemble rather than deterministic forecasts lead to better decisions in the context of flood mitigation. Hence, it is believed that ensemble forecasts possess a greater economic and social value for both decision makers and the general population. However, the vast majority of, if not all, existing hydro-economic studies rely on a cost-loss ratio framework that assumes a risk-neutral decision maker. To overcome this important flaw, this study borrows from economics and evaluates the economic value of early warning flood systems using the well-known Constant Absolute Risk Aversion (CARA) utility function, which explicitly accounts for the level of risk aversion of the decision maker. This new framework allows for the full exploitation of the information related to a forecasts' uncertainty, making it especially suited for the economic assessment of ensemble or probabilistic forecasts. Rather than comparing deterministic and ensemble forecasts, this study focuses on comparing different types of ensemble forecasts. There are multiple ways of assessing and representing forecast uncertainty. Consequently, there exist many different means of building an ensemble forecasting system for future streamflow. One such possibility is to dress deterministic forecasts using the statistics of past error forecasts. Such dressing methods are popular among operational agencies because of their simplicity and intuitiveness. Another approach is the use of ensemble meteorological forecasts for precipitation and temperature, which are then provided as inputs to one or many hydrological model(s). In this study, three concurrent ensemble streamflow forecasting systems are compared: simple statistically dressed deterministic forecasts, forecasts based on meteorological ensembles, and a variant of the latter that also includes an estimation of state variable uncertainty. This comparison takes place for the Montmorency River, a small flood-prone watershed in southern central Quebec, Canada. The assessment of forecasts is performed for lead times of 1 to 5 days, both in terms of forecasts' quality (relative to the corresponding record of observations) and in terms of economic value, using the new proposed framework based on the CARA utility function. It is found that the economic value of a forecast for a risk-averse decision maker is closely linked to the forecast reliability in predicting the upper tail of the streamflow distribution. Hence, post-processing forecasts to avoid over-forecasting could help improve both the quality and the value of forecasts.

  9. Multivariate Cholesky models of human female fertility patterns in the NLSY.

    PubMed

    Rodgers, Joseph Lee; Bard, David E; Miller, Warren B

    2007-03-01

    Substantial evidence now exists that variables measuring or correlated with human fertility outcomes have a heritable component. In this study, we define a series of age-sequenced fertility variables, and fit multivariate models to account for underlying shared genetic and environmental sources of variance. We make predictions based on a theory developed by Udry [(1996) Biosocial models of low-fertility societies. In: Casterline, JB, Lee RD, Foote KA (eds) Fertility in the United States: new patterns, new theories. The Population Council, New York] suggesting that biological/genetic motivations can be more easily realized and measured in settings in which fertility choices are available. Udry's theory, along with principles from molecular genetics and certain tenets of life history theory, allow us to make specific predictions about biometrical patterns across age. Consistent with predictions, our results suggest that there are different sources of genetic influence on fertility variance at early compared to later ages, but that there is only one source of shared environmental influence that occurs at early ages. These patterns are suggestive of the types of gene-gene and gene-environment interactions for which we must account to better understand individual differences in fertility outcomes.

  10. Realizing Women Living with HIV's Reproductive Rights in the Era of ART: The Negative Impact of Non-consensual HIV Disclosure on Pregnancy Decisions Amongst Women Living with HIV in a Canadian Setting.

    PubMed

    Duff, Putu; Kestler, Mary; Chamboko, Patience; Braschel, Melissa; Ogilvie, Gina; Krüsi, Andrea; Montaner, Julio; Money, Deborah; Shannon, Kate

    2018-04-07

    To better understand the structural drivers of women living with HIV's (WLWH's) reproductive rights and choices, this study examined the structural correlates, including non-consensual HIV disclosure, on WLWH's pregnancy decisions and describes access to preconception care. Analyses drew on data (2014-present) from SHAWNA, a longitudinal community-based cohort with WLWH across Metro-Vancouver, Canada. Multivariable logistic regression was used to model the effect of non-consensual HIV disclosure on WLWH's pregnancy decisions. Of the 218 WLWH included in our analysis, 24.8% had ever felt discouraged from becoming pregnant and 11.5% reported accessing preconception counseling. In multivariable analyses, non-consensual HIV disclosure was positively associated with feeling discouraged from wanting to become pregnant (AOR 3.76; 95% CI 1.82-7.80). Non-consensual HIV disclosure adversely affects WLWH's pregnancy decisions. Supporting the reproductive rights of WLWH will require further training among general practitioners on the reproductive health of WLWH and improved access to women-centred, trauma-informed care, including non-judgmental preconception counseling.

  11. A diagnostic analysis of the VVP single-doppler retrieval technique

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis J.

    1995-01-01

    A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.

  12. The study of combining Latin Hypercube Sampling method and LU decomposition method (LULHS method) for constructing spatial random field

    NASA Astrophysics Data System (ADS)

    WANG, P. T.

    2015-12-01

    Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.

  13. A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data

    NASA Astrophysics Data System (ADS)

    Awajan, Ahmad Mohd; Ismail, Mohd Tahir

    2017-08-01

    Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.

  14. Using a Software Tool in Forecasting: a Case Study of Sales Forecasting Taking into Account Data Uncertainty

    NASA Astrophysics Data System (ADS)

    Fabianová, Jana; Kačmáry, Peter; Molnár, Vieroslav; Michalik, Peter

    2016-10-01

    Forecasting is one of the logistics activities and a sales forecast is the starting point for the elaboration of business plans. Forecast accuracy affects the business outcomes and ultimately may significantly affect the economic stability of the company. The accuracy of the prediction depends on the suitability of the use of forecasting methods, experience, quality of input data, time period and other factors. The input data are usually not deterministic but they are often of random nature. They are affected by uncertainties of the market environment, and many other factors. Taking into account the input data uncertainty, the forecast error can by reduced. This article deals with the use of the software tool for incorporating data uncertainty into forecasting. Proposals are presented of a forecasting approach and simulation of the impact of uncertain input parameters to the target forecasted value by this case study model. The statistical analysis and risk analysis of the forecast results is carried out including sensitivity analysis and variables impact analysis.

  15. Realization of entry-to-practice milestones by Canadians who studied medicine abroad and other international medical graduates: a retrospective cohort study.

    PubMed

    Mathews, Maria; Kandar, Rima; Slade, Steve; Yi, Yanqing; Beardall, Sue; Bourgeault, Ivy

    2017-06-19

    International medical graduates must realize a series of milestones to obtain full licensure. We examined the realization of milestones by Canadian and non-Canadian graduates of Western or Caribbean medical schools, and Canadian and non-Canadian graduates from other medical schools. Using the National IMG Database (data available for 2005-2011), we created 2 cohorts: 1) international medical graduates who had passed the Medical Council of Canada Qualifying Examination Part I between 2005 and 2010 and 2) those who had first entered a family medicine postgraduate program between 2005 and 2009, or had first entered a specialty postgraduate program in 2005 or 2006. We examined 3 entry-to-practice milestones; obtaining a postgraduate position, passing the Medical Council of Canada Qualifying Examination Part II and obtaining a specialty designation. Of the 6925 eligible graduates in cohort 1, 2144 (31.0%) had obtained a postgraduate position. Of the 1214 eligible graduates in cohort 2, 1126 (92.8%) had passed the Qualifying Examination Part II, and 889 (73.2%) had obtained a specialty designation. In multivariate analyses, Canadian graduates of Western or Caribbean medical schools (odds ratio [OR] 4.69, 95% confidence interval [CI] 3.82-5.71) and Canadian graduates of other medical schools (OR 1.49, 95% CI 1.31-1.70) were more likely to obtain a postgraduate position than non-Canadian graduates of other (not Western or Caribbean) medical schools. There was no difference among the groups in passing the Qualifying Examination Part II or obtaining a specialty designation. Canadians who studied abroad were more likely than other international medical graduates to obtain a postgraduate position; there were no differences among the groups in realizing milestones once in a postgraduate program. These findings support policies that do not distinguish postgraduate applicants by citizenship or permanent residency status before medical school. Copyright 2017, Joule Inc. or its licensors.

  16. A Local Forecast of Land Surface Wetness Conditions, Drought, and St. Louis Encephalitis Virus Transmission Derived from Seasonal Climate Predictions

    NASA Astrophysics Data System (ADS)

    Shaman, J.; Stieglitz, M.; Zebiak, S.; Cane, M.; Day, J. F.

    2002-12-01

    We present an ensemble local hydrologic forecast derived from the seasonal forecasts of the International Research Institute (IRI) for Climate Prediction. Three- month seasonal forecasts were used to resample historical meteorological conditions and generate ensemble forcing datasets for a TOPMODEL-based hydrology model. Eleven retrospective forecasts were run at a Florida and New York site. Forecast skill was assessed for mean area modeled water table depth (WTD), i.e. near surface soil wetness conditions, and compared with WTD simulated with observed data. Hydrology model forecast skill was evident at the Florida site but not at the New York site. At the Florida site, persistence of hydrologic conditions and local skill of the IRI seasonal forecast contributed to the local hydrologic forecast skill. This forecast will permit probabilistic prediction of future hydrologic conditions. At the Florida site, we have also quantified the link between modeled WTD (i.e. drought) and the amplification and transmission of St. Louis Encephalitis virus (SLEV). We derive an empirical relationship between modeled land surface wetness and levels of SLEV transmission associated with human clinical cases. We then combine the seasonal forecasts of local, modeled WTD with this empirical relationship and produce retrospective probabilistic seasonal forecasts of epidemic SLEV transmission in Florida. Epidemic SLEV transmission forecast skill is demonstrated. These findings will permit real-time forecast of drought and resultant SLEV transmission in Florida.

  17. Exploring the interactions between forecast accuracy, risk perception and perceived forecast reliability in reservoir operator's decision to use forecast

    NASA Astrophysics Data System (ADS)

    Shafiee-Jood, M.; Cai, X.

    2017-12-01

    Advances in streamflow forecasts at different time scales offer a promise for proactive flood management and improved risk management. Despite the huge potential, previous studies have found that water resources managers are often not willing to incorporate streamflow forecasts information in decisions making, particularly in risky situations. While low accuracy of forecasts information is often cited as the main reason, some studies have found that implementation of streamflow forecasts sometimes is impeded by institutional obstacles and behavioral factors (e.g., risk perception). In fact, a seminal study by O'Connor et al. (2005) found that risk perception is the strongest determinant of forecast use while managers' perception about forecast reliability is not significant. In this study, we aim to address this issue again. However, instead of using survey data and regression analysis, we develop a theoretical framework to assess the user-perceived value of streamflow forecasts. The framework includes a novel behavioral component which incorporates both risk perception and perceived forecast reliability. The framework is then used in a hypothetical problem where reservoir operator should react to probabilistic flood forecasts with different reliabilities. The framework will allow us to explore the interactions among risk perception and perceived forecast reliability, and among the behavioral components and information accuracy. The findings will provide insights to improve the usability of flood forecasts information through better communication and education.

  18. Parametric decadal climate forecast recalibration (DeFoReSt 1.0)

    NASA Astrophysics Data System (ADS)

    Pasternack, Alexander; Bhend, Jonas; Liniger, Mark A.; Rust, Henning W.; Müller, Wolfgang A.; Ulbrich, Uwe

    2018-01-01

    Near-term climate predictions such as decadal climate forecasts are increasingly being used to guide adaptation measures. For near-term probabilistic predictions to be useful, systematic errors of the forecasting systems have to be corrected. While methods for the calibration of probabilistic forecasts are readily available, these have to be adapted to the specifics of decadal climate forecasts including the long time horizon of decadal climate forecasts, lead-time-dependent systematic errors (drift) and the errors in the representation of long-term changes and variability. These features are compounded by small ensemble sizes to describe forecast uncertainty and a relatively short period for which typically pairs of reforecasts and observations are available to estimate calibration parameters. We introduce the Decadal Climate Forecast Recalibration Strategy (DeFoReSt), a parametric approach to recalibrate decadal ensemble forecasts that takes the above specifics into account. DeFoReSt optimizes forecast quality as measured by the continuous ranked probability score (CRPS). Using a toy model to generate synthetic forecast observation pairs, we demonstrate the positive effect on forecast quality in situations with pronounced and limited predictability. Finally, we apply DeFoReSt to decadal surface temperature forecasts from the MiKlip prototype system and find consistent, and sometimes considerable, improvements in forecast quality compared with a simple calibration of the lead-time-dependent systematic errors.

  19. Water Wise Sustainable Farming in the Southeast USA: It's All About the Roots

    NASA Astrophysics Data System (ADS)

    Bartel, R. L., Jr.; Dourte, D. R.; George, S.

    2014-12-01

    Sod based crop rotation (SBR) is a relatively new system of practice that incorporates at least 2 years of a perennial grass followed by a peanut and then cotton rotation. After 15 years of research on farm scale sites, this system has been found to have many advantages over conventional 3 year peanut-cotton-cotton rotations. These benefits include: increased profits; dramatic reductions in irrigation demand, fertilizer (N,P) pesticide use, and energy consumption; and, better carbon sequestration potential. The SBR system works primarily due to enhancement of plant root growth and improvement to soil properties. To forecast the water savings potential of sod based rotation, we employ the Soil Water Assessment Tool (SWAT) model to simulate irrigation water demands over a 34 year period (1980-2013). We utilize data from a distributed network of weather stations to represent a range of climate conditions in the Florida, Georgia and Alabama area and data that represent a range of soil physical properties. The only calibration parameter adjusted in SWAT to distinguish between the new and conventional system of farming is rooting depth. Each model result providing 34 years of annual irrigation water requirements was fit to a cumulative probability distribution function to forecast the water savings potential of SBR. Depending upon soil type and weather station location, forecasted water savings using SBR ranged between 20.3 and 33.1 cm during a 10% chance drought year. With over 526,000 ha of irrigated conventional acres in cotton and peanut within the states of Alabama, Florida, Georgia, and South Carolina alone, this equates to a potential water savings of 1.07 billion to 1.74 billion m3 of water in a drought year. The cumulative mitigative effect of this system has yet to be realized through actual application in these states. However, even at low rates of application it is evident that SBR could significantly reduce the negative impacts of irrigation in the Southeast US as well as other locations where row crops grow in humid subtropical climates.

  20. Turbidity forecasting at a karst spring using combined machine learning and wavelet multiresolution analysis.

    NASA Astrophysics Data System (ADS)

    Savary, M.; Massei, N.; Johannet, A.; Dupont, J. P.; Hauchard, E.

    2016-12-01

    25% of the world populations drink water extracted from karst aquifer. The comprehension and the protection of these aquifers appear as crucial due to an increase of drinking water needs. In Normandie(North-West of France), the principal exploited aquifer is the chalk aquifer. The chalk aquifer highly karstified is an important water resource, regionally speaking. Connections between surface and underground waters thanks to karstification imply turbidity that decreases water quality. Both numerous parameters and phenomenons, and the non-linearity of the rainfall/turbidity relation influence the turbidity causing difficulties to model and forecast turbidity peaks. In this context, the Yport pumping well provides half of Le Havreconurbation drinking water supply (236 000 inhabitants). The aim of this work is thus to perform prediction of the turbidity peaks in order to help pumping well managers to decrease the impact of turbidity on water treatment. Database consists in hourly rainfalls coming from six rain gauges located on the alimentation basin since 2009 and hourly turbidity since 1993. Because of the lack of accurate physical description of the karst system and its surface basin, the systemic paradigm is chosen and a black box model: a neural network model is chosen. In a first step, correlation analyses are used to design the original model architecture by identifying the relation between output and input. The following optimization phases bring us four different architectures. These models were experimented to forecast 12h ahead turbidity and threshold surpassing. The first model is a simple multilayer perceptron. The second is a two-branches model designed to better represent the fast (rainfall) and low (evapotranspiration) dynamics. Each kind of model is developed using both a recurrent and feed-forward architecture. This work highlights that feed-forward multilayer perceptron is better to predict turbidity peaks when feed-forward two-branches model is better to predict threshold surpassing. In a second step, the implementation of wavelet decomposition within the neural network model to better apprehend slow and fast dynamics is tested and discussed, which could also allows accounting for non-linearity of the turbid response to some extent. This second approach is still under realization so far.

  1. Somerset County Flood Information System

    USGS Publications Warehouse

    Hoppe, Heidi L.

    2007-01-01

    The timely warning of a flood is crucial to the protection of lives and property. One has only to recall the floods of August 2, 1973, September 16 and 17, 1999, and April 16, 2007, in Somerset County, New Jersey, in which lives were lost and major property damage occurred, to realize how costly, especially in terms of human life, an unexpected flood can be. Accurate forecasts and warnings cannot be made, however, without detailed information about precipitation and streamflow in the drainage basin. Since the mid 1960's, the National Weather Service (NWS) has been able to forecast flooding on larger streams in Somerset County, such as the Raritan and Millstone Rivers. Flooding on smaller streams in urban areas was more difficult to predict. In response to this problem the NWS, in cooperation with the Green Brook Flood Control Commission, installed a precipitation gage in North Plainfield, and two flash-flood alarms, one on Green Brook at Seeley Mills and one on Stony Brook at Watchung, in the early 1970's. In 1978, New Jersey's first countywide flood-warning system was installed by the U.S. Geological Survey (USGS) in Somerset County. This system consisted of a network of eight stage and discharge gages equipped with precipitation gages linked by telephone telemetry and eight auxiliary precipitation gages. The gages were installed throughout the county to collect precipitation and runoff data that could be used to improve flood-monitoring capabilities and flood-frequency estimates. Recognizing the need for more detailed hydrologic information for Somerset County, the USGS, in cooperation with Somerset County, designed and installed the Somerset County Flood Information System (SCFIS) in 1990. This system is part of a statewide network of stream gages, precipitation gages, weather stations, and tide gages that collect data in real time. The data provided by the SCFIS improve the flood forecasting ability of the NWS and aid Somerset County and municipal agencies in the planning and execution of flood-preparation and emergency-evacuation procedures in the county. This fact sheet describes the SCFIS and identifies its benefits.

  2. The Impact of Land-Atmosphere Coupling on the 2017 Northern Great Plains Drought

    NASA Astrophysics Data System (ADS)

    Roundy, J. K.; Santanello, J. A., Jr.

    2017-12-01

    In a changing climate, the potential for increased frequency and duration of drought implies devastating impacts on many aspects of society. The negative impacts of drought can be reduced through informing sustainable water management made possible by real-time monitoring and prediction. The refinement of forecast models is best realized through large-scale observation based datasets, yet there are few of these datasets currently available. The Coupling Drought Index (CDI) is a metric based on the persistence of Land-Atmosphere (L-A) coupling into distinct regimes derived from observations of the land and atmospheric state. The coupling regime persistence has been shown to relate to drought intensification and recovery and is the basis for the Coupling Statistical Model (CSM), which uses a Markov Chain framework to make statistical predictions. The CDI and CSM have been used to understand the predictability of L-A interactions in NCEP's Climate Forecasts System version 2 (CFSv2) and indicated that the forecasts exhibit strong biases in the L-A coupling that produced biases in the precipitation and limited the predictability of drought. The CDI can also be derived exclusively from satellite data which provides an observational large-scale metric of L-A coupling and drought evolution. This provides a unique observational tool for understanding the persistence and intensification of drought through land-atmosphere interactions. During the Spring and Summer of 2017, a drought developed over the Norther great plains that caused substantial agricultural losses in parts of Montana and North and South Dakota. In this work, we use satellite derived CDI to explore the impact of Land-Atmosphere Interactions on the persistence and intensification of the 2017 Northern Great Plains drought. To do this we analyze and quantify the change in CDI at various spatial and temporal scales and correlate these changes with other drought indicators including the U.S. Drought Monitor (http://droughtmonitor.unl.edu). The 2017 Northern Great Plains drought is compared to previous droughts in the region and the predictability of 2017 drought from the CSM as well as future droughts for the area is assessed.

  3. Monitoring the performance of the next Climate Forecast System version 3, throughout its development stage at EMC/NCEP

    NASA Astrophysics Data System (ADS)

    Peña, M.; Saha, S.; Wu, X.; Wang, J.; Tripp, P.; Moorthi, S.; Bhattacharjee, P.

    2016-12-01

    The next version of the operational Climate Forecast System (version 3, CFSv3) will be a fully coupled six-components system with diverse applications to earth system modeling, including weather and climate predictions. This system will couple the earth's atmosphere, land, ocean, sea-ice, waves and aerosols for both data assimilation and modeling. It will also use the NOAA Environmental Modeling System (NEMS) software super structure to couple these components. The CFSv3 is part of the next Unified Global Coupled System (UGCS), which will unify the global prediction systems that are now operational at NCEP. The UGCS is being developed through the efforts of dedicated research and engineering teams and through coordination across many CPO/MAPP and NGGPS groups. During this development phase, the UGCS is being tested for seasonal purposes and undergoes frequent revisions. Each new revision is evaluated to quickly discover, isolate and solve problems that negatively impact its performance. In the UGCS-seasonal model, components (e.g., ocean, sea-ice, atmosphere, etc.) are coupled through a NEMS-based "mediator". In this numerical infrastructure, model diagnostics and forecast validation are carried out, both component by component, and as a whole. The next stage, model optimization, will require enhanced performance diagnostics tools to help prioritize areas of numerical improvements. After the technical development of the UGCS-seasonal is completed, it will become the first realization of the CFSv3. All future development of this system will be carried out by the climate team at NCEP, in scientific collaboration with the groups that developed the individual components, as well as the climate community. A unique challenge to evaluate this unified weather-climate system is the large number of variables, which evolve over a wide range of temporal and spatial scales. A small set of performance measures and scorecard displays are been created, and collaboration and software contributions from research and operational centers are being incorporated. A status of the CFSv3/UGCS-seasonal development and examples of its performance and measuring tools will be presented.

  4. 40 CFR Appendix A to Part 57 - Primary Nonferrous Smelter Order (NSO) Application

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Profit and Loss Summary A.4 Historical Capital Investment Summary B.1 Pre-Control Revenue Forecast B.2 Pre-Control Cost Forecast B.3 Pre-Control Forecast Profit and Loss Summary B.4 Constant Controls Revenue Forecast B.5 Constant Controls Cost Forecast B.6 Constant Controls Forecast Profit and Loss...

  5. Proceedings of the Annual Federal Forecasters Conference (2nd, Washington, D.C., September 6-7, 1989).

    ERIC Educational Resources Information Center

    Sonnenberg, William; And Others

    The Second Annual Federal Forecasters Conference, "Forecasting and Public Policy", provided a forum where forecasters from various Federal agencies could meet and discuss aspects of forecasting in the U.S. Government. A total of 140 forecasters from 42 Federal agencies and other organizations attended the conference. Opening remarks by…

  6. The Federal Forecasters Conference-1997. Papers and Proceedings (9th, Washington, D.C., September 11, 1997).

    ERIC Educational Resources Information Center

    Gerald, Debra E., Ed.

    The Ninth Federal Forecasters Conference provided a forum in which forecasters from different federal agencies and other organizations could meet to discuss various aspects of forecasting in the United States. The theme was "Forecasting in an Era of Diminishing Resources." The conference was attended by 150 forecasters. A keynote address by…

  7. Realizing NASA's Goal of Societal Benefits From Earth Observations in Mesoamerica Through the SERVIR Project

    NASA Astrophysics Data System (ADS)

    Hardin, D. M.; Irwin, D.; Sever, T.; Graves, S.

    2006-12-01

    One of the goals of NASA's Applied Sciences Program is to manifest societal benefits from the vast store of Earth Observations through partnerships with public, private and academic organizations. The SERVIR project represents an early success toward this goal. By combining Earth Observations from NASA missions, results from environmental models and decision support tools from its partners the SERVIR project has produced an integrated systems solution that is yielding societal benefits for the region of Mesoamerica. The architecture of the SERVIR system consists of an operational facility in Panama with regional nodes in Costa Rica, Nicaragua, Honduras, Guatemala, El Salvador and Belize plus a Rapid Prototyping Center (RPC), located in Huntsville, Alabama. The RPC, funded by NASA's Applied Sciences Division, and developed by the Information Technology and Systems Center at the University of Alabama in Huntsville, and NASA Marshall Space Flight Center, produces scientifically strong decision support products and applications. When mature, the products and applications migrate to the operational center in Panama. There, they are available to environmental ministers and decision makers in Mesoamerica. In June 2004, the SERVIR project was contacted by the environmental ministry of El Salvador, which urgently requested remote sensing imagery of the location, direction, and extent of a HAB event off the coast of El Salvador and Guatemala. Using MODIS data the SERVIR team developed a value added product that predicts the location, direction, and extent of HABs. The products are produced twice daily and are used by the El Salvadoran and Guatemalan governments to alert their tourism and fishing industries of potential red tide events. This has enabled these countries to save millions of dollars for their industries as well as improve the health of harvested fish. In the area of short term weather forecasting the SERVIR team, in collaboration with the NASA Short-term Prediction Research and Transition (SpoRT) Center, generates 24 hour-forecasts twice daily utilizing the Weather Research and Forecast Model (WRF). Originally aimed at forecasts for the United States, the SPoRT team extended their work to cover the Mesoamerican region. Following testing at the RPC the system was installed in Panama and is currently producing forecasts that are used by tour guides, boat captains on river and ocean fishing tours, and cruise ship captains. This capability fits perfectly with NASA's goals since an existing project was modified, at minimal cost, to provide societal benefits to the population of a different geographic region. On June 30, 2006 several new applications matured and the inventory of decision support products was significantly expanded. As a result the SERVIR website was reorganized to reflect the changes. The degree of change was sufficient for the developers to designate it as a new release of SERVIR. The applications include a Real-Time Image Viewer, a customized version of NASA World Wind for Mesoamerica known as SERVIR- VIZ (developed by IAGT) and the SERVIR Data Portal (developed by the Water Center of the Humid Tropics Latin America and the Caribbean). The success of the SERVIR project is reflected by its choice by NASA as the decision support system for the Ecological Forecasting National Application. The SERVIR model is also under consideration for other regions of the globe. Potential areas for development are Africa, South America and the Caribbean.

  8. GloFAS-Seasonal: Operational Seasonal Ensemble River Flow Forecasts at the Global Scale

    NASA Astrophysics Data System (ADS)

    Emerton, Rebecca; Zsoter, Ervin; Smith, Paul; Salamon, Peter

    2017-04-01

    Seasonal hydrological forecasting has potential benefits for many sectors, including agriculture, water resources management and humanitarian aid. At present, no global scale seasonal hydrological forecasting system exists operationally; although smaller scale systems have begun to emerge around the globe over the past decade, a system providing consistent global scale seasonal forecasts would be of great benefit in regions where no other forecasting system exists, and to organisations operating at the global scale, such as disaster relief. We present here a new operational global ensemble seasonal hydrological forecast, currently under development at ECMWF as part of the Global Flood Awareness System (GloFAS). The proposed system, which builds upon the current version of GloFAS, takes the long-range forecasts from the ECMWF System4 ensemble seasonal forecast system (which incorporates the HTESSEL land surface scheme) and uses this runoff as input to the Lisflood routing model, producing a seasonal river flow forecast out to 4 months lead time, for the global river network. The seasonal forecasts will be evaluated using the global river discharge reanalysis, and observations where available, to determine the potential value of the forecasts across the globe. The seasonal forecasts will be presented as a new layer in the GloFAS interface, which will provide a global map of river catchments, indicating whether the catchment-averaged discharge forecast is showing abnormally high or low flows during the 4-month lead time. Each catchment will display the corresponding forecast as an ensemble hydrograph of the weekly-averaged discharge forecast out to 4 months, with percentile thresholds shown for comparison with the discharge climatology. The forecast visualisation is based on a combination of the current medium-range GloFAS forecasts and the operational EFAS (European Flood Awareness System) seasonal outlook, and aims to effectively communicate the nature of a seasonal outlook while providing useful information to users and partners. We demonstrate the first version of an operational GloFAS seasonal outlook, outlining the model set-up and presenting a first look at the seasonal forecasts that will be displayed in the GloFAS interface, and discuss the initial results of the forecast evaluation.

  9. Ensemble flare forecasting: using numerical weather prediction techniques to improve space weather operations

    NASA Astrophysics Data System (ADS)

    Murray, S.; Guerra, J. A.

    2017-12-01

    One essential component of operational space weather forecasting is the prediction of solar flares. Early flare forecasting work focused on statistical methods based on historical flaring rates, but more complex machine learning methods have been developed in recent years. A multitude of flare forecasting methods are now available, however it is still unclear which of these methods performs best, and none are substantially better than climatological forecasts. Current operational space weather centres cannot rely on automated methods, and generally use statistical forecasts with a little human intervention. Space weather researchers are increasingly looking towards methods used in terrestrial weather to improve current forecasting techniques. Ensemble forecasting has been used in numerical weather prediction for many years as a way to combine different predictions in order to obtain a more accurate result. It has proved useful in areas such as magnetospheric modelling and coronal mass ejection arrival analysis, however has not yet been implemented in operational flare forecasting. Here we construct ensemble forecasts for major solar flares by linearly combining the full-disk probabilistic forecasts from a group of operational forecasting methods (ASSA, ASAP, MAG4, MOSWOC, NOAA, and Solar Monitor). Forecasts from each method are weighted by a factor that accounts for the method's ability to predict previous events, and several performance metrics (both probabilistic and categorical) are considered. The results provide space weather forecasters with a set of parameters (combination weights, thresholds) that allow them to select the most appropriate values for constructing the 'best' ensemble forecast probability value, according to the performance metric of their choice. In this way different forecasts can be made to fit different end-user needs.

  10. Fuzzy logic-based analogue forecasting and hybrid modelling of horizontal visibility

    NASA Astrophysics Data System (ADS)

    Tuba, Zoltán; Bottyán, Zsolt

    2018-04-01

    Forecasting visibility is one of the greatest challenges in aviation meteorology. At the same time, high accuracy visibility forecasts can significantly reduce or make avoidable weather-related risk in aviation as well. To improve forecasting visibility, this research links fuzzy logic-based analogue forecasting and post-processed numerical weather prediction model outputs in hybrid forecast. Performance of analogue forecasting model was improved by the application of Analytic Hierarchy Process. Then, linear combination of the mentioned outputs was applied to create ultra-short term hybrid visibility prediction which gradually shifts the focus from statistical to numerical products taking their advantages during the forecast period. It gives the opportunity to bring closer the numerical visibility forecast to the observations even it is wrong initially. Complete verification of categorical forecasts was carried out; results are available for persistence and terminal aerodrome forecasts (TAF) as well in order to compare. The average value of Heidke Skill Score (HSS) of examined airports of analogue and hybrid forecasts shows very similar results even at the end of forecast period where the rate of analogue prediction in the final hybrid output is 0.1-0.2 only. However, in case of poor visibility (1000-2500 m), hybrid (0.65) and analogue forecasts (0.64) have similar average of HSS in the first 6 h of forecast period, and have better performance than persistence (0.60) or TAF (0.56). Important achievement that hybrid model takes into consideration physics and dynamics of the atmosphere due to the increasing part of the numerical weather prediction. In spite of this, its performance is similar to the most effective visibility forecasting methods and does not follow the poor verification results of clearly numerical outputs.

  11. Difficult decisions in times of constraint: Criteria based Resource Allocation in the Vancouver Coastal Health Authority

    PubMed Central

    2011-01-01

    Objectives The aim of the project was to develop a plan to address a forecasted deficit of approximately $4.65 million for fiscal year 2010/11 in the Vancouver Communities division of the Vancouver Coastal Health Authority. For disinvestment opportunities identified beyond the forecasted deficit, a commitment was made to consider options for resource re-allocation within the Vancouver Communities division. Methods A standard approach to program budgeting and marginal analysis (PBMA) was taken with a priority setting working committee and a broader advisory panel. An experienced, non-vested internal project manager worked closely with the two-member external research team throughout the process. Face to face evaluation interviews were held with 10 decision makers immediately following the process. Results The recommendations of the working committee included the implementation of 44 disinvestment initiatives with an annualized value of CAD $4.9 million, as well as consideration of possible investments if the realized savings match expectations. Overall, decision makers viewed the process favorably and the primary aim of addressing the deficit gap was met. Discussion A key challenge was the tight timeline which likely lead to less evidence informed decision making then one would hope for. Despite this, decision makers felt that better decisions were made then had the process not been in place. In the end, this project adds value in finding that PBMA can be used to cover a deficit and minimize opportunity cost through systematic application of criteria whilst ensuring process fairness through focusing on communication, transparency and decision maker engagement. PMID:21756357

  12. Cosmic Rays in the Earth's Atmosphere and Underground

    NASA Astrophysics Data System (ADS)

    Dorman, Lev I.

    2004-08-01

    This book consists of four parts. In the first part (Chapters 1-4) a full overview is given of the theoretical and experimental basis of Cosmic Ray (CR) research in the atmosphere and underground for Geophysics and Space Physics; the development of CR research and a short history of many fundamental discoveries, main properties of primary and secondary CR, methods of transformation of CR observation data in the atmosphere and underground to space, and the experimental basis of CR research underground and on the ground, on balloons and on satellites and space probes. The second part (Chapters 5-9) is devoted to the influence of atmospheric properties on CR, so called CR meteorological effects; pressure, temperature, humidity, snow, wind, gravitation, and atmospheric electric field effects. The inverse problem - the influence of CR properties on the atmosphere and atmospheric processes is considered in the third part (Chapters 10-14); influence on atmospheric, nuclear and chemical compositions, ionization and radio-wave propagation, formation of thunderstorms and lightning, clouds and climate change. The fourth part (Chapters 15-18) describes many realized and potential applications of CR research in different branches of Science and Technology; Meteorology and Aerodrome Service, Geology and Geophysical Prospecting, Hydrology and Agricultural Applications, Archaeology and Medicine, Seismology and Big Earthquakes Forecasting, Space Weather and Environment Monitoring/Forecasting. The book ends with a list providing more than 1,500 full references, a discussion on future developments and unsolved problems, as well as object and author indices. This book will be useful for experts in different branches of Science and Technology, and for students to be used as additional literature to text-books.

  13. Integrated Modelling in CRUCIAL Science Education

    NASA Astrophysics Data System (ADS)

    Mahura, Alexander; Nuterman, Roman; Mukhamedzhanova, Elena; Nerobelov, Georgiy; Sedeeva, Margarita; Suhodskiy, Alexander; Mostamandy, Suleiman; Smyshlyaev, Sergey

    2017-04-01

    The NordForsk CRUCIAL project (2016-2017) "Critical steps in understanding land surface - atmosphere interactions: from improved knowledge to socioeconomic solutions" as a part of the Pan-Eurasian EXperiment (PEEX; https://www.atm.helsinki.fi/peex) programme activities, is looking for a deeper collaboration between Nordic-Russian science communities. In particular, following collaboration between Danish and Russian partners, several topics were selected for joint research and are focused on evaluation of: (1) urbanization processes impact on changes in urban weather and climate on urban-subregional-regional scales and at contribution to assessment studies for population and environment; (2) effects of various feedback mechanisms on aerosol and cloud formation and radiative forcing on urban-regional scales for better predicting extreme weather events and at contribution to early warning systems, (3) environmental contamination from continues emissions and industrial accidents for better assessment and decision making for sustainable social and economic development, and (4) climatology of atmospheric boundary layer in northern latitudes to improve understanding of processes, revising parameterizations, and better weather forecasting. These research topics are realized employing the online integrated Enviro-HIRLAM (Environment - High Resolution Limited Area Model) model within students' research projects: (1) "Online integrated high-resolution modelling of Saint-Petersburg metropolitan area influence on weather and air pollution forecasting"; (2) "Modeling of aerosol impact on regional-urban scales: case study of Saint-Petersburg metropolitan area"; (3) "Regional modeling and GIS evaluation of environmental pollution from Kola Peninsula sources"; and (4) "Climatology of the High-Latitude Planetary Boundary Layer". The students' projects achieved results and planned young scientists research training on online integrated modelling (Jun 2017) will be presented and discussed.

  14. Solar and Space Physics: A Science for a Technological Society

    NASA Technical Reports Server (NTRS)

    2013-01-01

    From the interior of the Sun, to the upper atmosphere and near-space environment of Earth, and outward to a region far beyond Pluto where the Sun's influence wanes, advances during the past decade in space physics and solar physics the disciplines NASA refers to as heliophysics have yielded spectacular insights into the phenomena that affect our home in space. This report, from the National Research Council's (NRC's) Committee for a Decadal Strategy in Solar and Space Physics, is the second NRC decadal survey in heliophysics. Building on the research accomplishments realized over the past decade, the report presents a program of basic and applied research for the period 2013-2022 that will improve scientific understanding of the mechanisms that drive the Sun's activity and the fundamental physical processes underlying near-Earth plasma dynamics, determine the physical interactions of Earth's atmospheric layers in the context of the connected Sun-Earth system, and enhance greatly the capability to provide realistic and specific forecasts of Earth's space environment that will better serve the needs of society. Although the recommended program is directed primarily to NASA (Science Mission Directorate -- Heliophysics Division) and the National Science Foundation (NSF) (Directorate for Geosciences -- Atmospheric and Geospace Sciences) for action, the report also recommends actions by other federal agencies, especially the National Oceanic and Atmospheric Administration (NOAA) those parts of NOAA charged with the day-to-day (operational) forecast of space weather. In addition to the recommendations included in this summary, related recommendations are presented in the main text of the report.

  15. A Hybrid Supervised/Unsupervised Machine Learning Approach to Solar Flare Prediction

    NASA Astrophysics Data System (ADS)

    Benvenuto, Federico; Piana, Michele; Campi, Cristina; Massone, Anna Maria

    2018-01-01

    This paper introduces a novel method for flare forecasting, combining prediction accuracy with the ability to identify the most relevant predictive variables. This result is obtained by means of a two-step approach: first, a supervised regularization method for regression, namely, LASSO is applied, where a sparsity-enhancing penalty term allows the identification of the significance with which each data feature contributes to the prediction; then, an unsupervised fuzzy clustering technique for classification, namely, Fuzzy C-Means, is applied, where the regression outcome is partitioned through the minimization of a cost function and without focusing on the optimization of a specific skill score. This approach is therefore hybrid, since it combines supervised and unsupervised learning; realizes classification in an automatic, skill-score-independent way; and provides effective prediction performances even in the case of imbalanced data sets. Its prediction power is verified against NOAA Space Weather Prediction Center data, using as a test set, data in the range between 1996 August and 2010 December and as training set, data in the range between 1988 December and 1996 June. To validate the method, we computed several skill scores typically utilized in flare prediction and compared the values provided by the hybrid approach with the ones provided by several standard (non-hybrid) machine learning methods. The results showed that the hybrid approach performs classification better than all other supervised methods and with an effectiveness comparable to the one of clustering methods; but, in addition, it provides a reliable ranking of the weights with which the data properties contribute to the forecast.

  16. Appraisal of cooperation with a palliative care case manager by general practitioners and community nurses: a cross-sectional questionnaire study.

    PubMed

    van der Plas, Annicka G M; Onwuteaka-Philipsen, Bregje D; Vissers, Kris C; Deliens, Luc; Jansen, Wim J J; Francke, Anneke L

    2016-01-01

    To investigate how general practitioners and community nurses value the support that they receive from a nurse case manager with expertise in palliative care, whether they think the case manager is helpful in realizing appropriate care and what characteristics of the patient and case management are associated with this view. For sustainable palliative care in an ageing society, basic palliative care is provided by generalists and specialist palliative care is reserved for complex situations. Acceptance of and cooperation with specialist palliative care providers by the general practitioner and community nurse is pivotal. Cross-sectional questionnaire study. Questionnaire data from 168 general practitioners and 125 community nurses were analysed using chi-square tests, univariate and multivariate logistic regression. Data were gathered between March 2011-December 2013. Of general practitioners, 46% rated the case manager as helpful in realizing care that is appropriate for the patient; for community nurses this was 49%. The case manager did not hinder the process of care and had added value for patients, according to the general practitioners and community nurses. The tasks of the case manager were associated with whether or not the case manager was helpful in realizing appropriate care, whereas patient characteristics and the number of contacts with the case manager were not. General practitioners and community nurses are moderately positive about the support from the case manager. To improve cooperation further, case managers should invest in contact with general practitioners and community nurses. © 2015 John Wiley & Sons Ltd.

  17. WPC Maximum Heat Index Forecasts

    Science.gov Websites

    Forecasts for Western US CLICK ON MAPS FOR MAXIMUM HEAT INDEX AND PROBABILITY FORECASTS FROM SUN MAY 27 2018 02 CLICK to view SAT JUN 02 forecast SUN JUN 03 CLICK to view SUN JUN 03 forecast SUN JUN 03 CLICK to view SUN JUN 03 forecast SUN JUN 03 CLICK to view SUN JUN 03 forecast SUN JUN 03 CLICK to view SUN JUN

  18. Assimilating Ferry Box data into the Aegean Sea model

    NASA Astrophysics Data System (ADS)

    Korres, G.; Ntoumas, M.; Potiris, M.; Petihakis, G.

    2014-12-01

    Operational monitoring and forecasting of marine environmental conditions is a necessary tool for the effective management and protection of the marine ecosystem. It requires the use of multi-variable real-time measurements combined with advanced physical and ecological numerical models. Towards this, a FerryBox system was originally installed and operated in the route Piraeus-Heraklion in 2003 for one year. Early 2012 the system was upgraded and moved to a new high-speed ferry traveling daily in the same route as before. This route is by large traversing the Cretan Sea being the largest and deepest basin (2500 m) in the south Aegean Sea. The HCMR Ferry Box is today the only one in the Mediterranean and thus it can be considered as a pilot case. The analysis of FerryBox SST and SSS in situ data revealed the presence of important regional and sub-basin scale physical phenomena, such as wind-driven coastal upwelling and the presence of a mesoscale cyclone to the north of Crete. In order to assess the impact of the FerryBox SST data in constraining the Aegean Sea hydrodynamic model which is part of the POSEIDON forecasting system, the in situ data were assimilated using an advanced multivariate assimilation scheme based on the Singular Evolutive Extended Kalman (SEEK) filter, a simplified square-root extended Kalman filter that operates with low-rank error covariance matrices as a way to reduce the computational burden. Thus during the period mid-August 2012-mid January 2013 in addition to the standard assimilating parameters, daily SST data along the ferryboat route from Piraeus to Heraklion were assimilated into the model. Inter-comparisons between the control run of the system (model run that uses only the standard data set of observations) and the experiment where the observational data set is augmented with the FerryBox SST data produce interesting results. Apart from the improvement of the SST error, the additional assimilation of daily of FerryBox SST observations is found to have a significant impact on the correct representation of the dynamical dipole in the central Cretan Sea and other dynamic features of the South Aegean Sea, which is then depicted in the decrease of the basin wide SSH RMS error.

  19. Global Tree Range Shifts Under Forecasts from Two Alternative GCMs Using Two Future Scenarios

    NASA Astrophysics Data System (ADS)

    Hargrove, W. W.; Kumar, J.; Potter, K. M.; Hoffman, F. M.

    2013-12-01

    Global shifts in the environmentally suitable ranges of 215 tree species were predicted under forecasts from two GCMs (the Parallel Climate Model (PCM), and the Hadley Model), each under two IPCC future climatic scenarios (A1 and B1), each at two future dates (2050 and 2100). The analysis considers all global land surface at a resolution of 4 km2. A statistical multivariate clustering procedure was used to quantitatively delineate 30 thousand environmentally homogeneous ecoregions across present and 8 potential future global locations at once, using global maps of 17 environmental characteristics describing temperature, precipitation, soils, topography and solar insolation. Presence of each tree species on Forest Inventory Analysis (FIA) plots and in Global Biodiversity Information Facility (GBIF) samples was used to select a subset of suitable ecoregions from the full set of 30 thousand. Once identified, this suitable subset of ecoregions was compared to the known current range of the tree species under present conditions. Predicted present ranges correspond well with current understanding for all but a few of the 215 tree species. The subset of suitable ecoregions for each tree species can then be tracked into the future to determine whether the suitable home range for this species remains the same, moves, grows, shrinks, or disappears under each model/scenario combination. Occurrence and growth performance measurements for various tree species across the U.S. are limited to FIA plots. We present a new, general-purpose empirical imputation method which associates sparse measurements of dependent variables with particular multivariate clustered combinations of the independent variables, and then estimates values for unmeasured clusters, based on directional proximity in multidimensional data space, at both the cluster and map-cell levels of resolution. Using Associative Clustering, we scaled up the FIA point measurements into contonuous maps that show the expected growth and suitability for individual tree species across the continental US. Maps were generated for each tree species showing the Minimum Required Movement (MRM) straight-line distance from each currently suitable location to the geographically nearest "lifeboat" location having suitable conditions in the future. Locations that are the closest "lifeboats" for many MRM propagules originating from wide surrounding areas may constitute high-priority preservation targets as a refugium against climatic change.

  20. EnrollForecast for Excel: K-12 Enrollment Forecasting Program. Software & User's Guide. [Computer Diskette].

    ERIC Educational Resources Information Center

    Smith, Curtis A.

    "EnrollForecast for Excel" will generate a 5-year forecast of K-12 student enrollment. It will also work for any combination of grades between kindergarten and twelth. The forecasts can be printed as either a table or a graph. The user must provide birth history (only if forecasting kindergarten) and enrollment history information. The user also…

  1. On the reliability of seasonal climate forecasts.

    PubMed

    Weisheimer, A; Palmer, T N

    2014-07-06

    Seasonal climate forecasts are being used increasingly across a range of application sectors. A recent UK governmental report asked: how good are seasonal forecasts on a scale of 1-5 (where 5 is very good), and how good can we expect them to be in 30 years time? Seasonal forecasts are made from ensembles of integrations of numerical models of climate. We argue that 'goodness' should be assessed first and foremost in terms of the probabilistic reliability of these ensemble-based forecasts; reliable inputs are essential for any forecast-based decision-making. We propose that a '5' should be reserved for systems that are not only reliable overall, but where, in particular, small ensemble spread is a reliable indicator of low ensemble forecast error. We study the reliability of regional temperature and precipitation forecasts of the current operational seasonal forecast system of the European Centre for Medium-Range Weather Forecasts, universally regarded as one of the world-leading operational institutes producing seasonal climate forecasts. A wide range of 'goodness' rankings, depending on region and variable (with summer forecasts of rainfall over Northern Europe performing exceptionally poorly) is found. Finally, we discuss the prospects of reaching '5' across all regions and variables in 30 years time.

  2. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  3. Utilizing Climate Forecasts for Improving Water and Power Systems Coordination

    NASA Astrophysics Data System (ADS)

    Arumugam, S.; Queiroz, A.; Patskoski, J.; Mahinthakumar, K.; DeCarolis, J.

    2016-12-01

    Climate forecasts, typically monthly-to-seasonal precipitation forecasts, are commonly used to develop streamflow forecasts for improving reservoir management. Irrespective of their high skill in forecasting, temperature forecasts in developing power demand forecasts are not often considered along with streamflow forecasts for improving water and power systems coordination. In this study, we consider a prototype system to analyze the utility of climate forecasts, both precipitation and temperature, for improving water and power systems coordination. The prototype system, a unit-commitment model that schedules power generation from various sources, is considered and its performance is compared with an energy system model having an equivalent reservoir representation. Different skill sets of streamflow forecasts and power demand forecasts are forced on both water and power systems representations for understanding the level of model complexity required for utilizing monthly-to-seasonal climate forecasts to improve coordination between these two systems. The analyses also identify various decision-making strategies - forward purchasing of fuel stocks, scheduled maintenance of various power systems and tradeoff on water appropriation between hydropower and other uses - in the context of various water and power systems configurations. Potential application of such analyses for integrating large power systems with multiple river basins is also discussed.

  4. Verification of temperature, precipitation, and streamflow forecasts from the NOAA/NWS Hydrologic Ensemble Forecast Service (HEFS): 1. Experimental design and forcing verification

    NASA Astrophysics Data System (ADS)

    Brown, James D.; Wu, Limin; He, Minxue; Regonda, Satish; Lee, Haksu; Seo, Dong-Jun

    2014-11-01

    Retrospective forecasts of precipitation, temperature, and streamflow were generated with the Hydrologic Ensemble Forecast Service (HEFS) of the U.S. National Weather Service (NWS) for a 20-year period between 1979 and 1999. The hindcasts were produced for two basins in each of four River Forecast Centers (RFCs), namely the Arkansas-Red Basin RFC, the Colorado Basin RFC, the California-Nevada RFC, and the Middle Atlantic RFC. Precipitation and temperature forecasts were produced with the HEFS Meteorological Ensemble Forecast Processor (MEFP). Inputs to the MEFP comprised ;raw; precipitation and temperature forecasts from the frozen (circa 1997) version of the NWS Global Forecast System (GFS) and a climatological ensemble, which involved resampling historical observations in a moving window around the forecast valid date (;resampled climatology;). In both cases, the forecast horizon was 1-14 days. This paper outlines the hindcasting and verification strategy, and then focuses on the quality of the temperature and precipitation forecasts from the MEFP. A companion paper focuses on the quality of the streamflow forecasts from the HEFS. In general, the precipitation forecasts are more skillful than resampled climatology during the first week, but comprise little or no skill during the second week. In contrast, the temperature forecasts improve upon resampled climatology at all forecast lead times. However, there are notable differences among RFCs and for different seasons, aggregation periods and magnitudes of the observed and forecast variables, both for precipitation and temperature. For example, the MEFP-GFS precipitation forecasts show the highest correlations and greatest skill in the California Nevada RFC, particularly during the wet season (November-April). While generally reliable, the MEFP forecasts typically underestimate the largest observed precipitation amounts (a Type-II conditional bias). As a statistical technique, the MEFP cannot detect, and thus appropriately correct for, conditions that are undetected by the GFS. The calibration of the MEFP to provide reliable and skillful forecasts of a range of precipitation amounts (not only large amounts) is a secondary factor responsible for these Type-II conditional biases. Interpretation of the verification results leads to guidance on the expected performance and limitations of the MEFP, together with recommendations on future enhancements.

  5. Assessing the skill of seasonal precipitation and streamflow forecasts in sixteen French catchments

    NASA Astrophysics Data System (ADS)

    Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian

    2015-04-01

    Meteorological centres make sustained efforts to provide seasonal forecasts that are increasingly skilful. Streamflow forecasting is one of the many applications than can benefit from these efforts. Seasonal flow forecasts generated using seasonal ensemble precipitation forecasts as input to a hydrological model can help to take anticipatory measures for water supply reservoir operation or drought risk management. The objective of the study is to assess the skill of seasonal precipitation and streamflow forecasts in France. First, we evaluated the skill of ECMWF SYS4 seasonal precipitation forecasts for streamflow forecasting in sixteen French catchments. Daily flow forecasts were produced using raw seasonal precipitation forecasts as input to the GR6J hydrological model. Ensemble forecasts are issued every month with 15 or 51 members according to the month of the year and evaluated for up to 90 days ahead. In a second step, we applied eight variants of bias correction approaches to the precipitation forecasts prior to generating the flow forecasts. The approaches were based on the linear scaling and the distribution mapping methods. The skill of the ensemble forecasts was assessed in accuracy (MAE), reliability (PIT Diagram) and overall performance (CRPS). The results show that, in most catchments, raw seasonal precipitation and streamflow forecasts are more skilful in terms of accuracy and overall performance than a reference prediction based on historic observed precipitation and watershed initial conditions at the time of forecast. Reliability is the only attribute that is not significantly improved. The skill of the forecasts is, in general, improved when applying bias correction. Two bias correction methods showed the best performance for the studied catchments: the simple linear scaling of monthly values and the empirical distribution mapping of daily values. L. Crochemore is funded by the Interreg IVB DROP Project (Benefit of governance in DROught adaPtation).

  6. Risky Business: Development, Communication and Use of Hydroclimatic Forecasts

    NASA Astrophysics Data System (ADS)

    Lall, U.

    2012-12-01

    Inter-seasonal and longer hydroclimatic forecasts have been made increasingly in the last two decades following the increase in ENSO activity since the early 1980s and the success in seasonal ENSO forecasting. Yet, the number of examples of systematic use of these forecasts and their incorporation into water systems operation continue to be few. This may be due in part to the limited skill in such forecasts over much of the world, but is also likely due to the limited evolution of methods and opportunities to "safely" use uncertain forecasts. There has been a trend to rely more on "physically based" rather than "physically informed" empirical forecasts, and this may in part explain the limited success in developing usable products in more locations. Given the limited skill, forecasters have tended to "dumb" down their forecasts - either formally or subjectively shrinking the forecasts towards climatology, or reducing them to tercile forecasts that serve to obscure the potential information in the forecast. Consequently, the potential utility of such forecasts for decision making is compromised. Water system operating rules are often designed to be robust in the face of historical climate variability, and consequently are adapted to the potential conditions that a forecast seeks to inform. In such situations, there is understandable reluctance by managers to use the forecasts as presented, except in special cases where an alternate course of action is pragmatically appealing in any case. In this talk, I review opportunities to present targeted forecasts for use with decision systems that directly address climate risk and the risk induced by unbiased yet uncertain forecasts, focusing especially on extreme events and water allocation in a competitive environment. Examples from Brazil and India covering surface and ground water conjunctive use strategies that could potentially be insured and lead to improvements over the traditional system operation and resource allocation are provided.

  7. A quality assessment of the MARS crop yield forecasting system for the European Union

    NASA Astrophysics Data System (ADS)

    van der Velde, Marijn; Bareuth, Bettina

    2015-04-01

    Timely information on crop production forecasts can become of increasing importance as commodity markets are more and more interconnected. Impacts across large crop production areas due to (e.g.) extreme weather and pest outbreaks can create ripple effects that may affect food prices and availability elsewhere. The MARS Unit (Monitoring Agricultural ResourceS), DG Joint Research Centre, European Commission, has been providing forecasts of European crop production levels since 1993. The operational crop production forecasting is carried out with the MARS Crop Yield Forecasting System (M-CYFS). The M-CYFS is used to monitor crop growth development, evaluate short-term effects of anomalous meteorological events, and provide monthly forecasts of crop yield at national and European Union level. The crop production forecasts are published in the so-called MARS bulletins. Forecasting crop yield over large areas in the operational context requires quality benchmarks. Here we present an analysis of the accuracy and skill of past crop yield forecasts of the main crops (e.g. soft wheat, grain maize), throughout the growing season, and specifically for the final forecast before harvest. Two simple benchmarks to assess the skill of the forecasts were defined as comparing the forecasts to 1) a forecast equal to the average yield and 2) a forecast using a linear trend established through the crop yield time-series. These reveal a variability in performance as a function of crop and Member State. In terms of production, the yield forecasts of 67% of the EU-28 soft wheat production and 80% of the EU-28 maize production have been forecast superior to both benchmarks during the 1993-2013 period. In a changing and increasingly variable climate crop yield forecasts can become increasingly valuable - provided they are used wisely. We end our presentation by discussing research activities that could contribute to this goal.

  8. Understanding Farmers’ Forecast Use from Their Beliefs, Values, Social Norms, and Perceived Obstacles

    NASA Astrophysics Data System (ADS)

    Hu, Qi; Pytlik Zillig, Lisa M.; Lynne, Gary D.; Tomkins, Alan J.; Waltman, William J.; Hayes, Michael J.; Hubbard, Kenneth G.; Artikov, Ikrom; Hoffman, Stacey J.; Wilhite, Donald A.

    2006-09-01

    Although the accuracy of weather and climate forecasts is continuously improving and new information retrieved from climate data is adding to the understanding of climate variation, use of the forecasts and climate information by farmers in farming decisions has changed little. This lack of change may result from knowledge barriers and psychological, social, and economic factors that undermine farmer motivation to use forecasts and climate information. According to the theory of planned behavior (TPB), the motivation to use forecasts may arise from personal attitudes, social norms, and perceived control or ability to use forecasts in specific decisions. These attributes are examined using data from a survey designed around the TPB and conducted among farming communities in the region of eastern Nebraska and the western U.S. Corn Belt. There were three major findings: 1) the utility and value of the forecasts for farming decisions as perceived by farmers are, on average, around 3.0 on a 0 7 scale, indicating much room to improve attitudes toward the forecast value. 2) The use of forecasts by farmers to influence decisions is likely affected by several social groups that can provide “expert viewpoints” on forecast use. 3) A major obstacle, next to forecast accuracy, is the perceived identity and reliability of the forecast makers. Given the rapidly increasing number of forecasts in this growing service business, the ambiguous identity of forecast providers may have left farmers confused and may have prevented them from developing both trust in forecasts and skills to use them. These findings shed light on productive avenues for increasing the influence of forecasts, which may lead to greater farming productivity. In addition, this study establishes a set of reference points that can be used for comparisons with future studies to quantify changes in forecast use and influence.

  9. How Hydroclimate Influences the Effectiveness of Particle Filter Data Assimilation of Streamflow in Initializing Short- to Medium-range Streamflow Forecasts

    NASA Astrophysics Data System (ADS)

    Clark, E.; Wood, A.; Nijssen, B.; Clark, M. P.

    2017-12-01

    Short- to medium-range (1- to 7-day) streamflow forecasts are important for flood control operations and in issuing potentially life-save flood warnings. In the U.S., the National Weather Service River Forecast Centers (RFCs) issue such forecasts in real time, depending heavily on a manual data assimilation (DA) approach. Forecasters adjust model inputs, states, parameters and outputs based on experience and consideration of a range of supporting real-time information. Achieving high-quality forecasts from new automated, centralized forecast systems will depend critically on the adequacy of automated DA approaches to make analogous corrections to the forecasting system. Such approaches would further enable systematic evaluation of real-time flood forecasting methods and strategies. Toward this goal, we have implemented a real-time Sequential Importance Resampling particle filter (SIR-PF) approach to assimilate observed streamflow into simulated initial hydrologic conditions (states) for initializing ensemble flood forecasts. Assimilating streamflow alone in SIR-PF improves simulated streamflow and soil moisture during the model spin up period prior to a forecast, with consequent benefits for forecasts. Nevertheless, it only consistently limits error in simulated snow water equivalent during the snowmelt season and in basins where precipitation falls primarily as snow. We examine how the simulated initial conditions with and without SIR-PF propagate into 1- to 7-day ensemble streamflow forecasts. Forecasts are evaluated in terms of reliability and skill over a 10-year period from 2005-2015. The focus of this analysis is on how interactions between hydroclimate and SIR-PF performance impact forecast skill. To this end, we examine forecasts for 5 hydroclimatically diverse basins in the western U.S. Some of these basins receive most of their precipitation as snow, others as rain. Some freeze throughout the mid-winter while others experience significant mid-winter melt events. We describe the methodology and present seasonal and inter-basin variations in DA-enhanced forecast skill.

  10. Accuracy of forecasts in strategic intelligence

    PubMed Central

    Mandel, David R.; Barnes, Alan

    2014-01-01

    The accuracy of 1,514 strategic intelligence forecasts abstracted from intelligence reports was assessed. The results show that both discrimination and calibration of forecasts was very good. Discrimination was better for senior (versus junior) analysts and for easier (versus harder) forecasts. Miscalibration was mainly due to underconfidence such that analysts assigned more uncertainty than needed given their high level of discrimination. Underconfidence was more pronounced for harder (versus easier) forecasts and for forecasts deemed more (versus less) important for policy decision making. Despite the observed underconfidence, there was a paucity of forecasts in the least informative 0.4–0.6 probability range. Recalibrating the forecasts substantially reduced underconfidence. The findings offer cause for tempered optimism about the accuracy of strategic intelligence forecasts and indicate that intelligence producers aim to promote informativeness while avoiding overstatement. PMID:25024176

  11. Seasonal forecast of St. Louis encephalitis virus transmission, Florida.

    PubMed

    Shaman, Jeffrey; Day, Jonathan F; Stieglitz, Marc; Zebiak, Stephen; Cane, Mark

    2004-05-01

    Disease transmission forecasts can help minimize human and domestic animal health risks by indicating where disease control and prevention efforts should be focused. For disease systems in which weather-related variables affect pathogen proliferation, dispersal, or transmission, the potential for disease forecasting exists. We present a seasonal forecast of St. Louis encephalitis virus transmission in Indian River County, Florida. We derive an empiric relationship between modeled land surface wetness and levels of SLEV transmission in humans. We then use these data to forecast SLEV transmission with a seasonal lead. Forecast skill is demonstrated, and a real-time seasonal forecast of epidemic SLEV transmission is presented. This study demonstrates how weather and climate forecast skill-verification analyses may be applied to test the predictability of an empiric disease forecast model.

  12. Seasonal Forecast of St. Louis Encephalitis Virus Transmission, Florida

    PubMed Central

    Day, Jonathan F.; Stieglitz, Marc; Zebiak, Stephen; Cane, Mark

    2004-01-01

    Disease transmission forecasts can help minimize human and domestic animal health risks by indicating where disease control and prevention efforts should be focused. For disease systems in which weather-related variables affect pathogen proliferation, dispersal, or transmission, the potential for disease forecasting exists. We present a seasonal forecast of St. Louis encephalitis virus transmission in Indian River County, Florida. We derive an empirical relationship between modeled land surface wetness and levels of SLEV transmission in humans. We then use these data to forecast SLEV transmission with a seasonal lead. Forecast skill is demonstrated, and a real-time seasonal forecast of epidemic SLEV transmission is presented. This study demonstrates how weather and climate forecast skill verification analyses may be applied to test the predictability of an empirical disease forecast model. PMID:15200812

  13. Comparative Analysis of NOAA REFM and SNB3GEO Tools for the Forecast of the Fluxes of High-Energy Electrons at GEO

    NASA Technical Reports Server (NTRS)

    Balikhin, M. A.; Rodriguez, J. V.; Boynton, R. J.; Walker, S. N.; Aryan, Homayon; Sibeck, D. G.; Billings, S. A.

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB3GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB3GEO forecasts use solar wind density and interplanetary magnetic field B(sub z) observations at L1. The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB3GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB3GEO forecast.

  14. Comparative analysis of NOAA REFM and SNB3GEO tools for the forecast of the fluxes of high-energy electrons at GEO.

    PubMed

    Balikhin, M A; Rodriguez, J V; Boynton, R J; Walker, S N; Aryan, H; Sibeck, D G; Billings, S A

    2016-01-01

    Reliable forecasts of relativistic electrons at geostationary orbit (GEO) are important for the mitigation of their hazardous effects on spacecraft at GEO. For a number of years the Space Weather Prediction Center at NOAA has provided advanced online forecasts of the fluence of electrons with energy >2 MeV at GEO using the Relativistic Electron Forecast Model (REFM). The REFM forecasts are based on real-time solar wind speed observations at L1. The high reliability of this forecasting tool serves as a benchmark for the assessment of other forecasting tools. Since 2012 the Sheffield SNB 3 GEO model has been operating online, providing a 24 h ahead forecast of the same fluxes. In addition to solar wind speed, the SNB 3 GEO forecasts use solar wind density and interplanetary magnetic field B z observations at L1.The period of joint operation of both of these forecasts has been used to compare their accuracy. Daily averaged measurements of electron fluxes by GOES 13 have been used to estimate the prediction efficiency of both forecasting tools. To assess the reliability of both models to forecast infrequent events of very high fluxes, the Heidke skill score was employed. The results obtained indicate that SNB 3 GEO provides a more accurate 1 day ahead forecast when compared to REFM. It is shown that the correction methodology utilized by REFM potentially can improve the SNB 3 GEO forecast.

  15. Using Temperature Forecasts to Improve Seasonal Streamflow Forecasts in the Colorado and Rio Grande Basins

    NASA Astrophysics Data System (ADS)

    Lehner, F.; Wood, A.; Llewellyn, D.; Blatchford, D. B.; Goodbody, A. G.; Pappenberger, F.

    2017-12-01

    Recent studies have documented the influence of increasing temperature on streamflow across the American West, including snow-melt driven rivers such as the Colorado or Rio Grande. At the same time, some basins are reporting decreasing skill in seasonal streamflow forecasts, termed water supply forecasts (WSFs), over the recent decade. While the skill in seasonal precipitation forecasts from dynamical models remains low, their skill in predicting seasonal temperature variations could potentially be harvested for WSFs to account for non-stationarity in regional temperatures. Here, we investigate whether WSF skill can be improved by incorporating seasonal temperature forecasts from dynamical forecasting models (from the North American Multi Model Ensemble and the European Centre for Medium-Range Weather Forecast System 4) into traditional statistical forecast models. We find improved streamflow forecast skill relative to traditional WSF approaches in a majority of headwater locations in the Colorado and Rio Grande basins. Incorporation of temperature into WSFs thus provides a promising avenue to increase the robustness of current forecasting techniques in the face of continued regional warming.

  16. Skillful Spring Forecasts of September Arctic Sea Ice Extent Using Passive Microwave Data

    NASA Technical Reports Server (NTRS)

    Petty, A. A.; Schroder, D.; Stroeve, J. C.; Markus, Thorsten; Miller, Jeffrey A.; Kurtz, Nathan Timothy; Feltham, D. L.; Flocco, D.

    2017-01-01

    In this study, we demonstrate skillful spring forecasts of detrended September Arctic sea ice extent using passive microwave observations of sea ice concentration (SIC) and melt onset (MO). We compare these to forecasts produced using data from a sophisticated melt pond model, and find similar to higher skill values, where the forecast skill is calculated relative to linear trend persistence. The MO forecasts shows the highest skill in March-May, while the SIC forecasts produce the highest skill in June-August, especially when the forecasts are evaluated over recent years (since 2008). The high MO forecast skill in early spring appears to be driven primarily by the presence and timing of open water anomalies, while the high SIC forecast skill appears to be driven by both open water and surface melt processes. Spatial maps of detrended anomalies highlight the drivers of the different forecasts, and enable us to understand regions of predictive importance. Correctly capturing sea ice state anomalies, along with changes in open water coverage appear to be key processes in skillfully forecasting summer Arctic sea ice.

  17. Why Don't We Learn to Accurately Forecast Feelings? How Misremembering Our Predictions Blinds Us to Past Forecasting Errors

    ERIC Educational Resources Information Center

    Meyvis, Tom; Ratner, Rebecca K.; Levav, Jonathan

    2010-01-01

    Why do affective forecasting errors persist in the face of repeated disconfirming evidence? Five studies demonstrate that people misremember their forecasts as consistent with their experience and thus fail to perceive the extent of their forecasting error. As a result, people do not learn from past forecasting errors and fail to adjust subsequent…

  18. Assessing the viability of `over-the-loop' real-time short-to-medium range ensemble streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Wood, A. W.; Clark, E.; Mendoza, P. A.; Nijssen, B.; Newman, A. J.; Clark, M. P.; Arnold, J.; Nowak, K. C.

    2016-12-01

    Many if not most national operational short-to-medium range streamflow prediction systems rely on a forecaster-in-the-loop approach in which some parts of the forecast workflow are automated, but others require the hands-on-effort of an experienced human forecaster. This approach evolved out of the need to correct for deficiencies in the models and datasets that were available for forecasting, and often leads to skillful predictions despite the use of relatively simple, conceptual models. On the other hand, the process is not reproducible, which limits opportunities to assess and incorporate process variations, and the effort required to make forecasts in this way is an obstacle to expanding forecast services - e.g., though adding new forecast locations or more frequent forecast updates, running more complex models, or producing forecast ensembles and hindcasts that can support verification. In the last decade, the hydrologic forecasting community has begun to develop more centralized, `over-the-loop' systems. The quality of these new forecast products will depend on their ability to leverage research in areas including earth system modeling, parameter estimation, data assimilation, statistical post-processing, weather and climate prediction, verification, and uncertainty estimation through the use of ensembles. Currently, the operational streamflow forecasting and water management communities have little experience with the strengths and weaknesses of over-the-loop approaches, even as the systems are being rolled out in major operational forecasting centers. There is thus a need both to evaluate these forecasting advances and to demonstrate their potential in a public arena, raising awareness in forecast user communities and development programs alike. To address this need, the National Center for Atmospheric Research is collaborating with the University of Washington, the Bureau of Reclamation and the US Army Corps of Engineers, using the NCAR 'System for Hydromet Analysis, Research, and Prediction' (SHARP) to implement, assess and demonstrate real-time over-the-loop forecasts. We present early hindcast and verification results from SHARP for short to medium range streamflow forecasts in a number of US case study watersheds.

  19. Assessment of an ensemble seasonal streamflow forecasting system for Australia

    NASA Astrophysics Data System (ADS)

    Bennett, James C.; Wang, Quan J.; Robertson, David E.; Schepen, Andrew; Li, Ming; Michael, Kelvin

    2017-11-01

    Despite an increasing availability of skilful long-range streamflow forecasts, many water agencies still rely on simple resampled historical inflow sequences (stochastic scenarios) to plan operations over the coming year. We assess a recently developed forecasting system called forecast guided stochastic scenarios (FoGSS) as a skilful alternative to standard stochastic scenarios for the Australian continent. FoGSS uses climate forecasts from a coupled ocean-land-atmosphere prediction system, post-processed with the method of calibration, bridging and merging. Ensemble rainfall forecasts force a monthly rainfall-runoff model, while a staged hydrological error model quantifies and propagates hydrological forecast uncertainty through forecast lead times. FoGSS is able to generate ensemble streamflow forecasts in the form of monthly time series to a 12-month forecast horizon. FoGSS is tested on 63 Australian catchments that cover a wide range of climates, including 21 ephemeral rivers. In all perennial and many ephemeral catchments, FoGSS provides an effective alternative to resampled historical inflow sequences. FoGSS generally produces skilful forecasts at shorter lead times ( < 4 months), and transits to climatology-like forecasts at longer lead times. Forecasts are generally reliable and unbiased. However, FoGSS does not perform well in very dry catchments (catchments that experience zero flows more than half the time in some months), sometimes producing strongly negative forecast skill and poor reliability. We attempt to improve forecasts through the use of (i) ESP rainfall forcings, (ii) different rainfall-runoff models, and (iii) a Bayesian prior to encourage the error model to return climatology forecasts in months when the rainfall-runoff model performs poorly. Of these, the use of the prior offers the clearest benefit in very dry catchments, where it moderates strongly negative forecast skill and reduces bias in some instances. However, the prior does not remedy poor reliability in very dry catchments. Overall, FoGSS is an attractive alternative to historical inflow sequences in all but the driest catchments. We discuss ways in which forecast reliability in very dry catchments could be improved in future work.

  20. Evaluation of streamflow forecast for the National Water Model of U.S. National Weather Service

    NASA Astrophysics Data System (ADS)

    Rafieeinasab, A.; McCreight, J. L.; Dugger, A. L.; Gochis, D.; Karsten, L. R.; Zhang, Y.; Cosgrove, B.; Liu, Y.

    2016-12-01

    The National Water Model (NWM), an implementation of the community WRF-Hydro modeling system, is an operational hydrologic forecasting model for the contiguous United States. The model forecasts distributed hydrologic states and fluxes, including soil moisture, snowpack, ET, and ponded water. In particular, the NWM provides streamflow forecasts at more than 2.7 million river reaches for three forecast ranges: short (15 hr), medium (10 days), and long (30 days). In this study, we verify short and medium range streamflow forecasts in the context of the verification of their respective quantitative precipitation forecasts/forcing (QPF), the High Resolution Rapid Refresh (HRRR) and the Global Forecast System (GFS). The streamflow evaluation is performed for summer of 2016 at more than 6,000 USGS gauges. Both individual forecasts and forecast lead times are examined. Selected case studies of extreme events aim to provide insight into the quality of the NWM streamflow forecasts. A goal of this comparison is to address how much streamflow bias originates from precipitation forcing bias. To this end, precipitation verification is performed over the contributing areas above (and between assimilated) USGS gauge locations. Precipitation verification is based on the aggregated, blended StageIV/StageII data as the "reference truth". We summarize the skill of the streamflow forecasts, their skill relative to the QPF, and make recommendations for improving NWM forecast skill.

Top